Rel 6 1 mergeback (#3927)

* Bump for CVE (#3856)

* Bump for CVE

* Bump spring-data version

* Fix compile

* Cut over to spring bom

* Bump to RC1

* remove RC

* do not contrain reindex for common SP updates (#3876)

* only fast-track jobs with exactly one chunk (#3879)

* Fix illegalstateexception when an exception is thrown during stream response (#3882)

* Finish up changelog, minor refactor

* reset buffer only

* Hack for some replacements

* Failure handling

* wip

* Fixed the issue (#3845)

* Fixed the issue

* Changelog modification

* Changelog modification

* Implemented seventh character extended code and the corresponding dis… (#3709)

* Implemented seventh character extended code and the corresponding display

* Modifications

* Changes on previous test according to modifications made in ICD10-CM XML file

* Subscription sending delete events being skipped (#3888)

* fixed bug and added test

* refactor

* Update for CVE (#3895)

* updated pointcuts to work as intended (#3903)

* updated pointcuts to work as intended

* added changelog

* review fixes

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* 3904 during $delete expunge job hibernate search indexed documents are left orphaned (#3905)

* Add test and implementation

* Add changelog

* 3899 code in limits (#3901)

* Add implementation, changelog, test

* Update hapi-fhir-jpaserver-test-utilities/src/test/java/ca/uhn/fhir/jpa/provider/r4/ResourceProviderR4Test.java

Co-authored-by: Ken Stevens <khstevens@gmail.com>

Co-authored-by: Ken Stevens <khstevens@gmail.com>

* 3884 overlapping searchparameter undetected rel 6 1 (#3909)

* Applying all changes from previous dev branch to current one pointing to rel_6_1

* Fixing merge conflict related to Msg.code value.

* Fixing Msg.code value.

* Making checkstyle happy.

* Making sure that all tests are passing.

* Passing all tests after fixing Msg.code

* Passing all tests.

Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* 3745 - fixed NPE for bundle with duplicate conditional create resourc… (#3746)

* 3745 - fixed NPE for bundle with duplicate conditional create resources and a conditional delete

* created unit test for skip of delete operation while processing duplicating create entries

* moved unit test to FhirSystemDaoR4Test

* 3379 mdm fixes (#3906)

* added MdmLinkCreateSvcimplTest

* fixed creating mdm-link not setting the resource type correctly

* fixed a bug where ResourcePersistenceId was being duplicated instead of passed on

* Update hapi-fhir-jpaserver-mdm/src/test/java/ca/uhn/fhir/jpa/mdm/svc/MdmLinkCreateSvcImplTest.java

Change order of tests such that assertEquals takes expected value then actual value

Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>

* added changelog, also changed a setup function in test to beforeeach

Co-authored-by: Long Ma <long@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>

* Fix to the issue (#3855)

* Fix to the issue

* Progress

* fixed the issue

* Addressing suggestions

* add response status code to MethodOutcome

* Addressing suggestions

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* Fix for caching appearing broken in batch2 for bulkexport jobs (#3912)

* Respect caching in bullk export, fix bug with completed date on empty jobs

* add changelog

* Add impl

* Add breaking test

* Complete failing test

* more broken tests

* Fix more tests'

* Fix paging bug

* Fix another brittle test

* 3915 do not collapse rules with filters (#3916)

* do not attempt to merge compartment permissions with filters

* changelog

* Rename to IT for concurrency problems

Co-authored-by: Tadgh <garygrantgraham@gmail.com>

* Version bump

* fix $mdm-submit output (#3917)

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* Gl3407 bundle offset size (#3918)

* begin with failing test

* fixed

* change log

* rollback default count change and corresponding comments

Co-authored-by: Ken Stevens <ken@smilecdr.com>

* Offset interceptor now only works for external calls

* Initialize some beans (esp interceptors) later in the boot process so they don't slow down startup.

* do not reindex searchparam jobs on startup

* Fix oracle non-enterprise attempting online index add (#3925)

* 3922 delete expunge large dataset (#3923)

* lower batchsize of delete requests so that we do not get sql exceptions

* blah

* fix test

* updated tests to not fail

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* add index

* Fix up colun grab

* Bump Version for non-breaking purposes

* Fix broken test

* bump error code

* revert internal logic

* Revert offset mode change

* wip

* Revert fix for null/system request details checks for reindex purposes

* Fix bug and add test for SP Validating Interceptor (#3930)

* wip

* Fix uptests

* Fix index online test

* Fix SP validating interceptor logic

Co-authored-by: JasonRoberts-smile <85363818+JasonRoberts-smile@users.noreply.github.com>
Co-authored-by: Qingyixia <106992634+Qingyixia@users.noreply.github.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
Co-authored-by: kateryna-mironova <107507153+kateryna-mironova@users.noreply.github.com>
Co-authored-by: longma1 <32119004+longma1@users.noreply.github.com>
Co-authored-by: Long Ma <long@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>
This commit is contained in:
Tadgh 2022-08-18 17:21:27 -07:00 committed by GitHub
parent 7f7b7d6303
commit e7a4c49aac
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
162 changed files with 9739 additions and 455 deletions

View File

@ -4,7 +4,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -1346,7 +1346,8 @@ public enum Pointcut implements IPointcut {
"org.hl7.fhir.instance.model.api.IBaseResource",
"ca.uhn.fhir.rest.api.server.RequestDetails",
"ca.uhn.fhir.rest.server.servlet.ServletRequestDetails",
"ca.uhn.fhir.rest.api.server.storage.TransactionDetails"
"ca.uhn.fhir.rest.api.server.storage.TransactionDetails",
"ca.uhn.fhir.interceptor.model.RequestPartitionId"
),
/**

View File

@ -0,0 +1,37 @@
package ca.uhn.fhir.interceptor.model;
/*-
* #%L
* HAPI FHIR - Core Library
* %%
* Copyright (C) 2014 - 2022 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
public class PartitionIdRequestDetails {
/**
* The currently provided request partition
*/
private RequestPartitionId myRequestPartitionId;
public RequestPartitionId getRequestPartitionId() {
return myRequestPartitionId;
}
public void setRequestPartitionId(RequestPartitionId theRequestPartitionId) {
myRequestPartitionId = theRequestPartitionId;
}
}

View File

@ -26,7 +26,7 @@ import org.hl7.fhir.instance.model.api.IIdType;
import javax.annotation.Nullable;
public class ReadPartitionIdRequestDetails {
public class ReadPartitionIdRequestDetails extends PartitionIdRequestDetails {
private final String myResourceType;
private final RestOperationTypeEnum myRestOperationType;
@ -67,6 +67,8 @@ public class ReadPartitionIdRequestDetails {
return myConditionalTargetOrNull;
}
public static ReadPartitionIdRequestDetails forSearchType(String theResourceType, Object theParams, IBaseResource theConditionalOperationTargetOrNull) {
return new ReadPartitionIdRequestDetails(theResourceType, RestOperationTypeEnum.SEARCH_TYPE, null, theParams, theConditionalOperationTargetOrNull);
}

View File

@ -39,6 +39,7 @@ public class MethodOutcome {
private IBaseResource myResource;
private Map<String, List<String>> myResponseHeaders;
private Collection<Runnable> myResourceViewCallbacks;
private int myResponseStatusCode;
/**
* Constructor
@ -238,4 +239,11 @@ public class MethodOutcome {
return myResource != null;
}
public void setStatusCode(int theResponseStatusCode) {
myResponseStatusCode = theResponseStatusCode;
}
public int getResponseStatusCode() {
return myResponseStatusCode;
}
}

View File

@ -6,7 +6,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -3,14 +3,14 @@
<modelVersion>4.0.0</modelVersion>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-bom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<packaging>pom</packaging>
<name>HAPI FHIR BOM</name>
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@ -4,7 +4,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -26,8 +26,11 @@ import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.jpa.model.util.JpaConstants;
import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.MethodOutcome;
import ca.uhn.fhir.rest.client.api.IGenericClient;
import ca.uhn.fhir.rest.client.interceptor.LoggingInterceptor;
import ca.uhn.fhir.rest.param.StringParam;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.util.ParametersUtil;
import org.apache.commons.cli.CommandLine;
import org.apache.commons.cli.Options;
@ -43,6 +46,7 @@ import org.eclipse.jetty.server.Server;
import org.eclipse.jetty.server.ServerConnector;
import org.eclipse.jetty.servlet.ServletContextHandler;
import org.eclipse.jetty.servlet.ServletHolder;
import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.instance.model.api.IBase;
import org.hl7.fhir.instance.model.api.IBaseParameters;
import org.hl7.fhir.instance.model.api.IBaseResource;
@ -106,8 +110,6 @@ public class BulkImportCommand extends BaseCommand {
@Override
public void run(CommandLine theCommandLine) throws ParseException, ExecutionException {
ourEndNow = false;
parseFhirContext(theCommandLine);
String baseDirectory = theCommandLine.getOptionValue(SOURCE_DIRECTORY);
@ -143,14 +145,43 @@ public class BulkImportCommand extends BaseCommand {
.execute();
ourLog.info("Got response: {}", myFhirCtx.newJsonParser().setPrettyPrint(true).encodeResourceToString(outcome));
ourLog.info("Bulk import is now running. Do not terminate this command until all files have been downloaded.");
ourLog.info("Bulk import is now running. Do not terminate this command until all files have been uploaded.");
checkJobComplete(outcome.getIdElement().toString(), client);
}
private void checkJobComplete(String url, IGenericClient client) {
String jobId = url.substring(url.indexOf("=") + 1);
while (true) {
if (ourEndNow) {
MethodOutcome response;
// handle NullPointerException
if (jobId == null) {
ourLog.error("The jobId cannot be null.");
break;
}
}
try {
response = client
.operation()
.onServer()
.named(JpaConstants.OPERATION_IMPORT_POLL_STATUS)
.withSearchParameter(Parameters.class, "_jobId", new StringParam(jobId))
.returnMethodOutcome()
.execute();
} catch (InternalErrorException e){
// handle ERRORED status
ourLog.error(e.getMessage());
break;
}
if (response.getResponseStatusCode() == 200) {
break;
} else {
// still in progress
continue;
}
}
}
@Nonnull
@ -252,9 +283,5 @@ public class BulkImportCommand extends BaseCommand {
}
}
public static void setEndNowForUnitTest(boolean theEndNow) {
ourEndNow = theEndNow;
}
}

View File

@ -3,7 +3,9 @@ package ca.uhn.fhir.cli;
import ca.uhn.fhir.batch2.api.IJobCoordinator;
import ca.uhn.fhir.batch2.jobs.imprt.BulkDataImportProvider;
import ca.uhn.fhir.batch2.jobs.imprt.BulkImportJobParameters;
import ca.uhn.fhir.batch2.model.JobInstance;
import ca.uhn.fhir.batch2.model.JobInstanceStartRequest;
import ca.uhn.fhir.batch2.model.StatusEnum;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
import ca.uhn.fhir.rest.server.interceptor.LoggingInterceptor;
@ -13,6 +15,7 @@ import org.apache.commons.io.FileUtils;
import org.apache.commons.io.IOUtils;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet;
import org.hl7.fhir.r4.model.InstantType;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
@ -34,20 +37,22 @@ import java.io.Writer;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.Date;
import java.util.zip.GZIPOutputStream;
import static org.awaitility.Awaitility.await;
import static org.hamcrest.Matchers.equalTo;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.Mockito.timeout;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;
@ExtendWith(MockitoExtension.class)
public class BulkImportCommandTest {
public class BulkImportCommandIT {
private static final Logger ourLog = LoggerFactory.getLogger(BulkImportCommandTest.class);
private static final Logger ourLog = LoggerFactory.getLogger(BulkImportCommandIT.class);
static {
System.setProperty("test", "true");
@ -78,7 +83,6 @@ public class BulkImportCommandTest {
public void afterEach() throws IOException {
ourLog.info("Deleting temp directory: {}", myTempDir);
FileUtils.deleteDirectory(myTempDir.toFile());
BulkImportCommand.setEndNowForUnitTest(true);
}
private Batch2JobStartResponse createJobStartResponse(String theId) {
@ -90,13 +94,19 @@ public class BulkImportCommandTest {
@Test
public void testBulkImport() throws IOException {
JobInstance jobInfo = new JobInstance()
.setStatus(StatusEnum.COMPLETED)
.setCreateTime(parseDate("2022-01-01T12:00:00-04:00"))
.setStartTime(parseDate("2022-01-01T12:10:00-04:00"));
when(myJobCoordinator.getInstance(eq("THE-JOB-ID"))).thenReturn(jobInfo);
String fileContents1 = "{\"resourceType\":\"Observation\"}\n{\"resourceType\":\"Observation\"}";
String fileContents2 = "{\"resourceType\":\"Patient\"}\n{\"resourceType\":\"Patient\"}";
writeNdJsonFileToTempDirectory(fileContents1, "file1.json");
writeNdJsonFileToTempDirectory(fileContents2, "file2.json");
when(myJobCoordinator.startInstance(any()))
.thenReturn(createJobStartResponse("THE-JOB-ID"));
when(myJobCoordinator.startInstance(any())).thenReturn(createJobStartResponse("THE-JOB-ID"));
// Start the command in a separate thread
new Thread(() -> App.main(new String[]{
@ -125,6 +135,13 @@ public class BulkImportCommandTest {
@Test
public void testBulkImport_GzippedFile() throws IOException {
JobInstance jobInfo = new JobInstance()
.setStatus(StatusEnum.COMPLETED)
.setCreateTime(parseDate("2022-01-01T12:00:00-04:00"))
.setStartTime(parseDate("2022-01-01T12:10:00-04:00"));
when(myJobCoordinator.getInstance(eq("THE-JOB-ID"))).thenReturn(jobInfo);
String fileContents1 = "{\"resourceType\":\"Observation\"}\n{\"resourceType\":\"Observation\"}";
String fileContents2 = "{\"resourceType\":\"Patient\"}\n{\"resourceType\":\"Patient\"}";
writeNdJsonFileToTempDirectory(fileContents1, "file1.json.gz");
@ -177,6 +194,9 @@ public class BulkImportCommandTest {
}
}
}
private Date parseDate(String theString) {
return new InstantType(theString).getValue();
}
}

View File

@ -6,7 +6,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-cli</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@ -6,7 +6,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../../hapi-deployable-pom</relativePath>
</parent>

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@ -4,7 +4,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -4,7 +4,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -1514,6 +1514,7 @@ public class GenericClient extends BaseClient implements IGenericClient {
MethodOutcome retVal = new MethodOutcome();
retVal.setResource(response);
retVal.setCreatedUsingStatusCode(theResponseStatusCode);
retVal.setStatusCode(theResponseStatusCode);
retVal.setResponseHeaders(theHeaders);
return retVal;
}

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 3379
title: "Fixed an issue where FindCandidateByLinkSvc still uses long to store persistent ids rather than ResourcePersistentId,
also fixed an issue where MdmLinkCreateSvc was not setting the resource type of MdmLinks properly"

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 3706
jira: SMILE-1119
title: "Codes with the Seventh Character will be dynamically generated while uploading the ICD10-CM diagnosis code file,
and the corresponding extended description can be extract using the code."

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 3745
jira: SMILE-4543
title: "Fixed a bug where NPE occurs while updating references for posting a bundle with duplicate resource in
conditional create and another conditional verb (delete)."

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 3837
jira: SMILE-4467
title: "Previously, when doing a GET operation with an _include as a bundle entry could occasionally miss returning some results.
This has been corrected, and queries like '/Medication?_include=Medication:ingredient' will now correctly include the relevant target resources."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 3854
title: "Previously, command prompt was not returned after initiating bulk import operation and even when the importing was completed.
This has been fixed, and the command prompt will return after the uploading process finished."

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 3875
title: "Previously, when modifying one of the common search parameters, if the `Mark Resources For Reindexing Upon
Search Parameter Change` configuration parameter was enabled, an invalid reindex batch job request would be created.
This has been fixed, and the batch job request will contain no URL constraints, causing all resources to be reindexed."

View File

@ -0,0 +1,5 @@
type: fix
issue: 3881
jira: SMILE-4083
title: "Previously, if an error was thrown after the outgoing response stream had started being written to, an OperationOutcome was not being returned. Instead, an HTML
servlet error was being thrown. This has been corrected."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 3884
title: "Prevent creating overlapping SearchParameter resources which would lead to random search behaviors."

View File

@ -0,0 +1,6 @@
---
type: fix
issue: 3887
jira: SMILE-3975
title: "Removed a bug with subscriptions that supported sending deletes. Previously, if there was a previously-registered subscription which did not support deletes, that would shortcircuit processing and cause subsequent
subscriptions not to fire. This has been corrected."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 3899
title: "Searches using `code:in` and `code:not-in` will now expand the valueset up to the Maximum Expansion Size defined in the DaoConfig."

View File

@ -0,0 +1,13 @@
---
type: fix
issue: 3902
jira: SMILE-4500
title: "
Fixed bug where creating a new resource in partitioned mode using a PUT operation
invoked pointcut STORAGE_PARTITION_IDENTIFY_READ during validation. This caused
errors because read interceptors (listing on Pointcut.STORAGE_PARTITION_IDENTIFY_READ)
and write interceptors (listening on Pointcut.STORAGE_PARTITION_IDENTIFY_CREATE) could
return different partitions (say, all vs default).
Now, only the CREATE pointcut will be invoked, and the same partition will be used
for any reads during UPDATE.
"

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 3887
title: "Previously, if Fulltext Search was enabled, and a `$delete-expunge` job was run, it could leave orphaned documents in the Fulltext index. This has been corrected."

View File

@ -0,0 +1,4 @@
---
type: fix
issue: 3911
title: "Fixed a bug where repeated calls to the same Bulk Export request would not return the cached result, but would instead start a new Bulk Export job."

View File

@ -0,0 +1,8 @@
---
type: fix
issue: 3915
title: "When multiple permissions exist for a user, granting access to the same compartment for different
owners, the permissions will be collapsed into one rule. Previously, if these permissions had filters, only the
the filter of the first permission in the list would be applied, but it would apply to all of the owners.
This has been fixed by turning off the collapse function for permissions with filters, converting each one
to a separate rule."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 3918
title: "When offset searches are enforced on the server, not all of the search parameters were loading and this caused a NPE.
Now an ERROR will be logged instead."

View File

@ -0,0 +1,5 @@
---
type: fix
issue: 3922
title: "Fixed an issue with $delete-expunge that resulted in SQL errors on large amounts
of deleted data"

View File

@ -11,7 +11,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -4,7 +4,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>
<modelVersion>4.0.0</modelVersion>

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -35,6 +35,7 @@ import ca.uhn.fhir.jpa.entity.Batch2JobInstanceEntity;
import ca.uhn.fhir.jpa.entity.Batch2WorkChunkEntity;
import ca.uhn.fhir.jpa.util.JobInstanceUtil;
import ca.uhn.fhir.model.api.PagingIterator;
import ca.uhn.fhir.narrative.BaseThymeleafNarrativeGenerator;
import org.apache.commons.collections4.ListUtils;
import org.apache.commons.lang3.Validate;
import org.slf4j.Logger;
@ -159,20 +160,30 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
}
@Override
public List<JobInstance> fetchInstances(FetchJobInstancesRequest theRequest, int theStart, int theBatchSize) {
public List<JobInstance> fetchInstances(FetchJobInstancesRequest theRequest, int thePage, int theBatchSize) {
String definitionId = theRequest.getJobDefinition();
String params = theRequest.getParameters();
Set<StatusEnum> statuses = theRequest.getStatuses();
Pageable pageable = Pageable.ofSize(theBatchSize).withPage(theStart);
Pageable pageable = PageRequest.of(thePage, theBatchSize);
// TODO - consider adding a new index... on the JobDefinitionId (and possibly Status)
List<JobInstance> instances = myJobInstanceRepository.findInstancesByJobIdAndParams(
definitionId,
params,
pageable
);
List<Batch2JobInstanceEntity> instanceEntities;
return instances == null ? new ArrayList<>() : instances;
if (statuses != null && !statuses.isEmpty()) {
instanceEntities = myJobInstanceRepository.findInstancesByJobIdParamsAndStatus(
definitionId,
params,
statuses,
pageable
);
} else {
instanceEntities = myJobInstanceRepository.findInstancesByJobIdAndParams(
definitionId,
params,
pageable
);
}
return toInstanceList(instanceEntities);
}
@Override

View File

@ -24,6 +24,7 @@ import ca.uhn.fhir.jpa.api.config.DaoConfig;
import ca.uhn.fhir.jpa.api.svc.IBatch2DaoSvc;
import ca.uhn.fhir.jpa.api.svc.IDeleteExpungeSvc;
import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
import ca.uhn.fhir.jpa.dao.IFulltextSearchSvc;
import ca.uhn.fhir.jpa.dao.data.IResourceLinkDao;
import ca.uhn.fhir.jpa.dao.expunge.ResourceTableFKProvider;
import ca.uhn.fhir.jpa.dao.index.IJpaIdHelperService;
@ -42,8 +43,8 @@ public class Batch2SupportConfig {
}
@Bean
public IDeleteExpungeSvc deleteExpungeSvc(EntityManager theEntityManager, DeleteExpungeSqlBuilder theDeleteExpungeSqlBuilder) {
return new DeleteExpungeSvcImpl(theEntityManager, theDeleteExpungeSqlBuilder);
public IDeleteExpungeSvc deleteExpungeSvc(EntityManager theEntityManager, DeleteExpungeSqlBuilder theDeleteExpungeSqlBuilder, IFulltextSearchSvc theFullTextSearchSvc) {
return new DeleteExpungeSvcImpl(theEntityManager, theDeleteExpungeSqlBuilder, theFullTextSearchSvc);
}
@Bean

View File

@ -156,6 +156,7 @@ import static org.apache.commons.lang3.StringUtils.isNotBlank;
public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends BaseHapiFhirDao<T> implements IFhirResourceDao<T> {
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(BaseHapiFhirResourceDao.class);
public static final String BASE_RESOURCE_NAME = "resource";
@Autowired
protected PlatformTransactionManager myPlatformTransactionManager;
@ -333,6 +334,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
.add(IBaseResource.class, theResource)
.add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest)
.add(RequestPartitionId.class, theRequestPartitionId)
.add(TransactionDetails.class, theTransactionDetails);
doCallHooks(theTransactionDetails, theRequest, Pointcut.STORAGE_PRESTORAGE_RESOURCE_CREATED, hookParams);
@ -976,7 +978,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
protected void requestReindexForRelatedResources(Boolean theCurrentlyReindexing, List<String> theBase, RequestDetails theRequestDetails) {
// Avoid endless loops
if (Boolean.TRUE.equals(theCurrentlyReindexing)) {
if (Boolean.TRUE.equals(theCurrentlyReindexing) || shouldSkipReindex(theRequestDetails)) {
return;
}
@ -984,11 +986,9 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
ReindexJobParameters params = new ReindexJobParameters();
theBase
.stream()
.map(t -> t + "?")
.map(url -> myUrlPartitioner.partitionUrl(url, theRequestDetails))
.forEach(params::addPartitionedUrl);
if (!isCommonSearchParam(theBase)) {
addAllResourcesTypesToReindex(theBase, theRequestDetails, params);
}
ReadPartitionIdRequestDetails details= new ReadPartitionIdRequestDetails(null, RestOperationTypeEnum.EXTENDED_OPERATION_SERVER, null, null, null);
RequestPartitionId requestPartition = myRequestPartitionHelperService.determineReadPartitionForRequest(theRequestDetails, null, details);
@ -1006,6 +1006,29 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
mySearchParamRegistry.requestRefresh();
}
private boolean shouldSkipReindex(RequestDetails theRequestDetails) {
if (theRequestDetails == null) {
return false;
}
Object shouldSkip = theRequestDetails.getUserData().getOrDefault(JpaConstants.SKIP_REINDEX_ON_UPDATE, false);
return Boolean.parseBoolean(shouldSkip.toString());
}
private void addAllResourcesTypesToReindex(List<String> theBase, RequestDetails theRequestDetails, ReindexJobParameters params) {
theBase
.stream()
.map(t -> t + "?")
.map(url -> myUrlPartitioner.partitionUrl(url, theRequestDetails))
.forEach(params::addPartitionedUrl);
}
private boolean isCommonSearchParam(List<String> theBase) {
// If the base contains the special resource "Resource", this is a common SP that applies to all resources
return theBase.stream()
.map(String::toLowerCase)
.anyMatch(BASE_RESOURCE_NAME::equals);
}
@Override
@Transactional
public <MT extends IBaseMetaType> MT metaAddOperation(IIdType theResourceId, MT theMetaAdd, RequestDetails theRequest) {

View File

@ -445,4 +445,14 @@ public class FulltextSearchSvcImpl implements IFulltextSearchSvc {
}
@Override
public void deleteIndexedDocumentsByTypeAndId(Class theClazz, List<Object> theGivenIds) {
SearchSession session = Search.session(myEntityManager);
SearchIndexingPlan indexingPlan = session.indexingPlan();
for (Object givenId : theGivenIds) {
indexingPlan.purge(theClazz, givenId, null);
}
indexingPlan.process();
indexingPlan.execute();
}
}

View File

@ -100,5 +100,13 @@ public interface IFulltextSearchSvc {
boolean supportsAllOf(SearchParameterMap theParams);
/**
* Given a resource type that is indexed by hibernate search, and a list of objects reprenting the IDs you wish to delete,
* this method will delete the resources from the Hibernate Search index. This is useful for situations where a deletion occurred
* outside a Hibernate ORM session, leaving dangling documents in the index.
*
* @param theClazz The class, which must be annotated with {@link org.hibernate.search.mapper.pojo.mapping.definition.annotation.Indexed}
* @param theGivenIds The list of IDs for the given document type. Note that while this is a List<Object>, the type must match the type of the `@Id` field on the given class.
*/
void deleteIndexedDocumentsByTypeAndId(Class theClazz, List<Object> theGivenIds);
}

View File

@ -23,6 +23,7 @@ package ca.uhn.fhir.jpa.dao.data;
import ca.uhn.fhir.batch2.model.JobInstance;
import ca.uhn.fhir.batch2.model.StatusEnum;
import ca.uhn.fhir.jpa.entity.Batch2JobInstanceEntity;
import org.hibernate.engine.jdbc.batch.spi.Batch;
import org.springframework.data.domain.Pageable;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.jpa.repository.Modifying;
@ -46,22 +47,16 @@ public interface IBatch2JobInstanceRepository extends JpaRepository<Batch2JobIns
@Query("UPDATE Batch2JobInstanceEntity e SET e.myCurrentGatedStepId = :currentGatedStepId WHERE e.myId = :id")
void updateInstanceCurrentGatedStepId(@Param("id") String theInstanceId, @Param("currentGatedStepId") String theCurrentGatedStepId);
@Query(
value = "SELECT * from Batch2JobInstanceEntity WHERE DEFINITION_ID = :defId AND PARAMS_JSON = :params AND STAT IN( :stats )",
nativeQuery = true
)
List<JobInstance> findInstancesByJobIdParamsAndStatus(
@Query("SELECT b from Batch2JobInstanceEntity b WHERE b.myDefinitionId = :defId AND b.myParamsJson = :params AND b.myStatus IN( :stats )")
List<Batch2JobInstanceEntity> findInstancesByJobIdParamsAndStatus(
@Param("defId") String theDefinitionId,
@Param("params") String theParams,
@Param("stats") Set<StatusEnum> theStatus,
Pageable thePageable
);
@Query(
value = "SELECT * from Batch2JobInstanceEntity WHERE DEFINITION_ID = :defId AND PARAMS_JSON = :params",
nativeQuery = true
)
List<JobInstance> findInstancesByJobIdAndParams(
@Query("SELECT b from Batch2JobInstanceEntity b WHERE b.myDefinitionId = :defId AND b.myParamsJson = :params")
List<Batch2JobInstanceEntity> findInstancesByJobIdAndParams(
@Param("defId") String theDefinitionId,
@Param("params") String theParams,
Pageable thePageable

View File

@ -21,6 +21,8 @@ package ca.uhn.fhir.jpa.delete.batch2;
*/
import ca.uhn.fhir.jpa.api.svc.IDeleteExpungeSvc;
import ca.uhn.fhir.jpa.dao.IFulltextSearchSvc;
import ca.uhn.fhir.jpa.model.entity.ResourceTable;
import ca.uhn.fhir.rest.api.server.storage.ResourcePersistentId;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@ -29,6 +31,7 @@ import org.springframework.transaction.annotation.Transactional;
import javax.persistence.EntityManager;
import java.util.List;
import java.util.stream.Collectors;
@Transactional(propagation = Propagation.MANDATORY)
public class DeleteExpungeSvcImpl implements IDeleteExpungeSvc {
@ -36,10 +39,12 @@ public class DeleteExpungeSvcImpl implements IDeleteExpungeSvc {
private final EntityManager myEntityManager;
private final DeleteExpungeSqlBuilder myDeleteExpungeSqlBuilder;
private final IFulltextSearchSvc myFullTextSearchSvc;
public DeleteExpungeSvcImpl(EntityManager theEntityManager, DeleteExpungeSqlBuilder theDeleteExpungeSqlBuilder) {
public DeleteExpungeSvcImpl(EntityManager theEntityManager, DeleteExpungeSqlBuilder theDeleteExpungeSqlBuilder, IFulltextSearchSvc theFullTextSearchSvc) {
myEntityManager = theEntityManager;
myDeleteExpungeSqlBuilder = theDeleteExpungeSqlBuilder;
myFullTextSearchSvc = theFullTextSearchSvc;
}
@Override
@ -52,9 +57,24 @@ public class DeleteExpungeSvcImpl implements IDeleteExpungeSvc {
ourLog.trace("Executing sql " + sql);
totalDeleted += myEntityManager.createNativeQuery(sql).executeUpdate();
}
ourLog.info("{} records deleted", totalDeleted);
clearHibernateSearchIndex(thePersistentIds);
// TODO KHS instead of logging progress, produce result chunks that get aggregated into a delete expunge report
}
/**
* If we are running with HS enabled, the expunge operation will cause dangling documents because Hibernate Search is not aware of custom SQL queries that delete resources.
* This method clears the Hibernate Search index for the given resources.
*/
private void clearHibernateSearchIndex(List<ResourcePersistentId> thePersistentIds) {
if (myFullTextSearchSvc != null) {
List<Object> objectIds = thePersistentIds.stream().map(ResourcePersistentId::getIdAsLong).collect(Collectors.toList());
myFullTextSearchSvc.deleteIndexedDocumentsByTypeAndId(ResourceTable.class, objectIds);
ourLog.info("Cleared Hibernate Search indexes.");
}
}
}

View File

@ -23,7 +23,10 @@ package ca.uhn.fhir.jpa.interceptor;
import ca.uhn.fhir.interceptor.api.Hook;
import ca.uhn.fhir.interceptor.api.Interceptor;
import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.jpa.partition.SystemRequestDetails;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import org.apache.commons.lang3.Validate;
/**
@ -43,7 +46,8 @@ public class ForceOffsetSearchModeInterceptor {
}
@Hook(Pointcut.STORAGE_PRESEARCH_REGISTERED)
public void storagePreSearchRegistered(SearchParameterMap theMap) {
public void storagePreSearchRegistered(SearchParameterMap theMap, RequestDetails theRequestDetails) {
if (theMap.getOffset() == null) {
theMap.setOffset(0);
}
@ -52,4 +56,5 @@ public class ForceOffsetSearchModeInterceptor {
}
}
}

View File

@ -1134,7 +1134,7 @@ public class SearchBuilder implements ISearchBuilder {
continue;
}
paths = param.getPathsSplitForResourceType(resType);
paths = theReverseMode ? param.getPathsSplitForResourceType(resType) : param.getPathsSplit();
String targetResourceType = defaultString(nextInclude.getParamTargetType(), null);
for (String nextPath : paths) {

View File

@ -32,6 +32,7 @@ import ca.uhn.fhir.context.support.ValidationSupportContext;
import ca.uhn.fhir.context.support.ValueSetExpansionOptions;
import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.config.DaoConfig;
import ca.uhn.fhir.jpa.dao.predicate.SearchFilterParser;
import ca.uhn.fhir.jpa.model.entity.BaseResourceIndexedSearchParam;
import ca.uhn.fhir.jpa.model.entity.ModelConfig;
@ -90,6 +91,8 @@ public class TokenPredicateBuilder extends BaseSearchParamPredicateBuilder {
private ModelConfig myModelConfig;
@Autowired
private FhirContext myContext;
@Autowired
private DaoConfig myDaoConfig;
/**
* Constructor
@ -177,7 +180,10 @@ public class TokenPredicateBuilder extends BaseSearchParamPredicateBuilder {
if (modifier == TokenParamModifier.IN || modifier == TokenParamModifier.NOT_IN) {
if (myContext.getVersion().getVersion().isNewerThan(FhirVersionEnum.DSTU2)) {
IValidationSupport.ValueSetExpansionOutcome expanded = myValidationSupport.expandValueSet(new ValidationSupportContext(myValidationSupport), new ValueSetExpansionOptions(), code);
ValueSetExpansionOptions valueSetExpansionOptions = new ValueSetExpansionOptions();
valueSetExpansionOptions.setCount(myDaoConfig.getMaximumExpansionSize());
IValidationSupport.ValueSetExpansionOutcome expanded = myValidationSupport.expandValueSet(new ValidationSupportContext(myValidationSupport), valueSetExpansionOptions, code);
codes.addAll(extractValueSetCodes(expanded.getValueSet()));
} else {
codes.addAll(myTerminologySvc.expandValueSetIntoConceptList(null, code));

View File

@ -37,6 +37,14 @@ public class Icd10CmLoader {
private final TermCodeSystemVersion myCodeSystemVersion;
private int myConceptCount;
private static final String SEVEN_CHR_DEF = "sevenChrDef";
private static final String VERSION = "version";
private static final String EXTENSION = "extension";
private static final String CHAPTER = "chapter";
private static final String SECTION = "section";
private static final String DIAG = "diag";
private static final String NAME = "name";
private static final String DESC = "desc";
/**
* Constructor
@ -73,8 +81,8 @@ public class Icd10CmLoader {
private void extractCode(Element theDiagElement, TermConcept theParentConcept) {
String code = theDiagElement.getElementsByTagName("name").item(0).getTextContent();
String display = theDiagElement.getElementsByTagName("desc").item(0).getTextContent();
String code = theDiagElement.getElementsByTagName(NAME).item(0).getTextContent();
String display = theDiagElement.getElementsByTagName(DESC).item(0).getTextContent();
TermConcept concept;
if (theParentConcept == null) {
@ -86,13 +94,50 @@ public class Icd10CmLoader {
concept.setCode(code);
concept.setDisplay(display);
for (Element nextChildDiag : XmlUtil.getChildrenByTagName(theDiagElement, "diag")) {
for (Element nextChildDiag : XmlUtil.getChildrenByTagName(theDiagElement, DIAG)) {
extractCode(nextChildDiag, concept);
if (XmlUtil.getChildrenByTagName(theDiagElement, SEVEN_CHR_DEF).size() != 0){
extractExtension(theDiagElement, nextChildDiag, concept);
}
}
myConceptCount++;
}
private void extractExtension(Element theDiagElement, Element theChildDiag, TermConcept theParentConcept) {
for (Element nextChrNote : XmlUtil.getChildrenByTagName(theDiagElement, SEVEN_CHR_DEF)){
for (Element nextExtension : XmlUtil.getChildrenByTagName(nextChrNote, EXTENSION)){
String baseCode = theChildDiag.getElementsByTagName(NAME).item(0).getTextContent();
String sevenChar = nextExtension.getAttributes().item(0).getNodeValue();
String baseDef = theChildDiag.getElementsByTagName(DESC).item(0).getTextContent();
String sevenCharDef = nextExtension.getTextContent();
TermConcept concept = theParentConcept.addChild(TermConceptParentChildLink.RelationshipTypeEnum.ISA);
concept.setCode(getExtendedCode(baseCode, sevenChar));
concept.setDisplay(getExtendedDisplay(baseDef, sevenCharDef));
}
}
}
private String getExtendedDisplay(String theBaseDef, String theSevenCharDef) {
return theBaseDef + ", " + theSevenCharDef;
}
/**
* The Seventh Character must be placed at the seventh position of the code
* If the base code only has five characters, "X" will be used as a placeholder
*/
private String getExtendedCode(String theBaseCode, String theSevenChar) {
String placeholder = "X";
String code = theBaseCode;
for (int i = code.length(); i < 7; i++){
code += placeholder;
}
code += theSevenChar;
return code;
}
public int getConceptCount() {
return myConceptCount;

View File

@ -254,10 +254,11 @@ class JpaJobPersistenceImplTest {
// setup
int pageStart = 1;
int pageSize = 132;
JobInstance job1 = createJobInstanceWithDemoData();
FetchJobInstancesRequest req = new FetchJobInstancesRequest(job1.getInstanceId(), "params");
JobInstance job2 = createJobInstanceWithDemoData();
List<JobInstance> instances = Arrays.asList(job1, job2);
Batch2JobInstanceEntity job1 = createBatch2JobInstanceEntity();
FetchJobInstancesRequest req = new FetchJobInstancesRequest(job1.getId(), "params");
Batch2JobInstanceEntity job2 = createBatch2JobInstanceEntity();
List<Batch2JobInstanceEntity> instances = Arrays.asList(job1, job2);
// when
when(myJobInstanceRepository
@ -271,7 +272,8 @@ class JpaJobPersistenceImplTest {
// verify
assertEquals(instances.size(), retInstances.size());
assertEquals(instances, retInstances);
assertEquals(instances.get(0).getId(), retInstances.get(0).getInstanceId());
assertEquals(instances.get(1).getId(), retInstances.get(1).getInstanceId());
ArgumentCaptor<Pageable> pageableCaptor = ArgumentCaptor.forClass(Pageable.class);
verify(myJobInstanceRepository)

View File

@ -7,7 +7,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -6,7 +6,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -6,7 +6,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>
@ -24,7 +24,6 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-jpa</artifactId>
<version>${spring_data_version}</version>
</dependency>
<dependency>
<groupId>ca.uhn.hapi.fhir</groupId>

View File

@ -89,6 +89,7 @@ public class MdmLinkCreateSvcImpl implements IMdmLinkCreateSvc {
IMdmLink mdmLink = myMdmLinkDaoSvc.getOrCreateMdmLinkByGoldenResourceAndSourceResource(theGoldenResource, theSourceResource);
mdmLink.setLinkSource(MdmLinkSourceEnum.MANUAL);
mdmLink.setMdmSourceType(sourceType);
if (theMatchResult == null) {
mdmLink.setMatchResult(MdmMatchResultEnum.MATCH);
} else {

View File

@ -51,7 +51,7 @@ public class FindCandidateByLinkSvc extends BaseCandidateFinder {
if (targetPid != null) {
Optional<? extends IMdmLink> oLink = myMdmLinkDaoSvc.getMatchedLinkForSourcePid(targetPid);
if (oLink.isPresent()) {
ResourcePersistentId goldenResourcePid = new ResourcePersistentId(oLink.get().getGoldenResourcePersistenceId().getIdAsLong());
ResourcePersistentId goldenResourcePid = oLink.get().getGoldenResourcePersistenceId();
ourLog.debug("Resource previously linked. Using existing link.");
retval.add(new MatchedGoldenResourceCandidate(goldenResourcePid, oLink.get()));
}

View File

@ -0,0 +1,93 @@
package ca.uhn.fhir.jpa.mdm.svc;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.api.svc.IIdHelperService;
import ca.uhn.fhir.jpa.entity.MdmLink;
import ca.uhn.fhir.jpa.mdm.dao.MdmLinkDaoSvc;
import ca.uhn.fhir.jpa.mdm.util.MdmPartitionHelper;
import ca.uhn.fhir.mdm.api.IMdmLink;
import ca.uhn.fhir.mdm.api.IMdmSettings;
import ca.uhn.fhir.mdm.api.MdmLinkSourceEnum;
import ca.uhn.fhir.mdm.api.MdmMatchResultEnum;
import ca.uhn.fhir.mdm.model.MdmTransactionContext;
import ca.uhn.fhir.mdm.util.MdmResourceUtil;
import ca.uhn.fhir.mdm.util.MessageHelper;
import ca.uhn.fhir.rest.api.server.storage.ResourcePersistentId;
import org.hl7.fhir.r4.model.Patient;
import org.jetbrains.annotations.NotNull;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.ArgumentCaptor;
import org.mockito.InjectMocks;
import org.mockito.Mock;
import org.mockito.Spy;
import org.mockito.junit.jupiter.MockitoExtension;
import org.springframework.beans.factory.annotation.Autowired;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.when;
@ExtendWith(MockitoExtension.class)
class MdmLinkCreateSvcImplTest {
@Spy
FhirContext myFhirContext = FhirContext.forR4();
@Mock
IIdHelperService myIdHelperService;
@Mock
MdmLinkDaoSvc myMdmLinkDaoSvc;
@Mock
IMdmSettings myMdmSettings;
@Mock
MessageHelper myMessageHelper;
@Mock
MdmPartitionHelper myMdmPartitionHelper;
@InjectMocks
MdmLinkCreateSvcImpl myMdmLinkCreateSvc = new MdmLinkCreateSvcImpl();
@Test
public void testCreateLink(){
ArgumentCaptor<IMdmLink> mdmLinkCaptor = ArgumentCaptor.forClass(IMdmLink.class);
when(myMdmLinkDaoSvc.save(mdmLinkCaptor.capture())).thenReturn(new MdmLink());
Patient goldenPatient = new Patient().setActive(true);
MdmResourceUtil.setMdmManaged(goldenPatient);
MdmResourceUtil.setGoldenResource(goldenPatient);
Patient sourcePatient = new Patient();
MdmTransactionContext ctx = new MdmTransactionContext();
myMdmLinkCreateSvc.createLink(goldenPatient, sourcePatient, MdmMatchResultEnum.MATCH, ctx);
IMdmLink mdmLink = mdmLinkCaptor.getValue();
assertEquals(MdmLinkSourceEnum.MANUAL, mdmLink.getLinkSource());
assertEquals("Patient", mdmLink.getMdmSourceType());
}
@BeforeEach
private void setup() {
ResourcePersistentId goldenId = new ResourcePersistentId(1L);
ResourcePersistentId sourceId = new ResourcePersistentId(2L);
when(myIdHelperService.getPidOrThrowException(any()))
.thenReturn(goldenId, sourceId);
when(myMdmLinkDaoSvc.getLinkByGoldenResourcePidAndSourceResourcePid(any(ResourcePersistentId.class), any(ResourcePersistentId.class))).thenReturn(Optional.empty());
when(myMdmLinkDaoSvc.getMdmLinksBySourcePidAndMatchResult(any(ResourcePersistentId.class), any())).thenReturn(new ArrayList<>());
MdmLink resultMdmLink = new MdmLink();
resultMdmLink.setGoldenResourcePersistenceId(goldenId).setSourcePersistenceId(sourceId);
when(myMdmLinkDaoSvc.getOrCreateMdmLinkByGoldenResourceAndSourceResource(any(), any())).thenReturn(resultMdmLink);
when(myMdmSettings.isSupportedMdmType(any())).thenReturn(true);
}
}

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -267,6 +267,8 @@ public class JpaConstants {
*/
public static final String HEADER_REWRITE_HISTORY = "X-Rewrite-History";
public static final String SKIP_REINDEX_ON_UPDATE = "SKIP-REINDEX-ON-UPDATE";
/**
* Non-instantiable
*/

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -31,9 +31,11 @@ import org.quartz.JobExecutionContext;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.event.ContextRefreshedEvent;
import org.springframework.context.event.EventListener;
import org.springframework.core.annotation.Order;
import org.springframework.stereotype.Service;
import javax.annotation.PostConstruct;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
@ -63,7 +65,8 @@ public class ResourceChangeListenerCacheRefresherImpl implements IResourceChange
@Autowired
private ResourceChangeListenerRegistryImpl myResourceChangeListenerRegistry;
@PostConstruct
@EventListener(classes = {ContextRefreshedEvent.class})
@Order
public void start() {
ScheduledJobDefinition jobDetail = new ScheduledJobDefinition();
jobDetail.setId(getClass().getName());

View File

@ -25,9 +25,11 @@ import ca.uhn.fhir.interceptor.api.IInterceptorService;
import ca.uhn.fhir.interceptor.api.Pointcut;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.event.ContextRefreshedEvent;
import org.springframework.context.event.EventListener;
import org.springframework.core.annotation.Order;
import org.springframework.stereotype.Service;
import javax.annotation.PostConstruct;
import javax.annotation.PreDestroy;
/**
@ -42,7 +44,9 @@ public class ResourceChangeListenerRegistryInterceptor {
@Autowired
private IResourceChangeListenerRegistry myResourceChangeListenerRegistry;
@PostConstruct
@EventListener(classes = {ContextRefreshedEvent.class})
@Order
public void start() {
myInterceptorBroadcaster.registerInterceptor(this);
}

View File

@ -148,16 +148,18 @@ public class SearchParamRegistryImpl implements ISearchParamRegistry, IResourceC
ourLog.info("Rebuilding SearchParamRegistry");
SearchParameterMap params = new SearchParameterMap();
params.setLoadSynchronousUpTo(MAX_MANAGED_PARAM_COUNT);
params.setCount(MAX_MANAGED_PARAM_COUNT);
IBundleProvider allSearchParamsBp = mySearchParamProvider.search(params);
List<IBaseResource> allSearchParams = allSearchParamsBp.getResources(0, MAX_MANAGED_PARAM_COUNT);
int size = allSearchParamsBp.sizeOrThrowNpe();
Integer size = allSearchParamsBp.size();
ourLog.trace("Loaded {} search params from the DB", size);
ourLog.trace("Loaded {} search params from the DB", allSearchParams.size());
// Just in case..
if (size >= MAX_MANAGED_PARAM_COUNT) {
if (size == null) {
ourLog.error("Only {} search parameters have been loaded, but there are more than that in the repository. Is offset search configured on this server?", allSearchParams.size());
} else if (size >= MAX_MANAGED_PARAM_COUNT) {
ourLog.warn("Unable to support >" + MAX_MANAGED_PARAM_COUNT + " search params!");
}
@ -269,11 +271,9 @@ public class SearchParamRegistryImpl implements ISearchParamRegistry, IResourceC
/**
*
* There is a circular reference between this class and the ResourceChangeListenerRegistry:
* SearchParamRegistryImpl -> ResourceChangeListenerRegistry -> InMemoryResourceMatcher -> SearchParamRegistryImpl. Sicne we only need this once on boot-up, we delay
* until ContextRefreshedEvent.
*
*/
@PostConstruct
public void registerListener() {

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -122,93 +122,13 @@ public class SubscriptionMatchingSubscriber implements MessageHandler {
Collection<ActiveSubscription> subscriptions = mySubscriptionRegistry.getAll();
ourLog.trace("Testing {} subscriptions for applicability", subscriptions.size());
boolean resourceMatched = false;
boolean anySubscriptionsMatchedResource = false;
for (ActiveSubscription nextActiveSubscription : subscriptions) {
// skip if the partitions don't match
CanonicalSubscription subscription = nextActiveSubscription.getSubscription();
if (subscription != null && theMsg.getPartitionId() != null &&
theMsg.getPartitionId().hasPartitionIds() && !subscription.getCrossPartitionEnabled() &&
!theMsg.getPartitionId().hasPartitionId(subscription.getRequestPartitionId())) {
continue;
}
String nextSubscriptionId = getId(nextActiveSubscription);
if (isNotBlank(theMsg.getSubscriptionId())) {
if (!theMsg.getSubscriptionId().equals(nextSubscriptionId)) {
// TODO KHS we should use a hash to look it up instead of this full table scan
ourLog.debug("Ignoring subscription {} because it is not {}", nextSubscriptionId, theMsg.getSubscriptionId());
continue;
}
}
if (!resourceTypeIsAppropriateForSubscription(nextActiveSubscription, resourceId)) {
continue;
}
if (theMsg.getOperationType().equals(DELETE)) {
if (!nextActiveSubscription.getSubscription().getSendDeleteMessages()) {
ourLog.trace("Not processing modified message for {}", theMsg.getOperationType());
return;
}
}
InMemoryMatchResult matchResult;
if (nextActiveSubscription.getCriteria().getType() == SubscriptionCriteriaParser.TypeEnum.SEARCH_EXPRESSION) {
matchResult = mySubscriptionMatcher.match(nextActiveSubscription.getSubscription(), theMsg);
if (!matchResult.matched()) {
ourLog.trace("Subscription {} was not matched by resource {} {}",
nextActiveSubscription.getId(),
resourceId.toUnqualifiedVersionless().getValue(),
matchResult.isInMemory() ? "in-memory" : "by querying the repository");
continue;
}
ourLog.debug("Subscription {} was matched by resource {} {}",
nextActiveSubscription.getId(),
resourceId.toUnqualifiedVersionless().getValue(),
matchResult.isInMemory() ? "in-memory" : "by querying the repository");
} else {
ourLog.trace("Subscription {} was not matched by resource {} - No search expression",
nextActiveSubscription.getId(),
resourceId.toUnqualifiedVersionless().getValue());
matchResult = InMemoryMatchResult.successfulMatch();
matchResult.setInMemory(true);
}
IBaseResource payload = theMsg.getNewPayload(myFhirContext);
EncodingEnum encoding = null;
if (subscription != null && subscription.getPayloadString() != null && !subscription.getPayloadString().isEmpty()) {
encoding = EncodingEnum.forContentType(subscription.getPayloadString());
}
encoding = defaultIfNull(encoding, EncodingEnum.JSON);
ResourceDeliveryMessage deliveryMsg = new ResourceDeliveryMessage();
deliveryMsg.setPartitionId(theMsg.getPartitionId());
if (payload != null) {
deliveryMsg.setPayload(myFhirContext, payload, encoding);
} else {
deliveryMsg.setPayloadId(theMsg.getPayloadId(myFhirContext));
}
deliveryMsg.setSubscription(subscription);
deliveryMsg.setOperationType(theMsg.getOperationType());
deliveryMsg.setTransactionId(theMsg.getTransactionId());
deliveryMsg.copyAdditionalPropertiesFrom(theMsg);
// Interceptor call: SUBSCRIPTION_RESOURCE_MATCHED
HookParams params = new HookParams()
.add(CanonicalSubscription.class, nextActiveSubscription.getSubscription())
.add(ResourceDeliveryMessage.class, deliveryMsg)
.add(InMemoryMatchResult.class, matchResult);
if (!myInterceptorBroadcaster.callHooks(Pointcut.SUBSCRIPTION_RESOURCE_MATCHED, params)) {
return;
}
resourceMatched |= sendToDeliveryChannel(nextActiveSubscription, deliveryMsg);
anySubscriptionsMatchedResource |= processSubscription(theMsg, resourceId, nextActiveSubscription);
}
if (!resourceMatched) {
if (!anySubscriptionsMatchedResource) {
// Interceptor call: SUBSCRIPTION_RESOURCE_MATCHED
HookParams params = new HookParams()
.add(ResourceModifiedMessage.class, theMsg);
@ -216,6 +136,95 @@ public class SubscriptionMatchingSubscriber implements MessageHandler {
}
}
/**
* Returns true if subscription matched, and processing completed successfully, and the message was sent to the delivery channel. False otherwise.
*
*/
private boolean processSubscription(ResourceModifiedMessage theMsg, IIdType theResourceId, ActiveSubscription theActiveSubscription) {
// skip if the partitions don't match
CanonicalSubscription subscription = theActiveSubscription.getSubscription();
if (subscription != null && theMsg.getPartitionId() != null &&
theMsg.getPartitionId().hasPartitionIds() && !subscription.getCrossPartitionEnabled() &&
!theMsg.getPartitionId().hasPartitionId(subscription.getRequestPartitionId())) {
return false;
}
String nextSubscriptionId = getId(theActiveSubscription);
if (isNotBlank(theMsg.getSubscriptionId())) {
if (!theMsg.getSubscriptionId().equals(nextSubscriptionId)) {
// TODO KHS we should use a hash to look it up instead of this full table scan
ourLog.debug("Ignoring subscription {} because it is not {}", nextSubscriptionId, theMsg.getSubscriptionId());
return false;
}
}
if (!resourceTypeIsAppropriateForSubscription(theActiveSubscription, theResourceId)) {
return false;
}
if (theMsg.getOperationType().equals(DELETE)) {
if (!theActiveSubscription.getSubscription().getSendDeleteMessages()) {
ourLog.trace("Not processing modified message for {}", theMsg.getOperationType());
return false;
}
}
InMemoryMatchResult matchResult;
if (theActiveSubscription.getCriteria().getType() == SubscriptionCriteriaParser.TypeEnum.SEARCH_EXPRESSION) {
matchResult = mySubscriptionMatcher.match(theActiveSubscription.getSubscription(), theMsg);
if (!matchResult.matched()) {
ourLog.trace("Subscription {} was not matched by resource {} {}",
theActiveSubscription.getId(),
theResourceId.toUnqualifiedVersionless().getValue(),
matchResult.isInMemory() ? "in-memory" : "by querying the repository");
return false;
}
ourLog.debug("Subscription {} was matched by resource {} {}",
theActiveSubscription.getId(),
theResourceId.toUnqualifiedVersionless().getValue(),
matchResult.isInMemory() ? "in-memory" : "by querying the repository");
} else {
ourLog.trace("Subscription {} was not matched by resource {} - No search expression",
theActiveSubscription.getId(),
theResourceId.toUnqualifiedVersionless().getValue());
matchResult = InMemoryMatchResult.successfulMatch();
matchResult.setInMemory(true);
}
IBaseResource payload = theMsg.getNewPayload(myFhirContext);
EncodingEnum encoding = null;
if (subscription != null && subscription.getPayloadString() != null && !subscription.getPayloadString().isEmpty()) {
encoding = EncodingEnum.forContentType(subscription.getPayloadString());
}
encoding = defaultIfNull(encoding, EncodingEnum.JSON);
ResourceDeliveryMessage deliveryMsg = new ResourceDeliveryMessage();
deliveryMsg.setPartitionId(theMsg.getPartitionId());
if (payload != null) {
deliveryMsg.setPayload(myFhirContext, payload, encoding);
} else {
deliveryMsg.setPayloadId(theMsg.getPayloadId(myFhirContext));
}
deliveryMsg.setSubscription(subscription);
deliveryMsg.setOperationType(theMsg.getOperationType());
deliveryMsg.setTransactionId(theMsg.getTransactionId());
deliveryMsg.copyAdditionalPropertiesFrom(theMsg);
// Interceptor call: SUBSCRIPTION_RESOURCE_MATCHED
HookParams params = new HookParams()
.add(CanonicalSubscription.class, theActiveSubscription.getSubscription())
.add(ResourceDeliveryMessage.class, deliveryMsg)
.add(InMemoryMatchResult.class, matchResult);
if (!myInterceptorBroadcaster.callHooks(Pointcut.SUBSCRIPTION_RESOURCE_MATCHED, params)) {
ourLog.info("Interceptor has decided to abort processing of subscription {}", nextSubscriptionId);
return false;
}
return sendToDeliveryChannel(theActiveSubscription, deliveryMsg);
}
private boolean sendToDeliveryChannel(ActiveSubscription nextActiveSubscription, ResourceDeliveryMessage theDeliveryMsg) {
boolean retVal = false;
ResourceDeliveryJsonMessage wrappedMsg = new ResourceDeliveryJsonMessage(theDeliveryMsg);

View File

@ -62,32 +62,34 @@ public class SubscriptionValidatingInterceptor {
private DaoConfig myDaoConfig;
@Autowired
private SubscriptionStrategyEvaluator mySubscriptionStrategyEvaluator;
@Autowired
private FhirContext myFhirContext;
@Autowired
private IRequestPartitionHelperSvc myRequestPartitionHelperSvc;
@Hook(Pointcut.STORAGE_PRESTORAGE_RESOURCE_CREATED)
public void resourcePreCreate(IBaseResource theResource, RequestDetails theRequestDetails) {
validateSubmittedSubscription(theResource, theRequestDetails);
public void resourcePreCreate(IBaseResource theResource, RequestDetails theRequestDetails, RequestPartitionId theRequestPartitionId) {
validateSubmittedSubscription(theResource, theRequestDetails, theRequestPartitionId);
}
@Hook(Pointcut.STORAGE_PRESTORAGE_RESOURCE_UPDATED)
public void resourcePreCreate(IBaseResource theOldResource, IBaseResource theResource, RequestDetails theRequestDetails) {
validateSubmittedSubscription(theResource, theRequestDetails);
public void resourcePreCreate(IBaseResource theOldResource, IBaseResource theResource, RequestDetails theRequestDetails, RequestPartitionId theRequestPartitionId) {
validateSubmittedSubscription(theResource, theRequestDetails, theRequestPartitionId);
}
@VisibleForTesting
public void setFhirContextForUnitTest(FhirContext theFhirContext) {
@Autowired
public void setFhirContext(FhirContext theFhirContext) {
myFhirContext = theFhirContext;
}
@Deprecated
public void validateSubmittedSubscription(IBaseResource theSubscription) {
validateSubmittedSubscription(theSubscription, null);
validateSubmittedSubscription(theSubscription, null, null);
}
public void validateSubmittedSubscription(IBaseResource theSubscription, RequestDetails theRequestDetails) {
public void validateSubmittedSubscription(IBaseResource theSubscription,
RequestDetails theRequestDetails,
RequestPartitionId theRequestPartitionId) {
if (!"Subscription".equals(myFhirContext.getResourceType(theSubscription))) {
return;
}
@ -109,7 +111,7 @@ public class SubscriptionValidatingInterceptor {
break;
}
validatePermissions(theSubscription, subscription, theRequestDetails);
validatePermissions(theSubscription, subscription, theRequestDetails, theRequestPartitionId);
mySubscriptionCanonicalizer.setMatchingStrategyTag(theSubscription, null);
@ -140,14 +142,24 @@ public class SubscriptionValidatingInterceptor {
}
}
protected void validatePermissions(IBaseResource theSubscription, CanonicalSubscription theCanonicalSubscription, RequestDetails theRequestDetails) {
protected void validatePermissions(IBaseResource theSubscription,
CanonicalSubscription theCanonicalSubscription,
RequestDetails theRequestDetails,
RequestPartitionId theRequestPartitionId) {
// If the subscription has the cross partition tag
if (SubscriptionUtil.isCrossPartition(theSubscription) && !(theRequestDetails instanceof SystemRequestDetails)) {
if (!myDaoConfig.isCrossPartitionSubscription()){
if (!myDaoConfig.isCrossPartitionSubscriptionEnabled()){
throw new UnprocessableEntityException(Msg.code(2009) + "Cross partition subscription is not enabled on this server");
}
if (!determinePartition(theRequestDetails, theSubscription).isDefaultPartition()) {
// if we have a partition id already, we'll use that
// otherwise we might end up with READ and CREATE pointcuts
// returning conflicting partitions (say, all vs default)
RequestPartitionId toCheckPartitionId = theRequestPartitionId != null ?
theRequestPartitionId :
determinePartition(theRequestDetails, theSubscription);
if (!toCheckPartitionId.isDefaultPartition()) {
throw new UnprocessableEntityException(Msg.code(2010) + "Cross partition subscription must be created on the default partition");
}
}

View File

@ -38,6 +38,7 @@ import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.Mockito.atLeastOnce;
import static org.mockito.Mockito.times;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;
@ -402,9 +403,13 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri
SubscriptionRegistry mySubscriptionRegistry;
@Mock(answer = Answers.RETURNS_DEEP_STUBS)
ActiveSubscription myActiveSubscription;
@Mock(answer = Answers.RETURNS_DEEP_STUBS)
ActiveSubscription myNonDeleteSubscription;
@Mock
CanonicalSubscription myCanonicalSubscription;
@Mock
CanonicalSubscription myNonDeleteCanonicalSubscription;
@Mock
SubscriptionCriteriaParser.SubscriptionCriteria mySubscriptionCriteria;
@Test
@ -445,6 +450,31 @@ public class SubscriptionMatchingSubscriberTest extends BaseBlockingQueueSubscri
verify(myCanonicalSubscription, atLeastOnce()).getSendDeleteMessages();
}
@Test
public void testMultipleSubscriptionsDoNotEarlyReturn() {
ReflectionTestUtils.setField(subscriber, "myInterceptorBroadcaster", myInterceptorBroadcaster);
ReflectionTestUtils.setField(subscriber, "mySubscriptionRegistry", mySubscriptionRegistry);
when(message.getOperationType()).thenReturn(BaseResourceModifiedMessage.OperationTypeEnum.DELETE);
when(myInterceptorBroadcaster.callHooks(
eq(Pointcut.SUBSCRIPTION_BEFORE_PERSISTED_RESOURCE_CHECKED), any(HookParams.class))).thenReturn(true);
when(message.getPayloadId(null)).thenReturn(new IdDt("Patient", 123L));
when(myNonDeleteCanonicalSubscription.getSendDeleteMessages()).thenReturn(false);
when(mySubscriptionRegistry.getAll()).thenReturn(List.of(myNonDeleteSubscription ,myActiveSubscription));
when(myActiveSubscription.getSubscription()).thenReturn(myCanonicalSubscription);
when(myActiveSubscription.getCriteria()).thenReturn(mySubscriptionCriteria);
when(myActiveSubscription.getId()).thenReturn("Patient/123");
when(myNonDeleteSubscription.getSubscription()).thenReturn(myNonDeleteCanonicalSubscription);
when(myNonDeleteSubscription.getCriteria()).thenReturn(mySubscriptionCriteria);
when(myNonDeleteSubscription.getId()).thenReturn("Patient/123");
when(mySubscriptionCriteria.getType()).thenReturn(STARTYPE_EXPRESSION);
subscriber.matchActiveSubscriptionsAndDeliver(message);
verify(myNonDeleteCanonicalSubscription, times(1)).getSendDeleteMessages();
verify(myCanonicalSubscription, times(1)).getSendDeleteMessages();
}
@Test
public void matchActiveSubscriptionsAndDeliverSetsPartitionId() {
ReflectionTestUtils.setField(subscriber, "myInterceptorBroadcaster", myInterceptorBroadcaster);

View File

@ -6,7 +6,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>

View File

@ -57,11 +57,16 @@ public class Batch2JobHelper {
}
public JobInstance awaitJobCompletion(String theId) {
return awaitJobCompletion(theId, 10);
}
public JobInstance awaitJobCompletion(String theId, int theSecondsToWait) {
await()
.atMost(theSecondsToWait, TimeUnit.SECONDS)
.until(() -> {
myJobMaintenanceService.runMaintenancePass();
return myJobCoordinator.getInstance(theId).getStatus();
}, equalTo(StatusEnum.COMPLETED));
myJobMaintenanceService.runMaintenancePass();
return myJobCoordinator.getInstance(theId).getStatus();
}, equalTo(StatusEnum.COMPLETED));
return myJobCoordinator.getInstance(theId);
}

View File

@ -25,6 +25,7 @@ import ca.uhn.fhir.jpa.binary.api.IBinaryStorageSvc;
import ca.uhn.fhir.jpa.binstore.MemoryBinaryStorageSvcImpl;
import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.model.entity.ModelConfig;
import ca.uhn.fhir.jpa.searchparam.submit.config.SearchParamSubmitterConfig;
import ca.uhn.fhir.jpa.subscription.channel.config.SubscriptionChannelConfig;
import ca.uhn.fhir.jpa.subscription.match.config.SubscriptionProcessorConfig;
import ca.uhn.fhir.jpa.subscription.match.deliver.resthook.SubscriptionDeliveringRestHookSubscriber;
@ -47,7 +48,8 @@ import javax.persistence.EntityManagerFactory;
@Import({
SubscriptionSubmitterConfig.class,
SubscriptionProcessorConfig.class,
SubscriptionChannelConfig.class
SubscriptionChannelConfig.class,
SearchParamSubmitterConfig.class
})
public class TestJPAConfig {
@Bean

View File

@ -0,0 +1,116 @@
package ca.uhn.fhir.jpa.batch2;
import ca.uhn.fhir.batch2.api.IJobPersistence;
import ca.uhn.fhir.batch2.model.FetchJobInstancesRequest;
import ca.uhn.fhir.batch2.model.JobInstance;
import ca.uhn.fhir.batch2.model.StatusEnum;
import ca.uhn.fhir.jpa.dao.data.IBatch2JobInstanceRepository;
import ca.uhn.fhir.jpa.entity.Batch2JobInstanceEntity;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.domain.PageRequest;
import java.util.Date;
import java.util.List;
import java.util.Set;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.hasSize;
public class JobInstanceRepositoryTest extends BaseJpaR4Test {
@Autowired
private IBatch2JobInstanceRepository myJobInstanceRepository;
@Autowired
private IJobPersistence myJobPersistenceSvc;
private String myParams = "{\"param1\":\"value1\"}";
private String myJobDefinitionId = "my-job-def-id";
private String myInstanceId = "abc-123";
@Test
public void testSearchByJobParamsAndStatuses_SingleStatus() {
Set<StatusEnum> statuses = Set.of(StatusEnum.IN_PROGRESS);
List<Batch2JobInstanceEntity> instancesByJobIdParamsAndStatus = myJobInstanceRepository.findInstancesByJobIdParamsAndStatus(myJobDefinitionId, myParams, statuses, PageRequest.of(0, 10));
assertThat(instancesByJobIdParamsAndStatus, hasSize(1));
}
@Test
public void testSearchByJobParamsAndStatuses_MultiStatus() {
Set<StatusEnum> statuses = Set.of(StatusEnum.IN_PROGRESS, StatusEnum.COMPLETED);
List<Batch2JobInstanceEntity> instances = myJobInstanceRepository.findInstancesByJobIdParamsAndStatus(myJobDefinitionId, myParams, statuses, PageRequest.of(0, 10));
assertThat(instances, hasSize(2));
}
@Test
public void testSearchByJobParamsWithoutStatuses() {
List<Batch2JobInstanceEntity> instances = myJobInstanceRepository.findInstancesByJobIdAndParams(myJobDefinitionId, myParams, PageRequest.of(0, 10));
assertThat(instances, hasSize(4));
}
@Test
public void testServiceLogicIsCorrectWhenNoStatusesAreUsed() {
FetchJobInstancesRequest request = new FetchJobInstancesRequest(myJobDefinitionId, myParams);
List<JobInstance> jobInstances = myJobPersistenceSvc.fetchInstances(request, 0, 1000);
assertThat(jobInstances, hasSize(4));
}
@Test
public void testServiceLogicIsCorrectWithStatuses() {
//Given
FetchJobInstancesRequest request = new FetchJobInstancesRequest(myJobDefinitionId, myParams);
request.addStatus(StatusEnum.IN_PROGRESS);
request.addStatus(StatusEnum.COMPLETED);
//When
List<JobInstance> jobInstances = myJobPersistenceSvc.fetchInstances(request, 0, 1000);
//Then
assertThat(jobInstances, hasSize(2));
}
@BeforeEach
private void beforeEach() {
//Create in-progress job.
Batch2JobInstanceEntity instance= new Batch2JobInstanceEntity();
instance.setId(myInstanceId);
instance.setStatus(StatusEnum.IN_PROGRESS);
instance.setCreateTime(new Date());
instance.setDefinitionId(myJobDefinitionId);
instance.setParams(myParams);
myJobInstanceRepository.save(instance);
Batch2JobInstanceEntity completedInstance = new Batch2JobInstanceEntity();
completedInstance.setId(myInstanceId + "-2");
completedInstance.setStatus(StatusEnum.COMPLETED);
completedInstance.setCreateTime(new Date());
completedInstance.setDefinitionId(myJobDefinitionId);
completedInstance.setParams(myParams);
myJobInstanceRepository.save(completedInstance);
Batch2JobInstanceEntity cancelledInstance = new Batch2JobInstanceEntity();
cancelledInstance.setId(myInstanceId + "-3");
cancelledInstance.setStatus(StatusEnum.CANCELLED);
cancelledInstance.setCreateTime(new Date());
cancelledInstance.setDefinitionId(myJobDefinitionId);
cancelledInstance.setParams(myParams);
myJobInstanceRepository.save(cancelledInstance);
Batch2JobInstanceEntity failedInstance = new Batch2JobInstanceEntity();
failedInstance.setId(myInstanceId + "-4");
failedInstance.setStatus(StatusEnum.FAILED);
failedInstance.setCreateTime(new Date());
failedInstance.setDefinitionId(myJobDefinitionId);
failedInstance.setParams(myParams);
myJobInstanceRepository.save(failedInstance);
}
@AfterEach
public void afterEach() {
myJobInstanceRepository.deleteAll();
}
}

View File

@ -655,4 +655,41 @@ public class BulkDataExportProviderTest {
assertThat(parameters.isUseExistingJobsFirst(), is(equalTo(false)));
}
@Test
public void testProviderReturnsSameIdForSameJob() throws IOException {
// given
Batch2JobStartResponse startResponse = createJobStartResponse();
startResponse.setUsesCachedResult(true);
startResponse.setJobId(A_JOB_ID);
when(myJobRunner.startNewJob(any(Batch2BaseJobParameters.class)))
.thenReturn(startResponse);
// when
Parameters input = new Parameters();
input.addParameter(JpaConstants.PARAM_EXPORT_OUTPUT_FORMAT, new StringType(Constants.CT_FHIR_NDJSON));
input.addParameter(JpaConstants.PARAM_EXPORT_TYPE, new StringType("Patient, Practitioner"));
// then
callExportAndAssertJobId(input, A_JOB_ID);
callExportAndAssertJobId(input, A_JOB_ID);
}
private void callExportAndAssertJobId(Parameters input, String theExpectedJobId) throws IOException {
HttpPost post;
post = new HttpPost("http://localhost:" + myPort + "/" + JpaConstants.OPERATION_EXPORT);
post.addHeader(Constants.HEADER_PREFER, Constants.HEADER_PREFER_RESPOND_ASYNC);
post.addHeader(Constants.HEADER_CACHE_CONTROL, Constants.CACHE_CONTROL_NO_CACHE);
post.setEntity(new ResourceEntity(myCtx, input));
ourLog.info("Request: {}", post);
try (CloseableHttpResponse response = myClient.execute(post)) {
ourLog.info("Response: {}", response.toString());
assertEquals(202, response.getStatusLine().getStatusCode());
assertEquals("Accepted", response.getStatusLine().getReasonPhrase());
assertEquals("http://localhost:" + myPort + "/$export-poll-status?_jobId=" + theExpectedJobId, response.getFirstHeader(Constants.HEADER_CONTENT_LOCATION).getValue());
}
}
}

View File

@ -1,5 +1,10 @@
package ca.uhn.fhir.jpa.dao;
import ca.uhn.fhir.batch2.api.IJobCoordinator;
import ca.uhn.fhir.batch2.jobs.parameters.PartitionedUrl;
import ca.uhn.fhir.batch2.jobs.parameters.UrlPartitioner;
import ca.uhn.fhir.batch2.jobs.reindex.ReindexJobParameters;
import ca.uhn.fhir.batch2.model.JobInstanceStartRequest;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
@ -17,6 +22,8 @@ import ca.uhn.fhir.rest.api.server.storage.TransactionDetails;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import ca.uhn.fhir.rest.server.util.ISearchParamRegistry;
import com.google.common.collect.Lists;
import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.IdType;
import org.hl7.fhir.r4.model.Patient;
@ -24,6 +31,7 @@ import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.ArgumentCaptor;
import org.mockito.InjectMocks;
import org.mockito.Mock;
import org.mockito.Mockito;
@ -31,7 +39,10 @@ import org.mockito.junit.jupiter.MockitoExtension;
import javax.persistence.EntityManager;
import java.util.List;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.fail;
@ExtendWith(MockitoExtension.class)
@ -49,6 +60,15 @@ class BaseHapiFhirResourceDaoTest {
@Mock
private DaoConfig myConfig;
@Mock
private IJobCoordinator myJobCoordinator;
@Mock
private UrlPartitioner myUrlPartitioner;
@Mock
private ISearchParamRegistry mySearchParamRegistry;
// we won't inject this
private FhirContext myFhirContext = FhirContext.forR4Cached();
@ -166,6 +186,48 @@ class BaseHapiFhirResourceDaoTest {
Assertions.assertEquals(id.getValue(), outcome.getId().getValue());
}
@Test
public void requestReindexForRelatedResources_withValidBase_includesUrlsInJobParameters() {
List<String> base = Lists.newArrayList("Patient", "Group");
Mockito.when(myUrlPartitioner.partitionUrl(Mockito.any(), Mockito.any())).thenAnswer(i -> {
PartitionedUrl partitionedUrl = new PartitionedUrl();
partitionedUrl.setUrl(i.getArgument(0));
return partitionedUrl;
});
mySvc.requestReindexForRelatedResources(false, base, new ServletRequestDetails());
ArgumentCaptor<JobInstanceStartRequest> requestCaptor = ArgumentCaptor.forClass(JobInstanceStartRequest.class);
Mockito.verify(myJobCoordinator).startInstance(requestCaptor.capture());
JobInstanceStartRequest actualRequest = requestCaptor.getValue();
assertNotNull(actualRequest);
assertNotNull(actualRequest.getParameters());
ReindexJobParameters actualParameters = actualRequest.getParameters(ReindexJobParameters.class);
assertEquals(2, actualParameters.getPartitionedUrls().size());
assertEquals("Patient?", actualParameters.getPartitionedUrls().get(0).getUrl());
assertEquals("Group?", actualParameters.getPartitionedUrls().get(1).getUrl());
}
@Test
public void requestReindexForRelatedResources_withSpecialBaseResource_doesNotIncludeUrlsInJobParameters() {
List<String> base = Lists.newArrayList("Resource");
mySvc.requestReindexForRelatedResources(false, base, new ServletRequestDetails());
ArgumentCaptor<JobInstanceStartRequest> requestCaptor = ArgumentCaptor.forClass(JobInstanceStartRequest.class);
Mockito.verify(myJobCoordinator).startInstance(requestCaptor.capture());
JobInstanceStartRequest actualRequest = requestCaptor.getValue();
assertNotNull(actualRequest);
assertNotNull(actualRequest.getParameters());
ReindexJobParameters actualParameters = actualRequest.getParameters(ReindexJobParameters.class);
assertEquals(0, actualParameters.getPartitionedUrls().size());
}
static class TestResourceDao extends BaseHapiFhirResourceDao<Patient> {
private final DaoConfig myDaoConfig = new DaoConfig();

View File

@ -122,7 +122,6 @@ public class TransactionProcessorTest {
}
}
@Configuration
public static class MyConfig {

View File

@ -37,6 +37,7 @@ import ca.uhn.fhir.rest.param.TokenParam;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import org.hl7.fhir.instance.model.api.IIdType;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;

View File

@ -4,9 +4,11 @@ import ca.uhn.fhir.context.RuntimeResourceDefinition;
import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.api.HookParams;
import ca.uhn.fhir.interceptor.api.IAnonymousInterceptor;
import ca.uhn.fhir.interceptor.api.IInterceptorService;
import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.jpa.api.config.DaoConfig;
import ca.uhn.fhir.jpa.entity.Search;
import ca.uhn.fhir.jpa.interceptor.ForceOffsetSearchModeInterceptor;
import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.model.entity.ModelConfig;
import ca.uhn.fhir.jpa.model.entity.NormalizedQuantitySearchLevel;
@ -201,6 +203,8 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(FhirResourceDaoR4SearchNoFtTest.class);
@Autowired
MatchUrlService myMatchUrlService;
@Autowired
IInterceptorService myInterceptorService;
@AfterEach
public void afterResetSearchSize() {
@ -252,6 +256,20 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
}
@Test
public void testRebuildSearchParamRegistryWithOffsetSearches() {
ForceOffsetSearchModeInterceptor forceOffsetSearchModeInterceptor = new ForceOffsetSearchModeInterceptor();
forceOffsetSearchModeInterceptor.setDefaultCount(1);
try {
myInterceptorService.registerInterceptor(forceOffsetSearchModeInterceptor);
createObservationIssueSearchParameter();
mySearchParamRegistry.forceRefresh();
} finally {
myInterceptorService.unregisterInterceptor(forceOffsetSearchModeInterceptor);
}
}
@Test
public void testSearchInExistingTransaction() {
createPatient(withBirthdate("2021-01-01"));
@ -4380,16 +4398,8 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
// Add a search parameter to Observation.issued, so that between that one
// and the existing one on Observation.effective, we have 2 date search parameters
// on the same resource
{
SearchParameter sp = new SearchParameter();
sp.setStatus(Enumerations.PublicationStatus.ACTIVE);
sp.addBase("Observation");
sp.setType(Enumerations.SearchParamType.DATE);
sp.setCode("issued");
sp.setExpression("Observation.issued");
mySearchParameterDao.create(sp);
mySearchParamRegistry.forceRefresh();
}
createObservationIssueSearchParameter();
mySearchParamRegistry.forceRefresh();
// Dates are reversed on these two observations
IIdType obsId1;
@ -4474,6 +4484,16 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
}
private void createObservationIssueSearchParameter() {
SearchParameter sp = new SearchParameter();
sp.setStatus(Enumerations.PublicationStatus.ACTIVE);
sp.addBase("Observation");
sp.setType(Enumerations.SearchParamType.DATE);
sp.setCode("issued");
sp.setExpression("Observation.issued");
mySearchParameterDao.create(sp);
}
@Test
public void testSearchWithFetchSizeDefaultMaximum() {
myDaoConfig.setFetchSizeDefaultMaximum(5);

View File

@ -4528,4 +4528,68 @@ public class FhirSystemDaoR4Test extends BaseJpaR4SystemTest {
assertEquals("3", id.getVersionIdPart());
}
@Test
public void testTransactionWithDuplicateConditionalCreateAndDelete() {
Bundle request = new Bundle();
request.setType(Bundle.BundleType.TRANSACTION);
String duplicateUrl = "/Patient";
String duplicateIfNoneExist = "identifier=http://acme.org/mrns|12345";
String firstPatientFullUrl = "urn:uuid:3ac4fde3-089d-4a2d-829b-f3ef68cae371";
String secondPatientFullUrl = "urn:uuid:2ab44de3-019d-432d-829b-f3ee08cae395";
Patient firstPatient = new Patient();
Identifier firstPatientIdentifier = new Identifier();
firstPatientIdentifier.setSystem("http://acme.org/mrns");
firstPatientIdentifier.setValue("12345");
firstPatient.setIdentifier(List.of(firstPatientIdentifier));
request
.addEntry()
.setResource(firstPatient)
.setFullUrl(firstPatientFullUrl)
.getRequest()
.setMethod(Bundle.HTTPVerb.POST)
.setUrl(duplicateUrl)
.setIfNoneExist(duplicateIfNoneExist);
Patient secondPatient = new Patient();
Identifier secondPatientIdentifier = new Identifier();
secondPatientIdentifier.setSystem("http://acme.org/mrns");
secondPatientIdentifier.setValue("12346");
secondPatient.setIdentifier(List.of(secondPatientIdentifier));
request
.addEntry()
.setResource(secondPatient)
.setFullUrl(secondPatientFullUrl)
.getRequest()
.setMethod(Bundle.HTTPVerb.POST)
.setUrl(duplicateUrl)
.setIfNoneExist(duplicateIfNoneExist);
String deleteUrl = "Observation?patient=urn:uuid:2ac34de3-069d-442d-829b-f3ef68cae371";
request
.addEntry()
.getRequest()
.setMethod(Bundle.HTTPVerb.DELETE)
.setUrl(deleteUrl);
Bundle resp = mySystemDao.transaction(mySrd, request);
ourLog.info(myFhirContext.newXmlParser().setPrettyPrint(true).encodeResourceToString(resp));
int expectedEntries = 2;
assertEquals(BundleType.TRANSACTIONRESPONSE, resp.getTypeElement().getValue());
assertEquals(expectedEntries, resp.getEntry().size());
String status201 = "201 Created";
String status204 = "204 No Content";
assertEquals(expectedEntries, resp.getEntry().size());
assertEquals(status201, resp.getEntry().get(0).getResponse().getStatus());
assertEquals(status204, resp.getEntry().get(1).getResponse().getStatus());
}
}

View File

@ -4,6 +4,7 @@ import ca.uhn.fhir.batch2.api.IJobCoordinator;
import ca.uhn.fhir.batch2.jobs.expunge.DeleteExpungeAppCtx;
import ca.uhn.fhir.batch2.jobs.expunge.DeleteExpungeJobParameters;
import ca.uhn.fhir.batch2.model.JobInstanceStartRequest;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
@ -69,7 +70,21 @@ public class DeleteExpungeJobTest extends BaseJpaR4Test {
// validate
assertEquals(1, myObservationDao.search(SearchParameterMap.newSynchronous()).size());
assertDocumentCountMatchesResourceCount(myObservationDao);
assertEquals(1, myDiagnosticReportDao.search(SearchParameterMap.newSynchronous()).size());
assertDocumentCountMatchesResourceCount(myDiagnosticReportDao);
assertEquals(2, myPatientDao.search(SearchParameterMap.newSynchronous()).size());
assertDocumentCountMatchesResourceCount(myPatientDao);
}
public void assertDocumentCountMatchesResourceCount(IFhirResourceDao dao) {
String resourceType = myFhirContext.getResourceType(dao.getResourceType());
long resourceCount = dao.search(new SearchParameterMap().setLoadSynchronous(true)).size();
runInTransaction(() -> {
assertEquals(resourceCount, myFulltestSearchSvc.count(resourceType, new SearchParameterMap()));
});
}
}

View File

@ -4,8 +4,11 @@ import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.api.Hook;
import ca.uhn.fhir.interceptor.api.Interceptor;
import ca.uhn.fhir.interceptor.api.Pointcut;
import ca.uhn.fhir.interceptor.model.ReadPartitionIdRequestDetails;
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.api.config.DaoConfig;
import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.api.model.DaoMethodOutcome;
import ca.uhn.fhir.jpa.dao.r4.BaseJpaR4SystemTest;
import ca.uhn.fhir.jpa.entity.PartitionEntity;
import ca.uhn.fhir.jpa.interceptor.ex.PartitionInterceptorReadAllPartitions;
@ -15,42 +18,44 @@ import ca.uhn.fhir.jpa.model.entity.ResourceTable;
import ca.uhn.fhir.jpa.partition.IPartitionLookupSvc;
import ca.uhn.fhir.jpa.partition.SystemRequestDetails;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.rest.api.RestOperationTypeEnum;
import ca.uhn.fhir.rest.api.server.IBundleProvider;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import ca.uhn.fhir.util.ExtensionUtil;
import ca.uhn.fhir.util.HapiExtensions;
import com.google.common.collect.Sets;
import org.apache.commons.lang3.StringUtils;
import org.apache.commons.lang3.Validate;
import org.hamcrest.Matchers;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.BooleanType;
import org.hl7.fhir.r4.model.StructureDefinition;
import org.hl7.fhir.r4.model.Subscription;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.mock.web.MockHttpServletRequest;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import java.time.LocalDate;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.function.Consumer;
import static ca.uhn.fhir.jpa.dao.r4.PartitioningSqlR4Test.assertLocalDateFromDbMatches;
import static org.apache.commons.lang3.StringUtils.countMatches;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.in;
import static org.hamcrest.Matchers.matchesPattern;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertNull;
import static org.junit.jupiter.api.Assertions.fail;
import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;
@SuppressWarnings("unchecked")
@ -72,6 +77,8 @@ public class PartitioningInterceptorR4Test extends BaseJpaR4SystemTest {
myInterceptorRegistry.unregisterInterceptor(myPartitionInterceptor);
myDaoConfig.setIndexMissingFields(new DaoConfig().getIndexMissingFields());
myInterceptorRegistry.unregisterAllInterceptors();
}
@BeforeEach
@ -88,6 +95,56 @@ public class PartitioningInterceptorR4Test extends BaseJpaR4SystemTest {
myDaoConfig.setIndexMissingFields(DaoConfig.IndexEnabledEnum.ENABLED);
}
@Test
public void testCrossPartitionUpdate() {
// setup
String id = "RED";
ServletRequestDetails dets = new ServletRequestDetails();
dets.setRestOperationType(RestOperationTypeEnum.UPDATE);
dets.setServletRequest(new MockHttpServletRequest());
AtomicInteger readIndex = new AtomicInteger();
AtomicInteger writeIndex = new AtomicInteger();
Subscription subscription = new Subscription();
subscription.setId("Subscription/" + id);
subscription.setStatus(Subscription.SubscriptionStatus.ACTIVE);
subscription.addExtension(
HapiExtensions.EXTENSION_SUBSCRIPTION_CROSS_PARTITION,
new BooleanType(true)
);
subscription.setCriteria("[*]");
Subscription.SubscriptionChannelComponent subscriptionChannelComponent =
new Subscription.SubscriptionChannelComponent()
.setType(Subscription.SubscriptionChannelType.RESTHOOK)
.setEndpoint("https://tinyurl.com/2p95e27r");
subscription.setChannel(subscriptionChannelComponent);
// set up partitioning for subscriptions
myDaoConfig.setCrossPartitionSubscription(true);
// register interceptors that return different partition ids
MySubscriptionReadInterceptor readInterceptor = new MySubscriptionReadInterceptor();
MySubscriptionWriteInterceptor writeInterceptor = new MySubscriptionWriteInterceptor();
myInterceptorRegistry.unregisterInterceptor(myPartitionInterceptor);
readInterceptor.setObjectConsumer((obj) -> {
readIndex.getAndIncrement();
});
writeInterceptor.setObjectConsumer((ojb) -> {
writeIndex.getAndIncrement();
});
myInterceptorRegistry.registerInterceptor(readInterceptor);
myInterceptorRegistry.registerInterceptor(writeInterceptor);
// run test
IFhirResourceDao<Subscription> dao = myDaoRegistry.getResourceDao(Subscription.class);
DaoMethodOutcome outcome = dao.update(subscription, dets);
// verify
assertNotNull(outcome);
assertEquals(id, outcome.getResource().getIdElement().getIdPart());
assertEquals(0, readIndex.get()); // should be no read interactions
assertEquals(1, writeIndex.get());
}
@Test
public void testCreateNonPartionableResourceWithPartitionDate() {
@ -260,4 +317,41 @@ public class PartitioningInterceptorR4Test extends BaseJpaR4SystemTest {
}
@Interceptor
public static class MySubscriptionReadInterceptor {
private Consumer<Object> myObjectConsumer;
public void setObjectConsumer(Consumer<Object> theConsumer) {
myObjectConsumer = theConsumer;
}
@Hook(Pointcut.STORAGE_PARTITION_IDENTIFY_READ)
public RequestPartitionId identifyForRead(ReadPartitionIdRequestDetails theReadDetails, RequestDetails theRequestDetails) {
if (myObjectConsumer != null) {
myObjectConsumer.accept(theReadDetails);
}
return RequestPartitionId.allPartitions();
}
}
@Interceptor
public static class MySubscriptionWriteInterceptor {
private Consumer<Object> myObjectConsumer;
public void setObjectConsumer(Consumer<Object> theConsumer) {
myObjectConsumer = theConsumer;
}
@Hook(Pointcut.STORAGE_PARTITION_IDENTIFY_CREATE)
public RequestPartitionId PartitionIdentifyCreate(IBaseResource theResource, ServletRequestDetails theRequestDetails) {
assertNotNull(theResource);
if (myObjectConsumer != null) {
myObjectConsumer.accept(theResource);
}
// doesn't matter; just not allPartitions
return RequestPartitionId.defaultPartition(LocalDate.of(2021, 2, 22));
}
}
}

View File

@ -9,6 +9,7 @@ import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.dao.data.INpmPackageDao;
import ca.uhn.fhir.jpa.dao.data.INpmPackageVersionDao;
import ca.uhn.fhir.jpa.dao.data.INpmPackageVersionResourceDao;
import ca.uhn.fhir.jpa.partition.SystemRequestDetails;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.model.entity.NpmPackageEntity;
@ -749,7 +750,6 @@ public class NpmR4Test extends BaseJpaR4Test {
NpmPackageVersionEntity versionEntity = myPackageVersionDao.findByPackageIdAndVersion("hl7.fhir.uv.shorthand", "0.12.0").orElseThrow(() -> new IllegalArgumentException());
assertEquals(true, versionEntity.isCurrentVersion());
});
}
@Test
@ -766,7 +766,7 @@ public class NpmR4Test extends BaseJpaR4Test {
PackageInstallationSpec spec = new PackageInstallationSpec().setName("test-exchange.fhir.us.com").setVersion("2.1.1").setInstallMode(PackageInstallationSpec.InstallModeEnum.STORE_AND_INSTALL);
myPackageInstallerSvc.install(spec);
IBundleProvider spSearch = mySearchParameterDao.search(SearchParameterMap.newSynchronous("code", new TokenParam("network-id")));
IBundleProvider spSearch = mySearchParameterDao.search(SearchParameterMap.newSynchronous("code", new TokenParam("network-id")), new SystemRequestDetails());
assertEquals(1, spSearch.sizeOrThrowNpe());
SearchParameter sp = (SearchParameter) spSearch.getResources(0, 1).get(0);
assertEquals("network-id", sp.getCode());
@ -782,7 +782,8 @@ public class NpmR4Test extends BaseJpaR4Test {
myPractitionerRoleDao.create(pr);
SearchParameterMap map = SearchParameterMap.newSynchronous("network-id", new ReferenceParam(orgId.getValue()));
spSearch = myPractitionerRoleDao.search(map);
spSearch = myPractitionerRoleDao.search(map, new SystemRequestDetails());
assertEquals(1, spSearch.sizeOrThrowNpe());
// Install newer version

View File

@ -9,10 +9,13 @@ import ca.uhn.fhir.jpa.api.config.DaoConfig;
import ca.uhn.fhir.jpa.interceptor.PerformanceTracingLoggingInterceptor;
import ca.uhn.fhir.jpa.model.search.SearchRuntimeDetails;
import ca.uhn.fhir.jpa.model.search.SearchStatusEnum;
import ca.uhn.fhir.jpa.searchparam.submit.interceptor.SearchParamValidatingInterceptor;
import ca.uhn.fhir.parser.IParser;
import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.MethodOutcome;
import ca.uhn.fhir.rest.api.RestOperationTypeEnum;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
import ca.uhn.fhir.rest.server.interceptor.ServerOperationInterceptorAdapter;
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
import com.google.common.collect.Lists;
@ -27,15 +30,19 @@ import org.hl7.fhir.r4.model.Bundle;
import org.hl7.fhir.r4.model.Bundle.BundleEntryComponent;
import org.hl7.fhir.r4.model.Bundle.BundleType;
import org.hl7.fhir.r4.model.Bundle.HTTPVerb;
import org.hl7.fhir.r4.model.CodeType;
import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.Observation;
import org.hl7.fhir.r4.model.Organization;
import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.Reference;
import org.hl7.fhir.r4.model.SearchParameter;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.Test;
import org.mockito.ArgumentCaptor;
import org.mockito.Captor;
import org.mockito.Mock;
import org.springframework.beans.factory.annotation.Autowired;
import javax.servlet.ServletException;
import java.io.IOException;
@ -43,12 +50,15 @@ import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.List;
import static java.util.Arrays.asList;
import static org.apache.commons.lang3.time.DateUtils.MILLIS_PER_SECOND;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.startsWith;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertTrue;
import static org.junit.jupiter.api.Assertions.fail;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.eq;
import static org.mockito.Mockito.mock;
@ -66,6 +76,9 @@ public class ResourceProviderInterceptorR4Test extends BaseResourceProviderR4Tes
@Captor
private ArgumentCaptor<HookParams> myParamsCaptor;
@Autowired
SearchParamValidatingInterceptor mySearchParamValidatingInterceptor;
@Override
@AfterEach
public void after() throws Exception {
@ -402,6 +415,138 @@ public class ResourceProviderInterceptorR4Test extends BaseResourceProviderR4Tes
ourRestServer.unregisterInterceptor(interceptor);
}
}
@Test
public void testSearchParamValidatingInterceptorNotAllowingOverlappingOnCreate(){
registerSearchParameterValidatingInterceptor();
// let's make sure we don't already have a matching SearchParameter
Bundle bundle = myClient
.search()
.byUrl("SearchParameter?base=AllergyIntolerance&code=patient")
.returnBundle(Bundle.class)
.execute();
assertTrue(bundle.getEntry().isEmpty());
SearchParameter searchParameter = createSearchParameter();
// now, create a SearchParameter
MethodOutcome methodOutcome = myClient
.create()
.resource(searchParameter)
.execute();
assertTrue(methodOutcome.getCreated());
// attempting to create an overlapping SearchParameter should fail
try {
methodOutcome = myClient
.create()
.resource(searchParameter)
.execute();
fail();
}catch (UnprocessableEntityException e){
// all is good
assertTrue(e.getMessage().contains("2131"));
}
}
@Test
public void testSearchParamValidatingInterceptorAllowsResourceUpdate(){
registerSearchParameterValidatingInterceptor();
SearchParameter searchParameter = createSearchParameter();
// now, create a SearchParameter
MethodOutcome methodOutcome = myClient
.create()
.resource(searchParameter)
.execute();
assertTrue(methodOutcome.getCreated());
SearchParameter createdSearchParameter = (SearchParameter) methodOutcome.getResource();
createdSearchParameter.setUrl("newUrl");
methodOutcome = myClient
.update()
.resource(createdSearchParameter)
.execute();
assertNotNull(methodOutcome);
}
@Test
public void testSearchParamValidationOnUpdateWithClientAssignedId() {
registerSearchParameterValidatingInterceptor();
SearchParameter searchParameter = createSearchParameter();
searchParameter.setId("my-custom-id");
// now, create a SearchParameter
MethodOutcome methodOutcome = myClient
.update()
.resource(searchParameter)
.execute();
assertTrue(methodOutcome.getCreated());
SearchParameter createdSearchParameter = (SearchParameter) methodOutcome.getResource();
createdSearchParameter.setUrl("newUrl");
methodOutcome = myClient
.update()
.resource(createdSearchParameter)
.execute();
assertNotNull(methodOutcome);
}
@Test
public void testSearchParamValidatingInterceptorNotAllowingOverlappingOnCreateWithUpdate(){
registerSearchParameterValidatingInterceptor();
String defaultSearchParamId = "clinical-patient";
String overlappingSearchParamId = "allergyintolerance-patient";
SearchParameter defaultSearchParameter = createSearchParameter();
defaultSearchParameter.setId(defaultSearchParamId);
// now, create a SearchParameter with a PUT operation
MethodOutcome methodOutcome = myClient
.update(defaultSearchParamId, defaultSearchParameter);
assertTrue(methodOutcome.getCreated());
SearchParameter overlappingSearchParameter = createSearchParameter();
overlappingSearchParameter.setId(overlappingSearchParamId);
overlappingSearchParameter.setBase(asList(new CodeType("AllergyIntolerance")));
try {
myClient.update(overlappingSearchParamId, overlappingSearchParameter);
fail();
} catch (UnprocessableEntityException e){
// this is good
assertTrue(e.getMessage().contains("2131"));
}
}
private void registerSearchParameterValidatingInterceptor(){
ourRestServer.getInterceptorService().registerInterceptor(mySearchParamValidatingInterceptor);
}
private SearchParameter createSearchParameter(){
SearchParameter retVal = new SearchParameter()
.setUrl("http://hl7.org/fhir/SearchParameter/clinical-patient")
.setStatus(Enumerations.PublicationStatus.ACTIVE)
.setDescription("someDescription")
.setCode("patient")
.addBase("DiagnosticReport").addBase("ImagingStudy").addBase("DocumentManifest").addBase("AllergyIntolerance")
.setType(Enumerations.SearchParamType.REFERENCE)
.setExpression("someExpression");
return retVal;
}

View File

@ -9,6 +9,7 @@ import ca.uhn.fhir.jpa.model.entity.ResourceHistoryTable;
import ca.uhn.fhir.jpa.model.util.JpaConstants;
import ca.uhn.fhir.jpa.model.util.UcumServiceUtil;
import ca.uhn.fhir.jpa.search.SearchCoordinatorSvcImpl;
import ca.uhn.fhir.jpa.term.ZipCollectionBuilder;
import ca.uhn.fhir.jpa.test.config.TestR4Config;
import ca.uhn.fhir.model.api.TemporalPrecisionEnum;
import ca.uhn.fhir.model.primitive.IdDt;
@ -103,7 +104,6 @@ import org.hl7.fhir.r4.model.ImagingStudy;
import org.hl7.fhir.r4.model.InstantType;
import org.hl7.fhir.r4.model.IntegerType;
import org.hl7.fhir.r4.model.Location;
import org.hl7.fhir.r4.model.MeasureReport;
import org.hl7.fhir.r4.model.Media;
import org.hl7.fhir.r4.model.Medication;
import org.hl7.fhir.r4.model.MedicationAdministration;
@ -151,6 +151,7 @@ import org.springframework.transaction.support.TransactionCallbackWithoutResult;
import org.springframework.transaction.support.TransactionTemplate;
import javax.annotation.Nonnull;
import javax.sql.DataSource;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
@ -175,7 +176,6 @@ import static ca.uhn.fhir.rest.param.BaseParamWithPrefix.MSG_PREFIX_INVALID_FORM
import static ca.uhn.fhir.util.TestUtil.sleepAtLeast;
import static org.apache.commons.lang3.StringUtils.isNotBlank;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.allOf;
import static org.hamcrest.Matchers.contains;
import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.containsInRelativeOrder;
@ -183,10 +183,12 @@ import static org.hamcrest.Matchers.containsString;
import static org.hamcrest.Matchers.empty;
import static org.hamcrest.Matchers.emptyString;
import static org.hamcrest.Matchers.endsWith;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.greaterThan;
import static org.hamcrest.Matchers.hasItem;
import static org.hamcrest.Matchers.hasItems;
import static org.hamcrest.Matchers.hasSize;
import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.lessThan;
import static org.hamcrest.Matchers.lessThanOrEqualTo;
import static org.hamcrest.Matchers.matchesPattern;
@ -3990,8 +3992,46 @@ public class ResourceProviderR4Test extends BaseResourceProviderR4Test {
assertThat(text, containsString("\"OBS1\""));
assertThat(text, not(containsString("\"OBS2\"")));
}
}
@Test
public void testCodeInWithLargeValueSet() throws IOException {
//Given: We load a large codesystem
myDaoConfig.setMaximumExpansionSize(1000);
ZipCollectionBuilder zipCollectionBuilder = new ZipCollectionBuilder();
zipCollectionBuilder.addFileZip("/largecodesystem/", "concepts.csv");
zipCollectionBuilder.addFileZip("/largecodesystem/", "hierarchy.csv");
myTerminologyLoaderSvc.loadCustom("http://hl7.org/fhir/sid/icd-10", zipCollectionBuilder.getFiles(), mySrd);
myTerminologyDeferredStorageSvc.saveAllDeferred();
//And Given: We create two valuesets based on the CodeSystem, one with >1000 codes and one with <1000 codes
ValueSet valueSetOver1000 = loadResourceFromClasspath(ValueSet.class, "/largecodesystem/ValueSetV.json");
ValueSet valueSetUnder1000 = loadResourceFromClasspath(ValueSet.class, "/largecodesystem/ValueSetV1.json");
myClient.update().resource(valueSetOver1000).execute();
myClient.update().resource(valueSetUnder1000).execute();
myTermSvc.preExpandDeferredValueSetsToTerminologyTables();
//When: We create matching and non-matching observations for the valuesets
Observation matchingObs = loadResourceFromClasspath(Observation.class, "/largecodesystem/observation-matching.json");
Observation nonMatchingObs = loadResourceFromClasspath(Observation.class, "/largecodesystem/observation-non-matching.json");
myClient.update().resource(matchingObs).execute();
myClient.update().resource(nonMatchingObs).execute();
//Then: Results should return the same, regardless of count of concepts in the ValueSet
assertOneResult(myClient.search().byUrl("Observation?code:in=http://smilecdr.com/V1").returnBundle(Bundle.class).execute());
assertOneResult(myClient.search().byUrl("Observation?code:not-in=http://smilecdr.com/V1").returnBundle(Bundle.class).execute());
assertOneResult(myClient.search().byUrl("Observation?code:in=http://smilecdr.com/V").returnBundle(Bundle.class).execute());
assertOneResult(myClient.search().byUrl("Observation?code:not-in=http://smilecdr.com/V").returnBundle(Bundle.class).execute());
myDaoConfig.setMaximumExpansionSize(new DaoConfig().getMaximumExpansionSize());
}
private void assertOneResult(Bundle theResponse) {
assertThat(theResponse.getEntry().size(), is(equalTo(1)));
}
private void printResourceToConsole(IBaseResource theResource) {
ourLog.info(myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(theResource));
}
@Test

View File

@ -32,6 +32,8 @@ import org.hl7.fhir.r4.model.Medication;
import org.hl7.fhir.r4.model.MedicationRequest;
import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.Reference;
import org.hl7.fhir.r4.model.Resource;
import org.hl7.fhir.r4.model.Substance;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
@ -374,6 +376,34 @@ public class SystemProviderTransactionSearchR4Test extends BaseJpaR4Test {
}
}
@Test
public void testSearchIncludeSubstances() {
// Create substance
Substance res = new Substance();
res.addInstance().getQuantity().setSystem("http://foo").setCode("UNIT").setValue(123);
res.addIdentifier().setSystem("urn:oid:2.16.840.1.113883.3.7418.12.4").setValue("11");
mySubstanceDao.create(res, mySrd);
// Create medication
Medication m = new Medication();
m.addIdentifier().setSystem("urn:oid:2.16.840.1.113883.3.7418.12.3").setValue("1");
m.getCode().setText("valueb");
m.addIngredient().setItem(new Reference(res));
ourClient.create().resource(m).execute();
// Search request
Bundle input = new Bundle();
input.setType(BundleType.TRANSACTION);
input.setId("bundle-batch-test");
input.addEntry().getRequest().setMethod(HTTPVerb.GET)
.setUrl("/Medication?_include=Medication:ingredient");
Bundle output = ourClient.transaction().withBundle(input).execute();
ourLog.info(myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(output));
Resource resource = output.getEntry().get(0).getResource();
assertEquals(2, resource.getChildByName("entry").getValues().size());
}
private List<String> toIds(Bundle theRespBundle) {
ArrayList<String> retVal = new ArrayList<String>();
for (BundleEntryComponent next : theRespBundle.getEntry()) {

View File

@ -500,6 +500,11 @@ public class GiantTransactionPerfTest {
throw new UnsupportedOperationException();
}
@Override
public ResourceHistoryTable getReferenceById(Long theLong) {
throw new UnsupportedOperationException();
}
@Override
public <S extends ResourceHistoryTable> Optional<S> findOne(Example<S> example) {
return Optional.empty();

View File

@ -568,12 +568,55 @@ public class StressTestR4Test extends BaseResourceProviderR4Test {
validateNoErrors(tasks);
}
@Test
public void test_DeleteExpunge_withLargeBatchSizeManyResources() {
// setup
int batchSize = 1000;
myDaoConfig.setAllowMultipleDelete(true);
myDaoConfig.setExpungeEnabled(true);
myDaoConfig.setDeleteExpungeEnabled(true);
// create patients
for (int i = 0; i < batchSize; i++) {
Patient patient = new Patient();
patient.setId("tracer" + i);
patient.setActive(true);
MethodOutcome result = myClient.update().resource(patient).execute();
}
ourLog.info("Patients created");
// parameters
Parameters input = new Parameters();
input.addParameter(ProviderConstants.OPERATION_DELETE_EXPUNGE_URL, "Patient?active=true");
input.addParameter(ProviderConstants.OPERATION_DELETE_BATCH_SIZE, new DecimalType(batchSize));
// execute
myCaptureQueriesListener.clear();
Parameters response = myClient
.operation()
.onServer()
.named(ProviderConstants.OPERATION_DELETE_EXPUNGE)
.withParameters(input)
.execute();
ourLog.info(myFhirContext.newJsonParser().setPrettyPrint(true).encodeResourceToString(response));
String jobId = BatchHelperR4.jobIdFromBatch2Parameters(response);
myBatch2JobHelper.awaitJobCompletion(jobId, 60);
int deleteCount = myCaptureQueriesListener.getDeleteQueries().size();
myCaptureQueriesListener.logDeleteQueries();
assertThat(deleteCount, is(equalTo(88)));
}
@Test
public void testDeleteExpungeOperationOverLargeDataset() {
myDaoConfig.setAllowMultipleDelete(true);
myDaoConfig.setExpungeEnabled(true);
myDaoConfig.setDeleteExpungeEnabled(true);
// setup
// setup
Patient patient = new Patient();
patient.setId("tracer");
patient.setActive(true);
@ -584,7 +627,6 @@ public class StressTestR4Test extends BaseResourceProviderR4Test {
patient.getMeta().addTag().setSystem(UUID.randomUUID().toString()).setCode(UUID.randomUUID().toString());
result = myClient.update().resource(patient).execute();
Parameters input = new Parameters();
input.addParameter(ProviderConstants.OPERATION_DELETE_EXPUNGE_URL, "Patient?active=true");
int batchSize = 2;

View File

@ -57,7 +57,7 @@ public class SubscriptionValidatingInterceptorTest {
mySvc.setSubscriptionCanonicalizerForUnitTest(new SubscriptionCanonicalizer(myCtx));
mySvc.setDaoRegistryForUnitTest(myDaoRegistry);
mySvc.setSubscriptionStrategyEvaluatorForUnitTest(mySubscriptionStrategyEvaluator);
mySvc.setFhirContextForUnitTest(myCtx);
mySvc.setFhirContext(myCtx);
mySvc.setDaoConfigForUnitTest(myDaoConfig);
mySvc.setRequestPartitionHelperSvcForUnitTest(myRequestPartitionHelperSvc);
}
@ -195,7 +195,7 @@ public class SubscriptionValidatingInterceptorTest {
@Test
public void testValidate_Cross_Partition_Subscription() {
when(myDaoRegistry.isResourceTypeSupported(eq("Patient"))).thenReturn(true);
when(myDaoConfig.isCrossPartitionSubscription()).thenReturn(true);
when(myDaoConfig.isCrossPartitionSubscriptionEnabled()).thenReturn(true);
when(myRequestPartitionHelperSvc.determineCreatePartitionForRequest(isA(RequestDetails.class), isA(Subscription.class), eq("Subscription"))).thenReturn(RequestPartitionId.defaultPartition());
Subscription subscription = new Subscription();
@ -211,15 +211,15 @@ public class SubscriptionValidatingInterceptorTest {
// No asserts here because the function should throw an UnprocessableEntityException exception if the subscription
// is invalid
assertDoesNotThrow(() -> mySvc.validateSubmittedSubscription(subscription, requestDetails));
Mockito.verify(myDaoConfig, times(1)).isCrossPartitionSubscription();
assertDoesNotThrow(() -> mySvc.validateSubmittedSubscription(subscription, requestDetails, null));
Mockito.verify(myDaoConfig, times(1)).isCrossPartitionSubscriptionEnabled();
Mockito.verify(myDaoRegistry, times(1)).isResourceTypeSupported(eq("Patient"));
Mockito.verify(myRequestPartitionHelperSvc, times(1)).determineCreatePartitionForRequest(isA(RequestDetails.class), isA(Subscription.class), eq("Subscription"));
}
@Test
public void testValidate_Cross_Partition_Subscription_On_Wrong_Partition() {
when(myDaoConfig.isCrossPartitionSubscription()).thenReturn(true);
when(myDaoConfig.isCrossPartitionSubscriptionEnabled()).thenReturn(true);
when(myRequestPartitionHelperSvc.determineCreatePartitionForRequest(isA(RequestDetails.class), isA(Subscription.class), eq("Subscription"))).thenReturn(RequestPartitionId.fromPartitionId(1));
Subscription subscription = new Subscription();
@ -234,7 +234,7 @@ public class SubscriptionValidatingInterceptorTest {
requestDetails.setRestOperationType(RestOperationTypeEnum.CREATE);
try {
mySvc.validateSubmittedSubscription(subscription, requestDetails);
mySvc.validateSubmittedSubscription(subscription, requestDetails, null);
fail();
} catch (UnprocessableEntityException theUnprocessableEntityException) {
assertEquals(Msg.code(2010) + "Cross partition subscription must be created on the default partition", theUnprocessableEntityException.getMessage());
@ -243,7 +243,7 @@ public class SubscriptionValidatingInterceptorTest {
@Test
public void testValidate_Cross_Partition_Subscription_Without_Setting() {
when(myDaoConfig.isCrossPartitionSubscription()).thenReturn(false);
when(myDaoConfig.isCrossPartitionSubscriptionEnabled()).thenReturn(false);
Subscription subscription = new Subscription();
subscription.setStatus(Subscription.SubscriptionStatus.ACTIVE);
@ -257,7 +257,7 @@ public class SubscriptionValidatingInterceptorTest {
requestDetails.setRestOperationType(RestOperationTypeEnum.CREATE);
try {
mySvc.validateSubmittedSubscription(subscription, requestDetails);
mySvc.validateSubmittedSubscription(subscription, requestDetails, null);
fail();
} catch (UnprocessableEntityException theUnprocessableEntityException) {
assertEquals(Msg.code(2009) + "Cross partition subscription is not enabled on this server", theUnprocessableEntityException.getMessage());
@ -281,8 +281,8 @@ public class SubscriptionValidatingInterceptorTest {
// No asserts here because the function should throw an UnprocessableEntityException exception if the subscription
// is invalid
mySvc.validateSubmittedSubscription(subscription, requestDetails);
Mockito.verify(myDaoConfig, never()).isCrossPartitionSubscription();
mySvc.validateSubmittedSubscription(subscription, requestDetails, null);
Mockito.verify(myDaoConfig, never()).isCrossPartitionSubscriptionEnabled();
Mockito.verify(myDaoRegistry, times(1)).isResourceTypeSupported(eq("Patient"));
Mockito.verify(myRequestPartitionHelperSvc, never()).determineCreatePartitionForRequest(isA(RequestDetails.class), isA(Patient.class), eq("Patient"));
}

View File

@ -45,7 +45,7 @@ public class TerminologyLoaderSvcIcd10cmJpaTest extends BaseJpaR4Test {
assertEquals(0, myTermValueSetDao.count());
assertEquals(0, myTermConceptMapDao.count());
assertEquals(1, myResourceTableDao.count());
assertEquals(17, myTermConceptDao.count());
assertEquals(43, myTermConceptDao.count());
TermCodeSystem codeSystem = myTermCodeSystemDao.findByCodeSystemUri(ITermLoaderSvc.ICD10CM_URI);
assertEquals("2021", codeSystem.getCurrentVersion().getCodeSystemVersionId());

View File

@ -4,6 +4,7 @@ import ca.uhn.fhir.jpa.entity.TermCodeSystemVersion;
import ca.uhn.fhir.jpa.entity.TermConcept;
import ca.uhn.fhir.util.ClasspathUtil;
import org.hamcrest.Matchers;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import org.xml.sax.SAXException;
@ -18,29 +19,56 @@ import static org.junit.jupiter.api.Assertions.assertEquals;
public class Icd10CmLoaderTest {
@Test
public void testLoadIcd10Cm() throws IOException, SAXException {
private static TermCodeSystemVersion codeSystemVersion;
@BeforeAll
public static void beforeClass() throws IOException, SAXException {
StringReader reader = new StringReader(ClasspathUtil.loadResource("icd/icd10cm_tabular_2021.xml"));
TermCodeSystemVersion codeSystemVersion = new TermCodeSystemVersion();
codeSystemVersion = new TermCodeSystemVersion();
Icd10CmLoader loader = new Icd10CmLoader(codeSystemVersion);
loader.load(reader);
}
@Test
public void testLoadIcd10CmCheckVersion() {
assertEquals("2021", codeSystemVersion.getCodeSystemVersionId());
}
@Test
public void testLoadIcd10CmCheckRootConcepts() {
List<TermConcept> rootConcepts = new ArrayList<>(codeSystemVersion.getConcepts());
assertEquals(2, rootConcepts.size());
assertEquals(4, rootConcepts.size());
assertEquals("A00", rootConcepts.get(0).getCode());
assertEquals("Cholera", rootConcepts.get(0).getDisplay());
List<String> conceptNames = rootConcepts.stream().map(t -> t.getCode()).collect(Collectors.toList());
assertThat(conceptNames.toString(), conceptNames, Matchers.contains("A00", "A01"));
assertThat(conceptNames.toString(), conceptNames, Matchers.contains("A00", "A01","H40","R40"));
}
@Test
public void testLoadIcd10CmCheckChildCode() {
List<TermConcept> rootConcepts = new ArrayList<>(codeSystemVersion.getConcepts());
assertEquals(3, rootConcepts.get(0).getChildCodes().size());
TermConcept firstChildCode = rootConcepts.get(0).getChildCodes().get(0);
assertEquals("A00.0", firstChildCode.getCode());
assertEquals("Cholera due to Vibrio cholerae 01, biovar cholerae", firstChildCode.getDisplay());
conceptNames = rootConcepts.get(0).getChildCodes().stream().map(t -> t.getCode()).collect(Collectors.toList());
List<String> conceptNames = rootConcepts.get(0).getChildCodes().stream().map(t -> t.getCode()).collect(Collectors.toList());
assertThat(conceptNames.toString(), conceptNames, Matchers.contains("A00.0", "A00.1", "A00.9"));
}
@Test
public void testLoadIcd10CmCheckExtendedChildCode() {
List<TermConcept> rootConcepts = new ArrayList<>(codeSystemVersion.getConcepts());
List<String> conceptNames = rootConcepts.get(2).getChildCodes().get(0).getChildCodes().stream().map(t -> t.getCode()).collect(Collectors.toList());
assertThat(conceptNames.toString(), conceptNames, Matchers.contains("H40.40", "H40.40X0", "H40.40X1", "H40.40X2", "H40.40X3", "H40.40X4", "H40.41", "H40.41X0", "H40.41X1", "H40.41X2", "H40.41X3", "H40.41X4"));
TermConcept ExtendedChildCode = rootConcepts.get(2).getChildCodes().get(0).getChildCodes().get(1);
assertEquals("H40.40X0", ExtendedChildCode.getCode());
assertEquals("Glaucoma secondary to eye inflammation, unspecified eye, stage unspecified", ExtendedChildCode.getDisplay());
ExtendedChildCode = rootConcepts.get(3).getChildCodes().get(0).getChildCodes().get(0).getChildCodes().get(0).getChildCodes().get(0).getChildCodes().get(0).getChildCodes().get(3);
assertEquals("R40.2112", ExtendedChildCode.getCode());
assertEquals("Coma scale, eyes open, never, at arrival to emergency department", ExtendedChildCode.getDisplay());
}
}

View File

@ -1,105 +1,196 @@
<?xml version="1.0" encoding="utf-8"?>
<ICD10CM.tabular>
<version>2021</version>
<introduction>
<introSection type="title">
<title>ICD-10-CM TABULAR LIST of DISEASES and INJURIES</title>
</introSection>
</introduction>
<chapter>
<name>1</name>
<desc>Certain infectious and parasitic diseases (A00-B99)</desc>
<sectionIndex>
<sectionRef first="A00" last="A09" id="A00-A09">
Intestinal infectious diseases
</sectionRef>
</sectionIndex>
<section id="A00-A09">
<desc>Intestinal infectious diseases (A00-A09)</desc>
<diag>
<name>A00</name>
<desc>Cholera</desc>
<diag>
<name>A00.0</name>
<desc>Cholera due to Vibrio cholerae 01, biovar cholerae</desc>
<inclusionTerm>
<note>Classical cholera</note>
</inclusionTerm>
</diag>
<diag>
<name>A00.1</name>
<desc>Cholera due to Vibrio cholerae 01, biovar eltor</desc>
<inclusionTerm>
<note>Cholera eltor</note>
</inclusionTerm>
</diag>
<diag>
<name>A00.9</name>
<desc>Cholera, unspecified</desc>
</diag>
</diag>
<diag>
<name>A01</name>
<desc>Typhoid and paratyphoid fevers</desc>
<diag>
<name>A01.0</name>
<desc>Typhoid fever</desc>
<inclusionTerm>
<note>Infection due to Salmonella typhi</note>
</inclusionTerm>
<diag>
<name>A01.00</name>
<desc>Typhoid fever, unspecified</desc>
</diag>
<diag>
<name>A01.01</name>
<desc>Typhoid meningitis</desc>
</diag>
<diag>
<name>A01.02</name>
<desc>Typhoid fever with heart involvement</desc>
<inclusionTerm>
<note>Typhoid endocarditis</note>
<note>Typhoid myocarditis</note>
</inclusionTerm>
</diag>
<diag>
<name>A01.03</name>
<desc>Typhoid pneumonia</desc>
</diag>
<diag>
<name>A01.04</name>
<desc>Typhoid arthritis</desc>
</diag>
<diag>
<name>A01.05</name>
<desc>Typhoid osteomyelitis</desc>
</diag>
<diag>
<name>A01.09</name>
<desc>Typhoid fever with other complications</desc>
</diag>
</diag>
<diag>
<name>A01.1</name>
<desc>Paratyphoid fever A</desc>
</diag>
<diag>
<name>A01.2</name>
<desc>Paratyphoid fever B</desc>
</diag>
<diag>
<name>A01.3</name>
<desc>Paratyphoid fever C</desc>
</diag>
<diag>
<name>A01.4</name>
<desc>Paratyphoid fever, unspecified</desc>
<inclusionTerm>
<note>Infection due to Salmonella paratyphi NOS</note>
</inclusionTerm>
</diag>
</diag>
</section>
</chapter>
<version>2021</version>
<introduction>
<introSection type="title">
<title>ICD-10-CM TABULAR LIST of DISEASES and INJURIES</title>
</introSection>
</introduction>
<chapter>
<name>1</name>
<desc>Certain infectious and parasitic diseases (A00-B99)</desc>
<sectionIndex>
<sectionRef first="A00" last="A09" id="A00-A09">
Intestinal infectious diseases
</sectionRef>
</sectionIndex>
<section id="A00-A09">
<desc>Intestinal infectious diseases (A00-A09)</desc>
<diag>
<name>A00</name>
<desc>Cholera</desc>
<diag>
<name>A00.0</name>
<desc>Cholera due to Vibrio cholerae 01, biovar cholerae</desc>
<inclusionTerm>
<note>Classical cholera</note>
</inclusionTerm>
</diag>
<diag>
<name>A00.1</name>
<desc>Cholera due to Vibrio cholerae 01, biovar eltor</desc>
<inclusionTerm>
<note>Cholera eltor</note>
</inclusionTerm>
</diag>
<diag>
<name>A00.9</name>
<desc>Cholera, unspecified</desc>
</diag>
</diag>
<diag>
<name>A01</name>
<desc>Typhoid and paratyphoid fevers</desc>
<diag>
<name>A01.0</name>
<desc>Typhoid fever</desc>
<inclusionTerm>
<note>Infection due to Salmonella typhi</note>
</inclusionTerm>
<diag>
<name>A01.00</name>
<desc>Typhoid fever, unspecified</desc>
</diag>
<diag>
<name>A01.01</name>
<desc>Typhoid meningitis</desc>
</diag>
<diag>
<name>A01.02</name>
<desc>Typhoid fever with heart involvement</desc>
<inclusionTerm>
<note>Typhoid endocarditis</note>
<note>Typhoid myocarditis</note>
</inclusionTerm>
</diag>
<diag>
<name>A01.03</name>
<desc>Typhoid pneumonia</desc>
</diag>
<diag>
<name>A01.04</name>
<desc>Typhoid arthritis</desc>
</diag>
<diag>
<name>A01.05</name>
<desc>Typhoid osteomyelitis</desc>
</diag>
<diag>
<name>A01.09</name>
<desc>Typhoid fever with other complications</desc>
</diag>
</diag>
<diag>
<name>A01.1</name>
<desc>Paratyphoid fever A</desc>
</diag>
<diag>
<name>A01.2</name>
<desc>Paratyphoid fever B</desc>
</diag>
<diag>
<name>A01.3</name>
<desc>Paratyphoid fever C</desc>
</diag>
<diag>
<name>A01.4</name>
<desc>Paratyphoid fever, unspecified</desc>
<inclusionTerm>
<note>Infection due to Salmonella paratyphi NOS</note>
</inclusionTerm>
</diag>
</diag>
</section>
</chapter>
<chapter>
<name>2</name>
<desc>Diseases of the eye and adnexa (H00-H59)</desc>
<sectionIndex>
<sectionRef first="H40" last="H42" id="H40-H42">
Glaucoma
</sectionRef>
</sectionIndex>
<section id="H40-H42">
<desc>Glaucoma (H40-H42)</desc>
<diag>
<name>H40</name>
<desc>Glaucoma</desc>
<diag>
<name>H40.4</name>
<desc>Glaucoma secondary to eye inflammation</desc>
<codeAlso>
<note>underlying condition</note>
</codeAlso>
<sevenChrNote>
<note>One of the following 7th characters is to be assigned to each code in subcategory H40.4 to designate the stage of glaucoma</note>
</sevenChrNote>
<sevenChrDef>
<extension char="0">stage unspecified</extension>
<extension char="1">mild stage</extension>
<extension char="2">moderate stage</extension>
<extension char="3">severe stage</extension>
<extension char="4">indeterminate stage</extension>
</sevenChrDef>
<diag>
<name>H40.40</name>
<desc>Glaucoma secondary to eye inflammation, unspecified eye</desc>
</diag>
<diag>
<name>H40.41</name>
<desc>Glaucoma secondary to eye inflammation, right eye</desc>
</diag>
</diag>
</diag>
</section>
</chapter>
<chapter>
<name>3</name>
<desc>Symptoms, signs and abnormal clinical and laboratory findings, not elsewhere classified (R00-R99)</desc>
<sectionIndex>
<sectionRef first="R40" last="R46" id="R40-R46">
Symptoms and signs involving cognition, perception, emotional state and behavior
</sectionRef>
</sectionIndex>
<section id="R40-R46">
<desc>Symptoms and signs involving cognition, perception, emotional state and behavior (R40-R46)</desc>
<diag>
<name>R40</name>
<desc>Somnolence, stupor and coma</desc>
<diag>
<name>R40.0</name>
<desc>Somnolence</desc>
<diag>
<name>R40.1</name>
<desc>Stupor</desc>
<diag>
<name>R40.2</name>
<desc>Coma</desc>
<diag>
<name>R40.20</name>
<desc>Unspecified coma</desc>
<diag>
<name>R40.21</name>
<desc>Coma scale, eyes open</desc>
<sevenChrNote>
<note>The following appropriate 7th character is to be added to subcategory R40.21-:</note>
</sevenChrNote>
<sevenChrDef>
<extension char="0">unspecified time</extension>
<extension char="1">in the field [EMT or ambulance]</extension>
<extension char="2">at arrival to emergency department</extension>
<extension char="3">at hospital admission</extension>
<extension char="4">24 hours or more after hospital admission</extension>
</sevenChrDef>
<diag>
<name>R40.211</name>
<desc>Coma scale, eyes open, never</desc>
</diag>
</diag>
</diag>
</diag>
</diag>
</diag>
</diag>
</section>
</chapter>
</ICD10CM.tabular>

View File

@ -0,0 +1,64 @@
{
"resourceType": "ValueSet",
"id": "V",
"meta": {
"versionId": "1",
"lastUpdated": "2022-08-04T22:39:27.530-04:00"
},
"url": "http://smilecdr.com/V",
"status": "active",
"compose": {
"include": [
{
"system": "http://hl7.org/fhir/sid/icd-10",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "STI"
}
]
},
{
"system": "http://hl7.org/fhir/sid/icd-10",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "ALCSUBUSE"
}
]
},
{
"system": "http://hl7.org/fhir/sid/icd-10",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "GENTESTINGINFO1"
}
]
},
{
"system": "http://hl7.org/fhir/sid/icd-10",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "HIVAIDS"
}
]
},
{
"system": "http://hl7.org/fhir/sid/icd-10",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "ABORTION"
}
]
}
]
}
}

View File

@ -0,0 +1,64 @@
{
"resourceType": "ValueSet",
"id": "V1",
"meta": {
"versionId": "1",
"lastUpdated": "2022-08-04T22:39:27.530-04:00"
},
"url": "http://smilecdr.com/V1",
"status": "active",
"compose": {
"include": [
{
"system": "http://hl7.org/fhir/sid/icd-10",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "STI"
}
]
},
{
"system": "http://hl7.org/fhir/sid/icd-10",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "ALCSUBUSE"
}
]
},
{
"system": "http://hl7.org/fhir/sid/icd-10",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "GENTESTINGINFO1"
}
]
},
{
"system": "http://hl7.org/fhir/sid/icd-10",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "HIVAIDS"
}
]
},
{
"system": "http://hl7.org/fhir/sid/icd-10",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "ABORTION"
}
]
}
]
}
}

View File

@ -0,0 +1,36 @@
{
"resourceType": "Observation",
"id": "observationtestmatch",
"code": {
"coding": [
{
"system": "http://hl7.org/fhir/sid/icd-10",
"code": "B20",
"display": "Human immunodeficiency virus [HIV] disease"
}
],
"text": "Human immunodeficiency virus [HIV] disease"
},
"valueQuantity": {
"value": 176,
"unit": "g/L",
"system": "http://unitsofmeasure.org",
"code": "g/L"
},
"referenceRange": [
{
"low": {
"value": 135,
"unit": "g/L",
"system": "http://unitsofmeasure.org",
"code": "g/L"
},
"high": {
"value": 180,
"unit": "g/L",
"system": "http://unitsofmeasure.org",
"code": "g/L"
}
}
]
}

View File

@ -0,0 +1,36 @@
{
"resourceType": "Observation",
"id": "observationtestnomatch",
"code": {
"coding": [
{
"system": "http://loinc.org",
"code": "718-7",
"display": "Hemoglobin [Mass/volume] in Blood"
}
],
"text": "Haemoglobin"
},
"valueQuantity": {
"value": 176,
"unit": "g/L",
"system": "http://unitsofmeasure.org",
"code": "g/L"
},
"referenceRange": [
{
"low": {
"value": 135,
"unit": "g/L",
"system": "http://unitsofmeasure.org",
"code": "g/L"
},
"high": {
"value": 180,
"unit": "g/L",
"system": "http://unitsofmeasure.org",
"code": "g/L"
}
}
]
}

View File

@ -5,7 +5,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@ -7,7 +7,7 @@
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-deployable-pom</artifactId>
<version>6.2.0-PRE1-SNAPSHOT</version>
<version>6.2.0-PRE2-SNAPSHOT</version>
<relativePath>../hapi-deployable-pom/pom.xml</relativePath>
</parent>
@ -35,7 +35,6 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons</artifactId>
<version>${spring_data_version}</version>
</dependency>
<dependency>
<groupId>ca.uhn.hapi.fhir</groupId>

Some files were not shown because too many files have changed in this diff Show More