Rel 6 0 mergeback (#3597)
* adding version.yaml, updating version in pom.xml * It is possible to write to a resource in a partition the user is not authorized to. (#3397) * fixed * remove sout * add msg.code * fix failed tests * fix equal sign * update msg code * extract method * Fix up code numbers * Clean changelog Co-authored-by: Tadgh <garygrantgraham@gmail.com> * Revert final artifact version * add graphql test (#3585) * added graphql birthdate test * fix variable name * typo * 3506 mdm log enhancement (#3543) * Providing Fixme's to be reworked at a later time. * Adding // FIXME Anna to assist our Austrian friend. * Adding logging test as a first step in addressing issue #2822. * hapifhir#3506 part 2: enhance logging for (un)successful MDM matching hapifhir#3506 part 2: enhance logging for (un)successful MDM matching * Update MdmMatchLinkSvc.java #3506 move "narrowed down" log to different place * #3506 added scores and tests #3506 added scores and tests * #3506 formatting #3506 formatting * #3506 create changelog file #3506 create changelog file * #3506 fix typo #3506 fix typo * #3506 fix part 3, minor formatting #3506 fix part 3, minor formatting * #3506 fix tests #3506 fix tests Co-authored-by: Etienne Poirier <etienne.poirier@smilecdr.com> Co-authored-by: Anna <anna@MacBook-Pro.local> * mdm matching (#3579) * Added fix for https://github.com/hapifhir/hapi-fhir-jpaserver-starter… (#3551) * Added fix for https://github.com/hapifhir/hapi-fhir-jpaserver-starter/issues/328 * Update HapiFhirJpaMigrationTasks.java Corrected ordering * Update HapiFhirJpaMigrationTasks.java Moving index status to be last operation * Revert "Update HapiFhirJpaMigrationTasks.java" This reverts commit37bfd3e66e
. * Moved to bottom Co-authored-by: Jens Kristian Villadsen <jvi@trifork.com> * begin with failing test * fixed * changelog * add jira tag * Update to 6 1 (#3582) * added changelog folder, upped version * version enum * add a few more unit tests to assert proper NO_MATCH exclusion * revert merge master doh! bad reflexes * revert merge origin master * unrevert revert. ugh what a pain * merge recovery. fix poms. * merge recovery. more reverting * merge recovery. more reverting * merge recovery. more reverting * Revert "merge recovery. fix poms." This reverts commitae6e0ddb06
. * more revert revert reversions * more revert revert reversions * pre-review cleanup Co-authored-by: Jens Kristian Villadsen <jenskristianvilladsen@gmail.com> Co-authored-by: Jens Kristian Villadsen <jvi@trifork.com> Co-authored-by: Mark Iantorno <markiantorno@gmail.com> * Fix regression of 3411 - _lastUpdated gets clobbered during $reindex job (#3586) * When chunking for the reindex job, don't clobber the lastUpdated if provided by the caller. * License * Ks 20220508 log colour (#3592) * don't use colours when output is redirected to a file * change log * Added fix for https://github.com/hapifhir/hapi-fhir-jpaserver-starter… (#3551) (#3594) * Added fix for https://github.com/hapifhir/hapi-fhir-jpaserver-starter/issues/328 * Update HapiFhirJpaMigrationTasks.java Corrected ordering * Update HapiFhirJpaMigrationTasks.java Moving index status to be last operation * Revert "Update HapiFhirJpaMigrationTasks.java" This reverts commit37bfd3e66e
. * Moved to bottom Co-authored-by: Jens Kristian Villadsen <jvi@trifork.com> Co-authored-by: Jens Kristian Villadsen <jenskristianvilladsen@gmail.com> Co-authored-by: Jens Kristian Villadsen <jvi@trifork.com> * 3584 case sensitive string elasticsearch (#3596) * Fix for case-sensitive search with newer elasticsearch * Test and fix ascii normalization too * more test cases * comments * Docs update empi usecase (#3598) * Update diagram for use case 5 * Add new page for database support * Licnese * Mb norm fix (#3604) * Normalize query since wildcard searches aren't normalized in elastic * Handle batch2 job cancellation (#3603) * Handle batch2 job cancellation * Add CANCELLED status * Remove unuseful test Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com> * Add batch2 interfaces to obtain recent instances (#3601) Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com> * changelog folder * Documentation change, added warning log and change log for ticket (#3609) Co-authored-by: Steven Li <steven@smilecdr.com> * fix ne for lastUpdated search param (#3589) * init/wip * add implementation, add tests * add checks and some refactoring * refactor and simplify tests * refactor * get rid of some of the warnings * add changelog * remove my todos * redo ne logic to account for less precise dates * add tests * refactoring * rename to follow convention Co-authored-by: Justin_Dar <justin.dar@smilecdr.com> * Version bump Co-authored-by: markiantorno <markiantorno@gmail.com> Co-authored-by: katiesmilecdr <88786813+katiesmilecdr@users.noreply.github.com> Co-authored-by: Ken Stevens <khstevens@gmail.com> Co-authored-by: alackerbauer <33912849+alackerbauer@users.noreply.github.com> Co-authored-by: Etienne Poirier <etienne.poirier@smilecdr.com> Co-authored-by: Anna <anna@MacBook-Pro.local> Co-authored-by: Jens Kristian Villadsen <jenskristianvilladsen@gmail.com> Co-authored-by: Jens Kristian Villadsen <jvi@trifork.com> Co-authored-by: michaelabuckley <michael.buckley@smilecdr.com> Co-authored-by: jmarchionatto <60409882+jmarchionatto@users.noreply.github.com> Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com> Co-authored-by: StevenXLi <stevenli_8118@hotmail.com> Co-authored-by: Steven Li <steven@smilecdr.com> Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com> Co-authored-by: Justin_Dar <justin.dar@smilecdr.com>
This commit is contained in:
parent
231c2659b8
commit
54f578c8b1
|
@ -8,6 +8,7 @@ import ca.uhn.fhir.parser.DataFormatException;
|
|||
import ca.uhn.fhir.rest.api.QualifiedParamList;
|
||||
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
|
||||
import ca.uhn.fhir.util.DateUtils;
|
||||
import org.apache.commons.lang3.Validate;
|
||||
import org.hl7.fhir.instance.model.api.IPrimitiveType;
|
||||
|
||||
import java.util.ArrayList;
|
||||
|
@ -57,6 +58,17 @@ public class DateRangeParam implements IQueryParameterAnd<DateParam> {
|
|||
super();
|
||||
}
|
||||
|
||||
/**
|
||||
* Copy constructor.
|
||||
*/
|
||||
@SuppressWarnings("CopyConstructorMissesField")
|
||||
public DateRangeParam(DateRangeParam theDateRangeParam) {
|
||||
super();
|
||||
Validate.notNull(theDateRangeParam);
|
||||
setLowerBound(theDateRangeParam.getLowerBound());
|
||||
setUpperBound(theDateRangeParam.getUpperBound());
|
||||
}
|
||||
|
||||
/**
|
||||
* Constructor which takes two Dates representing the lower and upper bounds of the range (inclusive on both ends)
|
||||
*
|
||||
|
@ -235,7 +247,7 @@ public class DateRangeParam implements IQueryParameterAnd<DateParam> {
|
|||
* are the same value. As such, even though the prefixes for the lower and
|
||||
* upper bounds default to <code>ge</code> and <code>le</code> respectively,
|
||||
* the resulting prefix is effectively <code>eq</code> where only a single
|
||||
* date is provided - as required by the FHIR specificiation (i.e. "If no
|
||||
* date is provided - as required by the FHIR specification (i.e. "If no
|
||||
* prefix is present, the prefix <code>eq</code> is assumed").
|
||||
* </p>
|
||||
*/
|
||||
|
@ -296,12 +308,12 @@ public class DateRangeParam implements IQueryParameterAnd<DateParam> {
|
|||
break;
|
||||
case EQUAL:
|
||||
case GREATERTHAN_OR_EQUALS:
|
||||
case NOT_EQUAL:
|
||||
break;
|
||||
case LESSTHAN:
|
||||
case APPROXIMATE:
|
||||
case LESSTHAN_OR_EQUALS:
|
||||
case ENDS_BEFORE:
|
||||
case NOT_EQUAL:
|
||||
throw new IllegalStateException(Msg.code(1926) + "Invalid lower bound comparator: " + myLowerBound.getPrefix());
|
||||
}
|
||||
}
|
||||
|
@ -326,11 +338,11 @@ public class DateRangeParam implements IQueryParameterAnd<DateParam> {
|
|||
break;
|
||||
case EQUAL:
|
||||
case LESSTHAN_OR_EQUALS:
|
||||
case NOT_EQUAL:
|
||||
break;
|
||||
case GREATERTHAN_OR_EQUALS:
|
||||
case GREATERTHAN:
|
||||
case APPROXIMATE:
|
||||
case NOT_EQUAL:
|
||||
case STARTS_AFTER:
|
||||
throw new IllegalStateException(Msg.code(1927) + "Invalid upper bound comparator: " + myUpperBound.getPrefix());
|
||||
}
|
||||
|
@ -355,13 +367,13 @@ public class DateRangeParam implements IQueryParameterAnd<DateParam> {
|
|||
retVal = myLowerBound.getPrecision().add(retVal, 1);
|
||||
break;
|
||||
case EQUAL:
|
||||
case NOT_EQUAL:
|
||||
case GREATERTHAN_OR_EQUALS:
|
||||
break;
|
||||
case LESSTHAN:
|
||||
case APPROXIMATE:
|
||||
case LESSTHAN_OR_EQUALS:
|
||||
case ENDS_BEFORE:
|
||||
case NOT_EQUAL:
|
||||
throw new IllegalStateException(Msg.code(1928) + "Invalid lower bound comparator: " + myLowerBound.getPrefix());
|
||||
}
|
||||
}
|
||||
|
@ -417,6 +429,7 @@ public class DateRangeParam implements IQueryParameterAnd<DateParam> {
|
|||
retVal = new Date(retVal.getTime() - 1L);
|
||||
break;
|
||||
case EQUAL:
|
||||
case NOT_EQUAL:
|
||||
case LESSTHAN_OR_EQUALS:
|
||||
retVal = myUpperBound.getPrecision().add(retVal, 1);
|
||||
retVal = new Date(retVal.getTime() - 1L);
|
||||
|
@ -424,7 +437,6 @@ public class DateRangeParam implements IQueryParameterAnd<DateParam> {
|
|||
case GREATERTHAN_OR_EQUALS:
|
||||
case GREATERTHAN:
|
||||
case APPROXIMATE:
|
||||
case NOT_EQUAL:
|
||||
case STARTS_AFTER:
|
||||
throw new IllegalStateException(Msg.code(1929) + "Invalid upper bound comparator: " + myUpperBound.getPrefix());
|
||||
}
|
||||
|
|
|
@ -23,7 +23,7 @@ import ca.uhn.fhir.util.CoverageIgnore;
|
|||
*/
|
||||
|
||||
|
||||
public class HasAndListParam extends BaseAndListParam<HasOrListParam> {
|
||||
public class HasAndListParam extends BaseAndListParam<HasOrListParam> {
|
||||
|
||||
@Override
|
||||
HasOrListParam newInstance() {
|
||||
|
|
|
@ -0,0 +1,61 @@
|
|||
package ca.uhn.fhir.util;
|
||||
|
||||
/*-
|
||||
* #%L
|
||||
* HAPI FHIR - Core Library
|
||||
* %%
|
||||
* Copyright (C) 2014 - 2022 Smile CDR, Inc.
|
||||
* %%
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
* #L%
|
||||
*/
|
||||
|
||||
import ca.uhn.fhir.rest.param.DateRangeParam;
|
||||
|
||||
import javax.annotation.Nonnull;
|
||||
import javax.annotation.Nullable;
|
||||
import java.util.Date;
|
||||
|
||||
public class DateRangeUtil {
|
||||
|
||||
/**
|
||||
* Narrow the DateRange to be within theStartInclusive, and theEndExclusive, if provided.
|
||||
* @param theDateRangeParam the initial range, null for unconstrained
|
||||
* @param theStartInclusive a lower bound to apply, or null for unchanged.
|
||||
* @param theEndExclusive an upper bound to apply, or null for unchanged.
|
||||
* @return a DateRange within the original range, and between theStartInclusive and theEnd
|
||||
*/
|
||||
@Nonnull
|
||||
public static DateRangeParam narrowDateRange(@Nullable DateRangeParam theDateRangeParam, @Nullable Date theStartInclusive, @Nullable Date theEndExclusive) {
|
||||
if (theStartInclusive == null && theEndExclusive == null) {
|
||||
return theDateRangeParam;
|
||||
}
|
||||
DateRangeParam result = theDateRangeParam==null?new DateRangeParam():new DateRangeParam(theDateRangeParam);
|
||||
|
||||
if (theStartInclusive != null) {
|
||||
Date inputStart = result.getLowerBoundAsInstant();
|
||||
if (theDateRangeParam == null || inputStart == null || inputStart.before(theStartInclusive)) {
|
||||
result.setLowerBoundInclusive(theStartInclusive);
|
||||
}
|
||||
}
|
||||
if (theEndExclusive != null) {
|
||||
Date inputEnd = result.getUpperBound() == null? null : result.getUpperBound().getValue();
|
||||
if (theDateRangeParam == null || inputEnd == null || inputEnd.after(theEndExclusive)) {
|
||||
result.setUpperBoundExclusive(theEndExclusive);
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
}
|
|
@ -7,7 +7,9 @@ import org.junit.jupiter.api.BeforeEach;
|
|||
import org.junit.jupiter.api.Test;
|
||||
import org.mockito.Mockito;
|
||||
|
||||
import java.time.Instant;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Date;
|
||||
import java.util.List;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||
|
@ -123,4 +125,18 @@ public class DateRangeParamTest {
|
|||
assertEquals(ParamPrefixEnum.NOT_EQUAL, dateRangeParam.getLowerBound().getPrefix());
|
||||
assertEquals(ParamPrefixEnum.NOT_EQUAL, dateRangeParam.getUpperBound().getPrefix());
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testCopyConstructor() {
|
||||
DateParam dateStart = new DateParam("gt2021-01-01");
|
||||
DateParam dateEnd = new DateParam("lt2021-02-01");
|
||||
DateRangeParam input = new DateRangeParam(dateStart, dateEnd);
|
||||
|
||||
DateRangeParam copy = new DateRangeParam(input);
|
||||
|
||||
assertEquals(dateStart, copy.getLowerBound());
|
||||
assertEquals(dateEnd, copy.getUpperBound());
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
|
|
|
@ -0,0 +1,112 @@
|
|||
package ca.uhn.fhir.util;
|
||||
|
||||
import ca.uhn.fhir.rest.param.DateParam;
|
||||
import ca.uhn.fhir.rest.param.DateRangeParam;
|
||||
import ca.uhn.fhir.rest.param.ParamPrefixEnum;
|
||||
import org.apache.commons.lang3.builder.ToStringBuilder;
|
||||
import org.apache.commons.lang3.builder.ToStringStyle;
|
||||
import org.junit.jupiter.params.ParameterizedTest;
|
||||
import org.junit.jupiter.params.provider.MethodSource;
|
||||
|
||||
import java.time.Instant;
|
||||
import java.util.Arrays;
|
||||
import java.util.Date;
|
||||
import java.util.List;
|
||||
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.GREATERTHAN;
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.GREATERTHAN_OR_EQUALS;
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.LESSTHAN;
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.LESSTHAN_OR_EQUALS;
|
||||
import static org.hamcrest.MatcherAssert.assertThat;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
|
||||
class DateRangeUtilTest {
|
||||
|
||||
static Date dateOne = Date.from(Instant.parse("2021-01-01T01:00:00Z"));
|
||||
static Date dateTwo = Date.from(Instant.parse("2021-01-01T02:00:00Z"));
|
||||
static Date dateThree = Date.from(Instant.parse("2021-01-01T03:00:00Z"));
|
||||
static Date dateFour = Date.from(Instant.parse("2021-01-01T04:00:00Z"));
|
||||
static Date dateFive = Date.from(Instant.parse("2021-01-01T05:00:00Z"));
|
||||
static Date dateSix = Date.from(Instant.parse("2021-01-01T06:00:00Z"));
|
||||
|
||||
static class NarrowCase {
|
||||
final String message;
|
||||
final DateRangeParam range;
|
||||
final Date narrowStart;
|
||||
final Date narrowEnd;
|
||||
final DateParam resultStart;
|
||||
final DateParam resultEnd;
|
||||
|
||||
public NarrowCase(String theMessage, DateRangeParam theRange, Date theNarrowStart, Date theNarrowEnd, DateParam theResultStart, DateParam theResultEnd) {
|
||||
message = theMessage;
|
||||
range = theRange;
|
||||
narrowStart = theNarrowStart;
|
||||
narrowEnd = theNarrowEnd;
|
||||
resultStart = theResultStart;
|
||||
resultEnd = theResultEnd;
|
||||
}
|
||||
|
||||
|
||||
static NarrowCase from(String theMessage, DateRangeParam theRange, Date theNarrowStart, Date theNarrowEnd, Date theResultStart, Date theResultEnd) {
|
||||
return new NarrowCase(theMessage, theRange, theNarrowStart, theNarrowEnd,
|
||||
theResultStart == null?null:new DateParam(GREATERTHAN_OR_EQUALS, theResultStart),
|
||||
theResultEnd == null?null:new DateParam(LESSTHAN, theResultEnd));
|
||||
}
|
||||
|
||||
static NarrowCase from(String theMessage, DateRangeParam theRange, Date theNarrowStart, Date theNarrowEnd,
|
||||
ParamPrefixEnum theResultStartPrefix, Date theResultStart, ParamPrefixEnum theResultEndPrefix, Date theResultEnd) {
|
||||
return new NarrowCase(theMessage, theRange, theNarrowStart, theNarrowEnd,
|
||||
new DateParam(theResultStartPrefix, theResultStart), new DateParam(theResultEndPrefix, theResultEnd));
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return new ToStringBuilder(this, ToStringStyle.SIMPLE_STYLE)
|
||||
.append(message)
|
||||
.append("range", range)
|
||||
.append("narrowStart", narrowStart)
|
||||
.append("narrowEnd", narrowEnd)
|
||||
.append("resultStart", resultStart)
|
||||
.append("resultEnd", resultEnd)
|
||||
.toString();
|
||||
}
|
||||
}
|
||||
|
||||
static public List<NarrowCase> narrowCases() {
|
||||
|
||||
return Arrays.asList(
|
||||
// null range cases
|
||||
new NarrowCase("nulls on null yields null", null, null,null, null, null),
|
||||
NarrowCase.from("start and end narrow null", null, dateTwo,dateThree, dateTwo, dateThree),
|
||||
NarrowCase.from("start on null provides open range", null, dateTwo, null, dateTwo, null),
|
||||
NarrowCase.from("end on null provides open range", null, null,dateThree, null, dateThree),
|
||||
// middle range
|
||||
// default range is inclusive at top
|
||||
NarrowCase.from("start and end outside leaves range unchanged", new DateRangeParam(dateTwo, dateFive), dateOne, dateSix, GREATERTHAN_OR_EQUALS, dateTwo, LESSTHAN_OR_EQUALS ,dateFive),
|
||||
NarrowCase.from("start inside narrows start", new DateRangeParam(dateTwo, dateFive), dateThree, dateSix, GREATERTHAN_OR_EQUALS, dateThree, LESSTHAN_OR_EQUALS ,dateFive),
|
||||
|
||||
NarrowCase.from("end inside narrows end", new DateRangeParam(dateTwo, dateFive), dateOne, dateFour, dateTwo, dateFour),
|
||||
// half-open cases
|
||||
NarrowCase.from("end inside open end", new DateRangeParam(dateTwo, null), null, dateFour, dateTwo, dateFour),
|
||||
NarrowCase.from("start inside open start", new DateRangeParam(null, dateFour), dateTwo, null, GREATERTHAN_OR_EQUALS, dateTwo, LESSTHAN_OR_EQUALS, dateFour),
|
||||
NarrowCase.from("gt case preserved", new DateRangeParam(new DateParam(GREATERTHAN, dateTwo), null), null, dateFour, GREATERTHAN, dateTwo, LESSTHAN, dateFour)
|
||||
|
||||
|
||||
);
|
||||
}
|
||||
|
||||
@ParameterizedTest
|
||||
@MethodSource("narrowCases")
|
||||
public void testNarrowCase(NarrowCase c) {
|
||||
DateRangeParam result = DateRangeUtil.narrowDateRange(c.range, c.narrowStart, c.narrowEnd);
|
||||
|
||||
if (c.resultStart == null && c.resultEnd == null) {
|
||||
assertThat(result, nullValue());
|
||||
} else {
|
||||
assertThat(result, notNullValue());
|
||||
assertThat("range start", result.getLowerBound(), equalTo(c.resultStart));
|
||||
assertThat("range end", result.getUpperBound(), equalTo(c.resultEnd));
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -22,9 +22,6 @@ package ca.uhn.fhir.cli;
|
|||
|
||||
import ca.uhn.fhir.i18n.Msg;
|
||||
import ca.uhn.fhir.util.VersionUtil;
|
||||
import ch.qos.logback.classic.LoggerContext;
|
||||
import ch.qos.logback.classic.joran.JoranConfigurator;
|
||||
import ch.qos.logback.core.joran.spi.JoranException;
|
||||
import com.helger.commons.io.file.FileHelper;
|
||||
import org.apache.commons.cli.CommandLine;
|
||||
import org.apache.commons.cli.DefaultParser;
|
||||
|
@ -60,7 +57,7 @@ public abstract class BaseApp {
|
|||
|
||||
static {
|
||||
System.setProperty(STACKFILTER_PATTERN_PROP, STACKFILTER_PATTERN);
|
||||
loggingConfigOff();
|
||||
LogbackUtil.loggingConfigOff();
|
||||
|
||||
// We don't use qualified names for loggers in CLI
|
||||
ourLog = LoggerFactory.getLogger(App.class);
|
||||
|
@ -198,7 +195,7 @@ public abstract class BaseApp {
|
|||
|
||||
@SuppressWarnings("ResultOfMethodCallIgnored")
|
||||
public void run(String[] theArgs) {
|
||||
loggingConfigOff();
|
||||
LogbackUtil.loggingConfigOff();
|
||||
validateJavaVersion();
|
||||
|
||||
if (System.getProperty("unit_test") != null) {
|
||||
|
@ -225,7 +222,7 @@ public abstract class BaseApp {
|
|||
}
|
||||
|
||||
Optional<BaseCommand> commandOpt = parseCommand(theArgs);
|
||||
if (! commandOpt.isPresent()) return;
|
||||
if (commandOpt.isEmpty()) return;
|
||||
|
||||
BaseCommand command = commandOpt.get();
|
||||
|
||||
|
@ -238,7 +235,14 @@ public abstract class BaseApp {
|
|||
|
||||
logAppHeader();
|
||||
validateJavaVersion();
|
||||
loggingConfigOn();
|
||||
|
||||
if (System.console() == null) {
|
||||
// Probably redirecting stdout to a file
|
||||
LogbackUtil.loggingConfigOnWithoutColour();
|
||||
} else {
|
||||
// Use colours if we're logging to a console
|
||||
LogbackUtil.loggingConfigOnWithColour();
|
||||
}
|
||||
|
||||
try {
|
||||
String[] args = Arrays.copyOfRange(theArgs, 1, theArgs.length);
|
||||
|
@ -248,7 +252,7 @@ public abstract class BaseApp {
|
|||
}
|
||||
|
||||
if (parsedOptions.hasOption("debug")) {
|
||||
loggingConfigOnDebug();
|
||||
LogbackUtil.loggingConfigOnDebug();
|
||||
ourDebugMode = true;
|
||||
}
|
||||
|
||||
|
@ -264,7 +268,7 @@ public abstract class BaseApp {
|
|||
|
||||
} catch (ParseException e) {
|
||||
if (!"true".equals(System.getProperty("test"))) {
|
||||
loggingConfigOff();
|
||||
LogbackUtil.loggingConfigOff();
|
||||
}
|
||||
System.err.println("Invalid command options for command: " + command.getCommandName());
|
||||
System.err.println(" " + ansi().fg(Ansi.Color.RED).bold() + e.getMessage());
|
||||
|
@ -287,7 +291,7 @@ public abstract class BaseApp {
|
|||
private Optional<BaseCommand> parseCommand(String[] theArgs) {
|
||||
Optional<BaseCommand> commandOpt = getNextCommand(theArgs, 0);
|
||||
|
||||
if (! commandOpt.isPresent()) {
|
||||
if (commandOpt.isEmpty()) {
|
||||
String message = "Unrecognized command: " + ansi().bold().fg(Ansi.Color.RED) + theArgs[0] + ansi().boldOff().fg(Ansi.Color.WHITE);
|
||||
printMessageToStdout(message);
|
||||
printMessageToStdout("");
|
||||
|
@ -307,7 +311,7 @@ public abstract class BaseApp {
|
|||
return;
|
||||
}
|
||||
Optional<BaseCommand> commandOpt = getNextCommand(theArgs, 1);
|
||||
if (! commandOpt.isPresent()) {
|
||||
if (commandOpt.isEmpty()) {
|
||||
String message = "Unknown command: " + theArgs[1];
|
||||
System.err.println(message);
|
||||
exitDueToProblem(message);
|
||||
|
@ -339,7 +343,7 @@ public abstract class BaseApp {
|
|||
private void runCleanupHookAndUnregister() {
|
||||
if (myShutdownHookHasNotRun) {
|
||||
Runtime.getRuntime().removeShutdownHook(myShutdownHook);
|
||||
myShutdownHook.run();
|
||||
myShutdownHook.start();
|
||||
myShutdownHookHasNotRun = false;
|
||||
}
|
||||
}
|
||||
|
@ -371,40 +375,4 @@ public abstract class BaseApp {
|
|||
public static boolean isDebugMode() {
|
||||
return ourDebugMode;
|
||||
}
|
||||
|
||||
private static void loggingConfigOff() {
|
||||
try {
|
||||
JoranConfigurator configurator = new JoranConfigurator();
|
||||
configurator.setContext((LoggerContext) LoggerFactory.getILoggerFactory());
|
||||
configurator.doConfigure(App.class.getResourceAsStream("/logback-cli-off.xml"));
|
||||
} catch (JoranException e) {
|
||||
e.printStackTrace();
|
||||
}
|
||||
}
|
||||
|
||||
private static void loggingConfigOn() {
|
||||
try {
|
||||
JoranConfigurator configurator = new JoranConfigurator();
|
||||
configurator.setContext((LoggerContext) LoggerFactory.getILoggerFactory());
|
||||
((LoggerContext) LoggerFactory.getILoggerFactory()).reset();
|
||||
configurator.doConfigure(App.class.getResourceAsStream("/logback-cli-on.xml"));
|
||||
ourLog.info("Logging configuration set from file logback-cli-on.xml");
|
||||
} catch (JoranException e) {
|
||||
e.printStackTrace();
|
||||
}
|
||||
}
|
||||
|
||||
private static void loggingConfigOnDebug() {
|
||||
try {
|
||||
JoranConfigurator configurator = new JoranConfigurator();
|
||||
configurator.setContext((LoggerContext) LoggerFactory.getILoggerFactory());
|
||||
((LoggerContext) LoggerFactory.getILoggerFactory()).reset();
|
||||
configurator.doConfigure(App.class.getResourceAsStream("/logback-cli-on-debug.xml"));
|
||||
ourLog.info("Logging configuration set from file logback-cli-on-debug.xml");
|
||||
} catch (JoranException e) {
|
||||
e.printStackTrace();
|
||||
}
|
||||
|
||||
ourLog.info("Debug logging is enabled");
|
||||
}
|
||||
}
|
||||
|
|
|
@ -0,0 +1,66 @@
|
|||
package ca.uhn.fhir.cli;
|
||||
|
||||
/*-
|
||||
* #%L
|
||||
* HAPI FHIR - Command Line Client - API
|
||||
* %%
|
||||
* Copyright (C) 2014 - 2022 Smile CDR, Inc.
|
||||
* %%
|
||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||
* you may not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
* #L%
|
||||
*/
|
||||
|
||||
import ch.qos.logback.classic.LoggerContext;
|
||||
import ch.qos.logback.classic.joran.JoranConfigurator;
|
||||
import ch.qos.logback.core.joran.spi.JoranException;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
class LogbackUtil {
|
||||
private static final Logger ourLog = LoggerFactory.getLogger(LogbackUtil.class);
|
||||
|
||||
static void loggingConfigOff() {
|
||||
try {
|
||||
JoranConfigurator configurator = new JoranConfigurator();
|
||||
configurator.setContext((LoggerContext) LoggerFactory.getILoggerFactory());
|
||||
configurator.doConfigure(App.class.getResourceAsStream("/logback-cli-off.xml"));
|
||||
} catch (JoranException e) {
|
||||
e.printStackTrace();
|
||||
}
|
||||
}
|
||||
|
||||
static void loggingConfigOnWithColour() {
|
||||
setLogbackConfig("/logback-cli-on.xml");
|
||||
}
|
||||
|
||||
static void loggingConfigOnWithoutColour() {
|
||||
setLogbackConfig("/logback-cli-on-no-colour.xml");
|
||||
}
|
||||
|
||||
static void loggingConfigOnDebug() {
|
||||
setLogbackConfig("/logback-cli-on-debug.xml");
|
||||
ourLog.info("Debug logging is enabled");
|
||||
}
|
||||
|
||||
static void setLogbackConfig(String logbackConfigFilename) {
|
||||
try {
|
||||
JoranConfigurator configurator = new JoranConfigurator();
|
||||
configurator.setContext((LoggerContext) LoggerFactory.getILoggerFactory());
|
||||
((LoggerContext) LoggerFactory.getILoggerFactory()).reset();
|
||||
configurator.doConfigure(App.class.getResourceAsStream(logbackConfigFilename));
|
||||
ourLog.info("Logging configuration set from file " + logbackConfigFilename);
|
||||
} catch (JoranException e) {
|
||||
e.printStackTrace();
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,57 @@
|
|||
<configuration>
|
||||
|
||||
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd} %d{HH:mm:ss.SS} [%thread] %-5level %logger{20} %msg%n
|
||||
</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
<logger name="ca.uhn.fhir.cli" additivity="false" level="info">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</logger>
|
||||
<logger name="ca.cdr.cli" additivity="false" level="info">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</logger>
|
||||
|
||||
<!-- These two are used by the websocket client -->
|
||||
<logger name="websocket.RECV" additivity="false" level="info">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</logger>
|
||||
<logger name="websocket.SEND" additivity="false" level="info">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</logger>
|
||||
|
||||
<!-- These two are used by SynchronizeFhirServersCommand -->
|
||||
<logger name="sync.SOURCE" additivity="false" level="info">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</logger>
|
||||
<logger name="sync.TARGET" additivity="false" level="info">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</logger>
|
||||
|
||||
<!--
|
||||
It's useful to have this log when uploading big terminologies
|
||||
-->
|
||||
<logger name="ca.uhn.fhir.jpa.term.BaseTermReadSvcImpl" additivity="false" level="info">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</logger>
|
||||
<logger name="ca.uhn.fhir.jpa.term.TermCodeSystemStorageSvcImpl" additivity="false" level="info">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</logger>
|
||||
<logger name="ca.uhn.fhir.jpa.term.TermDeferredStorageSvcImpl" additivity="false" level="info">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</logger>
|
||||
|
||||
<!--
|
||||
Always log the migrator
|
||||
-->
|
||||
<logger name="ca.uhn.fhir.jpa.migrate" additivity="false" level="info">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</logger>
|
||||
|
||||
<root level="warn">
|
||||
<appender-ref ref="STDOUT" />
|
||||
</root>
|
||||
|
||||
</configuration>
|
|
@ -1,7 +1,7 @@
|
|||
<configuration>
|
||||
|
||||
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
|
||||
<useJansi>true</useJansi>
|
||||
<withJansi>true</withJansi>
|
||||
<encoder>
|
||||
<pattern>%green(%d{yyyy-MM-dd}) %boldGreen(%d{HH:mm:ss.SS}) %white([%thread]) %white(%-5level) %boldBlue(%logger{20}) %boldWhite(%msg%n)
|
||||
</pattern>
|
||||
|
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
type: fix
|
||||
issue: 3608
|
||||
title: "Documentation on offset paging with _offset doesn't mention possible duplicate entries across different pages.
|
||||
The documentation has been updated, and a warning log is added to notify this behaviour as well."
|
|
@ -0,0 +1,4 @@
|
|||
---
|
||||
type: security
|
||||
issue: 3396
|
||||
title: "Previously, it was possible to update a resource with wrong `tenantID`. This issue has been fixed."
|
|
@ -0,0 +1,4 @@
|
|||
---
|
||||
type: change
|
||||
issue: 3506
|
||||
title: "Changing the MDM logging to contain scores for each applied matcher field. Deleting summary score when creating MDM link."
|
|
@ -0,0 +1,6 @@
|
|||
---
|
||||
type: fix
|
||||
issue: 3579
|
||||
jira: SMILE-4167
|
||||
title: "Mdm was not excluding NO_MATCH from golden-resource candidates in eid mode. This caused mdm to produce an error
|
||||
when a Patient eid is changed after that patient's link was updated to NO_MATCH. This has been corrected"
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
type: fix
|
||||
issue: 3584
|
||||
title: "Unmodified string searches and string `:contains` search were incorrectly case-sensitive
|
||||
when configured with recent versions of Lucene/Elasticsearch. This has been corrected."
|
|
@ -0,0 +1,6 @@
|
|||
---
|
||||
type: fix
|
||||
issue: 3586
|
||||
jira: SMILE-3441
|
||||
title: "While converting the reindexing job to the new batch framework, a regression of [#3441](https://github.com/hapifhir/hapi-fhir/issues/3441) was introduced. Reindexing jobs were not respecting the passed in `_lastUpdated` parameter.
|
||||
We now take date these date ranges into account when running re-indexing jobs."
|
|
@ -0,0 +1,4 @@
|
|||
---
|
||||
type: fix
|
||||
issue: 3590
|
||||
title: "When searching with the `_lastUpdated` parameter and using the `ne` prefix, search would fail with HAPI-1928 error. This has been fixed."
|
|
@ -0,0 +1,6 @@
|
|||
---
|
||||
type: fix
|
||||
issue: 3592
|
||||
jira: SMILE-691
|
||||
title: "Command-line log output now only sends colour commands if output is being printed to a console. Otherwise,
|
||||
(e.g. if output is redirected to a file) the log output will not contain any special colour escape characters."
|
|
@ -0,0 +1,5 @@
|
|||
---
|
||||
type: fix
|
||||
issue: 3602
|
||||
title: "New batch job implementation (batch2) were staying on IN_PROGRESS status after being cancelled.
|
||||
That is now fixed. After cancellation status is changed to CANCELLED."
|
|
@ -0,0 +1,3 @@
|
|||
---
|
||||
release-date: "2022-05-18"
|
||||
codename: "Tanuki"
|
|
@ -1,7 +1,8 @@
|
|||
|
||||
section.introduction.title=Welcome to HAPI FHIR
|
||||
page.introduction.table_of_contents=Table of Contents
|
||||
page.introduction.changelog=Changelog: 2021
|
||||
page.introduction.changelog=Changelog: 2022
|
||||
page.introduction.changelog_2021=Changelog: 2021
|
||||
page.introduction.changelog_2020=Changelog: 2020
|
||||
page.introduction.changelog_2019=Changelog: 2019
|
||||
page.introduction.changelog_2018=Changelog: 2018
|
||||
|
@ -53,6 +54,7 @@ section.server_jpa.title=JPA Server
|
|||
page.server_jpa.introduction=Introduction
|
||||
page.server_jpa.get_started=Get Started ⚡
|
||||
page.server_jpa.architecture=Architecture
|
||||
page.server_jpa.database_support=Database Support
|
||||
page.server_jpa.schema=Database Schema
|
||||
page.server_jpa.configuration=Configuration
|
||||
page.server_jpa.search=Search
|
||||
|
|
File diff suppressed because one or more lines are too long
Before Width: | Height: | Size: 61 KiB After Width: | Height: | Size: 55 KiB |
|
@ -1,4 +1,4 @@
|
|||
# Changelog: 2021
|
||||
# Changelog: 2022
|
||||
|
||||
<th:block th:insert="fragment_changelog.md :: changelog('2021', '')"/>
|
||||
<th:block th:insert="fragment_changelog.md :: changelog('2022', '')"/>
|
||||
|
||||
|
|
|
@ -0,0 +1,5 @@
|
|||
|
||||
# Changelog: 2021
|
||||
|
||||
<th:block th:insert="fragment_changelog.md :: changelog('2021', '2021')"/>
|
||||
|
|
@ -0,0 +1,29 @@
|
|||
# Database Support
|
||||
|
||||
HAPI FHIR JPA Server maintains active support for several databases:
|
||||
|
||||
- [MS SQL Server](https://www.microsoft.com/en-us/sql-server/sql-server-2019)
|
||||
- [PostgreSQL](https://www.postgresql.org/)
|
||||
- [Oracle](https://www.oracle.com/ca-en/database/12c-database/)
|
||||
|
||||
Use of any of the above databases is fully supported by HAPI-FHIR, and code is actively written to work with them.
|
||||
|
||||
# Experimental Support
|
||||
|
||||
HAPI FHIR currently provides experimental for the following databases, but does not actively support them, or write code specifically to work with them:
|
||||
|
||||
- [Cockroach DB](https://www.cockroachlabs.com/)
|
||||
|
||||
HAPI FHIR uses the Hibernate ORM to provide database abstraction. This means that HAPI FHIR could theoretically also work on other databases supported by Hibernate.
|
||||
For example, although we do not regularly test or validate on other platforms, community members have reported successfully running HAPI FHIR on:
|
||||
|
||||
- DB2
|
||||
- Cache
|
||||
- Firebird
|
||||
|
||||
# Deprecated Support
|
||||
|
||||
These databases were previously supported by HAPI FHIR JPA Server, but have since been deprecated, and should not be used.
|
||||
|
||||
- [MySQL](https://www.mysql.com/)
|
||||
|
|
@ -371,14 +371,19 @@ http://fhir.example.com/Patient?identifier=urn:foo|123&_count=10
|
|||
|
||||
## Offset paging with `_offset`
|
||||
|
||||
HAPI FHIR supports also paging. Offset specification can be passed into handler methods with [@Offset](/hapi-fhir/apidocs/hapi-fhir-base/ca/uhn/fhir/rest/annotation/Offset.html) annotation.
|
||||
This annotation is *not* part of the FHIR standard.
|
||||
**Warning:** Using `_offset` without sorting can result in duplicate entries to show up across the different pages when
|
||||
following the next page link provided on each page.
|
||||
|
||||
HAPI FHIR supports also paging. Offset specification can be passed into handler methods
|
||||
with [@Offset](/hapi-fhir/apidocs/hapi-fhir-base/ca/uhn/fhir/rest/annotation/Offset.html) annotation.
|
||||
This annotation is *not* part of the FHIR standard.
|
||||
|
||||
There are two possible ways to use paging. It is possible to define `_offset` parameter in the
|
||||
request which means that when combined with `_count` the paging is done on the database level. This type of
|
||||
paging benefits from not having to return so many items from the database when paging items. It's also possible
|
||||
to define default page size (i.e. default `_count` if not given) and maximum page size (i.e. maximum value
|
||||
for the `_count` parameter). See [RestfulServer](/hapi-fhir/apidocs/hapi-fhir-server/ca/uhn/fhir/rest/server/RestfulServer.html)
|
||||
for the `_count` parameter).
|
||||
See [RestfulServer](/hapi-fhir/apidocs/hapi-fhir-server/ca/uhn/fhir/rest/server/RestfulServer.html)
|
||||
for more information.
|
||||
|
||||
```java
|
||||
|
|
|
@ -35,7 +35,6 @@ import org.springframework.data.domain.Sort;
|
|||
|
||||
import javax.annotation.Nonnull;
|
||||
import javax.transaction.Transactional;
|
||||
import java.util.Collection;
|
||||
import java.util.Date;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
|
@ -120,7 +119,7 @@ public class JpaJobPersistenceImpl implements IJobPersistence {
|
|||
}
|
||||
|
||||
@Override
|
||||
public Collection<JobInstance> fetchRecentInstances(int thePageSize, int thePageIndex) {
|
||||
public List<JobInstance> fetchRecentInstances(int thePageSize, int thePageIndex) {
|
||||
PageRequest pageRequest = PageRequest.of(thePageIndex, thePageSize, Sort.Direction.DESC, "myCreateTime");
|
||||
return myJobInstanceRepository.findAll(pageRequest).stream().map(this::toInstance).collect(Collectors.toList());
|
||||
}
|
||||
|
|
|
@ -163,6 +163,8 @@ import static org.apache.commons.lang3.StringUtils.isNotBlank;
|
|||
import static org.apache.commons.lang3.StringUtils.left;
|
||||
import static org.apache.commons.lang3.StringUtils.trim;
|
||||
|
||||
import static ca.uhn.fhir.jpa.model.util.JpaConstants.ALL_PARTITIONS_NAME;
|
||||
|
||||
/*
|
||||
* #%L
|
||||
* HAPI FHIR JPA Server
|
||||
|
@ -1307,6 +1309,8 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
|
|||
} else {
|
||||
requestPartitionId = RequestPartitionId.defaultPartition();
|
||||
}
|
||||
|
||||
failIfPartitionMismatch(theRequest, entity);
|
||||
mySearchParamWithInlineReferencesExtractor.populateFromResource(requestPartitionId, newParams, theTransactionDetails, entity, theResource, existingParams, theRequest, thePerformIndexing);
|
||||
|
||||
changed = populateResourceIntoEntity(theTransactionDetails, theRequest, theResource, entity, true);
|
||||
|
@ -1474,6 +1478,24 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
|
|||
return retval;
|
||||
}
|
||||
|
||||
/**
|
||||
* TODO eventually consider refactoring this to be part of an interceptor.
|
||||
*
|
||||
* Throws an exception if the partition of the request, and the partition of the existing entity do not match.
|
||||
* @param theRequest the request.
|
||||
* @param entity the existing entity.
|
||||
*/
|
||||
private void failIfPartitionMismatch(RequestDetails theRequest, ResourceTable entity) {
|
||||
if (myPartitionSettings.isPartitioningEnabled() && theRequest != null && theRequest.getTenantId() != null && entity.getPartitionId() != null &&
|
||||
theRequest.getTenantId() != ALL_PARTITIONS_NAME) {
|
||||
PartitionEntity partitionEntity = myPartitionLookupSvc.getPartitionByName(theRequest.getTenantId());
|
||||
//partitionEntity should never be null
|
||||
if (partitionEntity != null && !partitionEntity.getId().equals(entity.getPartitionId().getPartitionId())) {
|
||||
throw new InvalidRequestException(Msg.code(2079) + "Resource " + entity.getResourceType() + "/" + entity.getId() + " is not known");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void createHistoryEntry(RequestDetails theRequest, IBaseResource theResource, ResourceTable theEntity, EncodedResource theChanged) {
|
||||
boolean versionedTags = getConfig().getTagStorageMode() == DaoConfig.TagStorageModeEnum.VERSIONED;
|
||||
final ResourceHistoryTable historyEntry = theEntity.toHistory(versionedTags);
|
||||
|
|
|
@ -37,6 +37,7 @@ import ca.uhn.fhir.rest.param.StringParam;
|
|||
import ca.uhn.fhir.rest.param.TokenParam;
|
||||
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
|
||||
import ca.uhn.fhir.util.DateUtils;
|
||||
import ca.uhn.fhir.util.StringUtil;
|
||||
import org.apache.commons.collections4.CollectionUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.commons.lang3.tuple.Pair;
|
||||
|
@ -52,6 +53,7 @@ import java.time.Instant;
|
|||
import java.util.Arrays;
|
||||
import java.util.HashSet;
|
||||
import java.util.List;
|
||||
import java.util.Locale;
|
||||
import java.util.Objects;
|
||||
import java.util.Optional;
|
||||
import java.util.Set;
|
||||
|
@ -214,7 +216,7 @@ public class ExtendedLuceneClauseBuilder {
|
|||
for (List<? extends IQueryParameterType> nextAnd : stringAndOrTerms) {
|
||||
Set<String> terms = extractOrStringParams(nextAnd);
|
||||
ourLog.debug("addStringTextSearch {}, {}", theSearchParamName, terms);
|
||||
if (terms.size() >= 1) {
|
||||
if (!terms.isEmpty()) {
|
||||
String query = terms.stream()
|
||||
.map(s -> "( " + s + " )")
|
||||
.collect(Collectors.joining(" | "));
|
||||
|
@ -249,14 +251,29 @@ public class ExtendedLuceneClauseBuilder {
|
|||
Set<String> terms = extractOrStringParams(nextAnd);
|
||||
ourLog.debug("addStringContainsSearch {} {}", theSearchParamName, terms);
|
||||
List<? extends PredicateFinalStep> orTerms = terms.stream()
|
||||
.map(s ->
|
||||
myPredicateFactory.wildcard().field(fieldPath).matching("*" + s + "*"))
|
||||
// wildcard is a term-level query, so queries aren't analyzed. Do our own normalization first.
|
||||
.map(s-> normalize(s))
|
||||
.map(s -> myPredicateFactory
|
||||
.wildcard().field(fieldPath)
|
||||
.matching("*" + s + "*"))
|
||||
.collect(Collectors.toList());
|
||||
|
||||
myRootClause.must(orPredicateOrSingle(orTerms));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Normalize the string to match our standardAnalyzer.
|
||||
* @see ca.uhn.fhir.jpa.search.HapiLuceneAnalysisConfigurer#STANDARD_ANALYZER
|
||||
*
|
||||
* @param theString the raw string
|
||||
* @return a case and accent normalized version of the input
|
||||
*/
|
||||
@Nonnull
|
||||
private String normalize(String theString) {
|
||||
return StringUtil.normalizeStringForSearchIndexing(theString).toLowerCase(Locale.ROOT);
|
||||
}
|
||||
|
||||
public void addStringUnmodifiedSearch(String theSearchParamName, List<List<IQueryParameterType>> theStringAndOrTerms) {
|
||||
String fieldPath = SEARCH_PARAM_ROOT + "." + theSearchParamName + ".string." + IDX_STRING_NORMALIZED;
|
||||
for (List<? extends IQueryParameterType> nextAnd : theStringAndOrTerms) {
|
||||
|
@ -264,7 +281,10 @@ public class ExtendedLuceneClauseBuilder {
|
|||
ourLog.debug("addStringUnmodifiedSearch {} {}", theSearchParamName, terms);
|
||||
List<? extends PredicateFinalStep> orTerms = terms.stream()
|
||||
.map(s ->
|
||||
myPredicateFactory.wildcard().field(fieldPath).matching(s + "*"))
|
||||
myPredicateFactory.wildcard()
|
||||
.field(fieldPath)
|
||||
// wildcard is a term-level query, so it isn't analyzed. Do our own case-folding to match the normStringAnalyzer
|
||||
.matching(normalize(s) + "*"))
|
||||
.collect(Collectors.toList());
|
||||
|
||||
myRootClause.must(orPredicateOrSingle(orTerms));
|
||||
|
@ -355,8 +375,8 @@ public class ExtendedLuceneClauseBuilder {
|
|||
* }
|
||||
* </pre>
|
||||
*
|
||||
* @param theSearchParamName
|
||||
* @param theDateAndOrTerms
|
||||
* @param theSearchParamName e.g code
|
||||
* @param theDateAndOrTerms The and/or list of DateParam values
|
||||
*/
|
||||
public void addDateUnmodifiedSearch(String theSearchParamName, List<List<IQueryParameterType>> theDateAndOrTerms) {
|
||||
for (List<? extends IQueryParameterType> nextAnd : theDateAndOrTerms) {
|
||||
|
|
|
@ -35,6 +35,7 @@ import ca.uhn.fhir.rest.api.SortOrderEnum;
|
|||
import ca.uhn.fhir.rest.api.SortSpec;
|
||||
import ca.uhn.fhir.rest.api.server.storage.ResourcePersistentId;
|
||||
import ca.uhn.fhir.rest.param.DateRangeParam;
|
||||
import ca.uhn.fhir.util.DateRangeUtil;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
import org.springframework.data.domain.Slice;
|
||||
|
@ -85,7 +86,8 @@ public class ResourceReindexSvcImpl implements IResourceReindexSvc {
|
|||
|
||||
SearchParameterMap searchParamMap = myMatchUrlService.translateMatchUrl(theUrl, def);
|
||||
searchParamMap.setSort(new SortSpec(Constants.PARAM_LASTUPDATED, SortOrderEnum.ASC));
|
||||
searchParamMap.setLastUpdated(new DateRangeParam(theStart, theEnd));
|
||||
DateRangeParam chunkDateRange = DateRangeUtil.narrowDateRange(searchParamMap.getLastUpdated(), theStart, theEnd);
|
||||
searchParamMap.setLastUpdated(chunkDateRange);
|
||||
searchParamMap.setCount(thePageSize);
|
||||
|
||||
IFhirResourceDao<?> dao = myDaoRegistry.getResourceDao(resourceType);
|
||||
|
|
|
@ -47,7 +47,6 @@ import ca.uhn.fhir.jpa.search.cache.ISearchCacheSvc;
|
|||
import ca.uhn.fhir.jpa.search.cache.ISearchResultCacheSvc;
|
||||
import ca.uhn.fhir.jpa.search.cache.SearchCacheStatusEnum;
|
||||
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
||||
import ca.uhn.fhir.rest.server.interceptor.ServerInterceptorUtil;
|
||||
import ca.uhn.fhir.model.api.IQueryParameterType;
|
||||
import ca.uhn.fhir.model.api.Include;
|
||||
import ca.uhn.fhir.rest.api.CacheControlDirective;
|
||||
|
@ -65,6 +64,7 @@ import ca.uhn.fhir.rest.server.exceptions.BaseServerResponseException;
|
|||
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
|
||||
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
|
||||
import ca.uhn.fhir.rest.server.exceptions.ResourceGoneException;
|
||||
import ca.uhn.fhir.rest.server.interceptor.ServerInterceptorUtil;
|
||||
import ca.uhn.fhir.rest.server.method.PageMethodBinding;
|
||||
import ca.uhn.fhir.rest.server.servlet.ServletRequestDetails;
|
||||
import ca.uhn.fhir.rest.server.util.CompositeInterceptorBroadcaster;
|
||||
|
@ -612,6 +612,7 @@ public class SearchCoordinatorSvcImpl implements ISearchCoordinatorSvc {
|
|||
if (theParams.isOffsetQuery()) {
|
||||
bundleProvider.setCurrentPageOffset(theParams.getOffset());
|
||||
bundleProvider.setCurrentPageSize(theParams.getCount());
|
||||
ourLog.warn("Query from search {} is using _offset, may result in duplicate entries across different pages.", theSearchUuid);
|
||||
}
|
||||
|
||||
if (wantCount) {
|
||||
|
|
|
@ -20,9 +20,9 @@ package ca.uhn.fhir.jpa.search.builder;
|
|||
* #L%
|
||||
*/
|
||||
|
||||
import ca.uhn.fhir.i18n.Msg;
|
||||
import ca.uhn.fhir.context.FhirContext;
|
||||
import ca.uhn.fhir.context.RuntimeSearchParam;
|
||||
import ca.uhn.fhir.i18n.Msg;
|
||||
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
|
||||
import ca.uhn.fhir.jpa.api.config.DaoConfig;
|
||||
import ca.uhn.fhir.jpa.dao.BaseStorageDao;
|
||||
|
@ -329,10 +329,6 @@ public class QueryStack {
|
|||
return orCondidtion;
|
||||
}
|
||||
|
||||
private Condition createPredicateCompositePart(@Nullable DbColumn theSourceJoinColumn, String theResourceName, String theSpnamePrefix, RuntimeSearchParam theParam, IQueryParameterType theParamValue, RequestPartitionId theRequestPartitionId) {
|
||||
return createPredicateCompositePart(theSourceJoinColumn, theResourceName, theSpnamePrefix, theParam, theParamValue, theRequestPartitionId, mySqlBuilder);
|
||||
}
|
||||
|
||||
private Condition createPredicateCompositePart(@Nullable DbColumn theSourceJoinColumn, String theResourceName, String theSpnamePrefix, RuntimeSearchParam theParam, IQueryParameterType theParamValue, RequestPartitionId theRequestPartitionId, SearchQueryBuilder theSqlBuilder) {
|
||||
|
||||
switch (theParam.getParamType()) {
|
||||
|
@ -930,7 +926,7 @@ public class QueryStack {
|
|||
}
|
||||
|
||||
public Condition createPredicateReferenceForContainedResource(@Nullable DbColumn theSourceJoinColumn,
|
||||
String theResourceName, String theParamName, List<String> theQualifiers, RuntimeSearchParam theSearchParam,
|
||||
String theResourceName, RuntimeSearchParam theSearchParam,
|
||||
List<? extends IQueryParameterType> theList, SearchFilterParser.CompareOperation theOperation,
|
||||
RequestDetails theRequest, RequestPartitionId theRequestPartitionId) {
|
||||
// A bit of a hack, but we need to turn off cache reuse while in this method so that we don't try to reuse builders across different subselects
|
||||
|
@ -1093,7 +1089,7 @@ public class QueryStack {
|
|||
}
|
||||
|
||||
private Condition createIndexPredicate(DbColumn theSourceJoinColumn, String theResourceName, String theSpnamePrefix, String theParamName, RuntimeSearchParam theParamDefinition, ArrayList<IQueryParameterType> theOrValues, SearchFilterParser.CompareOperation theOperation, List<String> theQualifiers, RequestDetails theRequest, RequestPartitionId theRequestPartitionId, SearchQueryBuilder theSqlBuilder) {
|
||||
Condition containedCondition = null;
|
||||
Condition containedCondition;
|
||||
|
||||
switch (theParamDefinition.getParamType()) {
|
||||
case DATE:
|
||||
|
@ -1312,7 +1308,7 @@ public class QueryStack {
|
|||
List<IQueryParameterType> tokens = new ArrayList<>();
|
||||
|
||||
boolean paramInverted = false;
|
||||
TokenParamModifier modifier = null;
|
||||
TokenParamModifier modifier;
|
||||
|
||||
for (IQueryParameterType nextOr : theList) {
|
||||
if (nextOr instanceof TokenParam) {
|
||||
|
@ -1337,7 +1333,7 @@ public class QueryStack {
|
|||
|
||||
modifier = id.getModifier();
|
||||
// for :not modifier, create a token and remove the :not modifier
|
||||
if (modifier != null && modifier == TokenParamModifier.NOT) {
|
||||
if (modifier == TokenParamModifier.NOT) {
|
||||
tokens.add(new TokenParam(((TokenParam) nextOr).getSystem(), ((TokenParam) nextOr).getValue()));
|
||||
paramInverted = true;
|
||||
} else {
|
||||
|
@ -1434,7 +1430,7 @@ public class QueryStack {
|
|||
case Constants.PARAM_PROFILE:
|
||||
case Constants.PARAM_SECURITY:
|
||||
if (myDaoConfig.getTagStorageMode() == DaoConfig.TagStorageModeEnum.INLINE) {
|
||||
return createPredicateSearchParameter(theSourceJoinColumn, theResourceName, theParamName, theAndOrParams, theRequest, theRequestPartitionId, theSearchContainedMode);
|
||||
return createPredicateSearchParameter(theSourceJoinColumn, theResourceName, theParamName, theAndOrParams, theRequest, theRequestPartitionId);
|
||||
} else {
|
||||
return createPredicateTag(theSourceJoinColumn, theAndOrParams, theParamName, theRequestPartitionId);
|
||||
}
|
||||
|
@ -1443,14 +1439,14 @@ public class QueryStack {
|
|||
return createPredicateSourceForAndList(theSourceJoinColumn, theAndOrParams);
|
||||
|
||||
default:
|
||||
return createPredicateSearchParameter(theSourceJoinColumn, theResourceName, theParamName, theAndOrParams, theRequest, theRequestPartitionId, theSearchContainedMode);
|
||||
return createPredicateSearchParameter(theSourceJoinColumn, theResourceName, theParamName, theAndOrParams, theRequest, theRequestPartitionId);
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@Nullable
|
||||
private Condition createPredicateSearchParameter(@Nullable DbColumn theSourceJoinColumn, String theResourceName, String theParamName, List<List<IQueryParameterType>> theAndOrParams, RequestDetails theRequest, RequestPartitionId theRequestPartitionId, SearchContainedModeEnum theSearchContainedMode) {
|
||||
private Condition createPredicateSearchParameter(@Nullable DbColumn theSourceJoinColumn, String theResourceName, String theParamName, List<List<IQueryParameterType>> theAndOrParams, RequestDetails theRequest, RequestPartitionId theRequestPartitionId) {
|
||||
List<Condition> andPredicates = new ArrayList<>();
|
||||
RuntimeSearchParam nextParamDef = mySearchParamRegistry.getActiveSearchParam(theResourceName, theParamName);
|
||||
if (nextParamDef != null) {
|
||||
|
@ -1472,7 +1468,6 @@ public class QueryStack {
|
|||
operation = toOperation(param.getPrefix());
|
||||
}
|
||||
andPredicates.add(createPredicateDate(theSourceJoinColumn, theResourceName, null, nextParamDef, nextAnd, operation, theRequestPartitionId));
|
||||
//andPredicates.add(createPredicateDate(theSourceJoinColumn, theResourceName, nextParamDef, nextAnd, null, theRequestPartitionId));
|
||||
}
|
||||
break;
|
||||
case QUANTITY:
|
||||
|
@ -1488,7 +1483,7 @@ public class QueryStack {
|
|||
case REFERENCE:
|
||||
for (List<? extends IQueryParameterType> nextAnd : theAndOrParams) {
|
||||
if (isEligibleForContainedResourceSearch(nextAnd)) {
|
||||
andPredicates.add(createPredicateReferenceForContainedResource(theSourceJoinColumn, theResourceName, theParamName, new ArrayList<>(), nextParamDef, nextAnd, null, theRequest, theRequestPartitionId));
|
||||
andPredicates.add(createPredicateReferenceForContainedResource(theSourceJoinColumn, theResourceName, nextParamDef, nextAnd, null, theRequest, theRequestPartitionId));
|
||||
} else {
|
||||
andPredicates.add(createPredicateReference(theSourceJoinColumn, theResourceName, theParamName, new ArrayList<>(), nextAnd, null, theRequest, theRequestPartitionId));
|
||||
}
|
||||
|
|
|
@ -20,8 +20,8 @@ package ca.uhn.fhir.jpa.search.builder.sql;
|
|||
* #L%
|
||||
*/
|
||||
|
||||
import ca.uhn.fhir.i18n.Msg;
|
||||
import ca.uhn.fhir.context.FhirContext;
|
||||
import ca.uhn.fhir.i18n.Msg;
|
||||
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
|
||||
import ca.uhn.fhir.jpa.config.HibernatePropertiesProvider;
|
||||
import ca.uhn.fhir.jpa.model.config.PartitionSettings;
|
||||
|
@ -34,8 +34,8 @@ import ca.uhn.fhir.jpa.search.builder.predicate.CoordsPredicateBuilder;
|
|||
import ca.uhn.fhir.jpa.search.builder.predicate.DatePredicateBuilder;
|
||||
import ca.uhn.fhir.jpa.search.builder.predicate.ForcedIdPredicateBuilder;
|
||||
import ca.uhn.fhir.jpa.search.builder.predicate.NumberPredicateBuilder;
|
||||
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityPredicateBuilder;
|
||||
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityNormalizedPredicateBuilder;
|
||||
import ca.uhn.fhir.jpa.search.builder.predicate.QuantityPredicateBuilder;
|
||||
import ca.uhn.fhir.jpa.search.builder.predicate.ResourceIdPredicateBuilder;
|
||||
import ca.uhn.fhir.jpa.search.builder.predicate.ResourceLinkPredicateBuilder;
|
||||
import ca.uhn.fhir.jpa.search.builder.predicate.ResourceTablePredicateBuilder;
|
||||
|
@ -45,9 +45,10 @@ import ca.uhn.fhir.jpa.search.builder.predicate.StringPredicateBuilder;
|
|||
import ca.uhn.fhir.jpa.search.builder.predicate.TagPredicateBuilder;
|
||||
import ca.uhn.fhir.jpa.search.builder.predicate.TokenPredicateBuilder;
|
||||
import ca.uhn.fhir.jpa.search.builder.predicate.UriPredicateBuilder;
|
||||
import ca.uhn.fhir.rest.api.server.storage.ResourcePersistentId;
|
||||
import ca.uhn.fhir.rest.param.DateParam;
|
||||
import ca.uhn.fhir.rest.param.DateRangeParam;
|
||||
import ca.uhn.fhir.rest.param.ParamPrefixEnum;
|
||||
|
||||
import com.healthmarketscience.sqlbuilder.BinaryCondition;
|
||||
import com.healthmarketscience.sqlbuilder.ComboCondition;
|
||||
import com.healthmarketscience.sqlbuilder.Condition;
|
||||
|
@ -69,17 +70,16 @@ import org.hibernate.engine.spi.RowSelection;
|
|||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import ca.uhn.fhir.rest.api.server.storage.ResourcePersistentId;
|
||||
import javax.annotation.Nonnull;
|
||||
import javax.annotation.Nullable;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Collection;
|
||||
import java.util.List;
|
||||
import java.util.Locale;
|
||||
import java.util.Set;
|
||||
import java.util.UUID;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.*;
|
||||
import static org.apache.commons.lang3.ObjectUtils.defaultIfNull;
|
||||
|
||||
public class SearchQueryBuilder {
|
||||
|
@ -556,21 +556,40 @@ public class SearchQueryBuilder {
|
|||
|
||||
public ComboCondition addPredicateLastUpdated(DateRangeParam theDateRange) {
|
||||
ResourceTablePredicateBuilder resourceTableRoot = getOrCreateResourceTablePredicateBuilder(false);
|
||||
|
||||
List<Condition> conditions = new ArrayList<>(2);
|
||||
BinaryCondition condition;
|
||||
|
||||
if (isNotEqualsComparator(theDateRange)) {
|
||||
condition = createConditionForValueWithComparator(LESSTHAN, resourceTableRoot.getLastUpdatedColumn(), theDateRange.getLowerBoundAsInstant());
|
||||
conditions.add(condition);
|
||||
condition = createConditionForValueWithComparator(GREATERTHAN, resourceTableRoot.getLastUpdatedColumn(), theDateRange.getUpperBoundAsInstant());
|
||||
conditions.add(condition);
|
||||
return ComboCondition.or(conditions.toArray(new Condition[0]));
|
||||
}
|
||||
|
||||
if (theDateRange.getLowerBoundAsInstant() != null) {
|
||||
BinaryCondition condition = createConditionForValueWithComparator(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, resourceTableRoot.getLastUpdatedColumn(), theDateRange.getLowerBoundAsInstant());
|
||||
condition = createConditionForValueWithComparator(GREATERTHAN_OR_EQUALS, resourceTableRoot.getLastUpdatedColumn(), theDateRange.getLowerBoundAsInstant());
|
||||
conditions.add(condition);
|
||||
}
|
||||
|
||||
if (theDateRange.getUpperBoundAsInstant() != null) {
|
||||
BinaryCondition condition = createConditionForValueWithComparator(ParamPrefixEnum.LESSTHAN_OR_EQUALS, resourceTableRoot.getLastUpdatedColumn(), theDateRange.getUpperBoundAsInstant());
|
||||
condition = createConditionForValueWithComparator(LESSTHAN_OR_EQUALS, resourceTableRoot.getLastUpdatedColumn(), theDateRange.getUpperBoundAsInstant());
|
||||
conditions.add(condition);
|
||||
}
|
||||
|
||||
return ComboCondition.and(conditions.toArray(new Condition[0]));
|
||||
}
|
||||
|
||||
private boolean isNotEqualsComparator(DateRangeParam theDateRange) {
|
||||
if (theDateRange != null) {
|
||||
DateParam lb = theDateRange.getLowerBound();
|
||||
DateParam ub = theDateRange.getUpperBound();
|
||||
|
||||
return lb != null && ub != null && lb.getPrefix().equals(NOT_EQUAL) && ub.getPrefix().equals(NOT_EQUAL);
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
|
||||
public void addResourceIdsPredicate(List<Long> thePidList) {
|
||||
DbColumn resourceIdColumn = getOrCreateFirstPredicateBuilder().getResourceIdColumn();
|
||||
|
@ -604,6 +623,8 @@ public class SearchQueryBuilder {
|
|||
return BinaryCondition.greaterThan(theColumn, generatePlaceholder(theValue));
|
||||
case GREATERTHAN_OR_EQUALS:
|
||||
return BinaryCondition.greaterThanOrEq(theColumn, generatePlaceholder(theValue));
|
||||
case NOT_EQUAL:
|
||||
return BinaryCondition.notEqualTo(theColumn, generatePlaceholder(theValue));
|
||||
default:
|
||||
throw new IllegalArgumentException(Msg.code(1263));
|
||||
}
|
||||
|
|
|
@ -40,10 +40,8 @@ import org.hl7.fhir.instance.model.api.IIdType;
|
|||
import org.slf4j.Logger;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.data.domain.Example;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.PageImpl;
|
||||
import org.springframework.data.domain.PageRequest;
|
||||
import org.springframework.data.domain.Sort;
|
||||
import org.springframework.transaction.annotation.Propagation;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
|
||||
|
@ -100,7 +98,7 @@ public class MdmLinkDaoSvc {
|
|||
mdmLink.setPartitionId(new PartitionablePartitionId(partitionId.getFirstPartitionIdOrNull(), partitionId.getPartitionDate()));
|
||||
}
|
||||
|
||||
String message = String.format("Creating MdmLink from %s to %s -> %s", theGoldenResource.getIdElement().toUnqualifiedVersionless(), theSourceResource.getIdElement().toUnqualifiedVersionless(), theMatchOutcome);
|
||||
String message = String.format("Creating MdmLink from %s to %s.", theGoldenResource.getIdElement().toUnqualifiedVersionless(), theSourceResource.getIdElement().toUnqualifiedVersionless());
|
||||
theMdmTransactionContext.addTransactionLogMessage(message);
|
||||
ourLog.debug(message);
|
||||
save(mdmLink);
|
||||
|
@ -279,11 +277,11 @@ public class MdmLinkDaoSvc {
|
|||
* Given a list of criteria, return all links from the database which fits the criteria provided
|
||||
*
|
||||
* @param theGoldenResourceId The resource ID of the golden resource being searched.
|
||||
* @param theSourceId The resource ID of the source resource being searched.
|
||||
* @param theMatchResult the {@link MdmMatchResultEnum} being searched.
|
||||
* @param theLinkSource the {@link MdmLinkSourceEnum} being searched.
|
||||
* @param thePageRequest the {@link MdmPageRequest} paging information
|
||||
* @param thePartitionId List of partitions ID being searched, where the link's partition must be in the list.
|
||||
* @param theSourceId The resource ID of the source resource being searched.
|
||||
* @param theMatchResult the {@link MdmMatchResultEnum} being searched.
|
||||
* @param theLinkSource the {@link MdmLinkSourceEnum} being searched.
|
||||
* @param thePageRequest the {@link MdmPageRequest} paging information
|
||||
* @param thePartitionId List of partitions ID being searched, where the link's partition must be in the list.
|
||||
* @return a list of {@link MdmLink} entities which match the example.
|
||||
*/
|
||||
public PageImpl<MdmLink> executeTypedQuery(IIdType theGoldenResourceId, IIdType theSourceId, MdmMatchResultEnum theMatchResult, MdmLinkSourceEnum theLinkSource, MdmPageRequest thePageRequest, List<Integer> thePartitionId) {
|
||||
|
@ -383,4 +381,13 @@ public class MdmLinkDaoSvc {
|
|||
}
|
||||
return retval;
|
||||
}
|
||||
|
||||
public Optional<MdmLink> getLinkByGoldenResourceAndSourceResource(@Nullable IAnyResource theGoldenResource, @Nullable IAnyResource theSourceResource) {
|
||||
if (theGoldenResource == null || theSourceResource == null) {
|
||||
return Optional.empty();
|
||||
}
|
||||
return getLinkByGoldenResourcePidAndSourceResourcePid(
|
||||
myJpaIdHelperService.getPidOrNull(theGoldenResource),
|
||||
myJpaIdHelperService.getPidOrNull(theSourceResource));
|
||||
}
|
||||
}
|
||||
|
|
|
@ -40,6 +40,7 @@ import org.slf4j.Logger;
|
|||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import javax.annotation.Nullable;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
|
||||
|
@ -182,6 +183,7 @@ public class MdmEidUpdateService {
|
|||
return myIncomingResourceHasAnEid;
|
||||
}
|
||||
|
||||
@Nullable
|
||||
public IAnyResource getExistingGoldenResource() {
|
||||
return myExistingGoldenResource;
|
||||
}
|
||||
|
|
|
@ -58,7 +58,7 @@ public class MdmLinkSvcImpl implements IMdmLinkSvc {
|
|||
|
||||
@Override
|
||||
@Transactional
|
||||
public void updateLink(IAnyResource theGoldenResource, IAnyResource theSourceResource, MdmMatchOutcome theMatchOutcome, MdmLinkSourceEnum theLinkSource, MdmTransactionContext theMdmTransactionContext) {
|
||||
public void updateLink(@Nonnull IAnyResource theGoldenResource, @Nonnull IAnyResource theSourceResource, MdmMatchOutcome theMatchOutcome, MdmLinkSourceEnum theLinkSource, MdmTransactionContext theMdmTransactionContext) {
|
||||
if (theMatchOutcome.isPossibleDuplicate() && goldenResourceLinkedAsNoMatch(theGoldenResource, theSourceResource)) {
|
||||
log(theMdmTransactionContext, theGoldenResource.getIdElement().toUnqualifiedVersionless() +
|
||||
" is linked as NO_MATCH with " +
|
||||
|
@ -129,10 +129,7 @@ public class MdmLinkSvcImpl implements IMdmLinkSvc {
|
|||
if (theGoldenResource.getIdElement().getIdPart() == null || theCandidate.getIdElement().getIdPart() == null) {
|
||||
return Optional.empty();
|
||||
} else {
|
||||
return myMdmLinkDaoSvc.getLinkByGoldenResourcePidAndSourceResourcePid(
|
||||
myIdHelperService.getPidOrNull(theGoldenResource),
|
||||
myIdHelperService.getPidOrNull(theCandidate)
|
||||
);
|
||||
return myMdmLinkDaoSvc.getLinkByGoldenResourceAndSourceResource(theGoldenResource, theCandidate);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -57,7 +57,7 @@ public class MdmMatchFinderSvcImpl implements IMdmMatchFinderSvc {
|
|||
.map(candidate -> new MatchedTarget(candidate, myMdmResourceMatcherSvc.getMatchResult(theResource, candidate)))
|
||||
.collect(Collectors.toList());
|
||||
|
||||
ourLog.info("Found {} matched targets for {}", matches.size(), theResourceType);
|
||||
ourLog.info("Found {} matched targets for {}.", matches.size(), theResourceType);
|
||||
return matches;
|
||||
}
|
||||
|
||||
|
|
|
@ -132,11 +132,11 @@ public class MdmMatchLinkSvc {
|
|||
// 1. Get the right helper
|
||||
// 2. Create source resource for the MDM source
|
||||
// 3. UPDATE MDM LINK TABLE
|
||||
|
||||
myMdmLinkSvc.updateLink(newGoldenResource, theResource, MdmMatchOutcome.NEW_GOLDEN_RESOURCE_MATCH, MdmLinkSourceEnum.AUTO, theMdmTransactionContext);
|
||||
}
|
||||
|
||||
private void handleMdmCreate(IAnyResource theTargetResource, MatchedGoldenResourceCandidate theGoldenResourceCandidate, MdmTransactionContext theMdmTransactionContext) {
|
||||
log(theMdmTransactionContext, "MDM has narrowed down to one candidate for matching.");
|
||||
IAnyResource goldenResource = myMdmGoldenResourceFindingSvc.getGoldenResourceFromMatchedGoldenResourceCandidate(theGoldenResourceCandidate, theMdmTransactionContext.getResourceType());
|
||||
|
||||
if (myGoldenResourceHelper.isPotentialDuplicate(goldenResource, theTargetResource)) {
|
||||
|
@ -146,6 +146,8 @@ public class MdmMatchLinkSvc {
|
|||
myMdmLinkSvc.updateLink(newGoldenResource, theTargetResource, MdmMatchOutcome.NEW_GOLDEN_RESOURCE_MATCH, MdmLinkSourceEnum.AUTO, theMdmTransactionContext);
|
||||
myMdmLinkSvc.updateLink(newGoldenResource, goldenResource, MdmMatchOutcome.POSSIBLE_DUPLICATE, MdmLinkSourceEnum.AUTO, theMdmTransactionContext);
|
||||
} else {
|
||||
log(theMdmTransactionContext, "MDM has narrowed down to one candidate for matching.");
|
||||
|
||||
if (theGoldenResourceCandidate.isMatch()) {
|
||||
myGoldenResourceHelper.handleExternalEidAddition(goldenResource, theTargetResource, theMdmTransactionContext);
|
||||
myEidUpdateService.applySurvivorshipRulesAndSaveGoldenResource(theTargetResource, goldenResource, theMdmTransactionContext);
|
||||
|
@ -156,8 +158,8 @@ public class MdmMatchLinkSvc {
|
|||
}
|
||||
|
||||
private void handleMdmWithSingleCandidate(IAnyResource theResource, MatchedGoldenResourceCandidate theGoldenResourceCandidate, MdmTransactionContext theMdmTransactionContext) {
|
||||
log(theMdmTransactionContext, "MDM has narrowed down to one candidate for matching.");
|
||||
if (theMdmTransactionContext.getRestOperation().equals(MdmTransactionContext.OperationType.UPDATE_RESOURCE)) {
|
||||
log(theMdmTransactionContext, "MDM has narrowed down to one candidate for matching.");
|
||||
myEidUpdateService.handleMdmUpdate(theResource, theGoldenResourceCandidate, theMdmTransactionContext);
|
||||
} else {
|
||||
handleMdmCreate(theResource, theGoldenResourceCandidate, theMdmTransactionContext);
|
||||
|
|
|
@ -67,4 +67,8 @@ public class CandidateList {
|
|||
public boolean isEidMatch() {
|
||||
return myStrategy.isEidMatch();
|
||||
}
|
||||
|
||||
public int size() {
|
||||
return myList.size();
|
||||
}
|
||||
}
|
||||
|
|
|
@ -21,6 +21,8 @@ package ca.uhn.fhir.jpa.mdm.svc.candidate;
|
|||
*/
|
||||
|
||||
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
|
||||
import ca.uhn.fhir.jpa.entity.MdmLink;
|
||||
import ca.uhn.fhir.jpa.mdm.dao.MdmLinkDaoSvc;
|
||||
import ca.uhn.fhir.jpa.mdm.svc.MdmResourceDaoSvc;
|
||||
import ca.uhn.fhir.mdm.api.MdmMatchOutcome;
|
||||
import ca.uhn.fhir.mdm.log.Logs;
|
||||
|
@ -46,6 +48,8 @@ public class FindCandidateByEidSvc extends BaseCandidateFinder {
|
|||
private EIDHelper myEIDHelper;
|
||||
@Autowired
|
||||
private MdmResourceDaoSvc myMdmResourceDaoSvc;
|
||||
@Autowired
|
||||
private MdmLinkDaoSvc myMdmLinkDaoSvc;
|
||||
|
||||
@Override
|
||||
protected List<MatchedGoldenResourceCandidate> findMatchGoldenResourceCandidates(IAnyResource theBaseResource) {
|
||||
|
@ -57,6 +61,10 @@ public class FindCandidateByEidSvc extends BaseCandidateFinder {
|
|||
Optional<IAnyResource> oFoundGoldenResource = myMdmResourceDaoSvc.searchGoldenResourceByEID(eid.getValue(), theBaseResource.getIdElement().getResourceType(), (RequestPartitionId) theBaseResource.getUserData(Constants.RESOURCE_PARTITION_ID));
|
||||
if (oFoundGoldenResource.isPresent()) {
|
||||
IAnyResource foundGoldenResource = oFoundGoldenResource.get();
|
||||
// Exclude manually declared NO_MATCH links from candidates
|
||||
if (isNoMatch(foundGoldenResource, theBaseResource)) {
|
||||
continue;
|
||||
}
|
||||
Long pidOrNull = myIdHelperService.getPidOrNull(foundGoldenResource);
|
||||
MatchedGoldenResourceCandidate mpc = new MatchedGoldenResourceCandidate(new ResourcePersistentId(pidOrNull), MdmMatchOutcome.EID_MATCH);
|
||||
ourLog.debug("Matched {} by EID {}", foundGoldenResource.getIdElement(), eid);
|
||||
|
@ -67,6 +75,15 @@ public class FindCandidateByEidSvc extends BaseCandidateFinder {
|
|||
return retval;
|
||||
}
|
||||
|
||||
private boolean isNoMatch(IAnyResource theGoldenResource, IAnyResource theSourceResource) {
|
||||
Optional<MdmLink> oLink = myMdmLinkDaoSvc.getLinkByGoldenResourceAndSourceResource(theGoldenResource, theSourceResource);
|
||||
if (oLink.isEmpty()) {
|
||||
return false;
|
||||
}
|
||||
MdmLink link = oLink.get();
|
||||
return link.isNoMatch();
|
||||
}
|
||||
|
||||
@Override
|
||||
protected CandidateStrategyEnum getStrategy() {
|
||||
return CandidateStrategyEnum.EID;
|
||||
|
|
|
@ -88,11 +88,17 @@ abstract public class BaseMdmR4Test extends BaseJpaR4Test {
|
|||
protected static final String PAUL_ID = "ID.PAUL.456";
|
||||
protected static final String FRANK_ID = "ID.FRANK.789";
|
||||
protected static final String DUMMY_ORG_ID = "Organization/mfr";
|
||||
protected static final String EID_1 = "123";
|
||||
protected static final String EID_2 = "456";
|
||||
|
||||
private static final Logger ourLog = getLogger(BaseMdmR4Test.class);
|
||||
private static final ContactPoint TEST_TELECOM = new ContactPoint()
|
||||
.setSystem(ContactPoint.ContactPointSystem.PHONE)
|
||||
.setValue("555-555-5555");
|
||||
private static final String NAME_GIVEN_FRANK = "Frank";
|
||||
|
||||
|
||||
|
||||
@Autowired
|
||||
protected IFhirResourceDao<Patient> myPatientDao;
|
||||
@Autowired
|
||||
|
@ -604,4 +610,12 @@ abstract public class BaseMdmR4Test extends BaseJpaR4Test {
|
|||
org.setId(DUMMY_ORG_ID);
|
||||
return myOrganizationDao.update(org);
|
||||
}
|
||||
|
||||
@Nonnull
|
||||
protected MdmTransactionContext buildUpdateLinkMdmTransactionContext() {
|
||||
MdmTransactionContext retval = new MdmTransactionContext();
|
||||
retval.setResourceType("Patient");
|
||||
retval.setRestOperation(MdmTransactionContext.OperationType.UPDATE_LINK);
|
||||
return retval;
|
||||
}
|
||||
}
|
||||
|
|
|
@ -0,0 +1,38 @@
|
|||
package ca.uhn.fhir.jpa.mdm.svc;
|
||||
|
||||
import ca.uhn.fhir.jpa.mdm.BaseMdmR4Test;
|
||||
import ca.uhn.fhir.mdm.api.IMdmLinkUpdaterSvc;
|
||||
import ca.uhn.fhir.mdm.model.MdmTransactionContext;
|
||||
import org.hl7.fhir.r4.model.Patient;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
|
||||
import static ca.uhn.fhir.mdm.api.MdmMatchResultEnum.MATCH;
|
||||
import static ca.uhn.fhir.mdm.api.MdmMatchResultEnum.NO_MATCH;
|
||||
import static org.junit.jupiter.api.Assertions.assertNotEquals;
|
||||
|
||||
class MdmLinkUpdaterSvcImplTest extends BaseMdmR4Test {
|
||||
@Autowired
|
||||
private IMdmLinkUpdaterSvc myMdmLinkUpdaterSvc;
|
||||
|
||||
@Test
|
||||
public void testUpdateLinkNoMatch() {
|
||||
// setup
|
||||
|
||||
Patient jane = createPatientAndUpdateLinks(addExternalEID(buildJanePatient(), EID_1));
|
||||
Patient originalJaneGolden = getGoldenResourceFromTargetResource(jane);
|
||||
|
||||
MdmTransactionContext mdmCtx = buildUpdateLinkMdmTransactionContext();
|
||||
|
||||
myMdmLinkUpdaterSvc.updateLink(originalJaneGolden, jane, NO_MATCH, mdmCtx);
|
||||
Patient newJaneGolden = getGoldenResourceFromTargetResource(jane);
|
||||
|
||||
assertNotEquals(newJaneGolden.getId(), originalJaneGolden.getId());
|
||||
|
||||
assertLinkCount(2);
|
||||
|
||||
assertLinksMatchResult(NO_MATCH, MATCH);
|
||||
assertLinksCreatedNewResource(true, true);
|
||||
assertLinksMatchedByEid(false, false);
|
||||
}
|
||||
}
|
|
@ -4,6 +4,7 @@ import ca.uhn.fhir.jpa.entity.MdmLink;
|
|||
import ca.uhn.fhir.jpa.mdm.BaseMdmR4Test;
|
||||
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
||||
import ca.uhn.fhir.mdm.api.IMdmLinkSvc;
|
||||
import ca.uhn.fhir.mdm.api.IMdmLinkUpdaterSvc;
|
||||
import ca.uhn.fhir.mdm.api.MdmConstants;
|
||||
import ca.uhn.fhir.mdm.api.MdmLinkSourceEnum;
|
||||
import ca.uhn.fhir.mdm.api.MdmMatchOutcome;
|
||||
|
@ -15,13 +16,11 @@ import ca.uhn.fhir.mdm.util.MdmResourceUtil;
|
|||
import ca.uhn.fhir.rest.api.server.IBundleProvider;
|
||||
import ca.uhn.fhir.rest.param.TokenParam;
|
||||
import org.hl7.fhir.instance.model.api.IAnyResource;
|
||||
import org.hl7.fhir.r4.model.Enumerations;
|
||||
import org.hl7.fhir.r4.model.HumanName;
|
||||
import org.hl7.fhir.r4.model.Identifier;
|
||||
import org.hl7.fhir.r4.model.Patient;
|
||||
import org.hl7.fhir.r4.model.Practitioner;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.slf4j.Logger;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
|
||||
import java.util.Date;
|
||||
|
@ -35,22 +34,26 @@ import static ca.uhn.fhir.mdm.api.MdmMatchResultEnum.NO_MATCH;
|
|||
import static ca.uhn.fhir.mdm.api.MdmMatchResultEnum.POSSIBLE_DUPLICATE;
|
||||
import static ca.uhn.fhir.mdm.api.MdmMatchResultEnum.POSSIBLE_MATCH;
|
||||
import static org.hamcrest.MatcherAssert.assertThat;
|
||||
import static org.hamcrest.Matchers.*;
|
||||
import static org.hamcrest.Matchers.blankOrNullString;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.hamcrest.Matchers.equalToIgnoringCase;
|
||||
import static org.hamcrest.Matchers.hasSize;
|
||||
import static org.hamcrest.Matchers.in;
|
||||
import static org.hamcrest.Matchers.is;
|
||||
import static org.hamcrest.Matchers.not;
|
||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||
import static org.junit.jupiter.api.Assertions.assertFalse;
|
||||
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||
import static org.slf4j.LoggerFactory.getLogger;
|
||||
|
||||
public class MdmMatchLinkSvcTest extends BaseMdmR4Test {
|
||||
|
||||
private static final Logger ourLog = getLogger(MdmMatchLinkSvcTest.class);
|
||||
|
||||
@Autowired
|
||||
IMdmLinkSvc myMdmLinkSvc;
|
||||
@Autowired
|
||||
private EIDHelper myEidHelper;
|
||||
@Autowired
|
||||
private GoldenResourceHelper myGoldenResourceHelper;
|
||||
@Autowired
|
||||
private IMdmLinkUpdaterSvc myMdmLinkUpdaterSvc;
|
||||
|
||||
@Test
|
||||
public void testAddPatientLinksToNewGoldenResourceIfNoneFound() {
|
||||
|
@ -211,7 +214,7 @@ public class MdmMatchLinkSvcTest extends BaseMdmR4Test {
|
|||
//We want to make sure the patients were linked to the same Golden Resource.
|
||||
assertThat(patient, is(sameGoldenResourceAs(janePatient)));
|
||||
|
||||
Patient sourcePatient = (Patient) getGoldenResourceFromTargetResource(patient);
|
||||
Patient sourcePatient = getGoldenResourceFromTargetResource(patient);
|
||||
|
||||
List<Identifier> identifier = sourcePatient.getIdentifier();
|
||||
|
||||
|
@ -397,11 +400,7 @@ public class MdmMatchLinkSvcTest extends BaseMdmR4Test {
|
|||
|
||||
IBundleProvider bundle = myPatientDao.search(buildGoldenRecordSearchParameterMap());
|
||||
assertEquals(1, bundle.size());
|
||||
Patient sourcePatient = (Patient) bundle.getResources(0, 1).get(0);
|
||||
|
||||
//assertEquals(Person.IdentityAssuranceLevel.LEVEL2, sourcePatient.getLink().get(0).getAssurance());
|
||||
//assertEquals(Person.IdentityAssuranceLevel.LEVEL1, sourcePatient.getLink().get(1).getAssurance());
|
||||
//assertEquals(Person.IdentityAssuranceLevel.LEVEL1, sourcePatient.getLink().get(2).getAssurance());
|
||||
//TODO GGG MDM: Convert these asserts to checking the MPI_LINK table
|
||||
|
||||
assertLinksMatchResult(MATCH, POSSIBLE_MATCH, POSSIBLE_MATCH);
|
||||
|
@ -490,7 +489,7 @@ public class MdmMatchLinkSvcTest extends BaseMdmR4Test {
|
|||
@Test
|
||||
public void testPatientUpdateOverwritesGoldenResourceDataOnChanges() {
|
||||
Patient janePatient = createPatientAndUpdateLinks(buildJanePatient());
|
||||
Patient janeSourcePatient = (Patient) getGoldenResourceFromTargetResource(janePatient);
|
||||
Patient janeSourcePatient = getGoldenResourceFromTargetResource(janePatient);
|
||||
|
||||
//Change Jane's name to paul.
|
||||
Patient patient1 = buildPaulPatient();
|
||||
|
@ -500,7 +499,7 @@ public class MdmMatchLinkSvcTest extends BaseMdmR4Test {
|
|||
assertThat(janeSourcePatient, is(sameGoldenResourceAs(janePaulPatient)));
|
||||
|
||||
//Ensure the related GoldenResource was updated with new info.
|
||||
Patient sourcePatientFromTarget = (Patient) getGoldenResourceFromTargetResource(janePaulPatient);
|
||||
Patient sourcePatientFromTarget = getGoldenResourceFromTargetResource(janePaulPatient);
|
||||
HumanName nameFirstRep = sourcePatientFromTarget.getNameFirstRep();
|
||||
|
||||
assertThat(nameFirstRep.getGivenAsSingleString(), is(equalToIgnoringCase("paul")));
|
||||
|
@ -514,7 +513,7 @@ public class MdmMatchLinkSvcTest extends BaseMdmR4Test {
|
|||
paul.getBirthDateElement().setValueAsString(incorrectBirthdate);
|
||||
paul = createPatientAndUpdateLinks(paul);
|
||||
|
||||
Patient sourcePatientFromTarget = (Patient) getGoldenResourceFromTargetResource(paul);
|
||||
Patient sourcePatientFromTarget = getGoldenResourceFromTargetResource(paul);
|
||||
assertThat(sourcePatientFromTarget.getBirthDateElement().getValueAsString(), is(incorrectBirthdate));
|
||||
|
||||
String correctBirthdate = "1990-06-28";
|
||||
|
@ -522,7 +521,7 @@ public class MdmMatchLinkSvcTest extends BaseMdmR4Test {
|
|||
|
||||
paul = updatePatientAndUpdateLinks(paul);
|
||||
|
||||
sourcePatientFromTarget = (Patient) getGoldenResourceFromTargetResource(paul);
|
||||
sourcePatientFromTarget = getGoldenResourceFromTargetResource(paul);
|
||||
assertThat(sourcePatientFromTarget.getBirthDateElement().getValueAsString(), is(equalTo(correctBirthdate)));
|
||||
assertLinkCount(1);
|
||||
}
|
||||
|
@ -530,33 +529,51 @@ public class MdmMatchLinkSvcTest extends BaseMdmR4Test {
|
|||
@Test
|
||||
// Test Case #3
|
||||
public void testUpdatedEidThatWouldRelinkAlsoCausesPossibleDuplicate() {
|
||||
String EID_1 = "123";
|
||||
String EID_2 = "456";
|
||||
|
||||
Patient paul = createPatientAndUpdateLinks(addExternalEID(buildPaulPatient(), EID_1));
|
||||
Patient originalPaulPatient = (Patient) getGoldenResourceFromTargetResource(paul);
|
||||
Patient originalPaulGolden = getGoldenResourceFromTargetResource(paul);
|
||||
|
||||
Patient jane = createPatientAndUpdateLinks(addExternalEID(buildJanePatient(), EID_2));
|
||||
Patient originalJanePatient = (Patient) getGoldenResourceFromTargetResource(jane);
|
||||
Patient originalJaneGolden = getGoldenResourceFromTargetResource(jane);
|
||||
|
||||
clearExternalEIDs(paul);
|
||||
addExternalEID(paul, EID_2);
|
||||
updatePatientAndUpdateLinks(paul);
|
||||
|
||||
assertThat(originalJanePatient, is(possibleDuplicateOf(originalPaulPatient)));
|
||||
assertThat(originalJaneGolden, is(possibleDuplicateOf(originalPaulGolden)));
|
||||
assertThat(jane, is(sameGoldenResourceAs(paul)));
|
||||
}
|
||||
|
||||
@Test
|
||||
// Test Case #3a
|
||||
public void originalLinkIsNoMatch() {
|
||||
// setup
|
||||
Patient paul = createPatientAndUpdateLinks(addExternalEID(buildPaulPatient(), EID_1));
|
||||
Patient originalPaulGolden = getGoldenResourceFromTargetResource(paul);
|
||||
|
||||
Patient jane = createPatientAndUpdateLinks(addExternalEID(buildJanePatient(), EID_2));
|
||||
Patient originalJaneGolden = getGoldenResourceFromTargetResource(jane);
|
||||
|
||||
MdmTransactionContext mdmCtx = buildUpdateLinkMdmTransactionContext();
|
||||
myMdmLinkUpdaterSvc.updateLink(originalPaulGolden, paul, NO_MATCH, mdmCtx);
|
||||
|
||||
clearExternalEIDs(paul);
|
||||
addExternalEID(paul, EID_2);
|
||||
|
||||
// execute
|
||||
updatePatientAndUpdateLinks(paul);
|
||||
|
||||
// verify
|
||||
assertThat(originalJaneGolden, is(not(possibleDuplicateOf(originalPaulGolden))));
|
||||
assertThat(jane, is(sameGoldenResourceAs(paul)));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testSinglyLinkedGoldenResourceThatGetsAnUpdatedEidSimplyUpdatesEID() {
|
||||
//Use Case # 2
|
||||
String EID_1 = "123";
|
||||
String EID_2 = "456";
|
||||
|
||||
Patient paul = createPatientAndUpdateLinks(addExternalEID(buildPaulPatient(), EID_1));
|
||||
Patient originalPaulPatient = (Patient) getGoldenResourceFromTargetResource(paul);
|
||||
Patient originalPaulGolden = getGoldenResourceFromTargetResource(paul);
|
||||
|
||||
String oldEid = myEidHelper.getExternalEid(originalPaulPatient).get(0).getValue();
|
||||
String oldEid = myEidHelper.getExternalEid(originalPaulGolden).get(0).getValue();
|
||||
assertThat(oldEid, is(equalTo(EID_1)));
|
||||
|
||||
clearExternalEIDs(paul);
|
||||
|
@ -565,8 +582,8 @@ public class MdmMatchLinkSvcTest extends BaseMdmR4Test {
|
|||
paul = updatePatientAndUpdateLinks(paul);
|
||||
assertNoDuplicates();
|
||||
|
||||
Patient newlyFoundPaulPatient = (Patient) getGoldenResourceFromTargetResource(paul);
|
||||
assertThat(originalPaulPatient, is(sameGoldenResourceAs(newlyFoundPaulPatient)));
|
||||
Patient newlyFoundPaulPatient = getGoldenResourceFromTargetResource(paul);
|
||||
assertThat(originalPaulGolden, is(sameGoldenResourceAs(newlyFoundPaulPatient)));
|
||||
String newEid = myEidHelper.getExternalEid(newlyFoundPaulPatient).get(0).getValue();
|
||||
assertThat(newEid, is(equalTo(EID_2)));
|
||||
}
|
||||
|
|
|
@ -0,0 +1,45 @@
|
|||
package ca.uhn.fhir.jpa.mdm.svc.candidate;
|
||||
|
||||
import ca.uhn.fhir.jpa.entity.MdmLink;
|
||||
import ca.uhn.fhir.jpa.mdm.BaseMdmR4Test;
|
||||
import ca.uhn.fhir.jpa.mdm.dao.MdmLinkDaoSvc;
|
||||
import ca.uhn.fhir.mdm.api.MdmLinkSourceEnum;
|
||||
import ca.uhn.fhir.mdm.api.MdmMatchResultEnum;
|
||||
import org.hl7.fhir.r4.model.Patient;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.extension.ExtendWith;
|
||||
import org.mockito.junit.jupiter.MockitoExtension;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
import static org.hamcrest.MatcherAssert.assertThat;
|
||||
import static org.hamcrest.Matchers.hasSize;
|
||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
class MdmGoldenResourceFindingSvcTest extends BaseMdmR4Test {
|
||||
|
||||
@Autowired
|
||||
MdmGoldenResourceFindingSvc myMdmGoldenResourceFindingSvc = new MdmGoldenResourceFindingSvc();
|
||||
@Autowired
|
||||
MdmLinkDaoSvc myMdmLinkDaoSvc;
|
||||
|
||||
@Test
|
||||
public void testNoMatchCandidatesSkipped() {
|
||||
// setup
|
||||
Patient jane = createPatientAndUpdateLinks(addExternalEID(buildJanePatient(), EID_1));
|
||||
|
||||
// hack the link into a NO_MATCH
|
||||
List<MdmLink> links = myMdmLinkDaoSvc.findMdmLinksBySourceResource(jane);
|
||||
assertThat(links, hasSize(1));
|
||||
MdmLink link = links.get(0);
|
||||
link.setMatchResult(MdmMatchResultEnum.NO_MATCH);
|
||||
link.setLinkSource(MdmLinkSourceEnum.MANUAL);
|
||||
myMdmLinkDaoSvc.save(link);
|
||||
|
||||
// the NO_MATCH golden resource should not be a candidate
|
||||
CandidateList candidateList = myMdmGoldenResourceFindingSvc.findGoldenResourceCandidates(jane);
|
||||
assertEquals(0, candidateList.size());
|
||||
}
|
||||
}
|
|
@ -74,6 +74,7 @@ public class HibernateSearchIndexWriter {
|
|||
public void writeStringIndex(String theSearchParam, String theValue) {
|
||||
DocumentElement stringIndexNode = getSearchParamIndexNode(theSearchParam, "string");
|
||||
|
||||
// we are assuming that our analyzer matches StringUtil.normalizeStringForSearchIndexing(theValue).toLowerCase(Locale.ROOT))
|
||||
stringIndexNode.addValue(IDX_STRING_NORMALIZED, theValue);// for default search
|
||||
stringIndexNode.addValue(IDX_STRING_EXACT, theValue);
|
||||
stringIndexNode.addValue(IDX_STRING_TEXT, theValue);
|
||||
|
|
|
@ -38,6 +38,7 @@ import java.util.Map;
|
|||
import java.util.Set;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.*;
|
||||
import static org.apache.commons.lang3.StringUtils.isBlank;
|
||||
import static org.apache.commons.lang3.StringUtils.isNotBlank;
|
||||
|
||||
|
@ -515,9 +516,14 @@ public class SearchParameterMap implements Serializable {
|
|||
|
||||
if (getLastUpdated() != null) {
|
||||
DateParam lb = getLastUpdated().getLowerBound();
|
||||
addLastUpdateParam(b, ParamPrefixEnum.GREATERTHAN_OR_EQUALS, lb);
|
||||
DateParam ub = getLastUpdated().getUpperBound();
|
||||
addLastUpdateParam(b, ParamPrefixEnum.LESSTHAN_OR_EQUALS, ub);
|
||||
|
||||
if (isNotEqualsComparator(lb, ub)) {
|
||||
addLastUpdateParam(b, NOT_EQUAL, getLastUpdated().getLowerBound());
|
||||
} else {
|
||||
addLastUpdateParam(b, GREATERTHAN_OR_EQUALS, lb);
|
||||
addLastUpdateParam(b, LESSTHAN_OR_EQUALS, ub);
|
||||
}
|
||||
}
|
||||
|
||||
if (getCount() != null) {
|
||||
|
@ -566,6 +572,10 @@ public class SearchParameterMap implements Serializable {
|
|||
return b.toString();
|
||||
}
|
||||
|
||||
private boolean isNotEqualsComparator(DateParam theLowerBound, DateParam theUpperBound) {
|
||||
return theLowerBound != null && theUpperBound != null && theLowerBound.getPrefix().equals(NOT_EQUAL) && theUpperBound.getPrefix().equals(NOT_EQUAL);
|
||||
}
|
||||
|
||||
/**
|
||||
* @since 5.5.0
|
||||
*/
|
||||
|
@ -576,10 +586,10 @@ public class SearchParameterMap implements Serializable {
|
|||
@Override
|
||||
public String toString() {
|
||||
ToStringBuilder b = new ToStringBuilder(this, ToStringStyle.SHORT_PREFIX_STYLE);
|
||||
if (isEmpty() == false) {
|
||||
if (!isEmpty()) {
|
||||
b.append("params", mySearchParameterMap);
|
||||
}
|
||||
if (getIncludes().isEmpty() == false) {
|
||||
if (!getIncludes().isEmpty()) {
|
||||
b.append("includes", getIncludes());
|
||||
}
|
||||
return b.toString();
|
||||
|
@ -668,7 +678,7 @@ public class SearchParameterMap implements Serializable {
|
|||
/**
|
||||
* Variant of removeByNameAndModifier for unmodified params.
|
||||
*
|
||||
* @param theName
|
||||
* @param theName the query parameter key
|
||||
* @return an And/Or List of Query Parameters matching the name with no modifier.
|
||||
*/
|
||||
public List<List<IQueryParameterType>> removeByNameUnmodified(String theName) {
|
||||
|
|
|
@ -29,7 +29,7 @@ import static java.time.temporal.ChronoUnit.SECONDS;
|
|||
public class TestElasticsearchContainerHelper {
|
||||
|
||||
|
||||
public static final String ELASTICSEARCH_VERSION = "7.16.3";
|
||||
public static final String ELASTICSEARCH_VERSION = "7.17.3";
|
||||
public static final String ELASTICSEARCH_IMAGE = "docker.elastic.co/elasticsearch/elasticsearch:" + ELASTICSEARCH_VERSION;
|
||||
|
||||
public static ElasticsearchContainer getEmbeddedElasticSearch() {
|
||||
|
|
|
@ -10,7 +10,10 @@ import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
|
|||
import ca.uhn.fhir.rest.api.SortSpec;
|
||||
import ca.uhn.fhir.rest.api.server.IBundleProvider;
|
||||
import ca.uhn.fhir.rest.server.method.SortParameter;
|
||||
import org.hamcrest.Matcher;
|
||||
import org.hamcrest.MatcherAssert;
|
||||
import org.hl7.fhir.instance.model.api.IBaseResource;
|
||||
import org.hl7.fhir.instance.model.api.IIdType;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.context.annotation.Bean;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
|
@ -21,10 +24,16 @@ import javax.annotation.Nonnull;
|
|||
import java.util.List;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import static org.hamcrest.Matchers.everyItem;
|
||||
import static org.hamcrest.Matchers.hasItems;
|
||||
import static org.hamcrest.Matchers.in;
|
||||
import static org.hamcrest.Matchers.not;
|
||||
|
||||
/**
|
||||
* Simplistic implementation of FHIR queries.
|
||||
*/
|
||||
public class TestDaoSearch {
|
||||
|
||||
@Configuration
|
||||
public static class Config {
|
||||
@Bean
|
||||
|
@ -47,6 +56,55 @@ public class TestDaoSearch {
|
|||
myFhirCtx = theFhirCtx;
|
||||
}
|
||||
|
||||
/**
|
||||
* Assert that the FHIR search has theIds in the search results.
|
||||
* @param theReason junit reason message
|
||||
* @param theQueryUrl FHIR query - e.g. /Patient?name=kelly
|
||||
* @param theIds the resource ids to expect.
|
||||
*/
|
||||
public void assertSearchFinds(String theReason, String theQueryUrl, String ...theIds) {
|
||||
assertSearchResultIds(theQueryUrl, theReason, hasItems(theIds));
|
||||
}
|
||||
|
||||
/**
|
||||
* Assert that the FHIR search has theIds in the search results.
|
||||
* @param theReason junit reason message
|
||||
* @param theQueryUrl FHIR query - e.g. /Patient?name=kelly
|
||||
* @param theIds the id-part of the resource ids to expect.
|
||||
*/
|
||||
public void assertSearchFinds(String theReason, String theQueryUrl, IIdType...theIds) {
|
||||
String[] bareIds = idTypeToIdParts(theIds);
|
||||
|
||||
assertSearchResultIds(theQueryUrl, theReason, hasItems(bareIds));
|
||||
}
|
||||
|
||||
public void assertSearchResultIds(String theQueryUrl, String theReason, Matcher<Iterable<String>> matcher) {
|
||||
List<String> ids = searchForIds(theQueryUrl);
|
||||
|
||||
MatcherAssert.assertThat(theReason, ids, matcher);
|
||||
}
|
||||
|
||||
/**
|
||||
* Assert that the FHIR search does not have theIds in the search results.
|
||||
* @param theReason junit reason message
|
||||
* @param theQueryUrl FHIR query - e.g. /Patient?name=kelly
|
||||
* @param theIds the id-part of the resource ids to not-expect.
|
||||
*/
|
||||
public void assertSearchNotFound(String theReason, String theQueryUrl, IIdType ...theIds) {
|
||||
List<String> ids = searchForIds(theQueryUrl);
|
||||
|
||||
MatcherAssert.assertThat(theReason, ids, everyItem(not(in(idTypeToIdParts(theIds)))));
|
||||
}
|
||||
|
||||
@Nonnull
|
||||
private String[] idTypeToIdParts(IIdType[] theIds) {
|
||||
String[] bareIds = new String[theIds.length];
|
||||
for (int i = 0; i < theIds.length; i++) {
|
||||
bareIds[i] = theIds[i].getIdPart();
|
||||
}
|
||||
return bareIds;
|
||||
}
|
||||
|
||||
public List<IBaseResource> searchForResources(String theQueryUrl) {
|
||||
IBundleProvider result = searchForBundleProvider(theQueryUrl);
|
||||
return result.getAllResources();
|
||||
|
|
|
@ -41,7 +41,6 @@ import ca.uhn.fhir.rest.param.HasAndListParam;
|
|||
import ca.uhn.fhir.rest.param.HasOrListParam;
|
||||
import ca.uhn.fhir.rest.param.HasParam;
|
||||
import ca.uhn.fhir.rest.param.NumberParam;
|
||||
import ca.uhn.fhir.rest.param.ParamPrefixEnum;
|
||||
import ca.uhn.fhir.rest.param.QuantityParam;
|
||||
import ca.uhn.fhir.rest.param.ReferenceAndListParam;
|
||||
import ca.uhn.fhir.rest.param.ReferenceOrListParam;
|
||||
|
@ -159,10 +158,18 @@ import java.util.Set;
|
|||
import java.util.TreeSet;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import static ca.uhn.fhir.rest.api.Constants.PARAM_HAS;
|
||||
import static ca.uhn.fhir.rest.api.Constants.PARAM_ID;
|
||||
import static ca.uhn.fhir.rest.api.Constants.PARAM_PROFILE;
|
||||
import static ca.uhn.fhir.rest.api.Constants.PARAM_SECURITY;
|
||||
import static ca.uhn.fhir.rest.api.Constants.PARAM_TAG;
|
||||
import static ca.uhn.fhir.rest.api.Constants.PARAM_TYPE;
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.EQUAL;
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.GREATERTHAN;
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.GREATERTHAN_OR_EQUALS;
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.LESSTHAN;
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.LESSTHAN_OR_EQUALS;
|
||||
import static ca.uhn.fhir.rest.param.ParamPrefixEnum.NOT_EQUAL;
|
||||
import static org.apache.commons.lang3.StringUtils.countMatches;
|
||||
import static org.apache.commons.lang3.StringUtils.leftPad;
|
||||
import static org.hamcrest.CoreMatchers.is;
|
||||
|
@ -945,7 +952,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
HasAndListParam hasAnd = new HasAndListParam();
|
||||
hasAnd.addValue(new HasOrListParam().add(new HasParam("Observation", "subject", "status", "final")));
|
||||
hasAnd.addValue(new HasOrListParam().add(new HasParam("Observation", "subject", "date", "2001-01-01")));
|
||||
map.add("_has", hasAnd);
|
||||
map.add(PARAM_HAS, hasAnd);
|
||||
List<String> actual = toUnqualifiedVersionlessIdValues(myPatientDao.search(map));
|
||||
assertThat(actual, containsInAnyOrder(p1id.getValue()));
|
||||
|
||||
|
@ -1025,7 +1032,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_has", new HasParam("Observation", "subject", "identifier", "urn:system|FOO"));
|
||||
params.add(PARAM_HAS, new HasParam("Observation", "subject", "identifier", "urn:system|FOO"));
|
||||
myCaptureQueriesListener.clear();
|
||||
assertThat(toUnqualifiedVersionlessIdValues(myPatientDao.search(params)), contains(pid0.getValue()));
|
||||
myCaptureQueriesListener.logSelectQueriesForCurrentThread(0);
|
||||
|
@ -1033,12 +1040,12 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
// No targets exist
|
||||
params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_has", new HasParam("Observation", "subject", "identifier", "urn:system|UNKNOWN"));
|
||||
params.add(PARAM_HAS, new HasParam("Observation", "subject", "identifier", "urn:system|UNKNOWN"));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(myPatientDao.search(params)), empty());
|
||||
|
||||
// Target exists but doesn't link to us
|
||||
params = new SearchParameterMap();
|
||||
params.add("_has", new HasParam("Observation", "subject", "identifier", "urn:system|NOLINK"));
|
||||
params.add(PARAM_HAS, new HasParam("Observation", "subject", "identifier", "urn:system|NOLINK"));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(myPatientDao.search(params)), empty());
|
||||
}
|
||||
|
||||
|
@ -1081,7 +1088,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
|
||||
// Double _has
|
||||
params = new SearchParameterMap();
|
||||
params.add("_has", new HasParam("Observation", "subject", "_has:DiagnosticReport:result:status", "final"));
|
||||
params.add(PARAM_HAS, new HasParam("Observation", "subject", "_has:DiagnosticReport:result:status", "final"));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(myPatientDao.search(params)), containsInAnyOrder(pid0.getValue()));
|
||||
|
||||
}
|
||||
|
@ -1117,13 +1124,13 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
// No targets exist
|
||||
params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_has", new HasParam("Observation", "subject", "identifier", "urn:system|UNKNOWN"));
|
||||
params.add(PARAM_HAS, new HasParam("Observation", "subject", "identifier", "urn:system|UNKNOWN"));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(myPatientDao.search(params)), empty());
|
||||
|
||||
// Target exists but doesn't link to us
|
||||
params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_has", new HasParam("Observation", "subject", "identifier", "urn:system|NOLINK"));
|
||||
params.add(PARAM_HAS, new HasParam("Observation", "subject", "identifier", "urn:system|NOLINK"));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(myPatientDao.search(params)), empty());
|
||||
}
|
||||
|
||||
|
@ -1131,7 +1138,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
public void testHasParameterInvalidResourceType() {
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_has", new HasParam("Observation__", "subject", "identifier", "urn:system|FOO"));
|
||||
params.add(PARAM_HAS, new HasParam("Observation__", "subject", "identifier", "urn:system|FOO"));
|
||||
try {
|
||||
myPatientDao.search(params);
|
||||
fail();
|
||||
|
@ -1144,7 +1151,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
public void testHasParameterInvalidSearchParam() {
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_has", new HasParam("Observation", "subject", "IIIIDENFIEYR", "urn:system|FOO"));
|
||||
params.add(PARAM_HAS, new HasParam("Observation", "subject", "IIIIDENFIEYR", "urn:system|FOO"));
|
||||
try {
|
||||
myPatientDao.search(params);
|
||||
fail();
|
||||
|
@ -1157,7 +1164,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
public void testHasParameterInvalidTargetPath() {
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_has", new HasParam("Observation", "soooooobject", "identifier", "urn:system|FOO"));
|
||||
params.add(PARAM_HAS, new HasParam("Observation", "soooooobject", "identifier", "urn:system|FOO"));
|
||||
try {
|
||||
myPatientDao.search(params);
|
||||
fail();
|
||||
|
@ -1199,7 +1206,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
mySystemDao.transaction(mySrd, input);
|
||||
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.add("_id", new TokenParam(null, "DR"));
|
||||
params.add(PARAM_ID, new TokenParam(null, "DR"));
|
||||
params.addInclude(new Include("DiagnosticReport:subject").setRecurse(true));
|
||||
params.addInclude(new Include("DiagnosticReport:result").setRecurse(true));
|
||||
params.addInclude(Observation.INCLUDE_HAS_MEMBER.setRecurse(true));
|
||||
|
@ -1581,17 +1588,17 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
assertThat(toUnqualifiedVersionlessIdValues(myPatientDao.search(params)), contains(id1));
|
||||
|
||||
params = new SearchParameterMap();
|
||||
params.add("_id", new StringParam(id1));
|
||||
params.add(PARAM_ID, new StringParam(id1));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(myPatientDao.search(params)), contains(id1));
|
||||
|
||||
params = new SearchParameterMap();
|
||||
params.add("_id", new StringParam("9999999999999999"));
|
||||
params.add(PARAM_ID, new StringParam("9999999999999999"));
|
||||
assertEquals(0, toList(myPatientDao.search(params)).size());
|
||||
|
||||
myCaptureQueriesListener.clear();
|
||||
params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_id", new StringParam(id2));
|
||||
params.add(PARAM_ID, new StringParam(id2));
|
||||
size = toList(myPatientDao.search(params)).size();
|
||||
myCaptureQueriesListener.logAllQueries();
|
||||
assertEquals(0, size);
|
||||
|
@ -1653,12 +1660,12 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
|
||||
// inverse
|
||||
params = SearchParameterMap.newSynchronous();
|
||||
params.add("_id", new TokenParam(id1).setModifier(TokenParamModifier.NOT));
|
||||
params.add(PARAM_ID, new TokenParam(id1).setModifier(TokenParamModifier.NOT));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(myPatientDao.search(params)), contains(id2));
|
||||
|
||||
// Non-inverse
|
||||
params = SearchParameterMap.newSynchronous();
|
||||
params.add("_id", new TokenParam(id1));
|
||||
params.add(PARAM_ID, new TokenParam(id1));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(myPatientDao.search(params)), contains(id1));
|
||||
|
||||
}
|
||||
|
@ -1669,7 +1676,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_id", new StringParam("DiagnosticReport/123"));
|
||||
params.add(PARAM_ID, new StringParam("DiagnosticReport/123"));
|
||||
myCaptureQueriesListener.clear();
|
||||
myDiagnosticReportDao.search(params).size();
|
||||
List<SqlQuery> selectQueries = myCaptureQueriesListener.getSelectQueriesForCurrentThread();
|
||||
|
@ -1686,7 +1693,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_id", new StringParam("DiagnosticReport/123"));
|
||||
params.add(PARAM_ID, new StringParam("DiagnosticReport/123"));
|
||||
params.add("code", new TokenParam("foo", "bar"));
|
||||
myCaptureQueriesListener.clear();
|
||||
myDiagnosticReportDao.search(params).size();
|
||||
|
@ -1707,8 +1714,8 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
public void testSearchByIdParamAndOtherSearchParam_QueryIsMinimal() {
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_id", new StringParam("DiagnosticReport/123"));
|
||||
params.add("_id", new StringParam("DiagnosticReport/123"));
|
||||
params.add(PARAM_ID, new StringParam("DiagnosticReport/123"));
|
||||
params.add(PARAM_ID, new StringParam("DiagnosticReport/123"));
|
||||
myCaptureQueriesListener.clear();
|
||||
myDiagnosticReportDao.search(params).size();
|
||||
List<SqlQuery> selectQueries = myCaptureQueriesListener.getSelectQueriesForCurrentThread();
|
||||
|
@ -1744,28 +1751,28 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
param = new StringAndListParam();
|
||||
param.addAnd(new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam(id2.getIdPart())));
|
||||
param.addAnd(new StringOrListParam().addOr(new StringParam(id1.getIdPart())));
|
||||
params.add("_id", param);
|
||||
params.add(PARAM_ID, param);
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(params)), containsInAnyOrder(id1));
|
||||
|
||||
params = new SearchParameterMap();
|
||||
param = new StringAndListParam();
|
||||
param.addAnd(new StringOrListParam().addOr(new StringParam(id2.getIdPart())));
|
||||
param.addAnd(new StringOrListParam().addOr(new StringParam(id1.getIdPart())));
|
||||
params.add("_id", param);
|
||||
params.add(PARAM_ID, param);
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(params)), empty());
|
||||
|
||||
params = new SearchParameterMap();
|
||||
param = new StringAndListParam();
|
||||
param.addAnd(new StringOrListParam().addOr(new StringParam(id2.getIdPart())));
|
||||
param.addAnd(new StringOrListParam().addOr(new StringParam("9999999999999")));
|
||||
params.add("_id", param);
|
||||
params.add(PARAM_ID, param);
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(params)), empty());
|
||||
|
||||
params = new SearchParameterMap();
|
||||
param = new StringAndListParam();
|
||||
param.addAnd(new StringOrListParam().addOr(new StringParam("9999999999999")));
|
||||
param.addAnd(new StringOrListParam().addOr(new StringParam(id2.getIdPart())));
|
||||
params.add("_id", param);
|
||||
params.add(PARAM_ID, param);
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(params)), empty());
|
||||
|
||||
}
|
||||
|
@ -1791,21 +1798,21 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
}
|
||||
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.add("_id", new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam(id2.getIdPart())));
|
||||
params.add(PARAM_ID, new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam(id2.getIdPart())));
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(params)), containsInAnyOrder(id1, id2));
|
||||
|
||||
params = new SearchParameterMap();
|
||||
params.add("_id", new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam(id1.getIdPart())));
|
||||
params.add(PARAM_ID, new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam(id1.getIdPart())));
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(params)), containsInAnyOrder(id1));
|
||||
|
||||
params = new SearchParameterMap();
|
||||
params.add("_id", new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam("999999999999")));
|
||||
params.add(PARAM_ID, new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam("999999999999")));
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(params)), containsInAnyOrder(id1));
|
||||
|
||||
// With lastupdated
|
||||
|
||||
params = SearchParameterMap.newSynchronous();
|
||||
params.add("_id", new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam(id2.getIdPart())));
|
||||
params.add(PARAM_ID, new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam(id2.getIdPart())));
|
||||
params.setLastUpdated(new DateRangeParam(new Date(betweenTime), null));
|
||||
|
||||
myCaptureQueriesListener.clear();
|
||||
|
@ -1831,7 +1838,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
}
|
||||
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.add("_id", new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam(id2.getIdPart())));
|
||||
params.add(PARAM_ID, new StringOrListParam().addOr(new StringParam(id1.getIdPart())).addOr(new StringParam(id2.getIdPart())));
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(params)), containsInAnyOrder(id1));
|
||||
|
||||
}
|
||||
|
@ -2041,7 +2048,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
String param = Observation.SP_COMPONENT_VALUE_QUANTITY;
|
||||
|
||||
{
|
||||
QuantityParam v1 = new QuantityParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, 150, "http://bar", "code1");
|
||||
QuantityParam v1 = new QuantityParam(GREATERTHAN_OR_EQUALS, 150, "http://bar", "code1");
|
||||
SearchParameterMap map = new SearchParameterMap().setLoadSynchronous(true).add(param, v1);
|
||||
IBundleProvider result = myObservationDao.search(map);
|
||||
assertThat("Got: " + toUnqualifiedVersionlessIdValues(result), toUnqualifiedVersionlessIdValues(result), containsInAnyOrder(id1.getValue()));
|
||||
|
@ -2072,7 +2079,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
|
||||
{
|
||||
TokenParam v0 = new TokenParam("http://foo", "code1");
|
||||
QuantityParam v1 = new QuantityParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, 150, "http://bar", "code1");
|
||||
QuantityParam v1 = new QuantityParam(GREATERTHAN_OR_EQUALS, 150, "http://bar", "code1");
|
||||
CompositeParam<TokenParam, QuantityParam> val = new CompositeParam<>(v0, v1);
|
||||
SearchParameterMap map = new SearchParameterMap().setLoadSynchronous(true).add(param, val);
|
||||
myCaptureQueriesListener.clear();
|
||||
|
@ -2082,21 +2089,21 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
}
|
||||
{
|
||||
TokenParam v0 = new TokenParam("http://foo", "code1");
|
||||
QuantityParam v1 = new QuantityParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, 50, "http://bar", "code1");
|
||||
QuantityParam v1 = new QuantityParam(GREATERTHAN_OR_EQUALS, 50, "http://bar", "code1");
|
||||
CompositeParam<TokenParam, QuantityParam> val = new CompositeParam<>(v0, v1);
|
||||
IBundleProvider result = myObservationDao.search(new SearchParameterMap().setLoadSynchronous(true).add(param, val));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(result), containsInAnyOrder(id1.getValue(), id2.getValue()));
|
||||
}
|
||||
{
|
||||
TokenParam v0 = new TokenParam("http://foo", "code4");
|
||||
QuantityParam v1 = new QuantityParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, 50, "http://bar", "code1");
|
||||
QuantityParam v1 = new QuantityParam(GREATERTHAN_OR_EQUALS, 50, "http://bar", "code1");
|
||||
CompositeParam<TokenParam, QuantityParam> val = new CompositeParam<>(v0, v1);
|
||||
IBundleProvider result = myObservationDao.search(new SearchParameterMap().setLoadSynchronous(true).add(param, val));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(result), empty());
|
||||
}
|
||||
{
|
||||
TokenParam v0 = new TokenParam("http://foo", "code1");
|
||||
QuantityParam v1 = new QuantityParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, 50, "http://bar", "code4");
|
||||
QuantityParam v1 = new QuantityParam(GREATERTHAN_OR_EQUALS, 50, "http://bar", "code4");
|
||||
CompositeParam<TokenParam, QuantityParam> val = new CompositeParam<>(v0, v1);
|
||||
IBundleProvider result = myObservationDao.search(new SearchParameterMap().setLoadSynchronous(true).add(param, val));
|
||||
assertThat(toUnqualifiedVersionlessIdValues(result), empty());
|
||||
|
@ -2487,7 +2494,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
{
|
||||
// Don't load synchronous
|
||||
SearchParameterMap map = new SearchParameterMap();
|
||||
map.setLastUpdated(new DateRangeParam().setUpperBound(new DateParam(ParamPrefixEnum.LESSTHAN, "2042-01-01")));
|
||||
map.setLastUpdated(new DateRangeParam().setUpperBound(new DateParam(LESSTHAN, "2042-01-01")));
|
||||
IBundleProvider found = myPatientDao.search(map);
|
||||
Set<String> dates = new HashSet<>();
|
||||
String searchId = found.getUuid();
|
||||
|
@ -2534,11 +2541,10 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
}
|
||||
|
||||
SearchParameterMap params;
|
||||
List result;
|
||||
|
||||
params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_id", new StringParam("TEST"));
|
||||
params.add(PARAM_ID, new StringParam("TEST"));
|
||||
assertEquals(1, toList(myPatientDao.search(params)).size());
|
||||
|
||||
params = new SearchParameterMap();
|
||||
|
@ -2555,7 +2561,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
|
||||
params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add("_id", new StringParam("TEST"));
|
||||
params.add(PARAM_ID, new StringParam("TEST"));
|
||||
assertEquals(0, toList(myPatientDao.search(params)).size());
|
||||
|
||||
params = new SearchParameterMap();
|
||||
|
@ -2574,7 +2580,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
public void testSearchForUnknownAlphanumericId() {
|
||||
{
|
||||
SearchParameterMap map = new SearchParameterMap();
|
||||
map.add("_id", new StringParam("testSearchForUnknownAlphanumericId"));
|
||||
map.add(PARAM_ID, new StringParam("testSearchForUnknownAlphanumericId"));
|
||||
IBundleProvider retrieved = myPatientDao.search(map);
|
||||
assertEquals(0, retrieved.size().intValue());
|
||||
}
|
||||
|
@ -2614,55 +2620,37 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
id2 = myPatientDao.create(patient, mySrd).getId().toUnqualifiedVersionless();
|
||||
}
|
||||
|
||||
List<IIdType> result;
|
||||
|
||||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myPatientDao.search(params));
|
||||
assertThat(patients, hasItems(id1a, id1b, id2));
|
||||
}
|
||||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLastUpdated(new DateRangeParam(beforeAny, null));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myPatientDao.search(params));
|
||||
assertThat(patients, hasItems(id1a, id1b, id2));
|
||||
}
|
||||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLastUpdated(new DateRangeParam(beforeR2, null));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myPatientDao.search(params));
|
||||
assertThat(patients, hasItems(id2));
|
||||
assertThat(patients, not(hasItems(id1a, id1b)));
|
||||
}
|
||||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLastUpdated(new DateRangeParam(beforeAny, beforeR2));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myPatientDao.search(params));
|
||||
assertThat(patients.toString(), patients, not(hasItems(id2)));
|
||||
assertThat(patients.toString(), patients, (hasItems(id1a, id1b)));
|
||||
}
|
||||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLastUpdated(new DateRangeParam(null, beforeR2));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myPatientDao.search(params));
|
||||
assertThat(patients, (hasItems(id1a, id1b)));
|
||||
assertThat(patients, not(hasItems(id2)));
|
||||
}
|
||||
|
||||
result = performSearchLastUpdatedAndReturnIds(new DateRangeParam(beforeAny, null));
|
||||
assertThat(result, hasItems(id1a, id1b, id2));
|
||||
|
||||
result = performSearchLastUpdatedAndReturnIds(new DateRangeParam(beforeR2, null));
|
||||
assertThat(result, hasItems(id2));
|
||||
assertThat(result, not(hasItems(id1a, id1b)));
|
||||
|
||||
|
||||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLastUpdated(new DateRangeParam(new DateParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, beforeR2)));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myPatientDao.search(params));
|
||||
assertThat(patients, not(hasItems(id1a, id1b)));
|
||||
assertThat(patients, (hasItems(id2)));
|
||||
}
|
||||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLastUpdated(new DateRangeParam(new DateParam(ParamPrefixEnum.LESSTHAN_OR_EQUALS, beforeR2)));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myPatientDao.search(params));
|
||||
assertThat(patients, (hasItems(id1a, id1b)));
|
||||
assertThat(patients, not(hasItems(id2)));
|
||||
}
|
||||
result = performSearchLastUpdatedAndReturnIds(new DateRangeParam(beforeAny, beforeR2));
|
||||
assertThat(result.toString(), result, not(hasItems(id2)));
|
||||
assertThat(result.toString(), result, (hasItems(id1a, id1b)));
|
||||
|
||||
result = performSearchLastUpdatedAndReturnIds(new DateRangeParam(null, beforeR2));
|
||||
assertThat(result, (hasItems(id1a, id1b)));
|
||||
assertThat(result, not(hasItems(id2)));
|
||||
|
||||
result = performSearchLastUpdatedAndReturnIds(new DateRangeParam(new DateParam(GREATERTHAN_OR_EQUALS, beforeR2)));
|
||||
assertThat(result, not(hasItems(id1a, id1b)));
|
||||
assertThat(result, (hasItems(id2)));
|
||||
|
||||
result = performSearchLastUpdatedAndReturnIds(new DateRangeParam(new DateParam(LESSTHAN_OR_EQUALS, beforeR2)));
|
||||
assertThat(result, (hasItems(id1a, id1b)));
|
||||
assertThat(result, not(hasItems(id2)));
|
||||
}
|
||||
|
||||
@Test
|
||||
|
@ -2696,40 +2684,66 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
id1b = myPatientDao.create(patient, mySrd).getId().toUnqualifiedVersionless();
|
||||
}
|
||||
|
||||
ourLog.info("Res 1: {}", myPatientDao.read(id0, mySrd).getMeta().getLastUpdatedElement().getValueAsString());
|
||||
ourLog.info("Res 2: {}", myPatientDao.read(id1a, mySrd).getMeta().getLastUpdatedElement().getValueAsString());
|
||||
ourLog.info("Res 3: {}", myPatientDao.read(id1b, mySrd).getMeta().getLastUpdatedElement().getValueAsString());
|
||||
InstantType p0LastUpdated = myPatientDao.read(id0, mySrd).getMeta().getLastUpdatedElement();
|
||||
InstantType p1aLastUpdated = myPatientDao.read(id1a, mySrd).getMeta().getLastUpdatedElement();
|
||||
InstantType p1bLastUpdated = myPatientDao.read(id1b, mySrd).getMeta().getLastUpdatedElement();
|
||||
|
||||
ourLog.info("Res 1: {}", p0LastUpdated.getValueAsString());
|
||||
ourLog.info("Res 2: {}", p1aLastUpdated.getValueAsString());
|
||||
ourLog.info("Res 3: {}", p1bLastUpdated.getValueAsString());
|
||||
|
||||
TestUtil.sleepOneClick();
|
||||
|
||||
long end = System.currentTimeMillis();
|
||||
|
||||
SearchParameterMap map;
|
||||
List<IIdType> result;
|
||||
DateRangeParam dateRange;
|
||||
Date startDate = new Date(start);
|
||||
Date endDate = new Date(end);
|
||||
DateTimeType startDateTime = new DateTimeType(startDate, TemporalPrecisionEnum.MILLI);
|
||||
DateTimeType endDateTime = new DateTimeType(endDate, TemporalPrecisionEnum.MILLI);
|
||||
|
||||
map = new SearchParameterMap();
|
||||
map.setLastUpdated(new DateRangeParam(startDateTime, endDateTime));
|
||||
ourLog.info("Searching: {}", map.getLastUpdated());
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(map)), containsInAnyOrder(id1a, id1b));
|
||||
dateRange = new DateRangeParam(startDateTime, endDateTime);
|
||||
result = performSearchLastUpdatedAndReturnIds(dateRange);
|
||||
assertThat(result, containsInAnyOrder(id1a, id1b));
|
||||
|
||||
map = new SearchParameterMap();
|
||||
map.setLastUpdated(new DateRangeParam(new DateParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, startDateTime), new DateParam(ParamPrefixEnum.LESSTHAN_OR_EQUALS, endDateTime)));
|
||||
ourLog.info("Searching: {}", map.getLastUpdated());
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(map)), containsInAnyOrder(id1a, id1b));
|
||||
dateRange = new DateRangeParam(new DateParam(GREATERTHAN_OR_EQUALS, startDateTime), new DateParam(LESSTHAN_OR_EQUALS, endDateTime));
|
||||
result = performSearchLastUpdatedAndReturnIds(dateRange);
|
||||
assertThat(result, containsInAnyOrder(id1a, id1b));
|
||||
|
||||
map = new SearchParameterMap();
|
||||
map.setLastUpdated(new DateRangeParam(new DateParam(ParamPrefixEnum.GREATERTHAN, startDateTime), new DateParam(ParamPrefixEnum.LESSTHAN, endDateTime)));
|
||||
ourLog.info("Searching: {}", map.getLastUpdated());
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(map)), containsInAnyOrder(id1a, id1b));
|
||||
dateRange = new DateRangeParam(new DateParam(GREATERTHAN, startDateTime), new DateParam(LESSTHAN, endDateTime));
|
||||
result = performSearchLastUpdatedAndReturnIds(dateRange);
|
||||
assertThat(result, containsInAnyOrder(id1a, id1b));
|
||||
|
||||
map = new SearchParameterMap();
|
||||
map.setLastUpdated(new DateRangeParam(new DateParam(ParamPrefixEnum.GREATERTHAN, startDateTime.getValue()),
|
||||
new DateParam(ParamPrefixEnum.LESSTHAN, TestUtil.getTimestamp(myPatientDao.read(id1b, mySrd)))));
|
||||
dateRange = new DateRangeParam(new DateParam(GREATERTHAN, startDateTime.getValue()), new DateParam(LESSTHAN, TestUtil.getTimestamp(myPatientDao.read(id1b, mySrd))));
|
||||
result = performSearchLastUpdatedAndReturnIds(dateRange);
|
||||
assertThat(result, containsInAnyOrder(id1a));
|
||||
|
||||
dateRange = new DateRangeParam(new DateParam(EQUAL, p0LastUpdated), new DateParam(EQUAL, p0LastUpdated));
|
||||
result = performSearchLastUpdatedAndReturnIds(dateRange);
|
||||
assertThat(result, containsInAnyOrder(id0));
|
||||
assertThat(result, not(containsInAnyOrder(id1a, id1b)));
|
||||
|
||||
DateTimeType p0LastUpdatedDay = new DateTimeType(p0LastUpdated.getValue(), TemporalPrecisionEnum.DAY);
|
||||
dateRange = new DateRangeParam(new DateParam(EQUAL, p0LastUpdatedDay), new DateParam(EQUAL, p0LastUpdatedDay));
|
||||
result = performSearchLastUpdatedAndReturnIds(dateRange);
|
||||
assertThat(result, containsInAnyOrder(id0, id1a, id1b));
|
||||
|
||||
dateRange = new DateRangeParam(new DateParam(NOT_EQUAL, p0LastUpdated), new DateParam(NOT_EQUAL, p0LastUpdated));
|
||||
result = performSearchLastUpdatedAndReturnIds(dateRange);
|
||||
assertThat(result, containsInAnyOrder(id1a, id1b));
|
||||
assertThat(result, not(containsInAnyOrder(id0)));
|
||||
|
||||
dateRange = new DateRangeParam(new DateParam(NOT_EQUAL, p0LastUpdatedDay), new DateParam(NOT_EQUAL, p0LastUpdatedDay));
|
||||
result = performSearchLastUpdatedAndReturnIds(dateRange);
|
||||
assertEquals(0, result.size());
|
||||
}
|
||||
|
||||
private List<IIdType> performSearchLastUpdatedAndReturnIds(DateRangeParam theDateRange) {
|
||||
SearchParameterMap map = new SearchParameterMap();
|
||||
map.setLastUpdated(theDateRange);
|
||||
ourLog.info("Searching: {}", map.getLastUpdated());
|
||||
assertThat(toUnqualifiedVersionlessIds(myPatientDao.search(map)), containsInAnyOrder(id1a));
|
||||
return toUnqualifiedVersionlessIds(myPatientDao.search(map));
|
||||
}
|
||||
|
||||
@Test
|
||||
|
@ -3704,7 +3718,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
sp.setLastUpdated(new DateRangeParam()
|
||||
.setUpperBound(new DateParam("le2019-02-22T17:50:00"))
|
||||
.setLowerBound(new DateParam("ge2019-02-22T13:50:00")));
|
||||
IBundleProvider retrieved = myMedicationRequestDao.search(sp);
|
||||
myMedicationRequestDao.search(sp);
|
||||
|
||||
myCaptureQueriesListener.logSelectQueriesForCurrentThread();
|
||||
List<String> queries = myCaptureQueriesListener
|
||||
|
@ -3731,7 +3745,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
|
||||
myCaptureQueriesListener.clear();
|
||||
sp.setLoadSynchronous(true);
|
||||
IBundleProvider retrieved = myProcedureDao.search(sp);
|
||||
myProcedureDao.search(sp);
|
||||
|
||||
myCaptureQueriesListener.logSelectQueriesForCurrentThread();
|
||||
// List<String> queries = myCaptureQueriesListener
|
||||
|
@ -4126,7 +4140,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
|
||||
map = new SearchParameterMap();
|
||||
map.setLoadSynchronous(true);
|
||||
param = new QuantityParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, new BigDecimal("10"), null, null);
|
||||
param = new QuantityParam(GREATERTHAN_OR_EQUALS, new BigDecimal("10"), null, null);
|
||||
map.add(Observation.SP_VALUE_QUANTITY, param);
|
||||
myCaptureQueriesListener.clear();
|
||||
found = myObservationDao.search(map);
|
||||
|
@ -4141,28 +4155,28 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
|
||||
map = new SearchParameterMap();
|
||||
map.setLoadSynchronous(true);
|
||||
param = new QuantityParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, new BigDecimal("10"), null, methodName + "units");
|
||||
param = new QuantityParam(GREATERTHAN_OR_EQUALS, new BigDecimal("10"), null, methodName + "units");
|
||||
map.add(Observation.SP_VALUE_QUANTITY, param);
|
||||
found = myObservationDao.search(map);
|
||||
assertThat(toUnqualifiedVersionlessIdValues(found), contains(id1));
|
||||
|
||||
map = new SearchParameterMap();
|
||||
map.setLoadSynchronous(true);
|
||||
param = new QuantityParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, new BigDecimal("10"), "urn:bar:" + methodName, null);
|
||||
param = new QuantityParam(GREATERTHAN_OR_EQUALS, new BigDecimal("10"), "urn:bar:" + methodName, null);
|
||||
map.add(Observation.SP_VALUE_QUANTITY, param);
|
||||
found = myObservationDao.search(map);
|
||||
assertThat(toUnqualifiedVersionlessIdValues(found), contains(id1));
|
||||
|
||||
map = new SearchParameterMap();
|
||||
map.setLoadSynchronous(true);
|
||||
param = new QuantityParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, new BigDecimal("10"), "urn:bar:" + methodName, methodName + "units");
|
||||
param = new QuantityParam(GREATERTHAN_OR_EQUALS, new BigDecimal("10"), "urn:bar:" + methodName, methodName + "units");
|
||||
map.add(Observation.SP_VALUE_QUANTITY, param);
|
||||
found = myObservationDao.search(map);
|
||||
assertThat(toUnqualifiedVersionlessIdValues(found), contains(id1));
|
||||
|
||||
map = new SearchParameterMap();
|
||||
map.setLoadSynchronous(true);
|
||||
param = new QuantityParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, new BigDecimal("1000"), "urn:bar:" + methodName, methodName + "units");
|
||||
param = new QuantityParam(GREATERTHAN_OR_EQUALS, new BigDecimal("1000"), "urn:bar:" + methodName, methodName + "units");
|
||||
map.add(Observation.SP_VALUE_QUANTITY, param);
|
||||
found = myObservationDao.search(map);
|
||||
assertThat(toUnqualifiedVersionlessIdValues(found), empty());
|
||||
|
@ -4312,14 +4326,14 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add(Patient.SP_BIRTHDATE, new DateParam("2011-01-03").setPrefix(ParamPrefixEnum.LESSTHAN));
|
||||
params.add(Patient.SP_BIRTHDATE, new DateParam("2011-01-03").setPrefix(LESSTHAN));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myPatientDao.search(params));
|
||||
assertThat(patients, contains(id2));
|
||||
}
|
||||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add(Patient.SP_BIRTHDATE, new DateParam("2010-01-01").setPrefix(ParamPrefixEnum.LESSTHAN));
|
||||
params.add(Patient.SP_BIRTHDATE, new DateParam("2010-01-01").setPrefix(LESSTHAN));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myPatientDao.search(params));
|
||||
assertThat(patients, empty());
|
||||
}
|
||||
|
@ -4474,7 +4488,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
map.setLoadSynchronous(true);
|
||||
myCaptureQueriesListener.clear();
|
||||
IBundleProvider values = myPatientDao.search(map);
|
||||
assertEquals(null, values.size());
|
||||
assertNull(values.size());
|
||||
assertEquals(5, values.getResources(0, 1000).size());
|
||||
|
||||
String sql = myCaptureQueriesListener.logSelectQueriesForCurrentThread(0);
|
||||
|
@ -4870,13 +4884,13 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
}
|
||||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.add("_security", new TokenParam("urn:taglist", methodName + "1a"));
|
||||
params.add(PARAM_SECURITY, new TokenParam("urn:taglist", methodName + "1a"));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
assertThat(patients, containsInAnyOrder(tag1id));
|
||||
}
|
||||
{
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.add("_profile", new UriParam("http://" + methodName));
|
||||
params.add(PARAM_PROFILE, new UriParam("http://" + methodName));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
assertThat(patients, containsInAnyOrder(tag2id));
|
||||
}
|
||||
|
@ -4911,14 +4925,14 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
{
|
||||
// One tag
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.add("_tag", new TokenParam("urn:taglist", methodName + "1a"));
|
||||
params.add(PARAM_TAG, new TokenParam("urn:taglist", methodName + "1a"));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
assertThat(patients, containsInAnyOrder(tag1id));
|
||||
}
|
||||
{
|
||||
// Code only
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.add("_tag", new TokenParam(null, methodName + "1a"));
|
||||
params.add(PARAM_TAG, new TokenParam(null, methodName + "1a"));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
assertThat(patients, containsInAnyOrder(tag1id));
|
||||
}
|
||||
|
@ -4928,7 +4942,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
TokenOrListParam orListParam = new TokenOrListParam();
|
||||
orListParam.add(new TokenParam("urn:taglist", methodName + "1a"));
|
||||
orListParam.add(new TokenParam("urn:taglist", methodName + "2a"));
|
||||
params.add("_tag", orListParam);
|
||||
params.add(PARAM_TAG, orListParam);
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
assertThat(patients, containsInAnyOrder(tag1id, tag2id));
|
||||
}
|
||||
|
@ -4938,7 +4952,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
TokenOrListParam orListParam = new TokenOrListParam();
|
||||
orListParam.add(new TokenParam("urn:taglist", methodName + "1a"));
|
||||
orListParam.add(new TokenParam("urn:taglist", methodName + "2a"));
|
||||
params.add("_tag", orListParam);
|
||||
params.add(PARAM_TAG, orListParam);
|
||||
params.setLastUpdated(new DateRangeParam(betweenDate, null));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
assertThat(patients, containsInAnyOrder(tag2id));
|
||||
|
@ -4951,7 +4965,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
TokenAndListParam andListParam = new TokenAndListParam();
|
||||
andListParam.addValue(new TokenOrListParam("urn:taglist", methodName + "1a"));
|
||||
andListParam.addValue(new TokenOrListParam("urn:taglist", methodName + "2a"));
|
||||
params.add("_tag", andListParam);
|
||||
params.add(PARAM_TAG, andListParam);
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
assertEquals(0, patients.size());
|
||||
}
|
||||
|
@ -4962,7 +4976,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
TokenAndListParam andListParam = new TokenAndListParam();
|
||||
andListParam.addValue(new TokenOrListParam("urn:taglist", methodName + "1a"));
|
||||
andListParam.addValue(new TokenOrListParam("urn:taglist", methodName + "1b"));
|
||||
params.add("_tag", andListParam);
|
||||
params.add(PARAM_TAG, andListParam);
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
assertThat(patients, containsInAnyOrder(tag1id));
|
||||
}
|
||||
|
@ -4993,7 +5007,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
{
|
||||
// One tag
|
||||
SearchParameterMap params = SearchParameterMap.newSynchronous();
|
||||
params.add("_tag", new TokenParam("urn:taglist", methodName + "1a").setModifier(TokenParamModifier.NOT));
|
||||
params.add(PARAM_TAG, new TokenParam("urn:taglist", methodName + "1a").setModifier(TokenParamModifier.NOT));
|
||||
myCaptureQueriesListener.clear();
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
myCaptureQueriesListener.logSelectQueriesForCurrentThread(0);
|
||||
|
@ -5003,14 +5017,14 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
{
|
||||
// Non existant tag
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.add("_tag", new TokenParam("urn:taglist", methodName + "FOO").setModifier(TokenParamModifier.NOT));
|
||||
params.add(PARAM_TAG, new TokenParam("urn:taglist", methodName + "FOO").setModifier(TokenParamModifier.NOT));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
assertThat(patients, containsInAnyOrder(tag1id, tag2id));
|
||||
}
|
||||
{
|
||||
// Common tag
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.add("_tag", new TokenParam("urn:taglist", methodName + "1b").setModifier(TokenParamModifier.NOT));
|
||||
params.add(PARAM_TAG, new TokenParam("urn:taglist", methodName + "1b").setModifier(TokenParamModifier.NOT));
|
||||
List<IIdType> patients = toUnqualifiedVersionlessIds(myOrganizationDao.search(params));
|
||||
assertThat(patients, empty());
|
||||
}
|
||||
|
@ -5489,7 +5503,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
|
|||
myCaptureQueriesListener.clear();
|
||||
SearchParameterMap params = new SearchParameterMap();
|
||||
params.setLoadSynchronous(true);
|
||||
params.add(CommunicationRequest.SP_OCCURRENCE, new DateParam(ParamPrefixEnum.GREATERTHAN_OR_EQUALS, "2015-08-10T11:33:00-04:00"));
|
||||
params.add(CommunicationRequest.SP_OCCURRENCE, new DateParam(GREATERTHAN_OR_EQUALS, "2015-08-10T11:33:00-04:00"));
|
||||
IBundleProvider outcome = myCommunicationRequestDao.search(params);
|
||||
myCaptureQueriesListener.logSelectQueriesForCurrentThread();
|
||||
assertThat(toUnqualifiedVersionlessIdValues(outcome), contains(crId));
|
||||
|
|
|
@ -68,6 +68,7 @@ import org.hl7.fhir.r4.model.Reference;
|
|||
import org.hl7.fhir.r4.model.StringType;
|
||||
import org.hl7.fhir.r4.model.ValueSet;
|
||||
import org.junit.jupiter.api.AfterEach;
|
||||
import org.junit.jupiter.api.Assertions;
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.Disabled;
|
||||
import org.junit.jupiter.api.Nested;
|
||||
|
@ -89,14 +90,14 @@ import org.springframework.transaction.PlatformTransactionManager;
|
|||
|
||||
import javax.persistence.EntityManager;
|
||||
import java.io.IOException;
|
||||
import java.time.Month;
|
||||
import java.net.URLEncoder;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
import java.util.function.Consumer;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import static ca.uhn.fhir.jpa.model.util.UcumServiceUtil.UCUM_CODESYSTEM_URL;
|
||||
import static ca.uhn.fhir.rest.api.Constants.CHARSET_UTF8;
|
||||
import static org.hamcrest.MatcherAssert.assertThat;
|
||||
import static org.hamcrest.Matchers.contains;
|
||||
import static org.hamcrest.Matchers.containsInAnyOrder;
|
||||
|
@ -105,6 +106,7 @@ import static org.hamcrest.Matchers.equalTo;
|
|||
import static org.hamcrest.Matchers.hasItem;
|
||||
import static org.hamcrest.Matchers.hasSize;
|
||||
import static org.hamcrest.Matchers.not;
|
||||
import static org.hamcrest.Matchers.notNullValue;
|
||||
import static org.hamcrest.Matchers.stringContainsInOrder;
|
||||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
||||
|
@ -588,7 +590,50 @@ public class FhirResourceDaoR4SearchWithElasticSearchIT extends BaseJpaTest {
|
|||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify unmodified, :contains, and :text searches are case-insensitive and normalized;
|
||||
* :exact is still sensitive
|
||||
* https://github.com/hapifhir/hapi-fhir/issues/3584
|
||||
*/
|
||||
@Test
|
||||
void testStringCaseFolding() {
|
||||
IIdType kelly = myTestDataBuilder.createPatient(myTestDataBuilder.withGiven("Kelly"));
|
||||
IIdType keely = myTestDataBuilder.createPatient(myTestDataBuilder.withGiven("Kélly"));
|
||||
|
||||
// un-modified, :contains, and :text are all ascii normalized, and case-folded
|
||||
myTestDaoSearch.assertSearchFinds("lowercase matches capitalized", "/Patient?name=kelly", kelly, keely);
|
||||
myTestDaoSearch.assertSearchFinds("uppercase matches capitalized", "/Patient?name=KELLY", kelly, keely);
|
||||
myTestDaoSearch.assertSearchFinds("unmodified is accent insensitive", "/Patient?name=" + urlencode("Kélly"), kelly, keely);
|
||||
|
||||
myTestDaoSearch.assertSearchFinds("contains case-insensitive", "/Patient?name:contains=elly", kelly, keely);
|
||||
myTestDaoSearch.assertSearchFinds("contains case-insensitive", "/Patient?name:contains=ELLY", kelly, keely);
|
||||
myTestDaoSearch.assertSearchFinds("contains accent-insensitive", "/Patient?name:contains=ELLY", kelly, keely);
|
||||
myTestDaoSearch.assertSearchFinds("contains accent-insensitive", "/Patient?name:contains=" + urlencode("éLLY"), kelly, keely);
|
||||
|
||||
myTestDaoSearch.assertSearchFinds("text also accent and case-insensitive", "/Patient?name:text=kelly", kelly, keely);
|
||||
myTestDaoSearch.assertSearchFinds("text also accent and case-insensitive", "/Patient?name:text=KELLY", kelly, keely);
|
||||
myTestDaoSearch.assertSearchFinds("text also accent and case-insensitive", "/Patient?name:text=" + urlencode("KÉLLY"), kelly, keely);
|
||||
|
||||
myTestDaoSearch.assertSearchFinds("exact case and accent sensitive", "/Patient?name:exact=Kelly", kelly);
|
||||
// ugh. Our url parser won't handle raw utf8 urls. It requires everything to be single-byte encoded.
|
||||
myTestDaoSearch.assertSearchFinds("exact case and accent sensitive", "/Patient?name:exact=" + urlencode("Kélly"), keely);
|
||||
myTestDaoSearch.assertSearchNotFound("exact case and accent sensitive", "/Patient?name:exact=KELLY,kelly", kelly);
|
||||
myTestDaoSearch.assertSearchNotFound("exact case and accent sensitive",
|
||||
"/Patient?name:exact=" + urlencode("KÉLLY,kélly"),
|
||||
keely);
|
||||
|
||||
myTestDaoSearch.assertSearchFinds("exact accent sensitive", "/Patient?name:exact=Kelly", kelly);
|
||||
myTestDaoSearch.assertSearchFinds("exact accent sensitive", "/Patient?name:exact=" + urlencode("Kélly"), keely);
|
||||
myTestDaoSearch.assertSearchNotFound("exact accent sensitive", "/Patient?name:exact=Kelly", keely);
|
||||
myTestDaoSearch.assertSearchNotFound("exact accent sensitive", "/Patient?name:exact=" +
|
||||
urlencode("kélly"), kelly);
|
||||
|
||||
}
|
||||
|
||||
/** Our url parser requires all chars to be single-byte, and in utf8, that means ascii. */
|
||||
private String urlencode(String theParam) {
|
||||
return URLEncoder.encode(theParam, CHARSET_UTF8);
|
||||
}
|
||||
|
||||
private void assertObservationSearchMatchesNothing(String message, SearchParameterMap map) {
|
||||
assertObservationSearchMatches(message, map);
|
||||
|
@ -637,14 +682,13 @@ public class FhirResourceDaoR4SearchWithElasticSearchIT extends BaseJpaTest {
|
|||
"}";
|
||||
Observation o = myFhirCtx.newJsonParser().parseResource(Observation.class, json);
|
||||
|
||||
myObservationDao.create(o, mySrd).getId().toUnqualifiedVersionless();
|
||||
IIdType id = myObservationDao.create(o, mySrd).getId().toUnqualifiedVersionless();
|
||||
|
||||
// no error.
|
||||
assertThat(id, notNullValue());
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
@Test
|
||||
public void testExpandWithIsAInExternalValueSet() {
|
||||
createExternalCsAndLocalVs();
|
||||
|
@ -1570,7 +1614,7 @@ public class FhirResourceDaoR4SearchWithElasticSearchIT extends BaseJpaTest {
|
|||
}
|
||||
|
||||
|
||||
@Disabled // keeping to check search scrolling
|
||||
@Disabled("keeping to debug search scrolling")
|
||||
@Test
|
||||
public void withoutCount() {
|
||||
createObservations(600);
|
||||
|
|
|
@ -10,7 +10,9 @@ import org.apache.http.client.methods.HttpPost;
|
|||
import org.apache.http.entity.ContentType;
|
||||
import org.apache.http.entity.StringEntity;
|
||||
import org.hl7.fhir.instance.model.api.IIdType;
|
||||
import org.hl7.fhir.r4.model.DateType;
|
||||
import org.hl7.fhir.r4.model.Patient;
|
||||
import org.intellij.lang.annotations.Language;
|
||||
import org.junit.jupiter.api.MethodOrderer;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.TestMethodOrder;
|
||||
|
@ -43,14 +45,37 @@ public class GraphQLR4Test extends BaseResourceProviderR4Test {
|
|||
try (CloseableHttpResponse response = ourHttpClient.execute(httpGet)) {
|
||||
String resp = IOUtils.toString(response.getEntity().getContent(), StandardCharsets.UTF_8);
|
||||
ourLog.info(resp);
|
||||
assertEquals(TestUtil.stripWhitespace(DATA_PREFIX + "{\n" +
|
||||
" \"name\":[{\n" +
|
||||
" \"family\":\"FAM\",\n" +
|
||||
" \"given\":[\"GIVEN1\",\"GIVEN2\"]\n" +
|
||||
" },{\n" +
|
||||
" \"given\":[\"GivenOnly1\",\"GivenOnly2\"]\n" +
|
||||
" }]\n" +
|
||||
"}" + DATA_SUFFIX), TestUtil.stripWhitespace(resp));
|
||||
@Language("json")
|
||||
String expected = """
|
||||
{
|
||||
"name":[{
|
||||
"family":"FAM",
|
||||
"given":["GIVEN1","GIVEN2"]
|
||||
},{
|
||||
"given":["GivenOnly1","GivenOnly2"]
|
||||
}]
|
||||
}""";
|
||||
assertEquals(TestUtil.stripWhitespace(DATA_PREFIX + expected + DATA_SUFFIX), TestUtil.stripWhitespace(resp));
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testInstance_Patient_Birthdate() throws IOException {
|
||||
initTestPatients();
|
||||
|
||||
String query = "{birthDate}";
|
||||
HttpGet httpGet = new HttpGet(ourServerBase + "/Patient/" + myPatientId0.getIdPart() + "/$graphql?query=" + UrlUtil.escapeUrlParam(query));
|
||||
|
||||
try (CloseableHttpResponse response = ourHttpClient.execute(httpGet)) {
|
||||
String resp = IOUtils.toString(response.getEntity().getContent(), StandardCharsets.UTF_8);
|
||||
ourLog.info(resp);
|
||||
@Language("json")
|
||||
String expected = """
|
||||
{
|
||||
"birthDate": "1965-08-09"
|
||||
}""";
|
||||
assertEquals(TestUtil.stripWhitespace(DATA_PREFIX + expected + DATA_SUFFIX), TestUtil.stripWhitespace(resp));
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -113,13 +138,13 @@ public class GraphQLR4Test extends BaseResourceProviderR4Test {
|
|||
initTestPatients();
|
||||
|
||||
String uri = ourServerBase + "/$graphql";
|
||||
HttpPost httpGet = new HttpPost(uri);
|
||||
httpGet.setEntity(new StringEntity(INTROSPECTION_QUERY, ContentType.APPLICATION_JSON));
|
||||
HttpPost httpPost = new HttpPost(uri);
|
||||
httpPost.setEntity(new StringEntity(INTROSPECTION_QUERY, ContentType.APPLICATION_JSON));
|
||||
|
||||
// Repeat a couple of times to make sure it doesn't fail after the first one. At one point
|
||||
// the generator polluted the structure userdata and failed the second time
|
||||
for (int i = 0; i < 3; i++) {
|
||||
try (CloseableHttpResponse response = ourHttpClient.execute(httpGet)) {
|
||||
try (CloseableHttpResponse response = ourHttpClient.execute(httpPost)) {
|
||||
String resp = IOUtils.toString(response.getEntity().getContent(), StandardCharsets.UTF_8);
|
||||
ourLog.info("Response has size: {}", FileUtil.formatFileSize(resp.length()));
|
||||
assertEquals(200, response.getStatusLine().getStatusCode());
|
||||
|
@ -144,24 +169,27 @@ public class GraphQLR4Test extends BaseResourceProviderR4Test {
|
|||
try (CloseableHttpResponse response = ourHttpClient.execute(httpGet)) {
|
||||
String resp = IOUtils.toString(response.getEntity().getContent(), StandardCharsets.UTF_8);
|
||||
ourLog.info(resp);
|
||||
|
||||
@Language("json")
|
||||
String expected = """
|
||||
{
|
||||
"Patient":{
|
||||
"name":[{
|
||||
"family":"FAM",
|
||||
"given":["GIVEN1","GIVEN2"]
|
||||
},{
|
||||
"given":["GivenOnly1","GivenOnly2"]
|
||||
}]
|
||||
}
|
||||
}""";
|
||||
assertEquals(TestUtil.stripWhitespace(DATA_PREFIX +
|
||||
"{\n" +
|
||||
"\"Patient\":{\n" +
|
||||
"\"name\":[{\n" +
|
||||
"\"family\":\"FAM\",\n" +
|
||||
"\"given\":[\"GIVEN1\",\"GIVEN2\"]\n" +
|
||||
"},{\n" +
|
||||
"\"given\":[\"GivenOnly1\",\"GivenOnly2\"]\n" +
|
||||
"}]\n" +
|
||||
"}\n" +
|
||||
"}" +
|
||||
expected +
|
||||
DATA_SUFFIX), TestUtil.stripWhitespace(resp));
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
|
||||
|
||||
@Test
|
||||
public void testRoot_Search_Patient() throws IOException {
|
||||
initTestPatients();
|
||||
|
@ -172,20 +200,23 @@ public class GraphQLR4Test extends BaseResourceProviderR4Test {
|
|||
try (CloseableHttpResponse response = ourHttpClient.execute(httpGet)) {
|
||||
String resp = IOUtils.toString(response.getEntity().getContent(), StandardCharsets.UTF_8);
|
||||
ourLog.info(resp);
|
||||
assertEquals(TestUtil.stripWhitespace(DATA_PREFIX + "{\n" +
|
||||
" \"PatientList\":[{\n" +
|
||||
" \"name\":[{\n" +
|
||||
" \"family\":\"FAM\",\n" +
|
||||
" \"given\":[\"GIVEN1\",\"GIVEN2\"]\n" +
|
||||
" },{\n" +
|
||||
" \"given\":[\"GivenOnly1\",\"GivenOnly2\"]\n" +
|
||||
" }]\n" +
|
||||
" },{\n" +
|
||||
" \"name\":[{\n" +
|
||||
" \"given\":[\"GivenOnlyB1\",\"GivenOnlyB2\"]\n" +
|
||||
" }]\n" +
|
||||
" }]\n" +
|
||||
"}" + DATA_SUFFIX), TestUtil.stripWhitespace(resp));
|
||||
@Language("json")
|
||||
String expected = """
|
||||
{
|
||||
"PatientList":[{
|
||||
"name":[{
|
||||
"family":"FAM",
|
||||
"given":["GIVEN1","GIVEN2"]
|
||||
},{
|
||||
"given":["GivenOnly1","GivenOnly2"]
|
||||
}]
|
||||
},{
|
||||
"name":[{
|
||||
"given":["GivenOnlyB1","GivenOnlyB2"]
|
||||
}]
|
||||
}]
|
||||
}""";
|
||||
assertEquals(TestUtil.stripWhitespace(DATA_PREFIX + expected + DATA_SUFFIX), TestUtil.stripWhitespace(resp));
|
||||
}
|
||||
|
||||
}
|
||||
|
@ -214,6 +245,7 @@ public class GraphQLR4Test extends BaseResourceProviderR4Test {
|
|||
p.addName()
|
||||
.addGiven("GivenOnly1")
|
||||
.addGiven("GivenOnly2");
|
||||
p.setBirthDateElement(new DateType("1965-08-09"));
|
||||
myPatientId0 = myClient.create().resource(p).execute().getId().toUnqualifiedVersionless();
|
||||
|
||||
p = new Patient();
|
||||
|
|
|
@ -14,6 +14,7 @@ import ca.uhn.fhir.jpa.model.util.JpaConstants;
|
|||
import ca.uhn.fhir.jpa.partition.SystemRequestDetails;
|
||||
import ca.uhn.fhir.rest.api.server.RequestDetails;
|
||||
import ca.uhn.fhir.rest.api.server.bulk.BulkDataExportOptions;
|
||||
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
|
||||
import ca.uhn.fhir.rest.server.exceptions.MethodNotAllowedException;
|
||||
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
|
||||
import ca.uhn.fhir.test.utilities.ITestDataBuilder;
|
||||
|
@ -270,6 +271,20 @@ public class MultitenantServerR4Test extends BaseMultitenantResourceProviderR4Te
|
|||
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testPartitionInRequestDetails_UpdateWithWrongTenantId() {
|
||||
IIdType idA = createPatient(withTenant(TENANT_A), withActiveTrue()).toVersionless();
|
||||
IBaseResource patientA = buildPatient(withId(idA), withActiveTrue());
|
||||
RequestDetails requestDetails = new SystemRequestDetails();
|
||||
requestDetails.setTenantId(TENANT_B);
|
||||
try {
|
||||
myPatientDao.update((Patient) patientA, requestDetails);
|
||||
fail();
|
||||
} catch (InvalidRequestException e) {
|
||||
assertEquals(Msg.code(2079) + "Resource " + ((Patient) patientA).getResourceType() + "/" + ((Patient) patientA).getIdElement().getIdPart() + " is not known", e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testDirectDaoAccess_PartitionInRequestDetails_Update() {
|
||||
|
||||
|
|
|
@ -204,7 +204,7 @@ public class MdmRulesJson implements IModelJson {
|
|||
return myVectorMatchResultMap.getFieldMatchNames(theVector);
|
||||
}
|
||||
|
||||
public String getDetailedFieldMatchResultForUnmatchedVector(long theVector) {
|
||||
public String getDetailedFieldMatchResultWithSuccessInformation(long theVector) {
|
||||
List<String> fieldMatchResult = new ArrayList<>();
|
||||
for (int i = 0; i < myMatchFieldJsonList.size(); ++i) {
|
||||
if ((theVector & (1 << i)) == 0) {
|
||||
|
|
|
@ -92,10 +92,9 @@ public class MdmResourceMatcherSvc {
|
|||
MdmMatchResultEnum matchResultEnum = myMdmRulesJson.getMatchResult(matchResult.vector);
|
||||
matchResult.setMatchResultEnum(matchResultEnum);
|
||||
if (ourLog.isDebugEnabled()) {
|
||||
if (matchResult.isMatch() || matchResult.isPossibleMatch()) {
|
||||
ourLog.debug("{} {} with field matchers {}", matchResult, theRightResource.getIdElement().toUnqualifiedVersionless(), myMdmRulesJson.getFieldMatchNamesForVector(matchResult.vector));
|
||||
} else if (ourLog.isTraceEnabled()) {
|
||||
ourLog.trace("{} {}. Field matcher results: {}", matchResult, theRightResource.getIdElement().toUnqualifiedVersionless(), myMdmRulesJson.getDetailedFieldMatchResultForUnmatchedVector(matchResult.vector));
|
||||
ourLog.debug("{} {}: {}", matchResult.getMatchResultEnum(), theRightResource.getIdElement().toUnqualifiedVersionless(), matchResult);
|
||||
if (ourLog.isTraceEnabled()) {
|
||||
ourLog.trace("Field matcher results:\n{}", myMdmRulesJson.getDetailedFieldMatchResultWithSuccessInformation(matchResult.vector));
|
||||
}
|
||||
}
|
||||
return matchResult;
|
||||
|
@ -135,6 +134,9 @@ public class MdmResourceMatcherSvc {
|
|||
MdmMatchEvaluation matchEvaluation = fieldComparator.match(theLeftResource, theRightResource);
|
||||
if (matchEvaluation.match) {
|
||||
vector |= (1 << i);
|
||||
ourLog.trace("Match: Successfully matched matcher {} with score {}.", fieldComparator.getName(), matchEvaluation.score);
|
||||
} else {
|
||||
ourLog.trace("No match: Matcher {} did not match (score: {}).", fieldComparator.getName(), matchEvaluation.score);
|
||||
}
|
||||
score += matchEvaluation.score;
|
||||
appliedRuleCount += 1;
|
||||
|
|
|
@ -42,6 +42,7 @@ import org.slf4j.Logger;
|
|||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import javax.annotation.Nonnull;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.Objects;
|
||||
|
@ -80,6 +81,7 @@ public class GoldenResourceHelper {
|
|||
* @param theIncomingResource The resource that will be used as the starting point for the MDM linking.
|
||||
* @param theMdmTransactionContext
|
||||
*/
|
||||
@Nonnull
|
||||
public <T extends IAnyResource> T createGoldenResourceFromMdmSourceResource(T theIncomingResource, MdmTransactionContext theMdmTransactionContext) {
|
||||
validateContextSupported();
|
||||
|
||||
|
|
|
@ -0,0 +1,107 @@
|
|||
package ca.uhn.fhir.mdm.rules.svc;
|
||||
|
||||
import ca.uhn.fhir.context.RuntimeSearchParam;
|
||||
import ca.uhn.fhir.mdm.api.MdmMatchOutcome;
|
||||
import ca.uhn.fhir.mdm.log.Logs;
|
||||
import ch.qos.logback.classic.Level;
|
||||
import ch.qos.logback.classic.Logger;
|
||||
import ch.qos.logback.classic.spi.ILoggingEvent;
|
||||
import ch.qos.logback.core.read.ListAppender;
|
||||
import org.hl7.fhir.r4.model.Patient;
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.Test;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
||||
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||
import static org.mockito.Mockito.mock;
|
||||
import static org.mockito.Mockito.when;
|
||||
|
||||
public class MdmResourceMatcherSvcLoggingTest extends BaseMdmRulesR4Test {
|
||||
private MdmResourceMatcherSvc myMdmResourceMatcherSvc;
|
||||
private Patient myJohn;
|
||||
private Patient myJohny;
|
||||
|
||||
@Override
|
||||
@BeforeEach
|
||||
public void before() {
|
||||
super.before();
|
||||
|
||||
when(mySearchParamRetriever.getActiveSearchParam("Patient", "birthdate")).thenReturn(mock(RuntimeSearchParam.class));
|
||||
when(mySearchParamRetriever.getActiveSearchParam("Patient", "identifier")).thenReturn(mock(RuntimeSearchParam.class));
|
||||
when(mySearchParamRetriever.getActiveSearchParam("Practitioner", "identifier")).thenReturn(mock(RuntimeSearchParam.class));
|
||||
when(mySearchParamRetriever.getActiveSearchParam("Medication", "identifier")).thenReturn(mock(RuntimeSearchParam.class));
|
||||
when(mySearchParamRetriever.getActiveSearchParam("Patient", "active")).thenReturn(mock(RuntimeSearchParam.class));
|
||||
|
||||
myMdmResourceMatcherSvc = buildMatcher(buildActiveBirthdateIdRules());
|
||||
|
||||
myJohn = buildJohn();
|
||||
myJohny = buildJohny();
|
||||
|
||||
myJohn.addName().setFamily("LastName");
|
||||
myJohny.addName().setFamily("DifferentLastName");
|
||||
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testMatchWillProvideLogsAboutSuccessOnTraceLevel() {
|
||||
Logger logger = (Logger) Logs.getMdmTroubleshootingLog();
|
||||
logger.setLevel(Level.TRACE);
|
||||
|
||||
MemoryAppender memoryAppender = createAndAssignMemoryAppender(logger);
|
||||
|
||||
MdmMatchOutcome result = myMdmResourceMatcherSvc.match(myJohn, myJohny);
|
||||
assertNotNull(result);
|
||||
|
||||
//this test assumes, that the defined algorithm for calculating scores doesn't change
|
||||
assertTrue(memoryAppender.contains("No match: Matcher patient-last did not match (score: 0.4", Level.TRACE));
|
||||
assertTrue(memoryAppender.contains("Match: Successfully matched matcher patient-given with score 0.8", Level.TRACE));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testMatchWillProvideSummaryOnMatchingSuccessForEachField() {
|
||||
Patient someoneElse = buildSomeoneElse();
|
||||
Logger logger = (Logger) Logs.getMdmTroubleshootingLog();
|
||||
logger.setLevel(Level.TRACE);
|
||||
|
||||
MemoryAppender memoryAppender = createAndAssignMemoryAppender(logger);
|
||||
|
||||
MdmMatchOutcome result = myMdmResourceMatcherSvc.match(myJohn, someoneElse);
|
||||
assertNotNull(result);
|
||||
|
||||
assertTrue(memoryAppender.contains("NO_MATCH Patient/", Level.DEBUG));
|
||||
assertTrue(memoryAppender.contains("Field matcher results:\npatient-given: NO\npatient-last: YES", Level.TRACE));
|
||||
}
|
||||
|
||||
protected Patient buildSomeoneElse() {
|
||||
Patient patient = new Patient();
|
||||
patient.addName().addGiven("SomeOneElse");
|
||||
patient.addName().setFamily("LastName");
|
||||
patient.setId("Patient/3");
|
||||
return patient;
|
||||
}
|
||||
|
||||
|
||||
protected MemoryAppender createAndAssignMemoryAppender(Logger theLogger) {
|
||||
|
||||
MemoryAppender memoryAppender = new MemoryAppender();
|
||||
memoryAppender.setContext(theLogger.getLoggerContext());
|
||||
theLogger.addAppender(memoryAppender);
|
||||
memoryAppender.start();
|
||||
|
||||
return memoryAppender;
|
||||
}
|
||||
|
||||
public static class MemoryAppender extends ListAppender<ILoggingEvent> {
|
||||
public void reset() {
|
||||
this.list.clear();
|
||||
}
|
||||
|
||||
public boolean contains(String string, Level level) {
|
||||
return this.list.stream()
|
||||
.anyMatch(event -> event.toString().contains(string)
|
||||
&& event.getLevel().equals(level));
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
|
@ -52,7 +52,10 @@ public interface IJobCoordinator {
|
|||
*/
|
||||
List<JobInstance> getInstances(int thePageSize, int thePageIndex);
|
||||
|
||||
List<JobInstance> getRecentInstances(int thePageSize, int thePageIndex);
|
||||
/**
|
||||
* Fetch job instances
|
||||
*/
|
||||
List<JobInstance> getRecentInstances(int theCount, int theStart);
|
||||
|
||||
void cancelInstance(String theInstanceId) throws ResourceNotFoundException;
|
||||
|
||||
|
|
|
@ -24,7 +24,6 @@ import ca.uhn.fhir.batch2.impl.BatchWorkChunk;
|
|||
import ca.uhn.fhir.batch2.model.JobInstance;
|
||||
import ca.uhn.fhir.batch2.model.WorkChunk;
|
||||
|
||||
import java.util.Collection;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
|
||||
|
@ -70,9 +69,9 @@ public interface IJobPersistence {
|
|||
List<JobInstance> fetchInstances(int thePageSize, int thePageIndex);
|
||||
|
||||
/**
|
||||
* Fetch instance in 'myCreateTime' descending order
|
||||
* Fetch instances ordered by myCreateTime DESC
|
||||
*/
|
||||
Collection<JobInstance> fetchRecentInstances(int thePageSize, int thePageIndex);
|
||||
List<JobInstance> fetchRecentInstances(int thePageSize, int thePageIndex);
|
||||
|
||||
/**
|
||||
* Fetch a given instance and update the stored status
|
||||
|
|
|
@ -162,9 +162,9 @@ public class JobCoordinatorImpl extends BaseJobService implements IJobCoordinato
|
|||
}
|
||||
|
||||
@Override
|
||||
public List<JobInstance> getRecentInstances(int thePageSize, int thePageIndex) {
|
||||
return myJobPersistence.fetchRecentInstances(thePageSize, thePageIndex).stream()
|
||||
.map(this::massageInstanceForUserAccess).collect(Collectors.toList());
|
||||
public List<JobInstance> getRecentInstances(int theCount, int theStart) {
|
||||
return myJobPersistence.fetchRecentInstances(theCount, theStart)
|
||||
.stream().map(this::massageInstanceForUserAccess).collect(Collectors.toList());
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -71,6 +71,7 @@ import static org.apache.commons.lang3.StringUtils.isBlank;
|
|||
* <li>For instances that are COMPLETE, purges chunk data</li>
|
||||
* <li>For instances that are IN_PROGRESS where at least one chunk is FAILED, marks instance as FAILED and propagates the error message to the instance, and purges chunk data</li>
|
||||
* <li>For instances that are IN_PROGRESS with an error message set where no chunks are ERRORED or FAILED, clears the error message in the instance (meaning presumably there was an error but it cleared)</li>
|
||||
* <li>For instances that are IN_PROGRESS and isCancelled flag is set marks them as ERRORED and indicating the current running step if any</li>
|
||||
* <li>For instances that are COMPLETE or FAILED and are old, delete them entirely</li>
|
||||
* </ul>
|
||||
* </p>
|
||||
|
@ -125,6 +126,7 @@ public class JobMaintenanceServiceImpl extends BaseJobService implements IJobMai
|
|||
|
||||
for (JobInstance instance : instances) {
|
||||
if (processedInstanceIds.add(instance.getInstanceId())) {
|
||||
handleCancellation(instance);
|
||||
cleanupInstance(instance, progressAccumulator);
|
||||
triggerGatedExecutions(instance, progressAccumulator);
|
||||
}
|
||||
|
@ -136,6 +138,21 @@ public class JobMaintenanceServiceImpl extends BaseJobService implements IJobMai
|
|||
}
|
||||
}
|
||||
|
||||
private void handleCancellation(JobInstance theInstance) {
|
||||
if (! theInstance.isCancelled()) { return; }
|
||||
|
||||
if (theInstance.getStatus() == StatusEnum.QUEUED || theInstance.getStatus() == StatusEnum.IN_PROGRESS) {
|
||||
String msg = "Job instance cancelled";
|
||||
if (theInstance.getCurrentGatedStepId() != null) {
|
||||
msg += " while running step " + theInstance.getCurrentGatedStepId();
|
||||
}
|
||||
theInstance.setErrorMessage(msg);
|
||||
theInstance.setStatus(StatusEnum.CANCELLED);
|
||||
myJobPersistence.updateInstance(theInstance);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
private void cleanupInstance(JobInstance theInstance, JobChunkProgressAccumulator theProgressAccumulator) {
|
||||
switch (theInstance.getStatus()) {
|
||||
case QUEUED:
|
||||
|
@ -146,6 +163,7 @@ public class JobMaintenanceServiceImpl extends BaseJobService implements IJobMai
|
|||
break;
|
||||
case COMPLETED:
|
||||
case FAILED:
|
||||
case CANCELLED:
|
||||
if (theInstance.getEndTime() != null) {
|
||||
long cutoff = System.currentTimeMillis() - PURGE_THRESHOLD;
|
||||
if (theInstance.getEndTime().getTime() < cutoff) {
|
||||
|
@ -157,7 +175,8 @@ public class JobMaintenanceServiceImpl extends BaseJobService implements IJobMai
|
|||
break;
|
||||
}
|
||||
|
||||
if ((theInstance.getStatus() == StatusEnum.COMPLETED || theInstance.getStatus() == StatusEnum.FAILED) && !theInstance.isWorkChunksPurged()) {
|
||||
if ((theInstance.getStatus() == StatusEnum.COMPLETED || theInstance.getStatus() == StatusEnum.FAILED
|
||||
|| theInstance.getStatus() == StatusEnum.CANCELLED) && !theInstance.isWorkChunksPurged()) {
|
||||
theInstance.setWorkChunksPurged(true);
|
||||
myJobPersistence.deleteChunks(theInstance.getInstanceId());
|
||||
myJobPersistence.updateInstance(theInstance);
|
||||
|
@ -214,6 +233,8 @@ public class JobMaintenanceServiceImpl extends BaseJobService implements IJobMai
|
|||
failedChunkCount++;
|
||||
errorMessage = chunk.getErrorMessage();
|
||||
break;
|
||||
case CANCELLED:
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -24,7 +24,6 @@ import ca.uhn.fhir.batch2.api.IJobPersistence;
|
|||
import ca.uhn.fhir.batch2.model.JobInstance;
|
||||
import ca.uhn.fhir.batch2.model.WorkChunk;
|
||||
|
||||
import java.util.Collection;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
|
||||
|
@ -65,7 +64,7 @@ public class SynchronizedJobPersistenceWrapper implements IJobPersistence {
|
|||
}
|
||||
|
||||
@Override
|
||||
public Collection<JobInstance> fetchRecentInstances(int thePageSize, int thePageIndex) {
|
||||
public List<JobInstance> fetchRecentInstances(int thePageSize, int thePageIndex) {
|
||||
return myWrap.fetchRecentInstances(thePageSize, thePageIndex);
|
||||
}
|
||||
|
||||
|
|
|
@ -51,7 +51,12 @@ public enum StatusEnum {
|
|||
* Task has failed and is known to be unrecoverable. There is no reason to believe that retrying will
|
||||
* result in a different outcome.
|
||||
*/
|
||||
FAILED(true);
|
||||
FAILED(true),
|
||||
|
||||
/**
|
||||
* Task has been cancelled.
|
||||
*/
|
||||
CANCELLED(true);
|
||||
|
||||
private final boolean myIncomplete;
|
||||
private static Set<StatusEnum> ourIncompleteStatuses;
|
||||
|
|
|
@ -3,6 +3,7 @@ package ca.uhn.fhir.batch2.impl;
|
|||
import ca.uhn.fhir.batch2.api.IJobCompletionHandler;
|
||||
import ca.uhn.fhir.batch2.api.IJobPersistence;
|
||||
import ca.uhn.fhir.batch2.api.JobCompletionDetails;
|
||||
import ca.uhn.fhir.batch2.model.JobDefinition;
|
||||
import ca.uhn.fhir.batch2.model.JobInstance;
|
||||
import ca.uhn.fhir.batch2.model.JobWorkNotification;
|
||||
import ca.uhn.fhir.batch2.model.StatusEnum;
|
||||
|
@ -11,6 +12,7 @@ import ca.uhn.fhir.jpa.subscription.channel.api.IChannelProducer;
|
|||
import com.google.common.collect.Lists;
|
||||
import org.hl7.fhir.r4.model.DateTimeType;
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.Nested;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.extension.ExtendWith;
|
||||
import org.mockito.ArgumentCaptor;
|
||||
|
@ -28,6 +30,7 @@ import static ca.uhn.fhir.batch2.impl.JobCoordinatorImplTest.createWorkChunkStep
|
|||
import static org.junit.jupiter.api.Assertions.assertEquals;
|
||||
import static org.junit.jupiter.api.Assertions.assertNotNull;
|
||||
import static org.junit.jupiter.api.Assertions.assertNull;
|
||||
import static org.junit.jupiter.api.Assertions.assertTrue;
|
||||
import static org.mockito.ArgumentMatchers.any;
|
||||
import static org.mockito.ArgumentMatchers.anyInt;
|
||||
import static org.mockito.ArgumentMatchers.eq;
|
||||
|
@ -244,6 +247,62 @@ public class JobMaintenanceServiceImplTest extends BaseBatch2Test {
|
|||
verifyNoMoreInteractions(myJobPersistence);
|
||||
}
|
||||
|
||||
|
||||
@Nested
|
||||
public class CancellationTests {
|
||||
|
||||
@Test
|
||||
public void afterFirstMaintenancePass() {
|
||||
// Setup
|
||||
myJobDefinitionRegistry.addJobDefinition(createJobDefinition(JobDefinition.Builder::gatedExecution));
|
||||
when(myJobPersistence.fetchWorkChunksWithoutData(eq(INSTANCE_ID), eq(100), eq(0))).thenReturn(Lists.newArrayList(
|
||||
createWorkChunkStep2().setStatus(StatusEnum.QUEUED).setId(CHUNK_ID),
|
||||
createWorkChunkStep2().setStatus(StatusEnum.QUEUED).setId(CHUNK_ID_2)
|
||||
));
|
||||
JobInstance instance1 = createInstance();
|
||||
instance1.setCurrentGatedStepId(STEP_1);
|
||||
when(myJobPersistence.fetchInstances(anyInt(), eq(0))).thenReturn(Lists.newArrayList(instance1));
|
||||
|
||||
mySvc.runMaintenancePass();
|
||||
|
||||
// Execute
|
||||
instance1.setCancelled(true);
|
||||
|
||||
mySvc.runMaintenancePass();
|
||||
|
||||
// Verify
|
||||
assertEquals(StatusEnum.CANCELLED, instance1.getStatus());
|
||||
assertTrue(instance1.getErrorMessage().startsWith("Job instance cancelled"));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void afterSecondMaintenancePass() {
|
||||
// Setup
|
||||
myJobDefinitionRegistry.addJobDefinition(createJobDefinition(JobDefinition.Builder::gatedExecution));
|
||||
when(myJobPersistence.fetchWorkChunksWithoutData(eq(INSTANCE_ID), eq(100), eq(0))).thenReturn(Lists.newArrayList(
|
||||
createWorkChunkStep2().setStatus(StatusEnum.QUEUED).setId(CHUNK_ID),
|
||||
createWorkChunkStep2().setStatus(StatusEnum.QUEUED).setId(CHUNK_ID_2)
|
||||
));
|
||||
JobInstance instance1 = createInstance();
|
||||
instance1.setCurrentGatedStepId(STEP_1);
|
||||
when(myJobPersistence.fetchInstances(anyInt(), eq(0))).thenReturn(Lists.newArrayList(instance1));
|
||||
|
||||
mySvc.runMaintenancePass();
|
||||
mySvc.runMaintenancePass();
|
||||
|
||||
// Execute
|
||||
instance1.setCancelled(true);
|
||||
|
||||
mySvc.runMaintenancePass();
|
||||
|
||||
// Verify
|
||||
assertEquals(StatusEnum.CANCELLED, instance1.getStatus());
|
||||
assertTrue(instance1.getErrorMessage().startsWith("Job instance cancelled"));
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
|
||||
private static Date parseTime(String theDate) {
|
||||
return new DateTimeType(theDate).getValue();
|
||||
}
|
||||
|
|
|
@ -74,6 +74,13 @@ public interface ITestDataBuilder {
|
|||
};
|
||||
}
|
||||
|
||||
|
||||
/** Patient.name.given */
|
||||
default <T extends IBaseResource> Consumer<T> withGiven(String theName) {
|
||||
return withPrimitiveAttribute("name.given", theName);
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Set Patient.birthdate
|
||||
*/
|
||||
|
@ -293,5 +300,4 @@ public interface ITestDataBuilder {
|
|||
booleanType.setValueAsString(theValue);
|
||||
activeChild.getMutator().addValue(theTarget, booleanType);
|
||||
}
|
||||
|
||||
}
|
||||
|
|
Loading…
Reference in New Issue