Combo SearchParameter improvements (#5885)

* Start supporting combo refs

* Work on error codes

* Add tests

* Work on tests

* Fixes

* CLeanup

* Add changelog

* Test fixes

* Test fix

* Clean up tests

* Test cleanup

* Work on tests

* Revert change

* Optimize storage

* Add tests

* Work on tests

* Update docs

* Spotless

* Test fixes

* Revert breaking change

* Merge

* Bump guava

* Address review comments

* Test fixes

* Test fix

* FIx tests

* Address review comments

* Spotless

* Test fix

* Revert change
This commit is contained in:
James Agnew 2024-06-27 08:42:04 -04:00 committed by GitHub
parent bd83bc17cd
commit 7224245217
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
44 changed files with 3384 additions and 690 deletions

View File

@ -58,6 +58,7 @@ public class RuntimeSearchParam {
private final Map<String, String> myUpliftRefchains = new HashMap<>();
private final ComboSearchParamType myComboSearchParamType;
private final List<Component> myComponents;
private final IIdType myIdUnqualifiedVersionless;
private IPhoneticEncoder myPhoneticEncoder;
/**
@ -127,6 +128,7 @@ public class RuntimeSearchParam {
super();
myId = theId;
myIdUnqualifiedVersionless = theId != null ? theId.toUnqualifiedVersionless() : null;
myUri = theUri;
myName = theName;
myDescription = theDescription;
@ -214,6 +216,10 @@ public class RuntimeSearchParam {
return myId;
}
public IIdType getIdUnqualifiedVersionless() {
return myIdUnqualifiedVersionless;
}
public String getUri() {
return myUri;
}

View File

@ -0,0 +1,6 @@
---
type: perf
issue: 5885
title: "When unique and non-unique combo parameters are in use on a server, FHIR Transaction and Reindex Job
performance has been optimized by pre-fetching all existing combo index rows for a large batch of resources
in a single database operation. This should yield a meaningful performance improvement on such systems."

View File

@ -0,0 +1,6 @@
---
type: perf
issue: 5885
title: "Indexing for non-unique combo Search Parameters has been improved,
using a new hash-based index that should perform significantly better in
many circumstances."

View File

@ -0,0 +1,7 @@
---
type: perf
issue: 5885
title: "Indexing for unique combo Search Parameters has been modified so that a hash
value is now stored. This hash value is not yet used in searching or enforcing uniqueness,
but will be in the future in order to reduce the space required to store the indexes and the
current size limitation on unique indexes."

View File

@ -443,7 +443,9 @@ In some configurations, the partition ID is also factored into the hashes.
<img src="/hapi-fhir/docs/images/jpa_erd_search_indexes.svg" alt="Search Indexes" style="width: 100%; max-width: 900px;"/>
## Columns
<a name="HFJ_SPIDX_common"/>
## Common Search Index Columns
The following columns are common to **all HFJ_SPIDX_xxx tables**.
@ -556,6 +558,8 @@ Sorting is done by the SP_VALUE_LOW column.
## Columns
Note: This table has the columns listed below, but it also has all common columns listed above in [Common Search Index Columns](#HFJ_SPIDX_common).
<table class="table table-striped table-condensed">
<thead>
<tr>
@ -625,6 +629,8 @@ Range queries and sorting use the HASH_IDENTITY and SP_VALUE columns.
## Columns
Note: This table has the columns listed below, but it also has all common columns listed above in [Common Search Index Columns](#HFJ_SPIDX_common).
<table class="table table-striped table-condensed">
<thead>
<tr>
@ -660,6 +666,8 @@ Sorting is done via the HASH_IDENTITY and SP_VALUE columns.
## Columns
Note: This table has the columns listed below, but it also has all common columns listed above in [Common Search Index Columns](#HFJ_SPIDX_common).
<table class="table table-striped table-condensed">
<thead>
<tr>
@ -753,6 +761,8 @@ Sorting is done via the HASH_IDENTITY and SP_VALUE_NORMALIZED columns.
## Columns
Note: This table has the columns listed below, but it also has all common columns listed above in [Common Search Index Columns](#HFJ_SPIDX_common).
<table class="table table-striped table-condensed">
<thead>
<tr>
@ -806,6 +816,8 @@ Sorting is done via the HASH_IDENTITY and SP_VALUE columns.
## Columns
Note: This table has the columns listed below, but it also has all common columns listed above in [Common Search Index Columns](#HFJ_SPIDX_common).
<table class="table table-striped table-condensed">
<thead>
<tr>
@ -876,6 +888,8 @@ Sorting is done via the HASH_IDENTITY and SP_URI columns.
## Columns
Note: This table has the columns listed below, but it also has all common columns listed above in [Common Search Index Columns](#HFJ_SPIDX_common).
<table class="table table-striped table-condensed">
<thead>
<tr>
@ -908,3 +922,135 @@ Sorting is done via the HASH_IDENTITY and SP_URI columns.
</tbody>
</table>
# HFJ_IDX_CMB_TOK_NU: Combo Non-Unique Search Param
This table is used to index [Non-Unique Combo Search Parameters](https://smilecdr.com/docs/fhir_standard/fhir_search_custom_search_parameters.html#combo-search-index-parameters).
## Columns
<table class="table table-striped table-condensed">
<thead>
<tr>
<th>Name</th>
<th>Relationships</th>
<th>Datatype</th>
<th>Nullable</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td>PID</td>
<td></td>
<td>Long</td>
<td></td>
<td>
A unique persistent identifier for the given index row.
</td>
</tr>
<tr>
<td>RES_ID</td>
<td>FK to <a href="#HFJ_RESOURCE">HFJ_RESOURCE</a></td>
<td>Long</td>
<td></td>
<td>
Contains the PID of the resource being indexed.
</td>
</tr>
<tr>
<td>IDX_STRING</td>
<td></td>
<td>String</td>
<td></td>
<td>
This column contains a FHIR search expression indicating what is being indexed. For example, if a
non-unique combo search parameter is present which indexes a combination of Observation#code and
Observation#status, this column might contain a value such as
<code>Observation?code=http://loinc.org|1234-5&status=final</code>
</td>
</tr>
<tr>
<td>HASH_COMPLETE</td>
<td></td>
<td>Long</td>
<td></td>
<td>
This column contains a hash of the value in column <code>IDX_STRING</code>.
</td>
</tr>
</tbody>
</table>
<a name="HFJ_IDX_CMP_STRING_UNIQ"/>
# HFJ_IDX_CMP_STRING_UNIQ: Combo Unique Search Param
This table is used to index [Unique Combo Search Parameters](https://smilecdr.com/docs/fhir_standard/fhir_search_custom_search_parameters.html#combo-search-index-parameters).
## Columns
<table class="table table-striped table-condensed">
<thead>
<tr>
<th>Name</th>
<th>Relationships</th>
<th>Datatype</th>
<th>Nullable</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td>PID</td>
<td></td>
<td>Long</td>
<td></td>
<td>
A unique persistent identifier for the given index row.
</td>
</tr>
<tr>
<td>RES_ID</td>
<td>FK to <a href="#HFJ_RESOURCE">HFJ_RESOURCE</a></td>
<td>Long</td>
<td></td>
<td>
Contains the PID of the resource being indexed.
</td>
</tr>
<tr>
<td>IDX_STRING</td>
<td></td>
<td>String</td>
<td></td>
<td>
This column contains a FHIR search expression indicating what is being indexed. For example, if a
unique combo search parameter is present which indexes a combination of Observation#code and
Observation#status, this column might contain a value such as
<code>Observation?code=http://loinc.org|1234-5&status=final</code>
</td>
</tr>
<tr>
<td>HASH_COMPLETE</td>
<td></td>
<td>Long</td>
<td></td>
<td>
This column contains a hash of the value in column <code>IDX_STRING</code>.
</td>
</tr>
<tr>
<td>HASH_COMPLETE_2</td>
<td></td>
<td>Long</td>
<td></td>
<td>
This column contains an additional hash of the value in column <code>IDX_STRING</code>, using a
static salt of the value prior to the hashing. This is done in order to increase the number
of bits used to hash the index string from 64 to 128.
</td>
</tr>
</tbody>
</table>

View File

@ -1284,10 +1284,6 @@ public abstract class BaseHapiFhirDao<T extends IBaseResource> extends BaseStora
myInterceptorBroadcaster, theRequest, Pointcut.JPA_PERFTRACE_INFO, params);
}
}
// Synchronize composite params
mySearchParamWithInlineReferencesExtractor.storeUniqueComboParameters(
newParams, entity, existingParams);
}
}

View File

@ -1628,6 +1628,7 @@ public abstract class BaseHapiFhirResourceDao<T extends IBaseResource> extends B
T resource = (T) myJpaStorageResourceParser.toResource(entity, false);
reindexSearchParameters(resource, entity, theTransactionDetails);
} catch (Exception e) {
ourLog.warn("Failure during reindex: {}", e.toString());
theReindexOutcome.addWarning("Failed to reindex resource " + entity.getIdDt() + ": " + e);
myResourceTableDao.updateIndexStatus(entity.getId(), INDEX_STATUS_INDEXING_FAILED);
}

View File

@ -224,6 +224,17 @@ public abstract class BaseHapiFhirSystemDao<T extends IBaseBundle, MT> extends B
BaseHasResource::isHasTags,
entityChunk);
prefetchByField(
"comboStringUnique",
"myParamsComboStringUnique",
ResourceTable::isParamsComboStringUniquePresent,
entityChunk);
prefetchByField(
"comboTokenNonUnique",
"myParamsComboTokensNonUnique",
ResourceTable::isParamsComboTokensNonUniquePresent,
entityChunk);
if (myStorageSettings.getIndexMissingFields() == JpaStorageSettings.IndexEnabledEnum.ENABLED) {
prefetchByField("searchParamPresence", "mySearchParamPresents", r -> true, entityChunk);
}

View File

@ -19,18 +19,24 @@
*/
package ca.uhn.fhir.jpa.dao.index;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.dao.BaseHapiFhirDao;
import ca.uhn.fhir.jpa.dao.data.IResourceIndexedComboStringUniqueDao;
import ca.uhn.fhir.jpa.model.entity.BaseResourceIndex;
import ca.uhn.fhir.jpa.model.entity.BaseResourceIndexedSearchParam;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboStringUnique;
import ca.uhn.fhir.jpa.model.entity.ResourceTable;
import ca.uhn.fhir.jpa.model.entity.StorageSettings;
import ca.uhn.fhir.jpa.searchparam.extractor.ResourceIndexedSearchParams;
import ca.uhn.fhir.jpa.util.AddRemoveCount;
import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException;
import com.google.common.annotations.VisibleForTesting;
import jakarta.annotation.Nullable;
import jakarta.persistence.EntityManager;
import jakarta.persistence.PersistenceContext;
import jakarta.persistence.PersistenceContextType;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
@ -44,30 +50,46 @@ import java.util.Set;
@Service
public class DaoSearchParamSynchronizer {
private static final Logger ourLog = LoggerFactory.getLogger(DaoSearchParamSynchronizer.class);
@Autowired
private StorageSettings myStorageSettings;
@PersistenceContext(type = PersistenceContextType.TRANSACTION)
protected EntityManager myEntityManager;
@Autowired
private JpaStorageSettings myStorageSettings;
@Autowired
private IResourceIndexedComboStringUniqueDao myResourceIndexedCompositeStringUniqueDao;
@Autowired
private FhirContext myFhirContext;
public AddRemoveCount synchronizeSearchParamsToDatabase(
ResourceIndexedSearchParams theParams,
ResourceTable theEntity,
ResourceIndexedSearchParams existingParams) {
AddRemoveCount retVal = new AddRemoveCount();
synchronize(theEntity, retVal, theParams.myStringParams, existingParams.myStringParams);
synchronize(theEntity, retVal, theParams.myTokenParams, existingParams.myTokenParams);
synchronize(theEntity, retVal, theParams.myNumberParams, existingParams.myNumberParams);
synchronize(theEntity, retVal, theParams.myQuantityParams, existingParams.myQuantityParams);
synchronize(theEntity, retVal, theParams.myQuantityNormalizedParams, existingParams.myQuantityNormalizedParams);
synchronize(theEntity, retVal, theParams.myDateParams, existingParams.myDateParams);
synchronize(theEntity, retVal, theParams.myUriParams, existingParams.myUriParams);
synchronize(theEntity, retVal, theParams.myCoordsParams, existingParams.myCoordsParams);
synchronize(theEntity, retVal, theParams.myLinks, existingParams.myLinks);
synchronize(theEntity, retVal, theParams.myComboTokenNonUnique, existingParams.myComboTokenNonUnique);
synchronize(theEntity, retVal, theParams.myStringParams, existingParams.myStringParams, null);
synchronize(theEntity, retVal, theParams.myTokenParams, existingParams.myTokenParams, null);
synchronize(theEntity, retVal, theParams.myNumberParams, existingParams.myNumberParams, null);
synchronize(theEntity, retVal, theParams.myQuantityParams, existingParams.myQuantityParams, null);
synchronize(
theEntity,
retVal,
theParams.myQuantityNormalizedParams,
existingParams.myQuantityNormalizedParams,
null);
synchronize(theEntity, retVal, theParams.myDateParams, existingParams.myDateParams, null);
synchronize(theEntity, retVal, theParams.myUriParams, existingParams.myUriParams, null);
synchronize(theEntity, retVal, theParams.myCoordsParams, existingParams.myCoordsParams, null);
synchronize(theEntity, retVal, theParams.myLinks, existingParams.myLinks, null);
synchronize(theEntity, retVal, theParams.myComboTokenNonUnique, existingParams.myComboTokenNonUnique, null);
synchronize(
theEntity,
retVal,
theParams.myComboStringUniques,
existingParams.myComboStringUniques,
new UniqueIndexPreExistenceChecker());
// make sure links are indexed
theEntity.setResourceLinks(theParams.myLinks);
@ -76,20 +98,21 @@ public class DaoSearchParamSynchronizer {
}
@VisibleForTesting
public void setStorageSettings(StorageSettings theStorageSettings) {
this.myStorageSettings = theStorageSettings;
public void setEntityManager(EntityManager theEntityManager) {
myEntityManager = theEntityManager;
}
@VisibleForTesting
public void setEntityManager(EntityManager theEntityManager) {
myEntityManager = theEntityManager;
public void setStorageSettings(JpaStorageSettings theStorageSettings) {
myStorageSettings = theStorageSettings;
}
private <T extends BaseResourceIndex> void synchronize(
ResourceTable theEntity,
AddRemoveCount theAddRemoveCount,
Collection<T> theNewParams,
Collection<T> theExistingParams) {
Collection<T> theExistingParams,
@Nullable IPreSaveHook<T> theAddParamPreSaveHook) {
Collection<T> newParams = theNewParams;
for (T next : newParams) {
next.setPartitionId(theEntity.getPartitionId());
@ -112,6 +135,7 @@ public class DaoSearchParamSynchronizer {
Set<T> existingParamsAsSet = new HashSet<>(theExistingParams.size());
for (Iterator<T> iterator = theExistingParams.iterator(); iterator.hasNext(); ) {
T next = iterator.next();
next.setPlaceholderHashesIfMissing();
if (!existingParamsAsSet.add(next)) {
iterator.remove();
myEntityManager.remove(next);
@ -126,6 +150,11 @@ public class DaoSearchParamSynchronizer {
List<T> paramsToRemove = subtract(theExistingParams, newParams);
List<T> paramsToAdd = subtract(newParams, theExistingParams);
if (theAddParamPreSaveHook != null) {
theAddParamPreSaveHook.preSave(paramsToRemove, paramsToAdd);
}
tryToReuseIndexEntities(paramsToRemove, paramsToAdd);
updateExistingParamsIfRequired(theExistingParams, paramsToAdd, newParams, paramsToRemove);
@ -138,6 +167,7 @@ public class DaoSearchParamSynchronizer {
}
myEntityManager.remove(next);
}
for (T next : paramsToAdd) {
myEntityManager.merge(next);
}
@ -249,4 +279,64 @@ public class DaoSearchParamSynchronizer {
}
return retVal;
}
private interface IPreSaveHook<T> {
void preSave(Collection<T> theParamsToRemove, Collection<T> theParamsToAdd);
}
private class UniqueIndexPreExistenceChecker implements IPreSaveHook<ResourceIndexedComboStringUnique> {
@Override
public void preSave(
Collection<ResourceIndexedComboStringUnique> theParamsToRemove,
Collection<ResourceIndexedComboStringUnique> theParamsToAdd) {
if (myStorageSettings.isUniqueIndexesCheckedBeforeSave()) {
for (ResourceIndexedComboStringUnique theIndex : theParamsToAdd) {
ResourceIndexedComboStringUnique existing =
myResourceIndexedCompositeStringUniqueDao.findByQueryString(theIndex.getIndexString());
if (existing != null) {
/*
* If we're reindexing, and the previous index row is being updated
* to add previously missing hashes, we may falsely detect that the index
* creation is going to fail.
*/
boolean existingIndexIsScheduledForRemoval = false;
for (var next : theParamsToRemove) {
if (existing == next) {
existingIndexIsScheduledForRemoval = true;
break;
}
}
if (existingIndexIsScheduledForRemoval) {
continue;
}
String searchParameterId = "(unknown)";
if (theIndex.getSearchParameterId() != null) {
searchParameterId = theIndex.getSearchParameterId().getValue();
}
String msg = myFhirContext
.getLocalizer()
.getMessage(
BaseHapiFhirDao.class,
"uniqueIndexConflictFailure",
existing.getResource().getResourceType(),
theIndex.getIndexString(),
existing.getResource()
.getIdDt()
.toUnqualifiedVersionless()
.getValue(),
searchParameterId);
// Use ResourceVersionConflictException here because the HapiTransactionService
// catches this and can retry it if needed
throw new ResourceVersionConflictException(Msg.code(1093) + msg);
}
}
}
}
}
}

View File

@ -19,15 +19,8 @@
*/
package ca.uhn.fhir.jpa.dao.index;
import ca.uhn.fhir.context.RuntimeSearchParam;
import ca.uhn.fhir.i18n.Msg;
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.dao.BaseHapiFhirDao;
import ca.uhn.fhir.jpa.dao.data.IResourceIndexedComboStringUniqueDao;
import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.model.dao.JpaPid;
import ca.uhn.fhir.jpa.model.entity.BaseResourceIndexedSearchParam;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboStringUnique;
import ca.uhn.fhir.jpa.model.entity.ResourceTable;
import ca.uhn.fhir.jpa.searchparam.extractor.BaseSearchParamWithInlineReferencesExtractor;
import ca.uhn.fhir.jpa.searchparam.extractor.ISearchParamExtractor;
@ -36,10 +29,7 @@ import ca.uhn.fhir.jpa.searchparam.extractor.ResourceIndexedSearchParams;
import ca.uhn.fhir.jpa.searchparam.extractor.SearchParamExtractorService;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.api.server.storage.TransactionDetails;
import ca.uhn.fhir.rest.server.exceptions.ResourceVersionConflictException;
import ca.uhn.fhir.rest.server.util.ISearchParamRegistry;
import com.google.common.annotations.VisibleForTesting;
import jakarta.annotation.Nullable;
import jakarta.persistence.EntityManager;
import jakarta.persistence.PersistenceContext;
import jakarta.persistence.PersistenceContextType;
@ -48,49 +38,22 @@ import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Lazy;
import org.springframework.stereotype.Service;
import java.util.Collection;
import java.util.stream.Collectors;
@Service
@Lazy
public class SearchParamWithInlineReferencesExtractor extends BaseSearchParamWithInlineReferencesExtractor<JpaPid>
implements ISearchParamWithInlineReferencesExtractor {
private static final org.slf4j.Logger ourLog =
org.slf4j.LoggerFactory.getLogger(SearchParamWithInlineReferencesExtractor.class);
@PersistenceContext(type = PersistenceContextType.TRANSACTION)
protected EntityManager myEntityManager;
@Autowired
private ISearchParamRegistry mySearchParamRegistry;
@Autowired
private SearchParamExtractorService mySearchParamExtractorService;
@Autowired
private DaoSearchParamSynchronizer myDaoSearchParamSynchronizer;
@Autowired
private IResourceIndexedComboStringUniqueDao myResourceIndexedCompositeStringUniqueDao;
@Autowired
private PartitionSettings myPartitionSettings;
@VisibleForTesting
public void setPartitionSettings(PartitionSettings thePartitionSettings) {
myPartitionSettings = thePartitionSettings;
}
@VisibleForTesting
public void setSearchParamExtractorService(SearchParamExtractorService theSearchParamExtractorService) {
mySearchParamExtractorService = theSearchParamExtractorService;
}
@VisibleForTesting
public void setSearchParamRegistry(ISearchParamRegistry theSearchParamRegistry) {
mySearchParamRegistry = theSearchParamRegistry;
}
public void populateFromResource(
RequestPartitionId theRequestPartitionId,
ResourceIndexedSearchParams theParams,
@ -116,103 +79,4 @@ public class SearchParamWithInlineReferencesExtractor extends BaseSearchParamWit
thePerformIndexing,
ISearchParamExtractor.ALL_PARAMS);
}
@Nullable
private Collection<? extends BaseResourceIndexedSearchParam> findParameterIndexes(
ResourceIndexedSearchParams theParams, RuntimeSearchParam nextCompositeOf) {
Collection<? extends BaseResourceIndexedSearchParam> paramsListForCompositePart = null;
switch (nextCompositeOf.getParamType()) {
case NUMBER:
paramsListForCompositePart = theParams.myNumberParams;
break;
case DATE:
paramsListForCompositePart = theParams.myDateParams;
break;
case STRING:
paramsListForCompositePart = theParams.myStringParams;
break;
case TOKEN:
paramsListForCompositePart = theParams.myTokenParams;
break;
case QUANTITY:
paramsListForCompositePart = theParams.myQuantityParams;
break;
case URI:
paramsListForCompositePart = theParams.myUriParams;
break;
case REFERENCE:
case SPECIAL:
case COMPOSITE:
case HAS:
break;
}
if (paramsListForCompositePart != null) {
paramsListForCompositePart = paramsListForCompositePart.stream()
.filter(t -> t.getParamName().equals(nextCompositeOf.getName()))
.collect(Collectors.toList());
}
return paramsListForCompositePart;
}
@VisibleForTesting
public void setDaoSearchParamSynchronizer(DaoSearchParamSynchronizer theDaoSearchParamSynchronizer) {
myDaoSearchParamSynchronizer = theDaoSearchParamSynchronizer;
}
public void storeUniqueComboParameters(
ResourceIndexedSearchParams theParams,
ResourceTable theEntity,
ResourceIndexedSearchParams theExistingParams) {
/*
* String Uniques
*/
if (myStorageSettings.isUniqueIndexesEnabled()) {
for (ResourceIndexedComboStringUnique next : DaoSearchParamSynchronizer.subtract(
theExistingParams.myComboStringUniques, theParams.myComboStringUniques)) {
ourLog.debug("Removing unique index: {}", next);
myEntityManager.remove(next);
theEntity.getParamsComboStringUnique().remove(next);
}
boolean haveNewStringUniqueParams = false;
for (ResourceIndexedComboStringUnique next : DaoSearchParamSynchronizer.subtract(
theParams.myComboStringUniques, theExistingParams.myComboStringUniques)) {
if (myStorageSettings.isUniqueIndexesCheckedBeforeSave()) {
ResourceIndexedComboStringUnique existing =
myResourceIndexedCompositeStringUniqueDao.findByQueryString(next.getIndexString());
if (existing != null) {
String searchParameterId = "(unknown)";
if (next.getSearchParameterId() != null) {
searchParameterId = next.getSearchParameterId()
.toUnqualifiedVersionless()
.getValue();
}
String msg = myFhirContext
.getLocalizer()
.getMessage(
BaseHapiFhirDao.class,
"uniqueIndexConflictFailure",
theEntity.getResourceType(),
next.getIndexString(),
existing.getResource()
.getIdDt()
.toUnqualifiedVersionless()
.getValue(),
searchParameterId);
// Use ResourceVersionConflictException here because the HapiTransactionService
// catches this and can retry it if needed
throw new ResourceVersionConflictException(Msg.code(1093) + msg);
}
}
ourLog.debug("Persisting unique index: {}", next);
myEntityManager.persist(next);
haveNewStringUniqueParams = true;
}
theEntity.setParamsComboStringUniquePresent(
theParams.myComboStringUniques.size() > 0 || haveNewStringUniqueParams);
}
}
}

View File

@ -37,6 +37,7 @@ import ca.uhn.fhir.jpa.migrate.tasks.api.TaskFlagEnum;
import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.model.entity.BaseResourceIndexedSearchParam;
import ca.uhn.fhir.jpa.model.entity.ResourceHistoryTable;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboStringUnique;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamDate;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamQuantity;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamString;
@ -404,6 +405,41 @@ public class HapiFhirJpaMigrationTasks extends BaseMigrationTasks<VersionEnum> {
new ColumnAndNullable("RES_ID", false),
new ColumnAndNullable("PARTITION_ID", true));
}
/*
* Add hash columns to the combo param index tables
*/
{
version.onTable("HFJ_IDX_CMB_TOK_NU")
.addIndex("20240625.10", "IDX_IDXCMBTOKNU_HASHC")
.unique(false)
.withColumns("HASH_COMPLETE", "RES_ID", "PARTITION_ID");
version.onTable("HFJ_IDX_CMP_STRING_UNIQ")
.addColumn("20240625.20", "HASH_COMPLETE")
.nullable()
.type(ColumnTypeEnum.LONG);
version.onTable("HFJ_IDX_CMP_STRING_UNIQ")
.addColumn("20240625.30", "HASH_COMPLETE_2")
.nullable()
.type(ColumnTypeEnum.LONG);
version.onTable("HFJ_IDX_CMP_STRING_UNIQ")
.addTask(
new CalculateHashesTask(VersionEnum.V7_4_0, "20240625.40") {
@Override
protected boolean shouldSkipTask() {
return false;
}
}.setPidColumnName("PID")
.addCalculator(
"HASH_COMPLETE",
t -> ResourceIndexedComboStringUnique.calculateHashComplete(
t.getString("IDX_STRING")))
.addCalculator(
"HASH_COMPLETE_2",
t -> ResourceIndexedComboStringUnique.calculateHashComplete2(
t.getString("IDX_STRING")))
.setColumnName("HASH_COMPLETE"));
}
}
protected void init720() {

View File

@ -74,6 +74,7 @@ import ca.uhn.fhir.jpa.util.SqlQueryList;
import ca.uhn.fhir.model.api.IQueryParameterType;
import ca.uhn.fhir.model.api.Include;
import ca.uhn.fhir.model.api.ResourceMetadataKeyEnum;
import ca.uhn.fhir.model.api.TemporalPrecisionEnum;
import ca.uhn.fhir.model.valueset.BundleEntrySearchModeEnum;
import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum;
@ -82,6 +83,7 @@ import ca.uhn.fhir.rest.api.SortOrderEnum;
import ca.uhn.fhir.rest.api.SortSpec;
import ca.uhn.fhir.rest.api.server.IPreResourceAccessDetails;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.param.DateParam;
import ca.uhn.fhir.rest.param.DateRangeParam;
import ca.uhn.fhir.rest.param.ParameterUtil;
import ca.uhn.fhir.rest.param.ReferenceParam;
@ -1879,12 +1881,12 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
}
private void attemptComboUniqueSpProcessing(
QueryStack theQueryStack3, @Nonnull SearchParameterMap theParams, RequestDetails theRequest) {
QueryStack theQueryStack, @Nonnull SearchParameterMap theParams, RequestDetails theRequest) {
RuntimeSearchParam comboParam = null;
List<String> comboParamNames = null;
List<RuntimeSearchParam> exactMatchParams =
mySearchParamRegistry.getActiveComboSearchParams(myResourceName, theParams.keySet());
if (exactMatchParams.size() > 0) {
if (!exactMatchParams.isEmpty()) {
comboParam = exactMatchParams.get(0);
comboParamNames = new ArrayList<>(theParams.keySet());
}
@ -1906,98 +1908,138 @@ public class SearchBuilder implements ISearchBuilder<JpaPid> {
}
if (comboParam != null) {
// Since we're going to remove elements below
theParams.values().forEach(this::ensureSubListsAreWritable);
StringBuilder sb = new StringBuilder();
sb.append(myResourceName);
sb.append("?");
boolean first = true;
Collections.sort(comboParamNames);
for (String nextParamName : comboParamNames) {
List<List<IQueryParameterType>> nextValues = theParams.get(nextParamName);
// TODO Hack to fix weird IOOB on the next stanza until James comes back and makes sense of this.
if (nextValues.isEmpty()) {
ourLog.error(
"query parameter {} is unexpectedly empty. Encountered while considering {} index for {}",
nextParamName,
comboParam.getName(),
theRequest.getCompleteUrl());
sb = null;
break;
}
if (nextValues.get(0).size() != 1) {
sb = null;
break;
}
// Reference params are only eligible for using a composite index if they
// are qualified
RuntimeSearchParam nextParamDef =
mySearchParamRegistry.getActiveSearchParam(myResourceName, nextParamName);
if (nextParamDef.getParamType() == RestSearchParameterTypeEnum.REFERENCE) {
ReferenceParam param = (ReferenceParam) nextValues.get(0).get(0);
if (isBlank(param.getResourceType())) {
sb = null;
break;
}
}
List<? extends IQueryParameterType> nextAnd = nextValues.remove(0);
IQueryParameterType nextOr = nextAnd.remove(0);
String nextOrValue = nextOr.getValueAsQueryToken(myContext);
if (comboParam.getComboSearchParamType() == ComboSearchParamType.NON_UNIQUE) {
if (nextParamDef.getParamType() == RestSearchParameterTypeEnum.STRING) {
nextOrValue = StringUtil.normalizeStringForSearchIndexing(nextOrValue);
}
}
if (first) {
first = false;
} else {
sb.append('&');
}
nextParamName = UrlUtil.escapeUrlParam(nextParamName);
nextOrValue = UrlUtil.escapeUrlParam(nextOrValue);
sb.append(nextParamName).append('=').append(nextOrValue);
if (!validateParamValuesAreValidForComboParam(theParams, comboParamNames)) {
return;
}
if (sb != null) {
String indexString = sb.toString();
ourLog.debug(
"Checking for {} combo index for query: {}", comboParam.getComboSearchParamType(), indexString);
applyComboSearchParam(theQueryStack, theParams, theRequest, comboParamNames, comboParam);
}
}
// Interceptor broadcast: JPA_PERFTRACE_INFO
StorageProcessingMessage msg = new StorageProcessingMessage()
.setMessage("Using " + comboParam.getComboSearchParamType() + " index for query for search: "
+ indexString);
HookParams params = new HookParams()
.add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest)
.add(StorageProcessingMessage.class, msg);
CompositeInterceptorBroadcaster.doCallHooks(
myInterceptorBroadcaster, theRequest, Pointcut.JPA_PERFTRACE_INFO, params);
private void applyComboSearchParam(
QueryStack theQueryStack,
@Nonnull SearchParameterMap theParams,
RequestDetails theRequest,
List<String> theComboParamNames,
RuntimeSearchParam theComboParam) {
// Since we're going to remove elements below
theParams.values().forEach(this::ensureSubListsAreWritable);
switch (comboParam.getComboSearchParamType()) {
case UNIQUE:
theQueryStack3.addPredicateCompositeUnique(indexString, myRequestPartitionId);
break;
case NON_UNIQUE:
theQueryStack3.addPredicateCompositeNonUnique(indexString, myRequestPartitionId);
break;
StringBuilder theSearchBuilder = new StringBuilder();
theSearchBuilder.append(myResourceName);
theSearchBuilder.append("?");
boolean first = true;
for (String nextParamName : theComboParamNames) {
List<List<IQueryParameterType>> nextValues = theParams.get(nextParamName);
// This should never happen, but this safety check was added along the way and
// presumably must save us in some specific race condition. I am preserving it
// in a refactor of this code base. 20240429
if (nextValues.isEmpty()) {
ourLog.error(
"query parameter {} is unexpectedly empty. Encountered while considering {} index for {}",
nextParamName,
theComboParam.getName(),
theRequest.getCompleteUrl());
continue;
}
List<? extends IQueryParameterType> nextAnd = nextValues.remove(0);
IQueryParameterType nextOr = nextAnd.remove(0);
String nextOrValue = nextOr.getValueAsQueryToken(myContext);
RuntimeSearchParam nextParamDef = mySearchParamRegistry.getActiveSearchParam(myResourceName, nextParamName);
if (theComboParam.getComboSearchParamType() == ComboSearchParamType.NON_UNIQUE) {
if (nextParamDef.getParamType() == RestSearchParameterTypeEnum.STRING) {
nextOrValue = StringUtil.normalizeStringForSearchIndexing(nextOrValue);
}
}
// Remove any empty parameters remaining after this
theParams.clean();
if (first) {
first = false;
} else {
theSearchBuilder.append('&');
}
nextParamName = UrlUtil.escapeUrlParam(nextParamName);
nextOrValue = UrlUtil.escapeUrlParam(nextOrValue);
theSearchBuilder.append(nextParamName).append('=').append(nextOrValue);
}
if (theSearchBuilder != null) {
String indexString = theSearchBuilder.toString();
ourLog.debug(
"Checking for {} combo index for query: {}", theComboParam.getComboSearchParamType(), indexString);
// Interceptor broadcast: JPA_PERFTRACE_INFO
StorageProcessingMessage msg = new StorageProcessingMessage()
.setMessage("Using " + theComboParam.getComboSearchParamType() + " index for query for search: "
+ indexString);
HookParams params = new HookParams()
.add(RequestDetails.class, theRequest)
.addIfMatchesType(ServletRequestDetails.class, theRequest)
.add(StorageProcessingMessage.class, msg);
CompositeInterceptorBroadcaster.doCallHooks(
myInterceptorBroadcaster, theRequest, Pointcut.JPA_PERFTRACE_INFO, params);
switch (theComboParam.getComboSearchParamType()) {
case UNIQUE:
theQueryStack.addPredicateCompositeUnique(indexString, myRequestPartitionId);
break;
case NON_UNIQUE:
theQueryStack.addPredicateCompositeNonUnique(indexString, myRequestPartitionId);
break;
}
// Remove any empty parameters remaining after this
theParams.clean();
}
}
private boolean validateParamValuesAreValidForComboParam(
@Nonnull SearchParameterMap theParams, List<String> comboParamNames) {
boolean paramValuesAreValidForCombo = true;
for (String nextParamName : comboParamNames) {
List<List<IQueryParameterType>> nextValues = theParams.get(nextParamName);
// Multiple AND parameters are not supported for unique combo params
if (nextValues.get(0).size() != 1) {
ourLog.debug(
"Search is not a candidate for unique combo searching - Multiple AND expressions found for the same parameter");
paramValuesAreValidForCombo = false;
break;
}
List<IQueryParameterType> nextAndValue = nextValues.get(0);
for (IQueryParameterType nextOrValue : nextAndValue) {
if (nextOrValue instanceof DateParam) {
if (((DateParam) nextOrValue).getPrecision() != TemporalPrecisionEnum.DAY) {
ourLog.debug(
"Search is not a candidate for unique combo searching - Date search with non-DAY precision");
paramValuesAreValidForCombo = false;
break;
}
}
}
// Reference params are only eligible for using a composite index if they
// are qualified
RuntimeSearchParam nextParamDef = mySearchParamRegistry.getActiveSearchParam(myResourceName, nextParamName);
if (nextParamDef.getParamType() == RestSearchParameterTypeEnum.REFERENCE) {
ReferenceParam param = (ReferenceParam) nextValues.get(0).get(0);
if (isBlank(param.getResourceType())) {
ourLog.debug(
"Search is not a candidate for unique combo searching - Reference with no type specified");
paramValuesAreValidForCombo = false;
break;
}
}
}
return paramValuesAreValidForCombo;
}
private <T> void ensureSubListsAreWritable(List<List<T>> theListOfLists) {

View File

@ -20,6 +20,8 @@
package ca.uhn.fhir.jpa.search.builder.predicate;
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.model.entity.PartitionablePartitionId;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboTokenNonUnique;
import ca.uhn.fhir.jpa.search.builder.sql.SearchQueryBuilder;
import com.healthmarketscience.sqlbuilder.BinaryCondition;
import com.healthmarketscience.sqlbuilder.Condition;
@ -27,7 +29,7 @@ import com.healthmarketscience.sqlbuilder.dbspec.basic.DbColumn;
public class ComboNonUniqueSearchParameterPredicateBuilder extends BaseSearchParamPredicateBuilder {
private final DbColumn myColumnIndexString;
private final DbColumn myColumnHashComplete;
/**
* Constructor
@ -35,11 +37,15 @@ public class ComboNonUniqueSearchParameterPredicateBuilder extends BaseSearchPar
public ComboNonUniqueSearchParameterPredicateBuilder(SearchQueryBuilder theSearchSqlBuilder) {
super(theSearchSqlBuilder, theSearchSqlBuilder.addTable("HFJ_IDX_CMB_TOK_NU"));
myColumnIndexString = getTable().addColumn("IDX_STRING");
myColumnHashComplete = getTable().addColumn("HASH_COMPLETE");
}
public Condition createPredicateHashComplete(RequestPartitionId theRequestPartitionId, String theIndexString) {
BinaryCondition predicate = BinaryCondition.equalTo(myColumnIndexString, generatePlaceholder(theIndexString));
PartitionablePartitionId partitionId =
PartitionablePartitionId.toStoragePartition(theRequestPartitionId, getPartitionSettings());
long hash = ResourceIndexedComboTokenNonUnique.calculateHashComplete(
getPartitionSettings(), partitionId, theIndexString);
BinaryCondition predicate = BinaryCondition.equalTo(myColumnHashComplete, generatePlaceholder(hash));
return combineWithRequestPartitionIdPredicate(theRequestPartitionId, predicate);
}
}

View File

@ -370,7 +370,8 @@ public class ResourceLinkPredicateBuilder extends BaseJoiningPredicateBuilder im
.collect(Collectors.joining(" or ")));
} else {
builder.append("If you know what you're looking for, try qualifying it using the form: '");
builder.append(theParamName).append(":[resourceType]");
builder.append(theParamName).append(":[resourceType]=[id] or ");
builder.append(theParamName).append("=[resourceType]/[id]");
builder.append("'");
}
String message = builder.toString();

View File

@ -1,10 +1,10 @@
package ca.uhn.fhir.jpa.dao.index;
import ca.uhn.fhir.jpa.api.config.JpaStorageSettings;
import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.model.entity.BaseResourceIndex;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamNumber;
import ca.uhn.fhir.jpa.model.entity.ResourceTable;
import ca.uhn.fhir.jpa.model.entity.StorageSettings;
import ca.uhn.fhir.jpa.searchparam.extractor.ResourceIndexedSearchParams;
import ca.uhn.fhir.jpa.util.AddRemoveCount;
import jakarta.persistence.EntityManager;
@ -62,7 +62,7 @@ public class DaoSearchParamSynchronizerTest {
THE_SEARCH_PARAM_NUMBER.setResource(resourceTable);
subject.setEntityManager(entityManager);
subject.setStorageSettings(new StorageSettings());
subject.setStorageSettings(new JpaStorageSettings());
}
@Test

View File

@ -56,4 +56,14 @@ public abstract class BaseResourceIndex extends BasePartitionable implements Ser
public abstract boolean equals(Object obj);
public abstract <T extends BaseResourceIndex> void copyMutableValuesFrom(T theSource);
/**
* This is called when reindexing a resource on the previously existing index rows. This method
* should set zero/0 values for the hashes, in order to avoid any calculating hashes on existing
* rows failing. This is important only in cases where hashes are not present on the existing rows,
* which would only be the case if new hash columns have been added.
*/
public void setPlaceholderHashesIfMissing() {
// nothing by default
}
}

View File

@ -0,0 +1,44 @@
/*-
* #%L
* HAPI FHIR JPA Model
* %%
* Copyright (C) 2014 - 2024 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.model.entity;
import jakarta.annotation.Nonnull;
import jakarta.persistence.MappedSuperclass;
import jakarta.persistence.Transient;
import org.hl7.fhir.instance.model.api.IIdType;
@MappedSuperclass
public abstract class BaseResourceIndexedCombo extends BaseResourceIndex implements IResourceIndexComboSearchParameter {
@Transient
private IIdType mySearchParameterId;
@Override
public IIdType getSearchParameterId() {
return mySearchParameterId;
}
@Override
public void setSearchParameterId(@Nonnull IIdType theSearchParameterId) {
assert theSearchParameterId.hasResourceType();
assert theSearchParameterId.hasIdPart();
mySearchParameterId = theSearchParameterId.toUnqualifiedVersionless();
}
}

View File

@ -66,7 +66,7 @@ public abstract class BaseResourceIndexedSearchParam extends BaseResourceIndex {
protected Long myHashIdentity;
@GenericField
@Column(name = "SP_UPDATED")
@Column(name = "SP_UPDATED", nullable = true)
@Temporal(TemporalType.TIMESTAMP)
private Date myUpdated;

View File

@ -19,6 +19,8 @@
*/
package ca.uhn.fhir.jpa.model.entity;
import jakarta.annotation.Nonnull;
import jakarta.annotation.Nullable;
import org.hl7.fhir.instance.model.api.IIdType;
/**
@ -27,9 +29,13 @@ import org.hl7.fhir.instance.model.api.IIdType;
*/
public interface IResourceIndexComboSearchParameter {
/**
* Will be in the exact form <code>[resourceType]/[id]</code>
*/
@Nullable // if it never got set, e.g. on a row pulled from the DB
IIdType getSearchParameterId();
void setSearchParameterId(IIdType theSearchParameterId);
void setSearchParameterId(@Nonnull IIdType theSearchParameterId);
String getIndexString();

View File

@ -19,6 +19,7 @@
*/
package ca.uhn.fhir.jpa.model.entity;
import ca.uhn.fhir.jpa.model.util.SearchParamHash;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.ForeignKey;
@ -30,7 +31,6 @@ import jakarta.persistence.JoinColumn;
import jakarta.persistence.ManyToOne;
import jakarta.persistence.SequenceGenerator;
import jakarta.persistence.Table;
import jakarta.persistence.Transient;
import org.apache.commons.lang3.Validate;
import org.apache.commons.lang3.builder.CompareToBuilder;
import org.apache.commons.lang3.builder.EqualsBuilder;
@ -39,6 +39,23 @@ import org.apache.commons.lang3.builder.ToStringBuilder;
import org.apache.commons.lang3.builder.ToStringStyle;
import org.hl7.fhir.instance.model.api.IIdType;
/**
* NOTE ON LIMITATIONS HERE
* <p>
* This table does not include the partition ID in the uniqueness check. This was the case
* when this table was originally created. In other words, the uniqueness constraint does not
* include the partition column, and would therefore not be able to guarantee uniqueness
* local to a partition.
* </p>
* <p>
* TODO: HAPI FHIR 7.4.0 introduced hashes to this table - In a future release we should
* move the uniqueness constraint over to using them instead of the long string. At that
* time we could probably decide whether it makes sense to include the partition ID in
* the uniqueness check. Null values will be an issue there, we may need to introduce
* a rule that if you want to enforce uniqueness on a partitioned system you need a
* non-null default partition ID?
* </p>
*/
@Entity()
@Table(
name = "HFJ_IDX_CMP_STRING_UNIQ",
@ -52,7 +69,7 @@ import org.hl7.fhir.instance.model.api.IIdType;
columnList = "RES_ID",
unique = false)
})
public class ResourceIndexedComboStringUnique extends BasePartitionable
public class ResourceIndexedComboStringUnique extends BaseResourceIndexedCombo
implements Comparable<ResourceIndexedComboStringUnique>, IResourceIndexComboSearchParameter {
public static final int MAX_STRING_LENGTH = 500;
@ -75,6 +92,39 @@ public class ResourceIndexedComboStringUnique extends BasePartitionable
@Column(name = "RES_ID", insertable = false, updatable = false)
private Long myResourceId;
// TODO: These hashes were added in 7.4.0 - They aren't used or indexed yet, but
// eventually we should replace the string index with a hash index in order to
// reduce the space usage.
@Column(name = "HASH_COMPLETE")
private Long myHashComplete;
/**
* Because we'll be using these hashes to enforce uniqueness, the risk of
* collisions is bad, since it would be plain impossible to insert a row
* with a false collision here. So in order to reduce that risk, we
* double the number of bits we hash by having two hashes, effectively
* making the hash a 128 bit hash instead of just 64.
* <p>
* The idea is that having two of them widens the hash from 64 bits to 128
* bits
* </p><p>
* If we have a value we want to guarantee uniqueness on of
* <code>Observation?code=A</code>, say it hashes to <code>12345</code>.
* And suppose we have another value of <code>Observation?code=B</code> which
* also hashes to <code>12345</code>. This is unlikely but not impossible.
* And if this happens, it will be impossible to add a resource with
* code B if there is already a resource with code A.
* </p><p>
* Adding a second, salted hash reduces the likelihood of this happening,
* since it's unlikely the second hash would also collide. Not impossible
* of course, but orders of magnitude less likely still.
* </p>
*
* @see #calculateHashComplete2(String) to see how this is calculated
*/
@Column(name = "HASH_COMPLETE_2")
private Long myHashComplete2;
@Column(name = "IDX_STRING", nullable = false, length = MAX_STRING_LENGTH)
private String myIndexString;
@ -85,9 +135,6 @@ public class ResourceIndexedComboStringUnique extends BasePartitionable
@Column(name = PartitionablePartitionId.PARTITION_ID, insertable = false, updatable = false, nullable = true)
private Integer myPartitionIdValue;
@Transient
private IIdType mySearchParameterId;
/**
* Constructor
*/
@ -121,9 +168,22 @@ public class ResourceIndexedComboStringUnique extends BasePartitionable
return false;
}
calculateHashes();
ResourceIndexedComboStringUnique that = (ResourceIndexedComboStringUnique) theO;
return new EqualsBuilder().append(myIndexString, that.myIndexString).isEquals();
EqualsBuilder b = new EqualsBuilder();
b.append(myHashComplete, that.myHashComplete);
b.append(myHashComplete2, that.myHashComplete2);
return b.isEquals();
}
@Override
public <T extends BaseResourceIndex> void copyMutableValuesFrom(T theSource) {
ResourceIndexedComboStringUnique source = (ResourceIndexedComboStringUnique) theSource;
myIndexString = source.myIndexString;
myHashComplete = source.myHashComplete;
myHashComplete2 = source.myHashComplete2;
}
@Override
@ -146,9 +206,76 @@ public class ResourceIndexedComboStringUnique extends BasePartitionable
myResource = theResource;
}
@Override
public Long getId() {
return myId;
}
@Override
public void setId(Long theId) {
myId = theId;
}
public Long getHashComplete() {
return myHashComplete;
}
public void setHashComplete(Long theHashComplete) {
myHashComplete = theHashComplete;
}
public Long getHashComplete2() {
return myHashComplete2;
}
public void setHashComplete2(Long theHashComplete2) {
myHashComplete2 = theHashComplete2;
}
@Override
public void setPlaceholderHashesIfMissing() {
super.setPlaceholderHashesIfMissing();
if (myHashComplete == null) {
myHashComplete = 0L;
}
if (myHashComplete2 == null) {
myHashComplete2 = 0L;
}
}
@Override
public void calculateHashes() {
if (myHashComplete == null) {
setHashComplete(calculateHashComplete(myIndexString));
setHashComplete2(calculateHashComplete2(myIndexString));
}
}
public static long calculateHashComplete(String theQueryString) {
return SearchParamHash.hashSearchParam(theQueryString);
}
public static long calculateHashComplete2(String theQueryString) {
// Just add a constant salt to the query string in order to hopefully
// further avoid collisions
String newQueryString = theQueryString + "ABC123";
return calculateHashComplete(newQueryString);
}
@Override
public void clearHashes() {
myHashComplete = null;
myHashComplete2 = null;
}
@Override
public int hashCode() {
return new HashCodeBuilder(17, 37).append(myIndexString).toHashCode();
calculateHashes();
HashCodeBuilder b = new HashCodeBuilder(17, 37);
b.append(myHashComplete);
b.append(myHashComplete2);
return b.toHashCode();
}
@Override
@ -157,23 +284,9 @@ public class ResourceIndexedComboStringUnique extends BasePartitionable
.append("id", myId)
.append("resourceId", myResourceId)
.append("indexString", myIndexString)
.append("hashComplete", myHashComplete)
.append("hashComplete2", myHashComplete2)
.append("partition", getPartitionId())
.toString();
}
/**
* Note: This field is not persisted, so it will only be populated for new indexes
*/
@Override
public void setSearchParameterId(IIdType theSearchParameterId) {
mySearchParameterId = theSearchParameterId;
}
/**
* Note: This field is not persisted, so it will only be populated for new indexes
*/
@Override
public IIdType getSearchParameterId() {
return mySearchParameterId;
}
}

View File

@ -21,6 +21,7 @@ package ca.uhn.fhir.jpa.model.entity;
import ca.uhn.fhir.interceptor.model.RequestPartitionId;
import ca.uhn.fhir.jpa.model.config.PartitionSettings;
import ca.uhn.fhir.jpa.model.util.SearchParamHash;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.ForeignKey;
@ -37,18 +38,17 @@ import org.apache.commons.lang3.builder.CompareToBuilder;
import org.apache.commons.lang3.builder.EqualsBuilder;
import org.apache.commons.lang3.builder.HashCodeBuilder;
import org.apache.commons.lang3.builder.ToStringBuilder;
import org.hl7.fhir.instance.model.api.IIdType;
import static ca.uhn.fhir.jpa.model.util.SearchParamHash.hashSearchParam;
@Entity
@Table(
name = "HFJ_IDX_CMB_TOK_NU",
indexes = {
// TODO: The hash index was added in 7.4.0 - In 7.6.0 we should drop the string index
@Index(name = "IDX_IDXCMBTOKNU_STR", columnList = "IDX_STRING", unique = false),
@Index(name = "IDX_IDXCMBTOKNU_HASHC", columnList = "HASH_COMPLETE,RES_ID,PARTITION_ID", unique = false),
@Index(name = "IDX_IDXCMBTOKNU_RES", columnList = "RES_ID", unique = false)
})
public class ResourceIndexedComboTokenNonUnique extends BaseResourceIndex
public class ResourceIndexedComboTokenNonUnique extends BaseResourceIndexedCombo
implements Comparable<ResourceIndexedComboTokenNonUnique>, IResourceIndexComboSearchParameter {
@SequenceGenerator(name = "SEQ_IDXCMBTOKNU_ID", sequenceName = "SEQ_IDXCMBTOKNU_ID")
@ -76,9 +76,6 @@ public class ResourceIndexedComboTokenNonUnique extends BaseResourceIndex
@Transient
private transient PartitionSettings myPartitionSettings;
@Transient
private IIdType mySearchParameterId;
/**
* Constructor
*/
@ -105,6 +102,8 @@ public class ResourceIndexedComboTokenNonUnique extends BaseResourceIndex
@Override
public boolean equals(Object theO) {
calculateHashes();
if (this == theO) {
return true;
}
@ -116,7 +115,7 @@ public class ResourceIndexedComboTokenNonUnique extends BaseResourceIndex
ResourceIndexedComboTokenNonUnique that = (ResourceIndexedComboTokenNonUnique) theO;
EqualsBuilder b = new EqualsBuilder();
b.append(myIndexString, that.myIndexString);
b.append(getHashComplete(), that.getHashComplete());
return b.isEquals();
}
@ -157,7 +156,11 @@ public class ResourceIndexedComboTokenNonUnique extends BaseResourceIndex
@Override
public int hashCode() {
return new HashCodeBuilder(17, 37).append(myIndexString).toHashCode();
calculateHashes();
HashCodeBuilder builder = new HashCodeBuilder(17, 37);
builder.append(getHashComplete());
return builder.toHashCode();
}
public PartitionSettings getPartitionSettings() {
@ -206,27 +209,6 @@ public class ResourceIndexedComboTokenNonUnique extends BaseResourceIndex
public static long calculateHashComplete(
PartitionSettings partitionSettings, PartitionablePartitionId thePartitionId, String queryString) {
RequestPartitionId requestPartitionId = PartitionablePartitionId.toRequestPartitionId(thePartitionId);
return hashSearchParam(partitionSettings, requestPartitionId, queryString);
}
public static long calculateHashComplete(
PartitionSettings partitionSettings, RequestPartitionId partitionId, String queryString) {
return hashSearchParam(partitionSettings, partitionId, queryString);
}
/**
* Note: This field is not persisted, so it will only be populated for new indexes
*/
@Override
public void setSearchParameterId(IIdType theSearchParameterId) {
mySearchParameterId = theSearchParameterId;
}
/**
* Note: This field is not persisted, so it will only be populated for new indexes
*/
@Override
public IIdType getSearchParameterId() {
return mySearchParameterId;
return SearchParamHash.hashSearchParam(partitionSettings, requestPartitionId, queryString);
}
}

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR JPA Model
* %%
* Copyright (C) 2014 - 2024 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.model.search;
import ca.uhn.fhir.rest.server.util.IndexedSearchParam;

View File

@ -29,6 +29,8 @@ import com.google.common.hash.HashCode;
import com.google.common.hash.HashFunction;
import com.google.common.hash.Hasher;
import com.google.common.hash.Hashing;
import jakarta.annotation.Nonnull;
import jakarta.annotation.Nullable;
/**
* Utility class for calculating hashes of SearchParam entity fields.
@ -51,12 +53,29 @@ public class SearchParamHash {
* Applies a fast and consistent hashing algorithm to a set of strings
*/
public static long hashSearchParam(
PartitionSettings thePartitionSettings, RequestPartitionId theRequestPartitionId, String... theValues) {
@Nonnull PartitionSettings thePartitionSettings,
@Nonnull RequestPartitionId theRequestPartitionId,
@Nonnull String... theValues) {
return doHashSearchParam(thePartitionSettings, theRequestPartitionId, theValues);
}
/**
* Applies a fast and consistent hashing algorithm to a set of strings
*/
public static long hashSearchParam(@Nonnull String... theValues) {
return doHashSearchParam(null, null, theValues);
}
private static long doHashSearchParam(
@Nullable PartitionSettings thePartitionSettings,
@Nullable RequestPartitionId theRequestPartitionId,
@Nonnull String[] theValues) {
Hasher hasher = HASH_FUNCTION.newHasher();
if (thePartitionSettings.isPartitioningEnabled()
&& thePartitionSettings.isIncludePartitionInSearchHashes()
&& theRequestPartitionId != null) {
if (thePartitionSettings != null
&& theRequestPartitionId != null
&& thePartitionSettings.isPartitioningEnabled()
&& thePartitionSettings.isIncludePartitionInSearchHashes()) {
if (theRequestPartitionId.getPartitionIds().size() > 1) {
throw new InternalErrorException(Msg.code(1527)
+ "Can not search multiple partitions when partitions are included in search hashes");

View File

@ -48,9 +48,11 @@ import ca.uhn.fhir.jpa.searchparam.SearchParamConstants;
import ca.uhn.fhir.jpa.searchparam.util.JpaParamUtil;
import ca.uhn.fhir.jpa.searchparam.util.RuntimeSearchParamHelper;
import ca.uhn.fhir.model.api.IQueryParameterType;
import ca.uhn.fhir.model.api.TemporalPrecisionEnum;
import ca.uhn.fhir.model.primitive.BoundCodeDt;
import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.RestSearchParameterTypeEnum;
import ca.uhn.fhir.rest.param.DateParam;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.util.ISearchParamRegistry;
import ca.uhn.fhir.util.FhirTerser;
@ -563,6 +565,14 @@ public abstract class BaseSearchParamExtractor implements ISearchParamExtractor
if (paramsListForCompositePart != null) {
for (BaseResourceIndexedSearchParam nextParam : paramsListForCompositePart) {
IQueryParameterType nextParamAsClientParam = nextParam.toQueryParameterType();
if (nextParamAsClientParam instanceof DateParam) {
DateParam date = (DateParam) nextParamAsClientParam;
if (date.getPrecision() != TemporalPrecisionEnum.DAY) {
continue;
}
}
String value = nextParamAsClientParam.getValueAsQueryToken(myContext);
RuntimeSearchParam param = mySearchParamRegistry.getActiveSearchParam(theResourceType, key);
@ -578,6 +588,7 @@ public abstract class BaseSearchParamExtractor implements ISearchParamExtractor
}
}
}
if (linksForCompositePart != null) {
for (ResourceLink nextLink : linksForCompositePart) {
if (linksForCompositePartWantPaths.contains(nextLink.getSourcePath())) {
@ -942,7 +953,7 @@ public abstract class BaseSearchParamExtractor implements ISearchParamExtractor
for (int i = 0; i < values.size(); i++) {
IBase nextObject = values.get(i);
if (nextObject instanceof IBaseExtension) {
IBaseExtension nextExtension = (IBaseExtension) nextObject;
IBaseExtension<?, ?> nextExtension = (IBaseExtension<?, ?>) nextObject;
nextObject = nextExtension.getValue();
values.set(i, nextObject);
}
@ -1740,7 +1751,7 @@ public abstract class BaseSearchParamExtractor implements ISearchParamExtractor
}
}
@SuppressWarnings({"unchecked", "UnnecessaryLocalVariable"})
@SuppressWarnings({"UnnecessaryLocalVariable"})
private void createStringIndexIfNotBlank(
String theResourceType,
Set<? extends BaseResourceIndexedSearchParam> theParams,

View File

@ -89,8 +89,9 @@ public class JpaSearchParamCache {
}
public Optional<RuntimeSearchParam> getActiveComboSearchParamById(String theResourceName, IIdType theId) {
IIdType idToFind = theId.toUnqualifiedVersionless();
return getActiveComboSearchParams(theResourceName).stream()
.filter((param) -> Objects.equals(theId, param.getId()))
.filter((param) -> Objects.equals(idToFind, param.getIdUnqualifiedVersionless()))
.findFirst();
}

View File

@ -76,17 +76,17 @@ public class JpaSearchParamCacheTest {
@Test
public void testGetActiveComboParamByIdPresent(){
IIdType id1 = new IdType(1);
IIdType id1 = new IdType("SearchParameter/1");
RuntimeSearchParam sp1 = createSearchParam(id1, ComboSearchParamType.NON_UNIQUE);
IIdType id2 = new IdType(2);
IIdType id2 = new IdType("SearchParameter/2");
RuntimeSearchParam sp2 = createSearchParam(id2, ComboSearchParamType.NON_UNIQUE);
setActiveComboSearchParams(RESOURCE_TYPE, List.of(sp1, sp2));
Optional<RuntimeSearchParam> found = myJpaSearchParamCache.getActiveComboSearchParamById(RESOURCE_TYPE, id1);
assertThat(found).isPresent();
assertEquals(id1, found.get().getId());
assertEquals(id1, found.get().getIdUnqualifiedVersionless());
}
@Test
@ -143,7 +143,7 @@ public class JpaSearchParamCacheTest {
private RuntimeSearchParam createSearchParam(IIdType theId, ComboSearchParamType theType){
RuntimeSearchParam sp = mock(RuntimeSearchParam.class);
when(sp.getId()).thenReturn(theId);
when(sp.getIdUnqualifiedVersionless()).thenReturn(theId);
when(sp.getComboSearchParamType()).thenReturn(theType);
return sp;
}
@ -154,7 +154,7 @@ public class JpaSearchParamCacheTest {
myJpaSearchParamCache.setActiveComboSearchParams(activeComboParams);
}
private class TestableJpaSearchParamCache extends JpaSearchParamCache {
private static class TestableJpaSearchParamCache extends JpaSearchParamCache {
public void setActiveComboSearchParams(Map<String, List<RuntimeSearchParam>> theActiveComboSearchParams){
myActiveComboSearchParams = theActiveComboSearchParams;
}

View File

@ -20,6 +20,7 @@ import org.apache.commons.lang3.Validate;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.r4.model.BooleanType;
import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.SearchParameter;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
@ -45,6 +46,8 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest {
protected LocalDate myPartitionDate2;
protected int myPartitionId;
protected int myPartitionId2;
protected int myPartitionId3;
protected int myPartitionId4;
private boolean myHaveDroppedForcedIdUniqueConstraint;
@Autowired
private IPartitionLookupSvc myPartitionConfigSvc;
@ -81,14 +84,16 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest {
myPartitionDate2 = LocalDate.of(2020, Month.FEBRUARY, 15);
myPartitionId = 1;
myPartitionId2 = 2;
myPartitionId3 = 3;
myPartitionId4 = 4;
myPartitionInterceptor = new MyReadWriteInterceptor();
mySrdInterceptorService.registerInterceptor(myPartitionInterceptor);
myPartitionConfigSvc.createPartition(new PartitionEntity().setId(1).setName(PARTITION_1), null);
myPartitionConfigSvc.createPartition(new PartitionEntity().setId(2).setName(PARTITION_2), null);
myPartitionConfigSvc.createPartition(new PartitionEntity().setId(3).setName(PARTITION_3), null);
myPartitionConfigSvc.createPartition(new PartitionEntity().setId(4).setName(PARTITION_4), null);
myPartitionConfigSvc.createPartition(new PartitionEntity().setId(myPartitionId).setName(PARTITION_1), null);
myPartitionConfigSvc.createPartition(new PartitionEntity().setId(myPartitionId2).setName(PARTITION_2), null);
myPartitionConfigSvc.createPartition(new PartitionEntity().setId(myPartitionId3).setName(PARTITION_3), null);
myPartitionConfigSvc.createPartition(new PartitionEntity().setId(myPartitionId4).setName(PARTITION_4), null);
myStorageSettings.setIndexMissingFields(JpaStorageSettings.IndexEnabledEnum.ENABLED);
@ -96,6 +101,10 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest {
myPartitionInterceptor.addReadPartition(RequestPartitionId.fromPartitionNames(JpaConstants.DEFAULT_PARTITION_NAME, PARTITION_1, PARTITION_2, PARTITION_3, PARTITION_4));
myPatientDao.search(new SearchParameterMap().setLoadSynchronous(true), mySrd);
// Pre-fetch the partitions by ID
for (int i = 1; i <= 4; i++) {
myPartitionConfigSvc.getPartitionById(i);
}
}
@Override
@ -110,7 +119,8 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest {
});
}
}
protected void createUniqueCompositeSp() {
protected void createUniqueComboSp() {
addCreateDefaultPartition();
addReadDefaultPartition(); // one for search param validation
SearchParameter sp = new SearchParameter();
@ -122,6 +132,17 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest {
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
addCreateDefaultPartition();
addReadDefaultPartition(); // one for search param validation
sp = new SearchParameter();
sp.setId("SearchParameter/patient-family");
sp.setType(Enumerations.SearchParamType.STRING);
sp.setCode("family");
sp.setExpression("Patient.name[0].family");
sp.setStatus(Enumerations.PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
addCreateDefaultPartition();
sp = new SearchParameter();
sp.setId("SearchParameter/patient-birthdate-unique");
@ -131,6 +152,9 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest {
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-birthdate");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-family");
sp.addExtension()
.setUrl(HapiExtensions.EXT_SP_UNIQUE)
.setValue(new BooleanType(true));
@ -139,6 +163,49 @@ public abstract class BasePartitioningR4Test extends BaseJpaR4SystemTest {
mySearchParamRegistry.forceRefresh();
}
protected void createNonUniqueComboSp() {
addCreateDefaultPartition();
addReadDefaultPartition(); // one for search param validation
SearchParameter sp = new SearchParameter();
sp.setId("SearchParameter/patient-family");
sp.setType(Enumerations.SearchParamType.STRING);
sp.setCode("family");
sp.setExpression("Patient.name.family");
sp.setStatus(Enumerations.PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
addCreateDefaultPartition();
addReadDefaultPartition(); // one for search param validation
sp = new SearchParameter();
sp.setId("SearchParameter/patient-managingorg");
sp.setType(Enumerations.SearchParamType.REFERENCE);
sp.setCode(Patient.SP_ORGANIZATION);
sp.setExpression("Patient.managingOrganization");
sp.setStatus(Enumerations.PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
addCreateDefaultPartition();
sp = new SearchParameter();
sp.setId("SearchParameter/patient-family-and-org");
sp.setType(Enumerations.SearchParamType.COMPOSITE);
sp.setStatus(Enumerations.PublicationStatus.ACTIVE);
sp.addBase("Patient");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-family");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-managingorg");
sp.addExtension()
.setUrl(HapiExtensions.EXT_SP_UNIQUE)
.setValue(new BooleanType(false));
mySearchParameterDao.update(sp, mySrd);
mySearchParamRegistry.forceRefresh();
}
protected void dropForcedIdUniqueConstraint() {
runInTransaction(() -> {
myEntityManager.createNativeQuery("alter table " + ResourceTable.HFJ_RESOURCE + " drop constraint " + IDX_RES_TYPE_FHIR_ID).executeUpdate();

View File

@ -1,6 +1,5 @@
package ca.uhn.fhir.jpa.dao.r4;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import ca.uhn.fhir.interceptor.api.IInterceptorService;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboTokenNonUnique;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
@ -8,6 +7,7 @@ import ca.uhn.fhir.jpa.searchparam.submit.interceptor.SearchParamValidatingInter
import ca.uhn.fhir.jpa.util.SqlQuery;
import ca.uhn.fhir.rest.api.server.IBundleProvider;
import ca.uhn.fhir.rest.param.DateParam;
import ca.uhn.fhir.rest.param.ReferenceParam;
import ca.uhn.fhir.rest.param.StringParam;
import ca.uhn.fhir.rest.param.TokenParam;
import ca.uhn.fhir.util.HapiExtensions;
@ -16,24 +16,24 @@ import org.hl7.fhir.r4.model.BooleanType;
import org.hl7.fhir.r4.model.DateType;
import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.Enumerations.PublicationStatus;
import org.hl7.fhir.r4.model.Organization;
import org.hl7.fhir.r4.model.Patient;
import org.hl7.fhir.r4.model.SearchParameter;
import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import java.util.Comparator;
import java.util.List;
import java.util.stream.Collectors;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
public class FhirResourceDaoR4ComboNonUniqueParamTest extends BaseComboParamsR4Test {
private static final Logger ourLog = LoggerFactory.getLogger(FhirResourceDaoR4ComboNonUniqueParamTest.class);
public static final String ORG_ID_UNQUALIFIED = "my-org";
public static final String ORG_ID_QUALIFIED = "Organization/" + ORG_ID_UNQUALIFIED;
@Autowired
SearchParamValidatingInterceptor mySearchParamValidatingInterceptor;
@ -53,36 +53,28 @@ public class FhirResourceDaoR4ComboNonUniqueParamTest extends BaseComboParamsR4T
}
}
private void createNamesAndGenderSp() {
@Test
public void testTokenFromCodeableConcept_Create() {
SearchParameter sp = new SearchParameter();
sp.setId("SearchParameter/patient-family");
sp.setType(Enumerations.SearchParamType.STRING);
sp.setCode("family");
sp.setExpression("Patient.name.family + '|'");
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-given");
sp.setType(Enumerations.SearchParamType.STRING);
sp.setCode("given");
sp.setExpression("Patient.name.given");
sp.setExpression("Patient.name.family");
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp);
mySearchParameterDao.update(sp, mySrd);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-gender");
sp.setId("SearchParameter/patient-maritalstatus");
sp.setType(Enumerations.SearchParamType.TOKEN);
sp.setCode("gender");
sp.setExpression("Patient.gender");
sp.setExpression("Patient.maritalStatus");
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp);
mySearchParameterDao.update(sp, mySrd);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-names-and-gender");
sp.setId("SearchParameter/patient-names-and-maritalstatus");
sp.setType(Enumerations.SearchParamType.COMPOSITE);
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
@ -91,35 +83,58 @@ public class FhirResourceDaoR4ComboNonUniqueParamTest extends BaseComboParamsR4T
.setDefinition("SearchParameter/patient-family");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-given");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-gender");
.setDefinition("SearchParameter/patient-maritalstatus");
sp.addExtension()
.setUrl(HapiExtensions.EXT_SP_UNIQUE)
.setValue(new BooleanType(false));
mySearchParameterDao.update(sp);
mySearchParameterDao.update(sp, mySrd);
mySearchParamRegistry.forceRefresh();
myMessages.clear();
Patient pt = new Patient();
pt.addName().setFamily("FAMILY1");
pt.addName().setFamily("FAMILY2");
pt.getMaritalStatus().addCoding().setSystem("http://foo1").setCode("bar1");
pt.getMaritalStatus().addCoding().setSystem("http://foo2").setCode("bar2");
myPatientDao.create(pt, mySrd);
createPatient1(null);
logAllNonUniqueIndexes();
runInTransaction(() -> {
List<ResourceIndexedComboTokenNonUnique> indexedTokens = myResourceIndexedComboTokensNonUniqueDao.findAll();
indexedTokens.sort(Comparator.comparing(ResourceIndexedComboTokenNonUnique::getIndexString));
assertEquals(4, indexedTokens.size());
String expected;
expected = "Patient?gender=http%3A%2F%2Ffoo1%7Cbar1&given=FAMILY1";
assertEquals(expected, indexedTokens.get(0).getIndexString());
expected = "Patient?gender=http%3A%2F%2Ffoo1%7Cbar1&given=FAMILY2";
assertEquals(expected, indexedTokens.get(1).getIndexString());
expected = "Patient?gender=http%3A%2F%2Ffoo2%7Cbar2&given=FAMILY1";
assertEquals(expected, indexedTokens.get(2).getIndexString());
expected = "Patient?gender=http%3A%2F%2Ffoo2%7Cbar2&given=FAMILY2";
assertEquals(expected, indexedTokens.get(3).getIndexString());
});
}
@Test
public void testCreateAndUse() {
createNamesAndGenderSp();
IIdType id1 = createPatient1();
@Test
public void testStringAndToken_Create() {
createStringAndTokenCombo_NameAndGender();
IIdType id1 = createPatient1(null);
assertNotNull(id1);
IIdType id2 = createPatient2();
IIdType id2 = createPatient2(null);
assertNotNull(id2);
logAllNonUniqueIndexes();
runInTransaction(() -> {
List<ResourceIndexedComboTokenNonUnique> indexedTokens = myResourceIndexedComboTokensNonUniqueDao.findAll();
indexedTokens.sort(Comparator.comparing(t -> t.getId()));
indexedTokens.sort(Comparator.comparing(ResourceIndexedComboTokenNonUnique::getId));
assertEquals(2, indexedTokens.size());
String expected = "Patient?family=FAMILY1%5C%7C&gender=http%3A%2F%2Fhl7.org%2Ffhir%2Fadministrative-gender%7Cmale&given=GIVEN1";
assertEquals(expected, indexedTokens.get(0).getIndexString());
assertEquals(-7504889232313729794L, indexedTokens.get(0).getHashComplete().longValue());
});
@ -134,14 +149,9 @@ public class FhirResourceDaoR4ComboNonUniqueParamTest extends BaseComboParamsR4T
myCaptureQueriesListener.logSelectQueries();
assertThat(actual).containsExactlyInAnyOrder(id1.toUnqualifiedVersionless().getValue());
boolean found = false;
for (SqlQuery query : myCaptureQueriesListener.getSelectQueries()) {
String sql = query.getSql(true, false);
if ("SELECT t0.RES_ID FROM HFJ_IDX_CMB_TOK_NU t0 WHERE (t0.IDX_STRING = 'Patient?family=FAMILY1%5C%7C&gender=http%3A%2F%2Fhl7.org%2Ffhir%2Fadministrative-gender%7Cmale&given=GIVEN1')".equals(sql)) {
found = true;
}
}
assertThat(found).as("Found expected sql").isTrue();
assertThat(myCaptureQueriesListener.getSelectQueries().stream().map(t->t.getSql(true, false)).toList()).contains(
"SELECT t0.RES_ID FROM HFJ_IDX_CMB_TOK_NU t0 WHERE (t0.HASH_COMPLETE = '-7504889232313729794')"
);
logCapturedMessages();
assertThat(myMessages.toString()).contains("[INFO Using NON_UNIQUE index for query for search: Patient?family=FAMILY1%5C%7C&gender=http%3A%2F%2Fhl7.org%2Ffhir%2Fadministrative-gender%7Cmale&given=GIVEN1]");
@ -149,9 +159,9 @@ public class FhirResourceDaoR4ComboNonUniqueParamTest extends BaseComboParamsR4T
// Remove 1, add another
myPatientDao.delete(id1);
myPatientDao.delete(id1, mySrd);
IIdType id3 = createPatient1();
IIdType id3 = createPatient1(null);
assertNotNull(id3);
params = SearchParameterMap.newSynchronous();
@ -166,15 +176,15 @@ public class FhirResourceDaoR4ComboNonUniqueParamTest extends BaseComboParamsR4T
}
@Test
public void testCreateAndUpdateResource() {
createNamesAndGenderSp();
public void testStringAndToken_CreateAndUpdate() {
createStringAndTokenCombo_NameAndGender();
// Create a resource patching the unique SP
myCaptureQueriesListener.clear();
IIdType id1 = createPatient1();
IIdType id1 = createPatient1(null);
assertNotNull(id1);
assertThat(myCaptureQueriesListener.countSelectQueries()).as(String.join(",", "\n" + myCaptureQueriesListener.getSelectQueries().stream().map(q -> q.getThreadName()).collect(Collectors.toList()))).isEqualTo(0);
assertThat(myCaptureQueriesListener.countSelectQueries()).as(String.join(",", "\n" + myCaptureQueriesListener.getSelectQueries().stream().map(SqlQuery::getThreadName).toList())).isEqualTo(0);
assertEquals(12, myCaptureQueriesListener.countInsertQueries());
assertEquals(0, myCaptureQueriesListener.countUpdateQueries());
assertEquals(0, myCaptureQueriesListener.countDeleteQueries());
@ -219,19 +229,19 @@ public class FhirResourceDaoR4ComboNonUniqueParamTest extends BaseComboParamsR4T
}
@Test
public void testSearchWithExtraParameters() {
createNamesAndGenderSp();
public void testStringAndToken_SearchWithExtraParameters() {
createStringAndTokenCombo_NameAndGender();
IIdType id1 = createPatient1();
IIdType id1 = createPatient1(null);
assertNotNull(id1);
IIdType id2 = createPatient2();
IIdType id2 = createPatient2(null);
assertNotNull(id2);
logAllNonUniqueIndexes();
runInTransaction(() -> {
List<ResourceIndexedComboTokenNonUnique> indexedTokens = myResourceIndexedComboTokensNonUniqueDao.findAll();
indexedTokens.sort(Comparator.comparing(t -> t.getId()));
indexedTokens.sort(Comparator.comparing(ResourceIndexedComboTokenNonUnique::getId));
assertEquals(2, indexedTokens.size());
assertEquals(-7504889232313729794L, indexedTokens.get(0).getHashComplete().longValue());
});
@ -249,7 +259,8 @@ public class FhirResourceDaoR4ComboNonUniqueParamTest extends BaseComboParamsR4T
assertThat(actual).containsExactlyInAnyOrder(id1.toUnqualifiedVersionless().getValue());
String sql = myCaptureQueriesListener.getSelectQueries().get(0).getSql(true, false);
assertEquals("SELECT t1.RES_ID FROM HFJ_RESOURCE t1 INNER JOIN HFJ_IDX_CMB_TOK_NU t0 ON (t1.RES_ID = t0.RES_ID) INNER JOIN HFJ_SPIDX_DATE t2 ON (t1.RES_ID = t2.RES_ID) WHERE ((t0.IDX_STRING = 'Patient?family=FAMILY1%5C%7C&gender=http%3A%2F%2Fhl7.org%2Ffhir%2Fadministrative-gender%7Cmale&given=GIVEN1') AND ((t2.HASH_IDENTITY = '5247847184787287691') AND ((t2.SP_VALUE_LOW_DATE_ORDINAL >= '20210202') AND (t2.SP_VALUE_HIGH_DATE_ORDINAL <= '20210202'))))", sql);
String expected = "SELECT t1.RES_ID FROM HFJ_RESOURCE t1 INNER JOIN HFJ_IDX_CMB_TOK_NU t0 ON (t1.RES_ID = t0.RES_ID) INNER JOIN HFJ_SPIDX_DATE t2 ON (t1.RES_ID = t2.RES_ID) WHERE ((t0.HASH_COMPLETE = '-7504889232313729794') AND ((t2.HASH_IDENTITY = '5247847184787287691') AND ((t2.SP_VALUE_LOW_DATE_ORDINAL >= '20210202') AND (t2.SP_VALUE_HIGH_DATE_ORDINAL <= '20210202'))))";
assertEquals(expected, sql);
logCapturedMessages();
assertThat(myMessages.toString()).contains("[INFO Using NON_UNIQUE index for query for search: Patient?family=FAMILY1%5C%7C&gender=http%3A%2F%2Fhl7.org%2Ffhir%2Fadministrative-gender%7Cmale&given=GIVEN1]");
@ -258,21 +269,296 @@ public class FhirResourceDaoR4ComboNonUniqueParamTest extends BaseComboParamsR4T
}
private IIdType createPatient2() {
Patient pt2 = new Patient();
pt2.getNameFirstRep().setFamily("Family2").addGiven("Given2");
pt2.setGender(Enumerations.AdministrativeGender.MALE);
pt2.setBirthDateElement(new DateType("2021-02-02"));
IIdType id2 = myPatientDao.create(pt2).getId().toUnqualified();
return id2;
@Test
public void testStringAndDate_Create() {
createStringAndTokenCombo_NameAndBirthdate();
IIdType id1 = createPatient1(null);
assertNotNull(id1);
IIdType id2 = createPatient2(null);
assertNotNull(id2);
logAllNonUniqueIndexes();
runInTransaction(() -> {
List<ResourceIndexedComboTokenNonUnique> indexedTokens = myResourceIndexedComboTokensNonUniqueDao.findAll();
indexedTokens.sort(Comparator.comparing(ResourceIndexedComboTokenNonUnique::getId));
assertEquals(2, indexedTokens.size());
String expected = "Patient?birthdate=2021-02-02&family=FAMILY1";
assertEquals(expected, indexedTokens.get(0).getIndexString());
assertEquals(7196518367857292879L, indexedTokens.get(0).getHashComplete().longValue());
});
myMessages.clear();
SearchParameterMap params = SearchParameterMap.newSynchronous();
params.add("family", new StringParam("family1"));
params.add("birthdate", new DateParam("2021-02-02"));
myCaptureQueriesListener.clear();
IBundleProvider results = myPatientDao.search(params, mySrd);
List<String> actual = toUnqualifiedVersionlessIdValues(results);
myCaptureQueriesListener.logSelectQueries();
assertThat(actual).contains(id1.toUnqualifiedVersionless().getValue());
String expected = "SELECT t0.RES_ID FROM HFJ_IDX_CMB_TOK_NU t0 WHERE (t0.HASH_COMPLETE = '7196518367857292879')";
assertEquals(expected, myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, false));
logCapturedMessages();
assertThat(myMessages.toString()).contains("[INFO Using NON_UNIQUE index for query for search: Patient?birthdate=2021-02-02&family=FAMILY1]");
myMessages.clear();
}
private IIdType createPatient1() {
/**
* Can't create or search for combo params with partial dates
*/
@Test
public void testStringAndDate_Create_PartialDate() {
createStringAndTokenCombo_NameAndBirthdate();
Patient pt1 = new Patient();
pt1.getNameFirstRep().setFamily("Family1").addGiven("Given1");
pt1.setBirthDateElement(new DateType("2021-02"));
IIdType id1 = myPatientDao.create(pt1, mySrd).getId().toUnqualified();
logAllNonUniqueIndexes();
runInTransaction(() -> {
List<ResourceIndexedComboTokenNonUnique> indexedTokens = myResourceIndexedComboTokensNonUniqueDao.findAll();
assertEquals(0, indexedTokens.size());
});
myMessages.clear();
SearchParameterMap params = SearchParameterMap.newSynchronous();
params.add("family", new StringParam("family1"));
params.add("birthdate", new DateParam("2021-02"));
myCaptureQueriesListener.clear();
IBundleProvider results = myPatientDao.search(params, mySrd);
List<String> actual = toUnqualifiedVersionlessIdValues(results);
myCaptureQueriesListener.logSelectQueries();
assertThat(actual).contains(id1.toUnqualifiedVersionless().getValue());
String sql = myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, false);
assertThat(sql).doesNotContain("HFJ_IDX_CMB_TOK_NU");
logCapturedMessages();
assertThat(myMessages).isEmpty();
}
@Test
public void testStringAndReference_Create() {
createStringAndReferenceCombo_FamilyAndOrganization();
createOrg();
IIdType id1 = createPatient1(ORG_ID_QUALIFIED);
createPatient2(ORG_ID_QUALIFIED);
logAllNonUniqueIndexes();
runInTransaction(() -> {
List<ResourceIndexedComboTokenNonUnique> indexedTokens = myResourceIndexedComboTokensNonUniqueDao.findAll();
indexedTokens.sort(Comparator.comparing(ResourceIndexedComboTokenNonUnique::getId));
assertEquals(2, indexedTokens.size());
assertEquals("Patient?family=FAMILY1%5C%7C&organization=Organization%2Fmy-org", indexedTokens.get(0).getIndexString());
});
myMessages.clear();
SearchParameterMap params = SearchParameterMap.newSynchronous();
params.add("family", new StringParam("fAmIlY1|")); // weird casing to test normalization
params.add("organization", new ReferenceParam(ORG_ID_QUALIFIED));
myCaptureQueriesListener.clear();
IBundleProvider results = myPatientDao.search(params, mySrd);
List<String> actual = toUnqualifiedVersionlessIdValues(results);
myCaptureQueriesListener.logSelectQueries();
assertThat(actual).contains(id1.toUnqualifiedVersionless().getValue());
String expected = "SELECT t0.RES_ID FROM HFJ_IDX_CMB_TOK_NU t0 WHERE (t0.HASH_COMPLETE = '2277801301223576208')";
assertEquals(expected, myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, false));
}
@Test
public void testStringAndReference_SearchByUnqualifiedReference() {
createStringAndReferenceCombo_FamilyAndOrganization();
createOrg();
IIdType id1 = createPatient1(ORG_ID_QUALIFIED);
createPatient2(ORG_ID_QUALIFIED);
myMessages.clear();
SearchParameterMap params = SearchParameterMap.newSynchronous();
params.add("family", new StringParam("family1"));
// "orgid" instead of "Organization/orgid"
params.add("organization", new ReferenceParam(ORG_ID_UNQUALIFIED));
myCaptureQueriesListener.clear();
IBundleProvider results = myPatientDao.search(params, mySrd);
List<String> actual = toUnqualifiedVersionlessIdValues(results);
assertThat(actual).contains(id1.toUnqualifiedVersionless().getValue());
myCaptureQueriesListener.logSelectQueries();
String expected;
expected = "select rt1_0.RES_ID,rt1_0.RES_TYPE,rt1_0.FHIR_ID from HFJ_RESOURCE rt1_0 where rt1_0.FHIR_ID='my-org'";
assertEquals(expected, myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, false));
String sql = myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(1).getSql(true, false);
assertThat(sql).contains("SP_VALUE_NORMALIZED LIKE 'FAMILY1%'");
assertThat(sql).contains("t1.TARGET_RESOURCE_ID");
assertThat(myMessages.get(0)).contains("This search uses an unqualified resource");
}
private void createOrg() {
Organization org = new Organization();
org.setName("Some Org");
org.setId(ORG_ID_QUALIFIED);
myOrganizationDao.update(org, mySrd);
}
private void createStringAndTokenCombo_NameAndBirthdate() {
SearchParameter sp = new SearchParameter();
sp.setId("SearchParameter/patient-family");
sp.setType(Enumerations.SearchParamType.STRING);
sp.setCode("family");
sp.setExpression("Patient.name.family");
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-birthdate");
sp.setType(Enumerations.SearchParamType.DATE);
sp.setCode("birthdate");
sp.setExpression("Patient.birthDate");
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-names-and-birthdate");
sp.setType(Enumerations.SearchParamType.COMPOSITE);
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-family");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-birthdate");
sp.addExtension()
.setUrl(HapiExtensions.EXT_SP_UNIQUE)
.setValue(new BooleanType(false));
mySearchParameterDao.update(sp, mySrd);
mySearchParamRegistry.forceRefresh();
myMessages.clear();
}
private void createStringAndTokenCombo_NameAndGender() {
SearchParameter sp = new SearchParameter();
sp.setId("SearchParameter/patient-family");
sp.setType(Enumerations.SearchParamType.STRING);
sp.setCode("family");
sp.setExpression("Patient.name.family + '|'");
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-given");
sp.setType(Enumerations.SearchParamType.STRING);
sp.setCode("given");
sp.setExpression("Patient.name.given");
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-gender");
sp.setType(Enumerations.SearchParamType.TOKEN);
sp.setCode("gender");
sp.setExpression("Patient.gender");
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-names-and-gender");
sp.setType(Enumerations.SearchParamType.COMPOSITE);
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-family");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-given");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-gender");
sp.addExtension()
.setUrl(HapiExtensions.EXT_SP_UNIQUE)
.setValue(new BooleanType(false));
mySearchParameterDao.update(sp, mySrd);
mySearchParamRegistry.forceRefresh();
myMessages.clear();
}
private void createStringAndReferenceCombo_FamilyAndOrganization() {
SearchParameter sp = new SearchParameter();
sp.setId("SearchParameter/patient-family");
sp.setType(Enumerations.SearchParamType.STRING);
sp.setCode("family");
sp.setExpression("Patient.name.family + '|'");
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-managingorg");
sp.setType(Enumerations.SearchParamType.REFERENCE);
sp.setCode(Patient.SP_ORGANIZATION);
sp.setExpression("Patient.managingOrganization");
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
mySearchParameterDao.update(sp, mySrd);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-family-and-org");
sp.setType(Enumerations.SearchParamType.COMPOSITE);
sp.setStatus(PublicationStatus.ACTIVE);
sp.addBase("Patient");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-family");
sp.addComponent()
.setExpression("Patient")
.setDefinition("SearchParameter/patient-managingorg");
sp.addExtension()
.setUrl(HapiExtensions.EXT_SP_UNIQUE)
.setValue(new BooleanType(false));
mySearchParameterDao.update(sp, mySrd);
mySearchParamRegistry.forceRefresh();
myMessages.clear();
}
private IIdType createPatient1(String theOrgId) {
Patient pt1 = new Patient();
pt1.getNameFirstRep().setFamily("Family1").addGiven("Given1");
pt1.setGender(Enumerations.AdministrativeGender.MALE);
pt1.setBirthDateElement(new DateType("2021-02-02"));
return myPatientDao.create(pt1).getId().toUnqualified();
pt1.getManagingOrganization().setReference(theOrgId);
return myPatientDao.create(pt1, mySrd).getId().toUnqualified();
}
private IIdType createPatient2(String theOrgId) {
Patient pt2 = new Patient();
pt2.getNameFirstRep().setFamily("Family2").addGiven("Given2");
pt2.setGender(Enumerations.AdministrativeGender.MALE);
pt2.setBirthDateElement(new DateType("2021-02-02"));
pt2.getManagingOrganization().setReference(theOrgId);
return myPatientDao.create(pt2, mySrd).getId().toUnqualified();
}

View File

@ -559,7 +559,7 @@ public class FhirResourceDaoR4ConcurrentWriteTest extends BaseJpaR4Test {
}
runInTransaction(() -> {
ourLog.info("Uniques:\n * " + myResourceIndexedCompositeStringUniqueDao.findAll().stream().map(t -> t.toString()).collect(Collectors.joining("\n * ")));
ourLog.info("Uniques:\n * " + myResourceIndexedComboStringUniqueDao.findAll().stream().map(t -> t.toString()).collect(Collectors.joining("\n * ")));
});
// Make sure we saved the object

View File

@ -19,6 +19,7 @@ import ca.uhn.fhir.jpa.api.model.DeleteMethodOutcome;
import ca.uhn.fhir.jpa.api.model.ExpungeOptions;
import ca.uhn.fhir.jpa.api.model.HistoryCountModeEnum;
import ca.uhn.fhir.jpa.dao.data.ISearchParamPresentDao;
import ca.uhn.fhir.jpa.delete.job.ReindexTestHelper;
import ca.uhn.fhir.jpa.entity.TermValueSet;
import ca.uhn.fhir.jpa.entity.TermValueSetPreExpansionStatusEnum;
import ca.uhn.fhir.jpa.interceptor.ForceOffsetSearchModeInterceptor;
@ -27,7 +28,6 @@ import ca.uhn.fhir.jpa.model.util.JpaConstants;
import ca.uhn.fhir.jpa.provider.BaseResourceProviderR4Test;
import ca.uhn.fhir.jpa.search.PersistedJpaSearchFirstPageBundleProvider;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.subscription.submit.svc.ResourceModifiedSubmitterSvc;
import ca.uhn.fhir.jpa.subscription.triggering.ISubscriptionTriggeringSvc;
import ca.uhn.fhir.jpa.subscription.triggering.SubscriptionTriggeringSvcImpl;
import ca.uhn.fhir.jpa.term.TermReadSvcImpl;
@ -148,13 +148,12 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test
@Autowired
private ISubscriptionTriggeringSvc mySubscriptionTriggeringSvc;
@Autowired
private ResourceModifiedSubmitterSvc myResourceModifiedSubmitterSvc;
@Autowired
private ReindexStep myReindexStep;
@Autowired
private DeleteExpungeStep myDeleteExpungeStep;
@Autowired
protected SubscriptionTestUtil mySubscriptionTestUtil;
private ReindexTestHelper myReindexTestHelper;
@AfterEach
@ -166,7 +165,6 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test
myStorageSettings.setDeleteEnabled(new JpaStorageSettings().isDeleteEnabled());
myStorageSettings.setHistoryCountMode(JpaStorageSettings.DEFAULT_HISTORY_COUNT_MODE);
myStorageSettings.setIndexMissingFields(new JpaStorageSettings().getIndexMissingFields());
myStorageSettings.setInlineResourceTextBelowSize(new JpaStorageSettings().getInlineResourceTextBelowSize());
myStorageSettings.setMassIngestionMode(new JpaStorageSettings().isMassIngestionMode());
myStorageSettings.setMatchUrlCacheEnabled(new JpaStorageSettings().isMatchUrlCacheEnabled());
myStorageSettings.setPopulateIdentifierInAutoCreatedPlaceholderReferenceTargets(new JpaStorageSettings().isPopulateIdentifierInAutoCreatedPlaceholderReferenceTargets());
@ -175,6 +173,8 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test
myStorageSettings.setRespectVersionsForSearchIncludes(new JpaStorageSettings().isRespectVersionsForSearchIncludes());
myStorageSettings.setTagStorageMode(new JpaStorageSettings().getTagStorageMode());
myStorageSettings.setExpungeEnabled(false);
myStorageSettings.setUniqueIndexesEnabled(new JpaStorageSettings().isUniqueIndexesEnabled());
myStorageSettings.setUniqueIndexesCheckedBeforeSave(new JpaStorageSettings().isUniqueIndexesCheckedBeforeSave());
myFhirContext.getParserOptions().setStripVersionsFromReferences(true);
TermReadSvcImpl.setForceDisableHibernateSearchForUnitTest(false);
@ -190,6 +190,8 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test
// Pre-cache all StructureDefinitions so that query doesn't affect other counts
myValidationSupport.invalidateCaches();
myValidationSupport.fetchAllStructureDefinitions();
myReindexTestHelper = new ReindexTestHelper(myFhirContext, myDaoRegistry, mySearchParamRegistry);
}
/**
@ -1068,6 +1070,45 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test
}
@Test
public void testReindexJob_ComboParamIndexesInUse() {
myStorageSettings.setUniqueIndexesEnabled(true);
myReindexTestHelper.createUniqueCodeSearchParameter();
myReindexTestHelper.createNonUniqueStatusAndCodeSearchParameter();
Bundle inputBundle = myReindexTestHelper.createTransactionBundleWith20Observation(false);
Bundle transactionResonse = mySystemDao.transaction(mySrd, inputBundle);
ResourceIdListWorkChunkJson data = new ResourceIdListWorkChunkJson();
transactionResonse
.getEntry()
.stream()
.map(t->new IdType(t.getResponse().getLocation()))
.forEach(t->data.addTypedPid("Observation", t.getIdPartAsLong()));
runInTransaction(() -> {
assertEquals(24L, myResourceTableDao.count());
assertEquals(20L, myResourceIndexedComboStringUniqueDao.count());
assertEquals(20L, myResourceIndexedComboTokensNonUniqueDao.count());
});
ReindexJobParameters params = new ReindexJobParameters()
.setOptimizeStorage(ReindexParameters.OptimizeStorageModeEnum.NONE)
.setReindexSearchParameters(ReindexParameters.ReindexSearchParametersEnum.ALL)
.setOptimisticLock(false);
// execute
myCaptureQueriesListener.clear();
RunOutcome outcome = myReindexStep.doReindex(data, mock(IJobDataSink.class), "123", "456", params);
assertEquals(20, outcome.getRecordsProcessed());
// validate
assertEquals(4, myCaptureQueriesListener.getSelectQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getUpdateQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getInsertQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getDeleteQueriesForCurrentThread().size());
}
public void assertNoPartitionSelectors() {
List<SqlQuery> selectQueries = myCaptureQueriesListener.getSelectQueriesForCurrentThread();
@ -3102,6 +3143,63 @@ public class FhirResourceDaoR4QueryCountTest extends BaseResourceProviderR4Test
}
@Test
public void testTransaction_ComboParamIndexesInUse() {
myStorageSettings.setUniqueIndexesEnabled(true);
myReindexTestHelper.createUniqueCodeSearchParameter();
myReindexTestHelper.createNonUniqueStatusAndCodeSearchParameter();
// Create resources for the first time
myCaptureQueriesListener.clear();
Bundle inputBundle = myReindexTestHelper.createTransactionBundleWith20Observation(true);
mySystemDao.transaction(mySrd, inputBundle);
assertEquals(21, myCaptureQueriesListener.getSelectQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getUpdateQueriesForCurrentThread().size());
assertEquals(78, myCaptureQueriesListener.getInsertQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getDeleteQueriesForCurrentThread().size());
// Now run the transaction again - It should not need too many SELECTs
myCaptureQueriesListener.clear();
inputBundle = myReindexTestHelper.createTransactionBundleWith20Observation(true);
mySystemDao.transaction(mySrd, inputBundle);
assertEquals(4, myCaptureQueriesListener.getSelectQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getUpdateQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getInsertQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getDeleteQueriesForCurrentThread().size());
}
@Test
public void testTransaction_ComboParamIndexesInUse_NoPreCheck() {
myStorageSettings.setUniqueIndexesEnabled(true);
myStorageSettings.setUniqueIndexesCheckedBeforeSave(false);
myReindexTestHelper.createUniqueCodeSearchParameter();
myReindexTestHelper.createNonUniqueStatusAndCodeSearchParameter();
// Create resources for the first time
myCaptureQueriesListener.clear();
Bundle inputBundle = myReindexTestHelper.createTransactionBundleWith20Observation(true);
mySystemDao.transaction(mySrd, inputBundle);
myCaptureQueriesListener.logSelectQueries();
assertEquals(1, myCaptureQueriesListener.getSelectQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getUpdateQueriesForCurrentThread().size());
assertEquals(7, myCaptureQueriesListener.getInsertQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getDeleteQueriesForCurrentThread().size());
// Now run the transaction again - It should not need too many SELECTs
myCaptureQueriesListener.clear();
inputBundle = myReindexTestHelper.createTransactionBundleWith20Observation(true);
mySystemDao.transaction(mySrd, inputBundle);
assertEquals(4, myCaptureQueriesListener.getSelectQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getUpdateQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getInsertQueriesForCurrentThread().size());
assertEquals(0, myCaptureQueriesListener.getDeleteQueriesForCurrentThread().size());
}
/**
* See the class javadoc before changing the counts in this test!
*/

View File

@ -4487,7 +4487,7 @@ public class FhirResourceDaoR4SearchNoFtTest extends BaseJpaR4Test {
ArgumentCaptor<HookParams> captor = ArgumentCaptor.forClass(HookParams.class);
verify(interceptor, times(1)).invoke(ArgumentMatchers.eq(Pointcut.JPA_PERFTRACE_WARNING), captor.capture());
StorageProcessingMessage message = captor.getValue().get(StorageProcessingMessage.class);
assertEquals("This search uses an unqualified resource(a parameter in a chain without a resource type). This is less efficient than using a qualified type. If you know what you're looking for, try qualifying it using the form: 'entity:[resourceType]'", message.getMessage());
assertEquals("This search uses an unqualified resource(a parameter in a chain without a resource type). This is less efficient than using a qualified type. If you know what you're looking for, try qualifying it using the form: 'entity:[resourceType]=[id] or entity=[resourceType]/[id]'", message.getMessage());
}
@Test

View File

@ -21,6 +21,7 @@ import ca.uhn.fhir.jpa.model.entity.ResourceHistoryTag;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboStringUnique;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamDate;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamString;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamToken;
import ca.uhn.fhir.jpa.model.entity.ResourceLink;
import ca.uhn.fhir.jpa.model.entity.ResourceTable;
import ca.uhn.fhir.jpa.model.entity.ResourceTag;
@ -54,6 +55,7 @@ import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.Bundle;
import org.hl7.fhir.r4.model.CodeSystem;
import org.hl7.fhir.r4.model.DateTimeType;
import org.hl7.fhir.r4.model.DateType;
import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.IdType;
import org.hl7.fhir.r4.model.Observation;
@ -68,6 +70,8 @@ import org.junit.jupiter.api.AfterEach;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.ValueSource;
import org.mockito.ArgumentCaptor;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@ -165,16 +169,15 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
IIdType obsId = myObservationDao.create(obs, mySrd).getId().toUnqualifiedVersionless();
List<SqlQuery> selectQueries = myCaptureQueriesListener.getSelectQueriesForCurrentThread();
assertThat(selectQueries).hasSize(2);
// Look up the partition
assertThat(selectQueries.get(0).getSql(true, false).toLowerCase()).contains(" from hfj_partition ");
assertEquals(1, selectQueries.size());
// Look up the referenced subject/patient
assertThat(selectQueries.get(1).getSql(true, false).toLowerCase()).contains(" from hfj_resource ");
assertEquals(0, StringUtils.countMatches(selectQueries.get(1).getSql(true, false).toLowerCase(), "partition"));
String sql = selectQueries.get(0).getSql(true, false).toLowerCase();
assertThat(sql).contains(" from hfj_resource ");
assertEquals(0, StringUtils.countMatches(selectQueries.get(0).getSql(true, false).toLowerCase(), "partition"));
runInTransaction(() -> {
List<ResourceLink> resLinks = myResourceLinkDao.findAll();
ourLog.info("Resource links:\n{}", resLinks.toString());
ourLog.info("Resource links:\n{}", resLinks);
assertEquals(2, resLinks.size());
assertEquals(obsId.getIdPartAsLong(), resLinks.get(0).getSourceResourcePid());
assertEquals(patientId.getIdPartAsLong(), resLinks.get(0).getTargetResourcePid());
@ -223,7 +226,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
runInTransaction(() -> {
List<ResourceLink> resLinks = myResourceLinkDao.findAll();
ourLog.info("Resource links:\n{}", resLinks.toString());
ourLog.info("Resource links:\n{}", resLinks);
assertEquals(2, resLinks.size());
assertEquals(obsId.getIdPartAsLong(), resLinks.get(0).getSourceResourcePid());
assertEquals(patientId.getIdPart(), resLinks.get(0).getTargetResourceId());
@ -270,7 +273,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
runInTransaction(() -> {
List<ResourceLink> resLinks = myResourceLinkDao.findAll();
ourLog.info("Resource links:\n{}", resLinks.toString());
ourLog.info("Resource links:\n{}", resLinks);
assertEquals(2, resLinks.size());
assertEquals(obsId.getIdPartAsLong(), resLinks.get(0).getSourceResourcePid());
assertEquals(patientId.getIdPartAsLong(), resLinks.get(0).getTargetResourcePid());
@ -294,7 +297,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
runInTransaction(() -> {
List<ResourceLink> resLinks = myResourceLinkDao.findAll();
ourLog.info("Resource links:\n{}", resLinks.toString());
ourLog.info("Resource links:\n{}", resLinks);
assertEquals(2, resLinks.size());
assertEquals(obsId.getIdPartAsLong(), resLinks.get(0).getSourceResourcePid());
assertEquals(patientId.getIdPart(), resLinks.get(0).getTargetResourceId());
@ -396,7 +399,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
@Test
public void testCreate_ServerId_WithPartition() {
createUniqueCompositeSp();
createUniqueComboSp();
createRequestId();
addCreatePartition(myPartitionId, myPartitionDate);
@ -409,7 +412,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
p.getMeta().addTag("http://system", "code", "diisplay");
p.addName().setFamily("FAM");
p.addIdentifier().setSystem("system").setValue("value");
p.setBirthDate(new Date());
p.setBirthDateElement(new DateType("2020-01-01"));
p.getManagingOrganization().setReferenceElement(orgId);
Long patientId = myPatientDao.create(p, mySrd).getId().getIdPartAsLong();
@ -470,17 +473,20 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
assertLocalDateFromDbMatches(myPartitionDate, presents.get(0).getPartitionId().getPartitionDate());
// HFJ_IDX_CMP_STRING_UNIQ
List<ResourceIndexedComboStringUnique> uniques = myResourceIndexedCompositeStringUniqueDao.findAllForResourceIdForUnitTest(patientId);
List<ResourceIndexedComboStringUnique> uniques = myResourceIndexedComboStringUniqueDao.findAllForResourceIdForUnitTest(patientId);
assertEquals(1, uniques.size());
assertEquals(myPartitionId, uniques.get(0).getPartitionId().getPartitionId().intValue());
assertLocalDateFromDbMatches(myPartitionDate, uniques.get(0).getPartitionId().getPartitionDate());
});
myCaptureQueriesListener.clear();
}
@Test
public void testCreate_ServerId_DefaultPartition() {
createUniqueCompositeSp();
createUniqueComboSp();
createRequestId();
addCreateDefaultPartition(myPartitionDate);
@ -500,29 +506,29 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
runInTransaction(() -> {
// HFJ_RESOURCE
ResourceTable resourceTable = myResourceTableDao.findById(patientId).orElseThrow(IllegalArgumentException::new);
assertEquals(null, resourceTable.getPartitionId().getPartitionId());
assertNull(resourceTable.getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, resourceTable.getPartitionId().getPartitionDate());
// HFJ_RES_TAG
List<ResourceTag> tags = myResourceTagDao.findAll();
assertEquals(1, tags.size());
assertEquals(null, tags.get(0).getPartitionId().getPartitionId());
assertNull(tags.get(0).getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, tags.get(0).getPartitionId().getPartitionDate());
// HFJ_RES_VER
ResourceHistoryTable version = myResourceHistoryTableDao.findForIdAndVersionAndFetchProvenance(patientId, 1L);
assertEquals(null, version.getPartitionId().getPartitionId());
assertNull(version.getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, version.getPartitionId().getPartitionDate());
// HFJ_HISTORY_TAG
List<ResourceHistoryTag> historyTags = myResourceHistoryTagDao.findAll();
assertEquals(1, historyTags.size());
assertEquals(null, historyTags.get(0).getPartitionId().getPartitionId());
assertNull(historyTags.get(0).getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, historyTags.get(0).getPartitionId().getPartitionDate());
// HFJ_RES_VER_PROV
assertNotNull(version.getProvenance());
assertEquals(null, version.getProvenance().getPartitionId().getPartitionId());
assertNull(version.getProvenance().getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, version.getProvenance().getPartitionId().getPartitionDate());
// HFJ_SPIDX_STRING
@ -532,34 +538,34 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
assertThat(stringsDesc).doesNotContain("_text");
assertThat(stringsDesc).doesNotContain("_content");
assertEquals(9, strings.size(), stringsDesc);
assertEquals(null, strings.get(0).getPartitionId().getPartitionId());
assertNull(strings.get(0).getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, strings.get(0).getPartitionId().getPartitionDate());
// HFJ_SPIDX_DATE
List<ResourceIndexedSearchParamDate> dates = myResourceIndexedSearchParamDateDao.findAllForResourceId(patientId);
ourLog.info("\n * {}", dates.stream().map(ResourceIndexedSearchParamDate::toString).collect(Collectors.joining("\n * ")));
assertEquals(2, dates.size());
assertEquals(null, dates.get(0).getPartitionId().getPartitionId());
assertNull(dates.get(0).getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, dates.get(0).getPartitionId().getPartitionDate());
assertEquals(null, dates.get(1).getPartitionId().getPartitionId());
assertNull(dates.get(1).getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, dates.get(1).getPartitionId().getPartitionDate());
// HFJ_RES_LINK
List<ResourceLink> resourceLinks = myResourceLinkDao.findAllForSourceResourceId(patientId);
assertEquals(1, resourceLinks.size());
assertEquals(null, resourceLinks.get(0).getPartitionId().getPartitionId());
assertNull(resourceLinks.get(0).getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, resourceLinks.get(0).getPartitionId().getPartitionDate());
// HFJ_RES_PARAM_PRESENT
List<SearchParamPresentEntity> presents = mySearchParamPresentDao.findAllForResource(resourceTable);
assertEquals(3, presents.size());
assertEquals(null, presents.get(0).getPartitionId().getPartitionId());
assertNull(presents.get(0).getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, presents.get(0).getPartitionId().getPartitionDate());
// HFJ_IDX_CMP_STRING_UNIQ
List<ResourceIndexedComboStringUnique> uniques = myResourceIndexedCompositeStringUniqueDao.findAllForResourceIdForUnitTest(patientId);
List<ResourceIndexedComboStringUnique> uniques = myResourceIndexedComboStringUniqueDao.findAllForResourceIdForUnitTest(patientId);
assertEquals(1, uniques.size());
assertEquals(null, uniques.get(0).getPartitionId().getPartitionId());
assertNull(uniques.get(0).getPartitionId().getPartitionId());
assertLocalDateFromDbMatches(myPartitionDate, uniques.get(0).getPartitionId().getPartitionDate());
});
@ -645,7 +651,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
@Test
public void testCreateInTransaction_ServerId_WithPartition() {
createUniqueCompositeSp();
createUniqueComboSp();
createRequestId();
addCreatePartition(myPartitionId, myPartitionDate);
@ -716,7 +722,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
startRequest.setJobDefinitionId(DeleteExpungeAppCtx.JOB_DELETE_EXPUNGE);
// execute
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(startRequest);
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(mySrd, startRequest);
// Validate
JobInstance outcome = myBatch2JobHelper.awaitJobCompletion(startResponse);
@ -818,7 +824,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
assertLocalDateFromDbMatches(expected, actual);
// HFJ_SPIDX_TOKEN
ourLog.info("Tokens:\n * {}", myResourceIndexedSearchParamTokenDao.findAll().stream().map(t -> t.toString()).collect(Collectors.joining("\n * ")));
ourLog.info("Tokens:\n * {}", myResourceIndexedSearchParamTokenDao.findAll().stream().map(ResourceIndexedSearchParamToken::toString).collect(Collectors.joining("\n * ")));
assertEquals(3, myResourceIndexedSearchParamTokenDao.countForResourceId(patientId));
});
@ -840,7 +846,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
assertLocalDateFromDbMatches(myPartitionDate, resourceTable.getPartitionId().getPartitionDate());
// HFJ_SPIDX_TOKEN
ourLog.info("Tokens:\n * {}", myResourceIndexedSearchParamTokenDao.findAll().stream().map(t -> t.toString()).collect(Collectors.joining("\n * ")));
ourLog.info("Tokens:\n * {}", myResourceIndexedSearchParamTokenDao.findAll().stream().map(ResourceIndexedSearchParamToken::toString).collect(Collectors.joining("\n * ")));
assertEquals(3, myResourceIndexedSearchParamTokenDao.countForResourceId(patientId));
// HFJ_RES_VER
@ -1869,7 +1875,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
// Date param
runInTransaction(() -> {
ourLog.info("Date indexes:\n * {}", myResourceIndexedSearchParamDateDao.findAll().stream().map(t -> t.toString()).collect(Collectors.joining("\n * ")));
ourLog.info("Date indexes:\n * {}", myResourceIndexedSearchParamDateDao.findAll().stream().map(ResourceIndexedSearchParamDate::toString).collect(Collectors.joining("\n * ")));
});
addReadPartition(1);
myCaptureQueriesListener.clear();
@ -2522,14 +2528,15 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
@Test
public void testSearch_UniqueParam_SearchAllPartitions() {
createUniqueCompositeSp();
createUniqueComboSp();
IIdType id = createPatient(withPartition(1), withBirthdate("2020-01-01"));
IIdType id = createPatient(withPartition(1), withBirthdate("2020-01-01"), withFamily("FAM"));
addReadAllPartitions();
myCaptureQueriesListener.clear();
SearchParameterMap map = new SearchParameterMap();
map.add(Patient.SP_FAMILY, new StringParam("FAM"));
map.add(Patient.SP_BIRTHDATE, new DateParam("2020-01-01"));
map.setLoadSynchronous(true);
IBundleProvider results = myPatientDao.search(map, mySrd);
@ -2539,20 +2546,21 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
String searchSql = myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, true);
ourLog.info("Search SQL:\n{}", searchSql);
assertEquals(0, StringUtils.countMatches(searchSql, "PARTITION_ID"));
assertEquals(1, StringUtils.countMatches(searchSql, "IDX_STRING = 'Patient?birthdate=2020-01-01'"));
assertThat(searchSql).doesNotContain("PARTITION_ID");
assertThat(searchSql).containsOnlyOnce("IDX_STRING = 'Patient?birthdate=2020-01-01&family=FAM'");
}
@Test
public void testSearch_UniqueParam_SearchOnePartition() {
createUniqueCompositeSp();
createUniqueComboSp();
IIdType id = createPatient(withPartition(1), withBirthdate("2020-01-01"));
IIdType id = createPatient(withPartition(1), withBirthdate("2020-01-01"), withFamily("FAM"));
addReadPartition(1);
myCaptureQueriesListener.clear();
SearchParameterMap map = new SearchParameterMap();
map.add(Patient.SP_FAMILY, new StringParam("FAM"));
map.add(Patient.SP_BIRTHDATE, new DateParam("2020-01-01"));
map.setLoadSynchronous(true);
IBundleProvider results = myPatientDao.search(map, mySrd);
@ -2562,8 +2570,8 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
String searchSql = myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, true);
ourLog.info("Search SQL:\n{}", searchSql);
assertEquals(1, StringUtils.countMatches(searchSql, "PARTITION_ID"));
assertEquals(1, StringUtils.countMatches(searchSql, "IDX_STRING = 'Patient?birthdate=2020-01-01'"));
assertThat(searchSql).containsOnlyOnce( "PARTITION_ID = '1'");
assertThat(searchSql).containsOnlyOnce("IDX_STRING = 'Patient?birthdate=2020-01-01&family=FAM'");
// Same query, different partition
addReadPartition(2);
@ -2578,9 +2586,69 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
}
@ParameterizedTest
@ValueSource(strings = {"ALL", "ONE", "MANY"})
public void testSearch_NonUniqueComboParam(String theReadPartitions) {
createNonUniqueComboSp();
IIdType orgId = createOrganization(withId("A"), withPartition(myPartitionId), withName("My Org")).toUnqualifiedVersionless();
IIdType orgId2 = createOrganization(withId("B"), withPartition(myPartitionId2), withName("My Org")).toUnqualifiedVersionless();
// Matching
IIdType patientId = createPatient(withPartition(myPartitionId), withFamily("FAMILY"), withOrganization(orgId));
// Non matching
createPatient(withPartition(myPartitionId), withFamily("WRONG"), withOrganization(orgId));
createPatient(withPartition(myPartitionId2), withFamily("FAMILY"), withOrganization(orgId2));
logAllNonUniqueIndexes();
switch (theReadPartitions) {
case "ALL":
addReadAllPartitions();
break;
case "ONE":
addReadPartition(myPartitionId);
break;
case "MANY":
addReadPartition(myPartitionId, myPartitionId4);
break;
default:
throw new IllegalStateException();
}
myCaptureQueriesListener.clear();
SearchParameterMap map = new SearchParameterMap();
map.add(Patient.SP_FAMILY, new StringParam("FAMILY"));
map.add(Patient.SP_ORGANIZATION, new ReferenceParam(orgId));
map.setLoadSynchronous(true);
IBundleProvider results = myPatientDao.search(map, mySrd);
List<IIdType> ids = toUnqualifiedVersionlessIds(results);
myCaptureQueriesListener.logSelectQueriesForCurrentThread();
assertThat(ids).containsExactly(patientId);
ourLog.info("Search SQL:\n{}", myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, true));
String searchSql = myCaptureQueriesListener.getSelectQueriesForCurrentThread().get(0).getSql(true, false);
switch (theReadPartitions) {
case "ALL":
assertThat(searchSql).doesNotContain("t0.PARTITION_ID");
break;
case "ONE":
assertThat(searchSql).contains("t0.PARTITION_ID = '1'");
break;
case "MANY":
assertThat(searchSql).contains("t0.PARTITION_ID IN ('1','4')");
break;
default:
throw new IllegalStateException();
}
assertThat(searchSql).containsOnlyOnce("t0.HASH_COMPLETE = '-2879121558074554863'");
}
@Test
public void testSearch_RefParam_TargetPid_SearchOnePartition() {
createUniqueCompositeSp();
createUniqueComboSp();
IIdType patientId = createPatient(withPartition(myPartitionId), withBirthdate("2020-01-01"));
IIdType observationId = createObservation(withPartition(myPartitionId), withSubject(patientId));
@ -2617,7 +2685,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
@Test
public void testSearch_RefParam_TargetPid_SearchDefaultPartition() {
createUniqueCompositeSp();
createUniqueComboSp();
IIdType patientId = createPatient(withPartition(null), withBirthdate("2020-01-01"));
IIdType observationId = createObservation(withPartition(null), withSubject(patientId));
@ -2654,7 +2722,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
@Test
public void testSearch_RefParam_TargetForcedId_SearchOnePartition() {
createUniqueCompositeSp();
createUniqueComboSp();
IIdType patientId = createPatient(withPartition(myPartitionId), withId("ONE"), withBirthdate("2020-01-01"));
IIdType observationId = createObservation(withPartition(myPartitionId), withSubject(patientId));
@ -2724,7 +2792,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
@Test
public void testSearch_RefParam_TargetForcedId_SearchDefaultPartition() {
createUniqueCompositeSp();
createUniqueComboSp();
IIdType patientId = createPatient(withPartition(null), withId("ONE"), withBirthdate("2020-01-01"));
IIdType observationId = createObservation(withPartition(null), withSubject(patientId));
@ -2890,7 +2958,7 @@ public class PartitioningSqlR4Test extends BasePartitioningR4Test {
Bundle output = mySystemDao.transaction(requestDetails, input);
myCaptureQueriesListener.logSelectQueries();
assertEquals(18, myCaptureQueriesListener.countSelectQueriesForCurrentThread());
assertEquals(17, myCaptureQueriesListener.countSelectQueriesForCurrentThread());
assertEquals(6189, myCaptureQueriesListener.countInsertQueriesForCurrentThread());
assertEquals(418, myCaptureQueriesListener.countUpdateQueriesForCurrentThread());
assertEquals(0, myCaptureQueriesListener.countDeleteQueriesForCurrentThread());

View File

@ -15,6 +15,8 @@ import ca.uhn.fhir.jpa.api.dao.ReindexParameters;
import ca.uhn.fhir.jpa.api.model.DaoMethodOutcome;
import ca.uhn.fhir.jpa.batch.models.Batch2JobStartResponse;
import ca.uhn.fhir.jpa.model.entity.ResourceHistoryTable;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboStringUnique;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboTokenNonUnique;
import ca.uhn.fhir.jpa.model.entity.ResourceTable;
import ca.uhn.fhir.jpa.searchparam.SearchParameterMap;
import ca.uhn.fhir.jpa.test.BaseJpaR4Test;
@ -44,6 +46,7 @@ import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.junit.jupiter.api.Assertions.assertNull;
@SuppressWarnings("SqlDialectInspection")
public class ReindexJobTest extends BaseJpaR4Test {
@Autowired
@ -65,7 +68,6 @@ public class ReindexJobTest extends BaseJpaR4Test {
@AfterEach
public void after() {
myInterceptorRegistry.unregisterAllAnonymousInterceptors();
myStorageSettings.setInlineResourceTextBelowSize(new JpaStorageSettings().getInlineResourceTextBelowSize());
myStorageSettings.setStoreMetaSourceInformation(new JpaStorageSettings().getStoreMetaSourceInformation());
myStorageSettings.setPreserveRequestIdInResourceBody(new JpaStorageSettings().isPreserveRequestIdInResourceBody());
}
@ -98,8 +100,6 @@ public class ReindexJobTest extends BaseJpaR4Test {
}
});
myStorageSettings.setInlineResourceTextBelowSize(10000);
// execute
JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
startRequest.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX);
@ -108,7 +108,7 @@ public class ReindexJobTest extends BaseJpaR4Test {
.setOptimizeStorage(ReindexParameters.OptimizeStorageModeEnum.CURRENT_VERSION)
.setReindexSearchParameters(ReindexParameters.ReindexSearchParametersEnum.NONE)
);
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(startRequest);
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(mySrd, startRequest);
myBatch2JobHelper.awaitJobCompletion(startResponse);
// validate
@ -165,7 +165,7 @@ public class ReindexJobTest extends BaseJpaR4Test {
.setOptimizeStorage(ReindexParameters.OptimizeStorageModeEnum.ALL_VERSIONS)
.setReindexSearchParameters(ReindexParameters.ReindexSearchParametersEnum.NONE)
);
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(startRequest);
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(mySrd, startRequest);
myBatch2JobHelper.awaitJobCompletion(startResponse);
// validate
@ -224,7 +224,7 @@ public class ReindexJobTest extends BaseJpaR4Test {
.setOptimizeStorage(ReindexParameters.OptimizeStorageModeEnum.ALL_VERSIONS)
.setReindexSearchParameters(ReindexParameters.ReindexSearchParametersEnum.NONE)
);
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(startRequest);
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(mySrd, startRequest);
myBatch2JobHelper.awaitJobCompletion(startResponse);
// validate
@ -251,8 +251,6 @@ public class ReindexJobTest extends BaseJpaR4Test {
myPatientDao.delete(nextId, mySrd);
}
myStorageSettings.setInlineResourceTextBelowSize(10000);
// execute
JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
startRequest.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX);
@ -299,7 +297,7 @@ public class ReindexJobTest extends BaseJpaR4Test {
JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
startRequest.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX);
startRequest.setParameters(parameters);
Batch2JobStartResponse res = myJobCoordinator.startInstance(startRequest);
Batch2JobStartResponse res = myJobCoordinator.startInstance(mySrd, startRequest);
myBatch2JobHelper.awaitJobCompletion(res);
// validate
@ -338,7 +336,7 @@ public class ReindexJobTest extends BaseJpaR4Test {
JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
startRequest.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX);
startRequest.setParameters(parameters);
Batch2JobStartResponse res = myJobCoordinator.startInstance(startRequest);
Batch2JobStartResponse res = myJobCoordinator.startInstance(mySrd, startRequest);
myBatch2JobHelper.awaitJobCompletion(res);
// then
@ -369,7 +367,7 @@ public class ReindexJobTest extends BaseJpaR4Test {
JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
startRequest.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX);
startRequest.setParameters(new ReindexJobParameters());
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(startRequest);
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(mySrd, startRequest);
myBatch2JobHelper.awaitJobCompletion(startResponse);
// validate
@ -380,8 +378,8 @@ public class ReindexJobTest extends BaseJpaR4Test {
@Test
public void testReindex_DuplicateResourceBeforeEnforceUniqueShouldSaveWarning() {
myReindexTestHelper.createObservationWithCode();
myReindexTestHelper.createObservationWithCode();
myReindexTestHelper.createObservationWithStatusAndCode();
myReindexTestHelper.createObservationWithStatusAndCode();
DaoMethodOutcome searchParameter = myReindexTestHelper.createUniqueCodeSearchParameter();
@ -396,6 +394,73 @@ public class ReindexJobTest extends BaseJpaR4Test {
assertThat(myJob.getWarningMessages()).contains("Failed to reindex resource because unique search parameter " + searchParameter.getEntity().getIdDt().toVersionless().toString());
}
/**
* This test will fail and can be deleted if we make the hash columns on
* the unique index table non-nullable.
*/
@Test
public void testReindex_ComboUnique_HashesShouldBePopulated() {
myReindexTestHelper.createUniqueCodeSearchParameter();
myReindexTestHelper.createObservationWithStatusAndCode();
logAllUniqueIndexes();
// Clear hashes
runInTransaction(()->{
assertEquals(1, myEntityManager.createNativeQuery("UPDATE HFJ_IDX_CMP_STRING_UNIQ SET HASH_COMPLETE = null WHERE HASH_COMPLETE IS NOT NULL").executeUpdate());
assertEquals(0, myEntityManager.createNativeQuery("UPDATE HFJ_IDX_CMP_STRING_UNIQ SET HASH_COMPLETE = null WHERE HASH_COMPLETE IS NOT NULL").executeUpdate());
assertEquals(1, myEntityManager.createNativeQuery("UPDATE HFJ_IDX_CMP_STRING_UNIQ SET HASH_COMPLETE_2 = null WHERE HASH_COMPLETE_2 IS NOT NULL").executeUpdate());
assertEquals(0, myEntityManager.createNativeQuery("UPDATE HFJ_IDX_CMP_STRING_UNIQ SET HASH_COMPLETE_2 = null WHERE HASH_COMPLETE_2 IS NOT NULL").executeUpdate());
});
// Run a reindex
JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
startRequest.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX);
startRequest.setParameters(new ReindexJobParameters());
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(new SystemRequestDetails(), startRequest);
JobInstance myJob = myBatch2JobHelper.awaitJobCompletion(startResponse.getInstanceId(), 999);
assertEquals(StatusEnum.COMPLETED, myJob.getStatus());
// Verify that hashes are repopulated
runInTransaction(()->{
List<ResourceIndexedComboStringUnique> indexes = myResourceIndexedComboStringUniqueDao.findAll();
assertEquals(1, indexes.size());
assertThat(indexes.get(0).getHashComplete()).isNotNull().isNotZero();
assertThat(indexes.get(0).getHashComplete2()).isNotNull().isNotZero();
});
}
/**
* This test will fail and can be deleted if we make the hash columns on
* the unique index table non-nullable.
*/
@Test
public void testReindex_ComboNonUnique_HashesShouldBePopulated() {
myReindexTestHelper.createNonUniqueStatusAndCodeSearchParameter();
myReindexTestHelper.createObservationWithStatusAndCode();
logAllNonUniqueIndexes();
// Set hash wrong
runInTransaction(()->{
assertEquals(1, myEntityManager.createNativeQuery("UPDATE HFJ_IDX_CMB_TOK_NU SET HASH_COMPLETE = 0 WHERE HASH_COMPLETE != 0").executeUpdate());
assertEquals(0, myEntityManager.createNativeQuery("UPDATE HFJ_IDX_CMB_TOK_NU SET HASH_COMPLETE = 0 WHERE HASH_COMPLETE != 0").executeUpdate());
});
// Run a reindex
JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
startRequest.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX);
startRequest.setParameters(new ReindexJobParameters());
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(new SystemRequestDetails(), startRequest);
JobInstance myJob = myBatch2JobHelper.awaitJobCompletion(startResponse.getInstanceId(), 999);
assertEquals(StatusEnum.COMPLETED, myJob.getStatus());
// Verify that hashes are repopulated
runInTransaction(()->{
List<ResourceIndexedComboTokenNonUnique> indexes = myResourceIndexedComboTokensNonUniqueDao.findAll();
assertEquals(1, indexes.size());
assertEquals(-4763890811650597657L, indexes.get(0).getHashComplete());
});
}
@Test
public void testReindex_ExceptionThrownDuringWrite() {
// setup
@ -415,7 +480,7 @@ public class ReindexJobTest extends BaseJpaR4Test {
JobInstanceStartRequest startRequest = new JobInstanceStartRequest();
startRequest.setJobDefinitionId(ReindexAppCtx.JOB_REINDEX);
startRequest.setParameters(new ReindexJobParameters());
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(startRequest);
Batch2JobStartResponse startResponse = myJobCoordinator.startInstance(mySrd, startRequest);
JobInstance outcome = myBatch2JobHelper.awaitJobCompletion(startResponse);
// Verify
@ -461,7 +526,7 @@ public class ReindexJobTest extends BaseJpaR4Test {
myStorageSettings.setMarkResourcesForReindexingUponSearchParameterChange(true);
// create an Observation resource and SearchParameter for it to trigger re-indexing
myReindexTestHelper.createObservationWithCode();
myReindexTestHelper.createObservationWithStatusAndCode();
myReindexTestHelper.createCodeSearchParameter();
// check that reindex job was created

View File

@ -11,11 +11,13 @@ import ca.uhn.fhir.rest.client.api.IGenericClient;
import ca.uhn.fhir.rest.gclient.StringClientParam;
import ca.uhn.fhir.rest.param.TokenParam;
import ca.uhn.fhir.rest.server.util.ISearchParamRegistry;
import ca.uhn.fhir.util.BundleBuilder;
import ca.uhn.fhir.util.BundleUtil;
import org.hl7.fhir.instance.model.api.IBaseBundle;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r4.model.BooleanType;
import org.hl7.fhir.r4.model.Bundle;
import org.hl7.fhir.r4.model.CodeableConcept;
import org.hl7.fhir.r4.model.Coding;
import org.hl7.fhir.r4.model.Enumerations;
@ -124,6 +126,25 @@ public class ReindexTestHelper {
return daoMethodOutcome;
}
public void createNonUniqueStatusAndCodeSearchParameter() {
createCodeSearchParameter();
createStatusSearchParameter();
SearchParameter uniqueCodeSp = new SearchParameter();
uniqueCodeSp.setId("SearchParameter/nonunique-status-code");
uniqueCodeSp.addExtension(new Extension().setUrl("http://hapifhir.io/fhir/StructureDefinition/sp-unique").setValue(new BooleanType(false)));
uniqueCodeSp.setStatus(Enumerations.PublicationStatus.ACTIVE);
uniqueCodeSp.setCode("observation-status-and-code");
uniqueCodeSp.addBase("Observation");
uniqueCodeSp.setType(Enumerations.SearchParamType.COMPOSITE);
uniqueCodeSp.setExpression("Observation");
uniqueCodeSp.addComponent(new SearchParameter.SearchParameterComponentComponent().setDefinition("SearchParameter/clinical-code").setExpression("Observation"));
uniqueCodeSp.addComponent(new SearchParameter.SearchParameterComponentComponent().setDefinition("SearchParameter/clinical-status").setExpression("Observation"));
mySearchParameterDao.update(uniqueCodeSp);
mySearchParamRegistry.forceRefresh();
}
public DaoMethodOutcome createCodeSearchParameter() {
SearchParameter codeSp = new SearchParameter();
codeSp.setId("SearchParameter/clinical-code");
@ -138,6 +159,20 @@ public class ReindexTestHelper {
return daoMethodOutcome;
}
public DaoMethodOutcome createStatusSearchParameter() {
SearchParameter codeSp = new SearchParameter();
codeSp.setId("SearchParameter/clinical-status");
codeSp.setStatus(Enumerations.PublicationStatus.ACTIVE);
codeSp.setCode("status");
codeSp.addBase("Observation");
codeSp.setType(Enumerations.SearchParamType.TOKEN);
codeSp.setExpression("Observation.status");
DaoMethodOutcome daoMethodOutcome = mySearchParameterDao.update(codeSp);
mySearchParamRegistry.forceRefresh();
return daoMethodOutcome;
}
public IIdType createObservationWithAlleleExtension(Observation.ObservationStatus theStatus) {
Observation observation = buildObservationWithAlleleExtension(theStatus);
return myObservationDao.create(observation).getId();
@ -151,13 +186,14 @@ public class ReindexTestHelper {
return observation;
}
public IIdType createObservationWithCode() {
Observation observation = buildObservationWithCode();
public IIdType createObservationWithStatusAndCode() {
Observation observation = buildObservationWithStatusAndCode();
return myObservationDao.create(observation).getId();
}
public Observation buildObservationWithCode() {
public Observation buildObservationWithStatusAndCode() {
Observation observation = new Observation();
observation.setStatus(Observation.ObservationStatus.FINAL);
CodeableConcept codeableConcept = new CodeableConcept();
codeableConcept.addCoding(new Coding().setCode("29463-7").setSystem("http://loinc.org").setDisplay("Body Weight"));
observation.setCode(codeableConcept);
@ -206,4 +242,28 @@ public class ReindexTestHelper {
.execute();
return BundleUtil.toListOfResourceIds(myFhirContext, result);
}
/**
* Creates a transaction bundle with 20 Observations which will create rows for indexes
* created by {@link #createNonUniqueStatusAndCodeSearchParameter()} and
* {@link #createUniqueCodeSearchParameter()}.
*/
public Bundle createTransactionBundleWith20Observation(boolean theUseClientAssignedIds) {
BundleBuilder bb = new BundleBuilder(myFhirContext);
for (int i = 0; i < 20; i++) {
Observation observation = new Observation();
if (theUseClientAssignedIds) {
observation.setId("OBS" + i);
}
observation.addIdentifier().setSystem("http://foo").setValue("ident" + i);
observation.setStatus(Observation.ObservationStatus.FINAL);
observation.getCode().addCoding().setSystem("http://foo").setCode("" + i);
if (theUseClientAssignedIds) {
bb.addTransactionUpdateEntry(observation);
} else {
bb.addTransactionCreateEntry(observation);
}
}
return bb.getBundleTyped();
}
}

View File

@ -246,14 +246,12 @@ public class GiantTransactionPerfTest {
myDaoSearchParamSynchronizer = new DaoSearchParamSynchronizer();
myDaoSearchParamSynchronizer.setEntityManager(myEntityManager);
myDaoSearchParamSynchronizer.setStorageSettings(myStorageSettings);
mySearchParamWithInlineReferencesExtractor = new SearchParamWithInlineReferencesExtractor();
mySearchParamWithInlineReferencesExtractor.setStorageSettings(myStorageSettings);
mySearchParamWithInlineReferencesExtractor.setContext(ourFhirContext);
mySearchParamWithInlineReferencesExtractor.setPartitionSettings(this.myPartitionSettings);
mySearchParamWithInlineReferencesExtractor.setSearchParamExtractorService(mySearchParamExtractorSvc);
mySearchParamWithInlineReferencesExtractor.setSearchParamRegistry(mySearchParamRegistry);
mySearchParamWithInlineReferencesExtractor.setDaoSearchParamSynchronizer(myDaoSearchParamSynchronizer);
myEobDao = new JpaResourceDao<>();
myEobDao.setContext(ourFhirContext);

View File

@ -13,16 +13,18 @@ import ca.uhn.fhir.util.HapiExtensions;
import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.r5.model.BooleanType;
import org.hl7.fhir.r5.model.Enumerations;
import org.hl7.fhir.r5.model.SearchParameter;
import org.hl7.fhir.r5.model.IdType;
import org.hl7.fhir.r5.model.Patient;
import org.hl7.fhir.r5.model.Reference;
import org.hl7.fhir.r5.model.SearchParameter;
import org.junit.jupiter.api.Test;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.jupiter.api.Assertions.assertEquals;
public class DuplicateIndexR5Test extends BaseJpaR5Test {
public static final String SEARCH_PARAMETER_PATIENT_NAMES_AND_GENDER = "SearchParameter/patient-names-and-gender";
@Test
public void testDuplicateTokensClearedOnUpdate() {
// Setup
@ -168,7 +170,7 @@ public class DuplicateIndexR5Test extends BaseJpaR5Test {
dupe0.setResource(param.getResource());
dupe0.setHashComplete(param.getHashComplete());
dupe0.setIndexString(param.getIndexString());
dupe0.setSearchParameterId(param.getSearchParameterId());
dupe0.setSearchParameterId(new IdType(SEARCH_PARAMETER_PATIENT_NAMES_AND_GENDER));
dupe0.calculateHashes();
myResourceIndexedComboTokensNonUniqueDao.save(dupe0);
@ -178,7 +180,7 @@ public class DuplicateIndexR5Test extends BaseJpaR5Test {
dupe1.setResource(param.getResource());
dupe1.setHashComplete(param.getHashComplete());
dupe1.setIndexString(param.getIndexString());
dupe1.setSearchParameterId(param.getSearchParameterId());
dupe1.setSearchParameterId(new IdType(SEARCH_PARAMETER_PATIENT_NAMES_AND_GENDER));
dupe1.calculateHashes();
myResourceIndexedComboTokensNonUniqueDao.save(dupe1);
});
@ -289,7 +291,7 @@ public class DuplicateIndexR5Test extends BaseJpaR5Test {
mySearchParameterDao.update(sp, mySrd);
sp = new SearchParameter();
sp.setId("SearchParameter/patient-names-and-gender");
sp.setId(SEARCH_PARAMETER_PATIENT_NAMES_AND_GENDER);
sp.setType(Enumerations.SearchParamType.COMPOSITE);
sp.setStatus(Enumerations.PublicationStatus.ACTIVE);
sp.addBase(Enumerations.VersionIndependentResourceTypesAll.PATIENT);

View File

@ -50,7 +50,6 @@ import ca.uhn.fhir.jpa.dao.data.IPartitionDao;
import ca.uhn.fhir.jpa.dao.data.IResourceHistoryProvenanceDao;
import ca.uhn.fhir.jpa.dao.data.IResourceHistoryTableDao;
import ca.uhn.fhir.jpa.dao.data.IResourceHistoryTagDao;
import ca.uhn.fhir.jpa.dao.data.IResourceIndexedComboStringUniqueDao;
import ca.uhn.fhir.jpa.dao.data.IResourceIndexedComboTokensNonUniqueDao;
import ca.uhn.fhir.jpa.dao.data.IResourceIndexedSearchParamCoordsDao;
import ca.uhn.fhir.jpa.dao.data.IResourceIndexedSearchParamDateDao;
@ -193,7 +192,6 @@ import org.hl7.fhir.r5.utils.validation.IResourceValidator;
import org.hl7.fhir.r5.utils.validation.IValidationPolicyAdvisor;
import org.hl7.fhir.r5.utils.validation.constants.BestPracticeWarningLevel;
import org.hl7.fhir.r5.utils.validation.constants.BindingKind;
import org.hl7.fhir.r5.utils.validation.constants.CodedContentValidationPolicy;
import org.hl7.fhir.r5.utils.validation.constants.ContainedReferenceValidationPolicy;
import org.hl7.fhir.r5.utils.validation.constants.ReferenceValidationPolicy;
import org.hl7.fhir.utilities.validation.ValidationMessage;
@ -277,8 +275,6 @@ public abstract class BaseJpaR4Test extends BaseJpaTest implements ITestDataBuil
@Autowired
protected IResourceIndexedSearchParamDateDao myResourceIndexedSearchParamDateDao;
@Autowired
protected IResourceIndexedComboStringUniqueDao myResourceIndexedCompositeStringUniqueDao;
@Autowired
protected IResourceIndexedComboTokensNonUniqueDao myResourceIndexedComboTokensNonUniqueDao;
@Autowired
@Qualifier("myAllergyIntoleranceDaoR4")

View File

@ -37,6 +37,7 @@ import ca.uhn.fhir.jpa.dao.BaseHapiFhirDao;
import ca.uhn.fhir.jpa.dao.IFulltextSearchSvc;
import ca.uhn.fhir.jpa.dao.JpaPersistedResourceValidationSupport;
import ca.uhn.fhir.jpa.dao.data.IResourceHistoryTableDao;
import ca.uhn.fhir.jpa.dao.data.IResourceIndexedComboStringUniqueDao;
import ca.uhn.fhir.jpa.dao.data.IResourceIndexedComboTokensNonUniqueDao;
import ca.uhn.fhir.jpa.dao.data.IResourceIndexedSearchParamCoordsDao;
import ca.uhn.fhir.jpa.dao.data.IResourceIndexedSearchParamDateDao;
@ -61,6 +62,7 @@ import ca.uhn.fhir.jpa.entity.TermValueSet;
import ca.uhn.fhir.jpa.entity.TermValueSetConcept;
import ca.uhn.fhir.jpa.entity.TermValueSetConceptDesignation;
import ca.uhn.fhir.jpa.model.entity.ResourceHistoryTable;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboStringUnique;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedComboTokenNonUnique;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamCoords;
import ca.uhn.fhir.jpa.model.entity.ResourceIndexedSearchParamDate;
@ -232,6 +234,8 @@ public abstract class BaseJpaTest extends BaseTest {
protected IResourceIndexedSearchParamCoordsDao myResourceIndexedSearchParamCoordsDao;
@Autowired
protected IResourceIndexedComboTokensNonUniqueDao myResourceIndexedComboTokensNonUniqueDao;
@Autowired
protected IResourceIndexedComboStringUniqueDao myResourceIndexedComboStringUniqueDao;
@Autowired(required = false)
protected IFulltextSearchSvc myFulltestSearchSvc;
@Autowired(required = false)
@ -535,6 +539,12 @@ public abstract class BaseJpaTest extends BaseTest {
});
}
protected void logAllUniqueIndexes() {
runInTransaction(() -> {
ourLog.info("Unique indexes:\n * {}", myResourceIndexedComboStringUniqueDao.findAll().stream().map(ResourceIndexedComboStringUnique::toString).collect(Collectors.joining("\n * ")));
});
}
protected void logAllTokenIndexes() {
runInTransaction(() -> {
ourLog.info("Token indexes:\n * {}", myResourceIndexedSearchParamTokenDao.findAll().stream().map(ResourceIndexedSearchParamToken::toString).collect(Collectors.joining("\n * ")));

View File

@ -1,13 +1,184 @@
package ca.uhn.fhir.jpa.migrate.tasks;
import ca.uhn.fhir.jpa.migrate.DriverTypeEnum;
import ca.uhn.fhir.jpa.migrate.HapiMigrator;
import ca.uhn.fhir.jpa.migrate.MigrationResult;
import ca.uhn.fhir.jpa.migrate.MigrationTaskList;
import ca.uhn.fhir.jpa.migrate.taskdef.InitializeSchemaTask;
import ca.uhn.fhir.util.VersionEnum;
import jakarta.annotation.Nonnull;
import org.apache.commons.dbcp2.BasicDataSource;
import org.junit.jupiter.api.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.jdbc.core.ColumnMapRowMapper;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.support.AbstractLobCreatingPreparedStatementCallback;
import org.springframework.jdbc.support.lob.DefaultLobHandler;
import org.springframework.jdbc.support.lob.LobCreator;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.sql.Types;
import java.time.Duration;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.UUID;
import static org.assertj.core.api.AssertionsForClassTypes.assertThat;
import static org.junit.jupiter.api.Assertions.assertEquals;
public class HapiFhirJpaMigrationTasksTest {
private static final Logger ourLog = LoggerFactory.getLogger(HapiFhirJpaMigrationTasksTest.class);
private static final String MIGRATION_TABLE_NAME = "HFJ_FLY_MIGRATOR";
private final BasicDataSource myDataSource = newDataSource();
private final JdbcTemplate myJdbcTemplate = new JdbcTemplate(myDataSource);
@Test
public void testCreate() {
new HapiFhirJpaMigrationTasks(Collections.emptySet());
}
/**
* Verify migration task 20240617.4 which creates hashes on the unique combo
* search param table if they aren't already present. Hash columns were only
* added in 7.4.0 so this backfills them.
*/
@Test
public void testCreateUniqueComboParamHashes() {
/*
* Setup
*/
// Create migrator and initialize schema using a static version
// of the schema from the 7.2.0 release
HapiFhirJpaMigrationTasks tasks = new HapiFhirJpaMigrationTasks(Set.of());
HapiMigrator migrator = new HapiMigrator(MIGRATION_TABLE_NAME, myDataSource, DriverTypeEnum.H2_EMBEDDED);
migrator.addTask(new InitializeSchemaTask("7.2.0", "20180115.0",
new SchemaInitializationProvider(
"HAPI FHIR", "/jpa_h2_schema_720", "HFJ_RESOURCE", true)));
migrator.createMigrationTableIfRequired();
migrator.migrate();
// Run a second time to run the 7.4.0 migrations
MigrationTaskList allTasks = tasks.getAllTasks(VersionEnum.V7_3_0, VersionEnum.V7_4_0);
migrator.addTasks(allTasks);
migrator.migrate();
// Create a unique index row with no hashes populated
insertRow_ResourceTable();
insertRow_ResourceIndexedComboStringUnique();
/*
* Execute
*/
// Remove the task we're testing from the migrator history, so it runs again
assertEquals(1, myJdbcTemplate.update("DELETE FROM " + MIGRATION_TABLE_NAME + " WHERE version = ?", "7.4.0.20240625.40"));
// Run the migrator
ourLog.info("About to run the migrator a second time");
MigrationResult migrationResult = migrator.migrate();
assertEquals(1, migrationResult.succeededTasks.size());
assertEquals(0, migrationResult.failedTasks.size());
/*
* Verify
*/
List<Map<String, Object>> rows = myJdbcTemplate.query("SELECT * FROM HFJ_IDX_CMP_STRING_UNIQ", new ColumnMapRowMapper());
assertEquals(1, rows.size());
Map<String, Object> row = rows.get(0);
assertThat(row.get("HASH_COMPLETE")).as(row::toString).isEqualTo(-5443017569618195896L);
assertThat(row.get("HASH_COMPLETE_2")).as(row::toString).isEqualTo(-1513800680307323438L);
}
private void insertRow_ResourceIndexedComboStringUnique() {
myJdbcTemplate.execute(
"""
insert into
HFJ_IDX_CMP_STRING_UNIQ (
PID,
RES_ID,
IDX_STRING)
values (1, 1, 'Patient?foo=bar')
""");
}
private void insertRow_ResourceTable() {
myJdbcTemplate.execute(
"""
insert into
HFJ_RESOURCE (
RES_DELETED_AT,
RES_VERSION,
FHIR_ID,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
RES_LANGUAGE,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
SP_QUANTITY_NRML_PRESENT,
RES_TYPE,
RES_VER,
RES_ID)
values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
""",
new AbstractLobCreatingPreparedStatementCallback(new DefaultLobHandler()) {
@Override
protected void setValues(@Nonnull PreparedStatement thePs, @Nonnull LobCreator theLobCreator) throws SQLException {
int i = 1;
thePs.setNull(i++, Types.TIMESTAMP);
thePs.setString(i++, "R4");
thePs.setString(i++, "ABC"); // FHIR_ID
thePs.setBoolean(i++, false);
thePs.setTimestamp(i++, new Timestamp(System.currentTimeMillis()));
thePs.setTimestamp(i++, new Timestamp(System.currentTimeMillis()));
thePs.setBoolean(i++, false);
thePs.setNull(i++, Types.VARCHAR);
thePs.setLong(i++, 1L);
thePs.setNull(i++, Types.VARCHAR);
thePs.setBoolean(i++, false);
thePs.setBoolean(i++, false);
thePs.setBoolean(i++, false);
thePs.setBoolean(i++, false);
thePs.setBoolean(i++, false);
thePs.setBoolean(i++, false);
thePs.setBoolean(i++, false);
thePs.setBoolean(i++, false);
thePs.setBoolean(i++, false); // SP_QUANTITY_NRML_PRESENT
thePs.setString(i++, "Patient");
thePs.setLong(i++, 1L);
thePs.setLong(i, 1L); // RES_ID
}
});
}
static BasicDataSource newDataSource() {
BasicDataSource retVal = new BasicDataSource();
retVal.setDriver(new org.h2.Driver());
retVal.setUrl("jdbc:h2:mem:test_migration-" + UUID.randomUUID() + ";CASE_INSENSITIVE_IDENTIFIERS=TRUE;");
retVal.setMaxWait(Duration.ofMillis(30000));
retVal.setUsername("");
retVal.setPassword("");
retVal.setMaxTotal(5);
return retVal;
}
}

File diff suppressed because it is too large Load Diff

View File

@ -118,7 +118,7 @@ public abstract class BaseColumnCalculatorTask extends BaseTableColumnTask {
try {
next.get();
} catch (Exception e) {
throw new SQLException(Msg.code(69) + e);
throw new SQLException(Msg.code(69) + e, e);
}
}
}
@ -168,8 +168,9 @@ public abstract class BaseColumnCalculatorTask extends BaseTableColumnTask {
rejectedExecutionHandler);
}
public void setPidColumnName(String thePidColumnName) {
public BaseColumnCalculatorTask setPidColumnName(String thePidColumnName) {
myPidColumnName = thePidColumnName;
return this;
}
private Future<?> updateRows(List<Map<String, Object>> theRows) {

View File

@ -1,3 +1,22 @@
/*-
* #%L
* HAPI FHIR Server - SQL Migration
* %%
* Copyright (C) 2014 - 2024 Smile CDR, Inc.
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
package ca.uhn.fhir.jpa.migrate.tasks.api;
/**

View File

@ -973,7 +973,7 @@
<error_prone_core_version>2.23.0</error_prone_core_version>
<mockito_version>5.8.0</mockito_version>
<nullaway_version>0.7.9</nullaway_version>
<guava_version>32.1.1-jre</guava_version>
<guava_version>33.2.1-jre</guava_version>
<gson_version>2.8.9</gson_version>
<jaxb_bundle_version>2.2.11_1</jaxb_bundle_version>
<jaxb_api_version>2.3.1</jaxb_api_version>