Automated migration test data (#5218)

* version bump

* Bump to core release 6.0.22 (#5028)

* Bump to core release 6.0.16

* Bump to core version 6.0.20

* Fix errors thrown as a result of VersionSpecificWorkerContextWrapper

* Bump to core 6.0.22

* Resolve 5126 hfj res ver prov might cause migration error on db that automatically indexes the primary key (#5127)

* dropped old index FK_RESVERPROV_RES_PID on RES_PID column before adding IDX_RESVERPROV_RES_PID

* added changelog

* changed to valid version number

* changed to valid version number, need to be ordered by version number...

* 5123 - Use DEFAULT partition for server-based requests if none specified (#5124)

5123 - Use DEFAULT partition for server-based requests if none specified

* consent remove all suppresses next link in bundle (#5119)

* added FIXME with source of issue

* added FIXME with root cause

* added FIXME with root cause

* Providing solution to the issue and removing fixmes.

* Providing changelog

* auto-formatting.

* Adding new test.

* Adding a new test for standard paging

* let's try this and see if it works...?

* fix tests

* cleanup to trigger a new run

* fixing tests

---------

Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* 5117 MDM Score for No Match Fields Should Not Be Included in Total Score  (#5118)

* fix, test, changelog

* fix, test, changelog

---------

Co-authored-by: justindar <justin.dar@smilecdr.com>

* wip run 5.2.0 migrations and then add test data

* Fix h2

* Fix postgres

* Complete until v5.4

* add sql migration scripts:

* Temp

* Add v6_0_0

* Add v6_8_0

* Final changes

* clean up

* _source search parameter needs to support modifiers (#5095)

_source search parameter needs to support modifiers - added support form :contains, :missing, :above modifiers

* Fix HFQL docs (#5151)

* Expunge operation on codesystem may throw 500 internal error with precondition fail message. (#5156)

* Initial failing test.

* Solution with changelog.

* fixing format.

* Addressing comment from code review.

* fixing failing test.

---------

Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* documentation update (#5154)

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* Fix hsql jdbc driver deps (#5168)

Avoid non-included classes in jdbc driver dependencies.

* $delete-expunge over 10k resources will now delete all resources (#5144)

* First commit with very rough fix and unit test.

* Refinements to ResourceIdListStep and Batch2DaoSvcImpl.  Make LoadIdsStepTest pass.   Enhance Batch2DaoSvcImplTest.

* Spotless

* Fix checkstyle errors.

* Fix test failures.

* Minor refactoring.  New unit test.  Finalize changelist.

* Spotless fix.

* Delete now useless code from unit test.

* Delete more useless code.

* Test pre-commit hook

* More spotless fixes.

* Address most code review feedback.

* Remove use of pageSize parameter and see if this breaks the pipeline.

* Remove use of pageSize parameter and see if this breaks the pipeline.

* Fix the noUrl case by passing an unlimited Pegeable instead.  Effectively stop using page size for most databases.

* Deprecate the old method and have it call the new one by default.

* updating documentation (#5170)

Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>

* _source search parameter modifiers for Subscription matching (#5159)

* _source search parameter modifiers for Subscription matching - test, implementation and changelog

* Complete Oracle

* Remove V6_8_0

* apply mvn: spotless

* Removal of meta tags during updates do not trigger subscription (#5181)

* Initial failing test.

* adding solution;
fixing documentation;

* spotless apply

* adding changelog

* modifying current test

---------

Co-authored-by: peartree <etienne.poirier@smilecdr.com>

* Issue 5173 get gateway everything doesnt return all patients (#5174)

* Failing test

* Also set offset and count in base DAO override

* Changelog

* Fix for specific case where count has been set in parameters

* spotless

* Improve checks

---------

Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>

* Do not 500 and continue IG ingestion when different IGs try to save different ValueSets with colliding FHIR IDs (#5175)

* First commit with failing unit test and small tweaks.

* Swallow resource version exceptions from colliding ValueSet OIDs and log a descriptive error instead.  Add more detailed unit testing.

* Tweaks to logic and update the changelog.  Reverse all changes to TermReadSvcImpl.

* Revert PackageResourceParsingSvc to release branch baseline.

* Accept code reviewer suggestion to change changelog description.

Co-authored-by: michaelabuckley <michaelabuckley@gmail.com>

---------

Co-authored-by: michaelabuckley <michaelabuckley@gmail.com>

* Update CDR insert tables

* Update CDR insert tables

---------

Co-authored-by: tadgh <garygrantgraham@gmail.com>
Co-authored-by: dotasek <david.otasek@smilecdr.com>
Co-authored-by: TynerGjs <132295567+TynerGjs@users.noreply.github.com>
Co-authored-by: Steve Corbett <137920358+steve-corbett-smilecdr@users.noreply.github.com>
Co-authored-by: Ken Stevens <khstevens@gmail.com>
Co-authored-by: Ken Stevens <ken@smilecdr.com>
Co-authored-by: peartree <etienne.poirier@smilecdr.com>
Co-authored-by: jdar8 <69840459+jdar8@users.noreply.github.com>
Co-authored-by: justindar <justin.dar@smilecdr.com>
Co-authored-by: nathaniel.doef <nathaniel.doef@smilecdr.com>
Co-authored-by: volodymyr-korzh <132366313+volodymyr-korzh@users.noreply.github.com>
Co-authored-by: Nathan Doef <n.doef@protonmail.com>
Co-authored-by: Etienne Poirier <33007955+epeartree@users.noreply.github.com>
Co-authored-by: TipzCM <leif.stawnyczy@gmail.com>
Co-authored-by: leif stawnyczy <leifstawnyczy@leifs-MacBook-Pro.local>
Co-authored-by: michaelabuckley <michaelabuckley@gmail.com>
Co-authored-by: Luke deGruchy <luke.degruchy@smilecdr.com>
Co-authored-by: jmarchionatto <60409882+jmarchionatto@users.noreply.github.com>
Co-authored-by: juan.marchionatto <juan.marchionatto@smilecdr.com>
This commit is contained in:
SouradeepSahaSmile 2023-08-23 10:27:02 -04:00 committed by GitHub
parent bbcc6148ae
commit ed616f7a6d
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
30 changed files with 1485 additions and 22 deletions

View File

@ -19,6 +19,7 @@
*/
package ca.uhn.fhir.jpa.embedded;
import ca.uhn.fhir.util.VersionEnum;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@ -37,10 +38,9 @@ public class DatabaseInitializerHelper {
theDatabase.executeSqlAsBatch(sql);
}
public void insertPersistenceTestData(JpaEmbeddedDatabase theDatabase) {
String fileName = String.format(
"migration/releases/%s/data/%s.sql",
HapiEmbeddedDatabasesExtension.FIRST_TESTED_VERSION, theDatabase.getDriverType());
public void insertPersistenceTestData(JpaEmbeddedDatabase theDatabase, VersionEnum theVersionEnum) {
String fileName =
String.format("migration/releases/%s/data/%s.sql", theVersionEnum, theDatabase.getDriverType());
String sql = getSqlFromResourceFile(fileName);
theDatabase.insertTestData(sql);
}

View File

@ -54,6 +54,17 @@ public class H2EmbeddedDatabase extends JpaEmbeddedDatabase {
deleteDatabaseDirectoryIfExists();
}
private List<String> getAllTableNames() {
List<String> allTableNames = new ArrayList<>();
List<Map<String, Object>> queryResults =
query("SELECT TABLE_NAME FROM information_schema.tables WHERE TABLE_SCHEMA = 'PUBLIC'");
for (Map<String, Object> row : queryResults) {
String tableName = row.get("TABLE_NAME").toString();
allTableNames.add(tableName);
}
return allTableNames;
}
@Override
public void disableConstraints() {
getJdbcTemplate().execute("SET REFERENTIAL_INTEGRITY = FALSE");
@ -61,7 +72,11 @@ public class H2EmbeddedDatabase extends JpaEmbeddedDatabase {
@Override
public void enableConstraints() {
getJdbcTemplate().execute("SET REFERENTIAL_INTEGRITY = TRUE");
List<String> sql = new ArrayList<>();
for (String tableName : getAllTableNames()) {
sql.add(String.format("ALTER TABLE \"%s\" SET REFERENTIAL_INTEGRITY TRUE CHECK", tableName));
}
executeSqlAsBatch(sql);
}
@Override
@ -83,10 +98,7 @@ public class H2EmbeddedDatabase extends JpaEmbeddedDatabase {
private void dropTables() {
List<String> sql = new ArrayList<>();
List<Map<String, Object>> tableResult =
query("SELECT TABLE_NAME FROM information_schema.tables WHERE TABLE_SCHEMA = 'PUBLIC'");
for (Map<String, Object> result : tableResult) {
String tableName = result.get("TABLE_NAME").toString();
for (String tableName : getAllTableNames()) {
sql.add(String.format("DROP TABLE %s CASCADE", tableName));
}
executeSqlAsBatch(sql);

View File

@ -95,8 +95,8 @@ public class HapiEmbeddedDatabasesExtension implements AfterAllCallback {
myDatabaseInitializerHelper.initializePersistenceSchema(getEmbeddedDatabase(theDriverType));
}
public void insertPersistenceTestData(DriverTypeEnum theDriverType) {
myDatabaseInitializerHelper.insertPersistenceTestData(getEmbeddedDatabase(theDriverType));
public void insertPersistenceTestData(DriverTypeEnum theDriverType, VersionEnum theVersionEnum) {
myDatabaseInitializerHelper.insertPersistenceTestData(getEmbeddedDatabase(theDriverType), theVersionEnum);
}
public String getSqlFromResourceFile(String theFileName) {

View File

@ -55,6 +55,7 @@ public class OracleEmbeddedDatabase extends JpaEmbeddedDatabase {
@Override
public void disableConstraints() {
purgeRecycleBin();
List<String> sql = new ArrayList<>();
List<Map<String, Object>> queryResults =
query("SELECT CONSTRAINT_NAME, TABLE_NAME FROM USER_CONSTRAINTS WHERE CONSTRAINT_TYPE != 'P'");
@ -68,6 +69,7 @@ public class OracleEmbeddedDatabase extends JpaEmbeddedDatabase {
@Override
public void enableConstraints() {
purgeRecycleBin();
List<String> sql = new ArrayList<>();
List<Map<String, Object>> queryResults =
query("SELECT CONSTRAINT_NAME, TABLE_NAME FROM USER_CONSTRAINTS WHERE CONSTRAINT_TYPE != 'P'");

View File

@ -63,13 +63,43 @@ public class PostgresEmbeddedDatabase extends JpaEmbeddedDatabase {
executeSqlAsBatch(sql);
}
public void validateConstraints() {
getJdbcTemplate()
.execute(
"""
do $$
declare r record;
BEGIN
FOR r IN (
SELECT FORMAT(
'UPDATE pg_constraint SET convalidated=false WHERE conname = ''%I''; ALTER TABLE %I VALIDATE CONSTRAINT %I;',
tc.constraint_name,
tc.table_name,
tc.constraint_name
) AS x
FROM information_schema.table_constraints AS tc
JOIN information_schema.tables t ON t.table_name = tc.table_name and t.table_type = 'BASE TABLE'
JOIN information_schema.key_column_usage AS kcu ON tc.constraint_name = kcu.constraint_name
JOIN information_schema.constraint_column_usage AS ccu ON ccu.constraint_name = tc.constraint_name
WHERE constraint_type = 'FOREIGN KEY'
AND tc.constraint_schema = 'public'
)
LOOP
EXECUTE (r.x);
END LOOP;
END;
$$;""");
}
@Override
public void enableConstraints() {
List<String> sql = new ArrayList<>();
for (String tableName : getAllTableNames()) {
sql.add(String.format("ALTER TABLE \"%s\" ENABLE TRIGGER ALL", tableName));
}
executeSqlAsBatch(sql);
validateConstraints();
}
@Override

View File

@ -0,0 +1,33 @@
INSERT INTO MPI_LINK (
PID,
CREATED,
EID_MATCH,
TARGET_TYPE,
LINK_SOURCE,
MATCH_RESULT,
NEW_PERSON,
PERSON_PID,
SCORE,
TARGET_PID,
UPDATED,
VECTOR,
VERSION,
GOLDEN_RESOURCE_PID,
RULE_COUNT
) VALUES (
1,
'2023-04-05 15:16:26.43',
1,
'PATIENT',
0,
2,
1,
1906,
NULL,
1905,
'2023-04-05 15:16:26.43',
NULL,
'1',
1906,
1
);

View File

@ -0,0 +1,33 @@
INSERT INTO MPI_LINK (
PID,
CREATED,
EID_MATCH,
TARGET_TYPE,
LINK_SOURCE,
MATCH_RESULT,
NEW_PERSON,
PERSON_PID,
SCORE,
TARGET_PID,
UPDATED,
VECTOR,
VERSION,
GOLDEN_RESOURCE_PID,
RULE_COUNT
) VALUES (
1,
'2023-04-05 15:16:26.43',
1,
'PATIENT',
0,
2,
1,
1906,
NULL,
1905,
'2023-04-05 15:16:26.43',
NULL,
'1',
1906,
1
);

View File

@ -0,0 +1,33 @@
INSERT INTO MPI_LINK (
PID,
CREATED,
EID_MATCH,
TARGET_TYPE,
LINK_SOURCE,
MATCH_RESULT,
NEW_PERSON,
PERSON_PID,
SCORE,
TARGET_PID,
UPDATED,
VECTOR,
VERSION,
GOLDEN_RESOURCE_PID,
RULE_COUNT
) VALUES (
1,
SYSDATE,
1,
'PATIENT',
0,
2,
1,
1906,
NULL,
1905,
SYSDATE,
NULL,
'1',
1906,
1
);

View File

@ -0,0 +1,33 @@
INSERT INTO MPI_LINK (
PID,
CREATED,
EID_MATCH,
TARGET_TYPE,
LINK_SOURCE,
MATCH_RESULT,
NEW_PERSON,
PERSON_PID,
SCORE,
TARGET_PID,
UPDATED,
VECTOR,
VERSION,
GOLDEN_RESOURCE_PID,
RULE_COUNT
) VALUES (
1,
'2023-04-05 15:16:26.43',
true,
'PATIENT',
0,
2,
true,
1906,
NULL,
1905,
'2023-04-05 15:16:26.43',
NULL,
'1',
1906,
1
);

View File

@ -0,0 +1,68 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
) VALUES (
1702,
'R4',
0,
'2023-06-15 09:58:42.92',
'2023-06-15 09:58:42.92',
0,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
0,
0,
0,
0,
0,
0,
0,
1,
1,
'Observation',
1
);
INSERT INTO HFJ_SPIDX_QUANTITY_NRML (
RES_ID,
RES_TYPE,
SP_UPDATED,
SP_MISSING,
SP_NAME, SP_ID,
SP_SYSTEM,
SP_UNITS,
HASH_IDENTITY_AND_UNITS,
HASH_IDENTITY_SYS_UNITS,
HASH_IDENTITY,
SP_VALUE
) VALUES (
1702,
'Observation',
'2023-04-05 15:16:26.43',
0, 'value-quantity',
2,
'https://unitsofmeasure.org',
'g',
-864931808150710347,
6382255012744790145,
-1901136387361512731,
0.012
);

View File

@ -0,0 +1,68 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
) VALUES (
1702,
'R4',
0,
'2023-06-15 09:58:42.92',
'2023-06-15 09:58:42.92',
0,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
0,
0,
0,
0,
0,
0,
0,
1,
1,
'Observation',
1
);
INSERT INTO HFJ_SPIDX_QUANTITY_NRML (
RES_ID,
RES_TYPE,
SP_UPDATED,
SP_MISSING,
SP_NAME, SP_ID,
SP_SYSTEM,
SP_UNITS,
HASH_IDENTITY_AND_UNITS,
HASH_IDENTITY_SYS_UNITS,
HASH_IDENTITY,
SP_VALUE
) VALUES (
1702,
'Observation',
'2023-04-05 15:16:26.43',
0, 'value-quantity',
2,
'https://unitsofmeasure.org',
'g',
-864931808150710347,
6382255012744790145,
-1901136387361512731,
0.012
);

View File

@ -0,0 +1,69 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
) VALUES (
1702,
'R4',
0,
SYSDATE,
SYSDATE,
0,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
0,
0,
0,
0,
0,
0,
0,
1,
1,
'Observation',
1
);
INSERT INTO HFJ_SPIDX_QUANTITY_NRML (
RES_ID,
RES_TYPE,
SP_UPDATED,
SP_MISSING,
SP_NAME, SP_ID,
SP_SYSTEM,
SP_UNITS,
HASH_IDENTITY_AND_UNITS,
HASH_IDENTITY_SYS_UNITS,
HASH_IDENTITY,
SP_VALUE
) VALUES (
1702,
'Observation',
SYSDATE,
0,
'value-quantity',
2,
'https://unitsofmeasure.org',
'g',
-864931808150710347,
6382255012744790145,
-1901136387361512731,
0.012
);

View File

@ -0,0 +1,69 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
) VALUES (
1702,
'R4',
false,
'2023-06-15 09:58:42.92',
'2023-06-15 09:58:42.92',
false,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
false,
false,
false,
false,
true,
false,
false,
true,
true,
'Observation',
1
);
INSERT INTO HFJ_SPIDX_QUANTITY_NRML (
RES_ID,
RES_TYPE,
SP_UPDATED,
SP_MISSING,
SP_NAME,
SP_ID,
SP_SYSTEM,
SP_UNITS,
HASH_IDENTITY_AND_UNITS,
HASH_IDENTITY_SYS_UNITS,
HASH_IDENTITY,
SP_VALUE
) VALUES (
1702,
'Observation',
'2023-04-05 15:16:26.43',
false,
'value-quantity',
2,
'https://unitsofmeasure.org',
'g',
-864931808150710347,
6382255012744790145,
-1901136387361512731,
0.012
);

View File

@ -0,0 +1,37 @@
INSERT INTO HFJ_BLK_IMPORT_JOB (
PID,
JOB_ID,
JOB_STATUS,
STATUS_TIME,
STATUS_MESSAGE,
JOB_DESC,
OPTLOCK,
FILE_COUNT,
ROW_PROCESSING_MODE,
BATCH_SIZE
) VALUES (
60,
'87145395-f9be-4a7b-abb3-6d41b6caf185',
'READY',
'2023-06-23 13:07:58.442',
'',
'ETL Import Job: (unnamed)',
2,
2,
'FHIR_TRANSACTION',
100
);
INSERT INTO HFJ_BLK_IMPORT_JOBFILE (
PID,
JOB_PID,
JOB_CONTENTS,
FILE_SEQ,
TENANT_NAME
) VALUES (
64,
60,
72995,
0,
''
);

View File

@ -0,0 +1,37 @@
INSERT INTO HFJ_BLK_IMPORT_JOB (
PID,
JOB_ID,
JOB_STATUS,
STATUS_TIME,
STATUS_MESSAGE,
JOB_DESC,
OPTLOCK,
FILE_COUNT,
ROW_PROCESSING_MODE,
BATCH_SIZE
) VALUES (
60,
'87145395-f9be-4a7b-abb3-6d41b6caf185',
'READY',
'2023-06-23 13:07:58.442',
'',
'ETL Import Job: (unnamed)',
2,
2,
'FHIR_TRANSACTION',
100
);
INSERT INTO HFJ_BLK_IMPORT_JOBFILE (
PID,
JOB_PID,
JOB_CONTENTS,
FILE_SEQ,
TENANT_NAME
) VALUES (
64,
60,
72995,
0,
''
);

View File

@ -0,0 +1,37 @@
INSERT INTO HFJ_BLK_IMPORT_JOB (
PID,
JOB_ID,
JOB_STATUS,
STATUS_TIME,
STATUS_MESSAGE,
JOB_DESC,
OPTLOCK,
FILE_COUNT,
ROW_PROCESSING_MODE,
BATCH_SIZE
) VALUES (
60,
'87145395-f9be-4a7b-abb3-6d41b6caf185',
'READY',
SYSDATE,
'',
'ETL Import Job: (unnamed)',
2,
2,
'FHIR_TRANSACTION',
100
);
INSERT INTO HFJ_BLK_IMPORT_JOBFILE (
PID,
JOB_PID,
JOB_CONTENTS,
FILE_SEQ,
TENANT_NAME
) VALUES (
64,
60,
HEXTORAW('453d7a34'),
0,
''
);

View File

@ -0,0 +1,37 @@
INSERT INTO HFJ_BLK_IMPORT_JOB (
PID,
JOB_ID,
JOB_STATUS,
STATUS_TIME,
STATUS_MESSAGE,
JOB_DESC,
OPTLOCK,
FILE_COUNT,
ROW_PROCESSING_MODE,
BATCH_SIZE
) VALUES (
60,
'87145395-f9be-4a7b-abb3-6d41b6caf185',
'READY',
'2023-06-23 13:07:58.442',
'',
'ETL Import Job: (unnamed)',
2,
2,
'FHIR_TRANSACTION',
100
);
INSERT INTO HFJ_BLK_IMPORT_JOBFILE (
PID,
JOB_PID,
JOB_CONTENTS,
FILE_SEQ,
TENANT_NAME
) VALUES (
64,
60,
72995,
0,
''
);

View File

@ -0,0 +1,55 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
)
VALUES (
1656,
'R4',
0,
'2023-06-15 09:58:42.92',
'2023-06-15 09:58:42.92',
0,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
0,
0,
0,
0,
1,
0,
0,
1,
1,
'Observation',
1
);
INSERT INTO HFJ_IDX_CMB_TOK_NU (
PID,
HASH_COMPLETE,
IDX_STRING,
RES_ID
) VALUES (
10,
'5570851350247697202',
'Patient?birthdate=1974-12-25&family=WINDSOR&gender=http%3A%2F%2Fhl7.org%2Ffhir%2Fadministrative-gender%7Cmale',
1656
);

View File

@ -0,0 +1,55 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
)
VALUES (
1653,
'R4',
'false',
'2023-06-15 09:58:42.92',
'2023-06-15 09:58:42.92',
'false',
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
'false',
'false',
'false',
'false',
'true',
'false',
'false',
'true',
'true',
'Observation',
1
);
INSERT INTO HFJ_IDX_CMB_TOK_NU (
PID,
HASH_COMPLETE,
IDX_STRING,
RES_ID
) VALUES (
10,
'5570851350247697202',
'Patient?birthdate=1974-12-25&family=WINDSOR&gender=http%3A%2F%2Fhl7.org%2Ffhir%2Fadministrative-gender%7Cmale',
1653
);

View File

@ -0,0 +1,55 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
)
VALUES (
1653,
'R4',
0,
SYSDATE,
SYSDATE,
0,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
0,
0,
0,
0,
1,
0,
0,
1,
1,
'Observation',
1
);
INSERT INTO HFJ_IDX_CMB_TOK_NU (
PID,
HASH_COMPLETE,
IDX_STRING,
RES_ID
) VALUES (
10,
'5570851350247697202',
'Patient?birthdate=1974-12-25&family=WINDSOR&gender=http%3A%2F%2Fhl7.org%2Ffhir%2Fadministrative-gender%7Cmale',
1653
);

View File

@ -0,0 +1,55 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
)
VALUES (
1653,
'R4',
FALSE,
'2023-06-15 09:58:42.92',
'2023-06-15 09:58:42.92',
FALSE,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
FALSE,
FALSE,
FALSE,
FALSE,
TRUE,
FALSE,
FALSE,
TRUE,
TRUE,
'Observation',
1
);
INSERT INTO HFJ_IDX_CMB_TOK_NU (
PID,
HASH_COMPLETE,
IDX_STRING,
RES_ID
) VALUES (
10,
'5570851350247697202',
'Patient?birthdate=1974-12-25&family=WINDSOR&gender=http%3A%2F%2Fhl7.org%2Ffhir%2Fadministrative-gender%7Cmale',
1653
);

View File

@ -0,0 +1,63 @@
INSERT INTO BT2_JOB_INSTANCE (
ID,
JOB_CANCELLED,
CMB_RECS_PROCESSED,
CMB_RECS_PER_SEC,
CREATE_TIME,
CUR_GATED_STEP_ID,
DEFINITION_ID,
DEFINITION_VER,
END_TIME,
ERROR_COUNT,
EST_REMAINING,
PARAMS_JSON,
PROGRESS_PCT,
START_TIME,
STAT,
WORK_CHUNKS_PURGED
) VALUES (
'00161699-bcfe-428e-9ca2-caceb9645f8a',
0,
0,
0,
'2023-07-06 14:24:10.845',
'WriteBundleForImportStep',
'bulkImportJob',
1,
'2023-07-06 14:25:11.098',
0,
'0ms',
'{"jobId":"42bfa0dd-ab7b-4991-8284-e4b2902c696b","batchSize":100}',
1,
'2023-07-06 14:24:10.875',
'COMPLETED',
1
);
INSERT INTO BT2_WORK_CHUNK (
ID,
CREATE_TIME,
END_TIME,
ERROR_COUNT,
INSTANCE_ID,
DEFINITION_ID,
DEFINITION_VER,
RECORDS_PROCESSED,
SEQ,
START_TIME,
STAT,
TGT_STEP_ID
) VALUES (
'01d26875-8d1a-4e37-b554-62a3219f009b',
'2023-07-06 15:20:20.797',
'2023-07-06 15:21:11.142',
0,
'00161699-bcfe-428e-9ca2-caceb9645f8a',
'bulkImportJob',
1,
0,
0,
'2023-07-06 15:21:11.14',
'COMPLETED',
'ReadInResourcesFromFileStep'
);

View File

@ -0,0 +1,63 @@
INSERT INTO BT2_JOB_INSTANCE (
ID,
JOB_CANCELLED,
CMB_RECS_PROCESSED,
CMB_RECS_PER_SEC,
CREATE_TIME,
CUR_GATED_STEP_ID,
DEFINITION_ID,
DEFINITION_VER,
END_TIME,
ERROR_COUNT,
EST_REMAINING,
PARAMS_JSON,
PROGRESS_PCT,
START_TIME,
STAT,
WORK_CHUNKS_PURGED
) VALUES (
'00161699-bcfe-428e-9ca2-caceb9645f8a',
0,
0,
0,
'2023-07-06 14:24:10.845',
'WriteBundleForImportStep',
'bulkImportJob',
1,
'2023-07-06 14:25:11.098',
0,
'0ms',
'{"jobId":"42bfa0dd-ab7b-4991-8284-e4b2902c696b","batchSize":100}',
1,
'2023-07-06 14:24:10.875',
'COMPLETED',
1
);
INSERT INTO BT2_WORK_CHUNK (
ID,
CREATE_TIME,
END_TIME,
ERROR_COUNT,
INSTANCE_ID,
DEFINITION_ID,
DEFINITION_VER,
RECORDS_PROCESSED,
SEQ,
START_TIME,
STAT,
TGT_STEP_ID
) VALUES (
'01d26875-8d1a-4e37-b554-62a3219f009b',
'2023-07-06 15:20:20.797',
'2023-07-06 15:21:11.142',
0,
'00161699-bcfe-428e-9ca2-caceb9645f8a',
'bulkImportJob',
1,
0,
0,
'2023-07-06 15:21:11.14',
'COMPLETED',
'ReadInResourcesFromFileStep'
);

View File

@ -0,0 +1,63 @@
INSERT INTO BT2_JOB_INSTANCE (
ID,
JOB_CANCELLED,
CMB_RECS_PROCESSED,
CMB_RECS_PER_SEC,
CREATE_TIME,
CUR_GATED_STEP_ID,
DEFINITION_ID,
DEFINITION_VER,
END_TIME,
ERROR_COUNT,
EST_REMAINING,
PARAMS_JSON,
PROGRESS_PCT,
START_TIME,
STAT,
WORK_CHUNKS_PURGED
) VALUES (
'00161699-bcfe-428e-9ca2-caceb9645f8a',
0,
0,
0,
SYSDATE,
'WriteBundleForImportStep',
'bulkImportJob',
1,
SYSDATE,
0,
'0ms',
'{"jobId":"42bfa0dd-ab7b-4991-8284-e4b2902c696b","batchSize":100}',
1,
SYSDATE,
'COMPLETED',
1
);
INSERT INTO BT2_WORK_CHUNK (
ID,
CREATE_TIME,
END_TIME,
ERROR_COUNT,
INSTANCE_ID,
DEFINITION_ID,
DEFINITION_VER,
RECORDS_PROCESSED,
SEQ,
START_TIME,
STAT,
TGT_STEP_ID
) VALUES (
'01d26875-8d1a-4e37-b554-62a3219f009b',
SYSDATE,
SYSDATE,
0,
'00161699-bcfe-428e-9ca2-caceb9645f8a',
'bulkImportJob',
1,
0,
0,
SYSDATE,
'COMPLETED',
'ReadInResourcesFromFileStep'
);

View File

@ -0,0 +1,63 @@
INSERT INTO BT2_JOB_INSTANCE (
ID,
JOB_CANCELLED,
CMB_RECS_PROCESSED,
CMB_RECS_PER_SEC,
CREATE_TIME,
CUR_GATED_STEP_ID,
DEFINITION_ID,
DEFINITION_VER,
END_TIME,
ERROR_COUNT,
EST_REMAINING,
PARAMS_JSON,
PROGRESS_PCT,
START_TIME,
STAT,
WORK_CHUNKS_PURGED
) VALUES (
'00161699-bcfe-428e-9ca2-caceb9645f8a',
false,
0,
0,
'2023-07-06 14:24:10.845',
'WriteBundleForImportStep',
'bulkImportJob',
1,
'2023-07-06 14:25:11.098',
0,
'0ms',
'{"jobId":"42bfa0dd-ab7b-4991-8284-e4b2902c696b","batchSize":100}',
1,
'2023-07-06 14:24:10.875',
'COMPLETED',
true
);
INSERT INTO BT2_WORK_CHUNK (
ID,
CREATE_TIME,
END_TIME,
ERROR_COUNT,
INSTANCE_ID,
DEFINITION_ID,
DEFINITION_VER,
RECORDS_PROCESSED,
SEQ,
START_TIME,
STAT,
TGT_STEP_ID
) VALUES (
'01d26875-8d1a-4e37-b554-62a3219f009b',
'2023-07-06 15:20:20.797',
'2023-07-06 15:21:11.142',
0,
'00161699-bcfe-428e-9ca2-caceb9645f8a',
'bulkImportJob',
1,
0,
0,
'2023-07-06 15:21:11.14',
'COMPLETED',
'ReadInResourcesFromFileStep'
);

View File

@ -0,0 +1,92 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
) VALUES (
1678,
'R4',
0,
'2023-06-15 09:58:42.92',
'2023-06-15 09:58:42.92',
0,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
0,
0,
0,
0,
1,
0,
0,
1,
1,
'Observation',
1
);
INSERT INTO HFJ_RES_SEARCH_URL (
RES_SEARCH_URL,
CREATED_TIME,
RES_ID
) VALUES (
'https://example.com',
'2023-06-29 10:14:39.69',
1678
);
INSERT INTO HFJ_REVINFO (
REV
) VALUES (
1
);
INSERT INTO MPI_LINK_AUD (
PID,
REV,
REVTYPE,
PERSON_PID,
GOLDEN_RESOURCE_PID,
TARGET_TYPE,
RULE_COUNT,
TARGET_PID,
MATCH_RESULT,
LINK_SOURCE,
VERSION,
EID_MATCH,
NEW_PERSON,
SCORE
) VALUES (
1,
1,
0,
1358,
1358,
'PATIENT',
0,
1357,
2,
0,
1,
0,
1,
1
);

View File

@ -0,0 +1,92 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
) VALUES (
1678,
'R4',
0,
'2023-06-15 09:58:42.92',
'2023-06-15 09:58:42.92',
0,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
0,
0,
0,
0,
1,
0,
0,
1,
1,
'Observation',
1
);
INSERT INTO HFJ_RES_SEARCH_URL (
RES_SEARCH_URL,
CREATED_TIME,
RES_ID
) VALUES (
'https://example.com',
'2023-06-29 10:14:39.69',
1678
);
INSERT INTO HFJ_REVINFO (
REV
) VALUES (
1
);
INSERT INTO MPI_LINK_AUD (
PID,
REV,
REVTYPE,
PERSON_PID,
GOLDEN_RESOURCE_PID,
TARGET_TYPE,
RULE_COUNT,
TARGET_PID,
MATCH_RESULT,
LINK_SOURCE,
VERSION,
EID_MATCH,
NEW_PERSON,
SCORE
) VALUES (
1,
1,
0,
1358,
1358,
'PATIENT',
0,
1357,
2,
0,
1,
0,
1,
1
);

View File

@ -0,0 +1,92 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
) VALUES (
1678,
'R4',
0,
SYSDATE,
SYSDATE,
0,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
0,
0,
0,
0,
1,
0,
0,
1,
1,
'Observation',
1
);
INSERT INTO HFJ_RES_SEARCH_URL (
RES_SEARCH_URL,
CREATED_TIME,
RES_ID
) VALUES (
'https://example.com',
SYSDATE,
1678
);
INSERT INTO HFJ_REVINFO (
REV
) VALUES (
1
);
INSERT INTO MPI_LINK_AUD (
PID,
REV,
REVTYPE,
PERSON_PID,
GOLDEN_RESOURCE_PID,
TARGET_TYPE,
RULE_COUNT,
TARGET_PID,
MATCH_RESULT,
LINK_SOURCE,
VERSION,
EID_MATCH,
NEW_PERSON,
SCORE
) VALUES (
1,
1,
0,
1358,
1358,
'PATIENT',
0,
1357,
2,
0,
1,
0,
1,
1
);

View File

@ -0,0 +1,92 @@
INSERT INTO HFJ_RESOURCE (
RES_ID,
RES_VERSION,
HAS_TAGS,
RES_PUBLISHED,
RES_UPDATED,
SP_HAS_LINKS,
HASH_SHA256,
SP_INDEX_STATUS,
SP_CMPSTR_UNIQ_PRESENT,
SP_COORDS_PRESENT,
SP_DATE_PRESENT,
SP_NUMBER_PRESENT,
SP_QUANTITY_NRML_PRESENT,
SP_QUANTITY_PRESENT,
SP_STRING_PRESENT,
SP_TOKEN_PRESENT,
SP_URI_PRESENT,
RES_TYPE,
RES_VER
) VALUES (
1678,
'R4',
false,
'2023-06-15 09:58:42.92',
'2023-06-15 09:58:42.92',
false,
'6beed652b77f6c65d776e57341a0b5b0596ac9cfb0e8345a5a5cfbfaa59e2b62',
1,
false,
false,
false,
false,
true,
false,
false,
true,
true,
'Observation',
1
);
INSERT INTO HFJ_RES_SEARCH_URL (
RES_SEARCH_URL,
CREATED_TIME,
RES_ID
) VALUES (
'https://example.com',
'2023-06-29 10:14:39.69',
1678
);
INSERT INTO HFJ_REVINFO (
REV
) VALUES (
1
);
INSERT INTO MPI_LINK_AUD (
PID,
REV,
REVTYPE,
PERSON_PID,
GOLDEN_RESOURCE_PID,
TARGET_TYPE,
RULE_COUNT,
TARGET_PID,
MATCH_RESULT,
LINK_SOURCE,
VERSION,
EID_MATCH,
NEW_PERSON,
SCORE
) VALUES (
1,
1,
0,
1358,
1358,
'PATIENT',
0,
1357,
2,
0,
1,
false,
true,
1
);

View File

@ -22,6 +22,7 @@ import javax.sql.DataSource;
import java.sql.SQLException;
import java.util.Collections;
import java.util.Properties;
import java.util.Set;
import static ca.uhn.fhir.jpa.embedded.HapiEmbeddedDatabasesExtension.FIRST_TESTED_VERSION;
import static ca.uhn.fhir.jpa.migrate.SchemaMigrator.HAPI_FHIR_MIGRATION_TABLENAME;
@ -61,26 +62,35 @@ public class HapiSchemaMigrationTest {
ourLog.info("Running hapi fhir migration tasks for {}", theDriverType);
myEmbeddedServersExtension.initializePersistenceSchema(theDriverType);
myEmbeddedServersExtension.insertPersistenceTestData(theDriverType);
myEmbeddedServersExtension.insertPersistenceTestData(theDriverType, FIRST_TESTED_VERSION);
JpaEmbeddedDatabase database = myEmbeddedServersExtension.getEmbeddedDatabase(theDriverType);
DataSource dataSource = database.getDataSource();
HapiMigrationDao hapiMigrationDao = new HapiMigrationDao(dataSource, theDriverType, HAPI_FHIR_MIGRATION_TABLENAME);
HapiMigrationStorageSvc hapiMigrationStorageSvc = new HapiMigrationStorageSvc(hapiMigrationDao);
VersionEnum[] allVersions = VersionEnum.values();
VersionEnum[] allVersions = VersionEnum.values();
int fromVersion = FIRST_TESTED_VERSION.ordinal() - 1;
Set<VersionEnum> dataVersions = Set.of(
VersionEnum.V5_2_0,
VersionEnum.V5_3_0,
VersionEnum.V5_4_0,
VersionEnum.V5_5_0,
VersionEnum.V6_0_0,
VersionEnum.V6_6_0
);
int fromVersion = 0;
VersionEnum from = allVersions[fromVersion];
VersionEnum toVersion;
int lastVersion = allVersions.length - 1;
VersionEnum to = allVersions[lastVersion];
MigrationTaskList migrationTasks = new HapiFhirJpaMigrationTasks(Collections.emptySet()).getTaskList(from, to);
SchemaMigrator schemaMigrator = new SchemaMigrator(TEST_SCHEMA_NAME, HAPI_FHIR_MIGRATION_TABLENAME, dataSource, new Properties(), migrationTasks, hapiMigrationStorageSvc);
schemaMigrator.setDriverType(theDriverType);
schemaMigrator.createMigrationTableIfRequired();
schemaMigrator.migrate();
for (int i = 0; i < allVersions.length; i++) {
toVersion = allVersions[i];
migrate(theDriverType, dataSource, hapiMigrationStorageSvc, toVersion);
if (dataVersions.contains(toVersion)) {
myEmbeddedServersExtension.insertPersistenceTestData(theDriverType, toVersion);
}
}
if (theDriverType == DriverTypeEnum.POSTGRES_9_4) {
// we only run this for postgres because:
@ -93,6 +103,21 @@ public class HapiSchemaMigrationTest {
}
}
private static void migrate(DriverTypeEnum theDriverType, DataSource dataSource, HapiMigrationStorageSvc hapiMigrationStorageSvc, VersionEnum from, VersionEnum to) throws SQLException {
MigrationTaskList migrationTasks = new HapiFhirJpaMigrationTasks(Collections.emptySet()).getTaskList(from, to);
SchemaMigrator schemaMigrator = new SchemaMigrator(TEST_SCHEMA_NAME, HAPI_FHIR_MIGRATION_TABLENAME, dataSource, new Properties(), migrationTasks, hapiMigrationStorageSvc);
schemaMigrator.setDriverType(theDriverType);
schemaMigrator.createMigrationTableIfRequired();
schemaMigrator.migrate();
}
private static void migrate(DriverTypeEnum theDriverType, DataSource dataSource, HapiMigrationStorageSvc hapiMigrationStorageSvc, VersionEnum to) throws SQLException {
MigrationTaskList migrationTasks = new HapiFhirJpaMigrationTasks(Collections.emptySet()).getAllTasks(new VersionEnum[]{to});
SchemaMigrator schemaMigrator = new SchemaMigrator(TEST_SCHEMA_NAME, HAPI_FHIR_MIGRATION_TABLENAME, dataSource, new Properties(), migrationTasks, hapiMigrationStorageSvc);
schemaMigrator.setDriverType(theDriverType);
schemaMigrator.createMigrationTableIfRequired();
schemaMigrator.migrate();
}
@Test
public void testCreateMigrationTableIfRequired() throws SQLException {