Merge branch 'loinc_loader_update' of github.com:jamesagnew/hapi-fhir into loinc_loader_update

This commit is contained in:
jamesagnew 2018-03-29 06:04:36 -04:00
commit 3cd3fdcb79
142 changed files with 11937 additions and 1133 deletions

View File

@ -8,31 +8,57 @@ TODO:
Comments for Loinc:
Overall
- ValueSet and ConceptMap resources have a spot for copyright and contact information. Are there official values for these?
- ValueSet and ConceptMap resources have a spot for copyright and
contact information. Are there official values for these?
Answer Lists
- Per the notes, there is no way in FHIR currently to map answer lists to codes based on context. For this reason, I am ignoring any entries in LoincAnswerListLink_Beta_1.csv where the "ApplicableContext" context is not empty. Is this correct?
- Per the notes, there is no way in FHIR currently to map answer lists to
codes based on context. For this reason, I am ignoring any entries in
LoincAnswerListLink_Beta_1.csv where the "ApplicableContext" context is
not empty. Is this correct?
Parts
- Only parts with a status of "ACTIVE" are being imported, any others are ignored. Does this make sense?
- The PartTypeName (e.g. "ADJUSTMENT") is ignored as there is no corresponding property in loinc.xml
- PartDisplayName does not have an obvious mapping to FHIR
- Part links are not currently processed (it's not clear to me how to model these in FHIR, as CodeSystem.hierarchyMeaning has to be only one of 'is-a' or 'part-of' and presumably the 'is-a' relationship is more important.
- Only parts with a status of "ACTIVE" are being imported, any others are
ignored.
- The PartTypeName (e.g. "ADJUSTMENT") is ignored as there is no corresponding
property in loinc.xml
- PartDisplayName is not mapped
- Part links are not currently processed (it's not clear to me how to model
these in FHIR, as CodeSystem.hierarchyMeaning has to be only one of 'is-a'
or 'part-of' and presumably the 'is-a' relationship is more important.
Part Mappings
- I have made LOINC the source and SCT the target for the mappings in the ConceptMap resource. Does this seem like the appropriate orientation?
- A canonical URI should be defined for the LOINC->SCT mapping ConceptMap resource. I have hardcoded "http://loinc.org/loinc-to-snomed" for now, but we should discuss what is appropriate.
- I have made LOINC the source and SCT the target for the mappings in the
ConceptMap resource. Does this seem like the appropriate orientation?
- A canonical URI should be defined for the LOINC->SCT mapping ConceptMap
resource. I have hardcoded "http://loinc.org/loinc-to-snomed" for now, but
we should discuss what is appropriate.
RSNA Playbook
- A canonical URI should be defined for the "all RSNA playbook codes" ValueSet. I have hardcoded "http://loinc.org/rsna-codes" for now but we should discuss what is appropriate.
- A canonical URI should be defined for the "all RSNA playbook codes" ValueSet.
I have hardcoded "http://loinc.org/rsna-codes" for now but we should discuss
what is appropriate.
- A name for the "RSNA Playbook" ValueSet is needed.
- Just to confirm, the "all RSNA playbook codes" ValueSet should contain the loinc codes (e.g. "17787-3") and not the part codes (e.g. "LP199995-4")?
- A codesystem URI for radlex RID and RPID codes is needed (currently "http://rid" and "http://rpid" are used as placeholders since I'm assuming these exist somewhere.
- For mappings from loinc part codes to RadLex RIDs, are the codes considered equivalent (or would they be wider/narrower). They look equivalent to me.
- Just to confirm, the "all RSNA playbook codes" ValueSet should contain the
loinc codes (e.g. "17787-3") and not the part codes (e.g. "LP199995-4")?
- A codesystem URI for radlex RID and RPID codes is needed (currently
"http://rid" and "http://rpid" are used as placeholders since I'm assuming
these exist somewhere.
- For mappings from loinc part codes to RadLex RIDs, are the codes considered
equivalent (or would they be wider/narrower). They look equivalent to me.
Document Ontology
- Per the SOW, "A value set containing terms in the LOINC Document Ontology will be created". Just to confirm, entries in this ValueSet are therefore LOINC terms (such as "11488-4 / Consultation Note") as opposed to part codes?
- Need to define a URI for the document ontology ValueSet. Currently I am using "http://loinc.org/document-ontology-codes"
- Per the SOW, "A value set containing terms in the LOINC Document Ontology
will be created". Just to confirm, entries in this ValueSet are therefore
LOINC terms (such as "11488-4 / Consultation Note") as opposed to part
codes?
- Need to define a URI for the document ontology ValueSet. Currently I am
using "http://loinc.org/document-ontology-codes"
Top 2000
- Need to define a URI for both ValueSets. Currently I am using "http://loinc.org/top-2000-lab-results-us" and "http://loinc.org/top-2000-lab-results-si"
- Need to define a URI for both ValueSets. Currently I am using
"http://loinc.org/top-2000-lab-results-us" and
"http://loinc.org/top-2000-lab-results-si"
Universal Order Set
- Need to define a URI for this ValueSet - Currenty using "http://loinc.org/fhir/loinc-universal-order-set"

View File

@ -0,0 +1,128 @@
/target
/jpaserver_derby_files
*.log
ca.uhn.fhir.jpa.entity.ResourceTable/
# Created by https://www.gitignore.io
### Java ###
*.class
# Mobile Tools for Java (J2ME)
.mtj.tmp/
# Package Files #
*.jar
*.war
*.ear
# virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
hs_err_pid*
### Maven ###
target/
pom.xml.tag
pom.xml.releaseBackup
pom.xml.versionsBackup
pom.xml.next
release.properties
dependency-reduced-pom.xml
buildNumber.properties
### Vim ###
[._]*.s[a-w][a-z]
[._]s[a-w][a-z]
*.un~
Session.vim
.netrwhist
*~
### Intellij ###
# Covers JetBrains IDEs: IntelliJ, RubyMine, PhpStorm, AppCode, PyCharm
*.iml
## Directory-based project format:
.idea/
# if you remove the above rule, at least ignore the following:
# User-specific stuff:
# .idea/workspace.xml
# .idea/tasks.xml
# .idea/dictionaries
# Sensitive or high-churn files:
# .idea/dataSources.ids
# .idea/dataSources.xml
# .idea/sqlDataSources.xml
# .idea/dynamic.xml
# .idea/uiDesigner.xml
# Gradle:
# .idea/gradle.xml
# .idea/libraries
# Mongo Explorer plugin:
# .idea/mongoSettings.xml
## File-based project format:
*.ipr
*.iws
## Plugin-specific files:
# IntelliJ
/out/
# mpeltonen/sbt-idea plugin
.idea_modules/
# JIRA plugin
atlassian-ide-plugin.xml
# Crashlytics plugin (for Android Studio and IntelliJ)
com_crashlytics_export_strings.xml
crashlytics.properties
crashlytics-build.properties
### Eclipse ###
*.pydevproject
.metadata
.gradle
bin/
tmp/
*.tmp
*.bak
*.swp
*~.nib
local.properties
.loadpath
# Eclipse Core
.project
# External tool builders
.externalToolBuilders/
# Locally stored "Eclipse launch configurations"
*.launch
# CDT-specific
.cproject
# JDT-specific (Eclipse Java Development Tools)
# PDT-specific
.buildpath
# sbteclipse plugin
.target
# TeXlipse plugin
.texlipse

View File

@ -0,0 +1,286 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<!--
Note: HAPI projects use the "hapi-fhir" POM as their base to provide easy management.
You do not need to use this in your own projects, so the "parent" tag and it's
contents below may be removed
if you are using this file as a basis for your own project.
-->
<parent>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir</artifactId>
<version>3.3.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>
<artifactId>hapi-fhir-jpaserver-cds-example</artifactId>
<packaging>war</packaging>
<name>HAPI FHIR JPA Clinical Decision Support Server - Example</name>
<repositories>
<repository>
<id>oss-snapshots</id>
<snapshots>
<enabled>true</enabled>
</snapshots>
<url>https://oss.sonatype.org/content/repositories/snapshots/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.opencds.cqf</groupId>
<artifactId>cqf-ruler</artifactId>
<version>0.1.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty.websocket</groupId>
<artifactId>websocket-api</artifactId>
<version>${jetty_version}</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty.websocket</groupId>
<artifactId>websocket-client</artifactId>
<version>${jetty_version}</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>6.0.5</version>
</dependency>
<!-- This dependency includes the core HAPI-FHIR classes -->
<dependency>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-base</artifactId>
<version>${project.version}</version>
</dependency>
<!-- At least one "structures" JAR must also be included -->
<dependency>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-structures-dstu3</artifactId>
<version>${project.version}</version>
</dependency>
<!-- This dependency includes the JPA server itself, which is packaged separately from the rest of HAPI FHIR -->
<dependency>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-jpaserver-base</artifactId>
<version>${project.version}</version>
</dependency>
<!-- This dependency is used for the "FHIR Tester" web app overlay -->
<dependency>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-testpage-overlay</artifactId>
<version>${project.version}</version>
<type>war</type>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-testpage-overlay</artifactId>
<version>${project.version}</version>
<classifier>classes</classifier>
<scope>provided</scope>
</dependency>
<!-- HAPI-FHIR uses Logback for logging support. The logback library is included automatically by Maven as a part of the hapi-fhir-base dependency, but you also need to include a logging library. Logback
is used here, but log4j would also be fine. -->
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
</dependency>
<!-- Needed for JEE/Servlet support -->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<scope>provided</scope>
</dependency>
<!-- If you are using HAPI narrative generation, you will need to include Thymeleaf as well. Otherwise the following can be omitted. -->
<dependency>
<groupId>org.thymeleaf</groupId>
<artifactId>thymeleaf</artifactId>
</dependency>
<!-- Used for CORS support -->
<dependency>
<groupId>org.ebaysf.web</groupId>
<artifactId>cors-filter</artifactId>
<exclusions>
<exclusion>
<artifactId>servlet-api</artifactId>
<groupId>javax.servlet</groupId>
</exclusion>
</exclusions>
</dependency>
<!-- Spring Web is used to deploy the server to a web container. -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
</dependency>
<!-- You may not need this if you are deploying to an application server which provides database connection pools itself. -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-dbcp2</artifactId>
</dependency>
<!-- This example uses Derby embedded database. If you are using another database such as Mysql or Oracle, you may omit the following dependencies and replace them with an appropriate database client
dependency for your database platform. -->
<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derby</artifactId>
</dependency>
<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derbynet</artifactId>
</dependency>
<dependency>
<groupId>org.apache.derby</groupId>
<artifactId>derbyclient</artifactId>
</dependency>
<!-- The following dependencies are only needed for automated unit tests, you do not neccesarily need them to run the example. -->
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-servlets</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-servlet</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.eclipse.jetty.websocket</groupId>
<artifactId>websocket-server</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-server</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-util</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-webapp</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.phloc</groupId>
<artifactId>phloc-schematron</artifactId>
<exclusions>
<exclusion>
<artifactId>Saxon-HE</artifactId>
<groupId>net.sf.saxon</groupId>
</exclusion>
</exclusions>
</dependency>
<!--
For some reason JavaDoc crashed during site generation unless we have this dependency
-->
<dependency>
<groupId>javax.interceptor</groupId>
<artifactId>javax.interceptor-api</artifactId>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<!-- Tells Maven to name the generated WAR file as hapi-fhir-jpaserver-example.war -->
<finalName>hapi-fhir-jpaserver-cds</finalName>
<!-- The following is not required for the application to build, but allows you to test it by issuing "mvn jetty:run" from the command line. -->
<pluginManagement>
<plugins>
<plugin>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-maven-plugin</artifactId>
<configuration>
<webApp>
<contextPath>/hapi-fhir-jpaserver-cds</contextPath>
<allowDuplicateFragmentNames>true</allowDuplicateFragmentNames>
</webApp>
</configuration>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<!-- Tell Maven which Java source version you want to use -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<!-- The configuration here tells the WAR plugin to include the FHIR Tester overlay. You can omit it if you are not using that feature. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<archive>
<manifestEntries>
<Build-Time>${maven.build.timestamp}</Build-Time>
</manifestEntries>
</archive>
<overlays>
<overlay>
<groupId>ca.uhn.hapi.fhir</groupId>
<artifactId>hapi-fhir-testpage-overlay</artifactId>
</overlay>
</overlays>
<webXml>src/main/webapp/WEB-INF/web.xml</webXml>
</configuration>
</plugin>
<!-- This plugin is just a part of the HAPI internal build process, you do not need to incude it in your own projects -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-deploy-plugin</artifactId>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
<!-- This is to run the integration tests -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<redirectTestOutputToFile>true</redirectTestOutputToFile>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>

View File

@ -0,0 +1,16 @@
package ca.uhn.fhir.jpa.cds.example;
import org.opencds.cqf.servlet.CdsServicesServlet;
public class CdsHooksServerExample extends CdsServicesServlet {
// @Override
// protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
// // Change how requests are handled
// }
//
// @Override
// protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
// // Change discovery response
// }
}

View File

@ -0,0 +1,165 @@
package ca.uhn.fhir.jpa.cds.example;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.jpa.dao.DaoConfig;
import ca.uhn.fhir.jpa.dao.IFhirSystemDao;
import ca.uhn.fhir.jpa.provider.TerminologyUploaderProvider;
import ca.uhn.fhir.jpa.provider.dstu3.JpaConformanceProviderDstu3;
import ca.uhn.fhir.jpa.provider.dstu3.JpaSystemProviderDstu3;
import ca.uhn.fhir.jpa.rp.dstu3.ActivityDefinitionResourceProvider;
import ca.uhn.fhir.jpa.rp.dstu3.MeasureResourceProvider;
import ca.uhn.fhir.jpa.rp.dstu3.PlanDefinitionResourceProvider;
import ca.uhn.fhir.jpa.search.DatabaseBackedPagingProvider;
import ca.uhn.fhir.narrative.DefaultThymeleafNarrativeGenerator;
import ca.uhn.fhir.rest.api.EncodingEnum;
import ca.uhn.fhir.rest.server.ETagSupportEnum;
import ca.uhn.fhir.rest.server.IResourceProvider;
import ca.uhn.fhir.rest.server.RestfulServer;
import ca.uhn.fhir.rest.server.interceptor.IServerInterceptor;
import org.hl7.fhir.dstu3.model.Bundle;
import org.hl7.fhir.dstu3.model.Meta;
import org.opencds.cqf.providers.FHIRActivityDefinitionResourceProvider;
import org.opencds.cqf.providers.FHIRMeasureResourceProvider;
import org.opencds.cqf.providers.FHIRPlanDefinitionResourceProvider;
import org.springframework.web.context.ContextLoaderListener;
import org.springframework.web.context.WebApplicationContext;
import javax.servlet.ServletException;
import java.util.Collection;
import java.util.List;
public class CdsServerExample extends RestfulServer {
@SuppressWarnings("unchecked")
@Override
protected void initialize() throws ServletException {
super.initialize();
FhirVersionEnum fhirVersion = FhirVersionEnum.DSTU3;
setFhirContext(new FhirContext(fhirVersion));
// Get the spring context from the web container (it's declared in web.xml)
WebApplicationContext myAppCtx = ContextLoaderListener.getCurrentWebApplicationContext();
if (myAppCtx == null) {
throw new ServletException("Error retrieving spring context from the web container");
}
String resourceProviderBeanName = "myResourceProvidersDstu3";
List<IResourceProvider> beans = myAppCtx.getBean(resourceProviderBeanName, List.class);
setResourceProviders(beans);
Object systemProvider = myAppCtx.getBean("mySystemProviderDstu3", JpaSystemProviderDstu3.class);
setPlainProviders(systemProvider);
/*
* The conformance provider exports the supported resources, search parameters, etc for
* this server. The JPA version adds resource counts to the exported statement, so it
* is a nice addition.
*/
IFhirSystemDao<Bundle, Meta> systemDao = myAppCtx.getBean("mySystemDaoDstu3", IFhirSystemDao.class);
JpaConformanceProviderDstu3 confProvider =
new JpaConformanceProviderDstu3(this, systemDao, myAppCtx.getBean(DaoConfig.class));
confProvider.setImplementationDescription("Example Server");
setServerConformanceProvider(confProvider);
/*
* Enable ETag Support (this is already the default)
*/
setETagSupport(ETagSupportEnum.ENABLED);
/*
* This server tries to dynamically generate narratives
*/
FhirContext ctx = getFhirContext();
ctx.setNarrativeGenerator(new DefaultThymeleafNarrativeGenerator());
/*
* Default to JSON and pretty printing
*/
setDefaultPrettyPrint(true);
setDefaultResponseEncoding(EncodingEnum.JSON);
/*
* -- New in HAPI FHIR 1.5 --
* This configures the server to page search results to and from
* the database, instead of only paging them to memory. This may mean
* a performance hit when performing searches that return lots of results,
* but makes the server much more scalable.
*/
setPagingProvider(myAppCtx.getBean(DatabaseBackedPagingProvider.class));
/*
* Load interceptors for the server from Spring (these are defined in FhirServerConfig.java)
*/
Collection<IServerInterceptor> interceptorBeans = myAppCtx.getBeansOfType(IServerInterceptor.class).values();
for (IServerInterceptor interceptor : interceptorBeans) {
this.registerInterceptor(interceptor);
}
/*
* Adding resource providers from the cqf-ruler
*/
// Measure processing
FHIRMeasureResourceProvider measureProvider = new FHIRMeasureResourceProvider(getResourceProviders());
MeasureResourceProvider jpaMeasureProvider = (MeasureResourceProvider) getProvider("Measure");
measureProvider.setDao(jpaMeasureProvider.getDao());
measureProvider.setContext(jpaMeasureProvider.getContext());
// PlanDefinition processing
FHIRPlanDefinitionResourceProvider planDefProvider = new FHIRPlanDefinitionResourceProvider(getResourceProviders());
PlanDefinitionResourceProvider jpaPlanDefProvider =
(PlanDefinitionResourceProvider) getProvider("PlanDefinition");
planDefProvider.setDao(jpaPlanDefProvider.getDao());
planDefProvider.setContext(jpaPlanDefProvider.getContext());
// ActivityDefinition processing
FHIRActivityDefinitionResourceProvider actDefProvider = new FHIRActivityDefinitionResourceProvider(getResourceProviders());
ActivityDefinitionResourceProvider jpaActDefProvider =
(ActivityDefinitionResourceProvider) getProvider("ActivityDefinition");
actDefProvider.setDao(jpaActDefProvider.getDao());
actDefProvider.setContext(jpaActDefProvider.getContext());
try {
unregisterProvider(jpaMeasureProvider);
unregisterProvider(jpaPlanDefProvider);
unregisterProvider(jpaActDefProvider);
} catch (Exception e) {
throw new ServletException("Unable to unregister provider: " + e.getMessage());
}
registerProvider(measureProvider);
registerProvider(planDefProvider);
registerProvider(actDefProvider);
/*
* If you are hosting this server at a specific DNS name, the server will try to
* figure out the FHIR base URL based on what the web container tells it, but
* this doesn't always work. If you are setting links in your search bundles that
* just refer to "localhost", you might want to use a server address strategy:
*/
//setServerAddressStrategy(new HardcodedServerAddressStrategy("http://mydomain.com/fhir/baseDstu2"));
/*
* If you are using DSTU3+, you may want to add a terminology uploader, which allows
* uploading of external terminologies such as Snomed CT. Note that this uploader
* does not have any security attached (any anonymous user may use it by default)
* so it is a potential security vulnerability. Consider using an AuthorizationInterceptor
* with this feature.
*/
registerProvider(myAppCtx.getBean(TerminologyUploaderProvider.class));
}
public IResourceProvider getProvider(String name) {
for (IResourceProvider res : getResourceProviders()) {
if (res.getResourceType().getSimpleName().equals(name)) {
return res;
}
}
throw new IllegalArgumentException("This should never happen!");
}
}

View File

@ -0,0 +1,125 @@
package ca.uhn.fhir.jpa.cds.example;
import ca.uhn.fhir.jpa.config.BaseJavaConfigDstu3;
import ca.uhn.fhir.jpa.dao.DaoConfig;
import ca.uhn.fhir.jpa.search.LuceneSearchMappingFactory;
import ca.uhn.fhir.jpa.util.SubscriptionsRequireManualActivationInterceptorDstu3;
import ca.uhn.fhir.rest.server.interceptor.IServerInterceptor;
import ca.uhn.fhir.rest.server.interceptor.LoggingInterceptor;
import ca.uhn.fhir.rest.server.interceptor.ResponseHighlighterInterceptor;
import org.apache.commons.dbcp2.BasicDataSource;
import org.apache.commons.lang3.time.DateUtils;
import org.hibernate.jpa.HibernatePersistenceProvider;
import org.springframework.beans.factory.annotation.Autowire;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import javax.persistence.EntityManagerFactory;
import javax.sql.DataSource;
import java.util.Properties;
/**
* This is the primary configuration file for the example server
*/
@Configuration
@EnableTransactionManagement()
public class FhirServerConfig extends BaseJavaConfigDstu3 {
/**
* Configure FHIR properties around the the JPA server via this bean
*/
@Bean()
public DaoConfig daoConfig() {
DaoConfig retVal = new DaoConfig();
retVal.setSubscriptionEnabled(true);
retVal.setSubscriptionPollDelay(5000);
retVal.setSubscriptionPurgeInactiveAfterMillis(DateUtils.MILLIS_PER_HOUR);
retVal.setAllowMultipleDelete(true);
return retVal;
}
/**
* The following bean configures the database connection. The 'url' property value of "jdbc:derby:directory:jpaserver_derby_files;create=true" indicates that the server should save resources in a
* directory called "jpaserver_derby_files".
*
* A URL to a remote database could also be placed here, along with login credentials and other properties supported by BasicDataSource.
*/
@Bean(destroyMethod = "close")
public DataSource dataSource() {
BasicDataSource retVal = new BasicDataSource();
retVal.setDriver(new org.apache.derby.jdbc.EmbeddedDriver());
retVal.setUrl("jdbc:derby:directory:target/jpaserver_derby_files;create=true");
retVal.setUsername("");
retVal.setPassword("");
return retVal;
}
@Bean()
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean retVal = new LocalContainerEntityManagerFactoryBean();
retVal.setPersistenceUnitName("HAPI_PU");
retVal.setDataSource(dataSource());
retVal.setPackagesToScan("ca.uhn.fhir.jpa.entity");
retVal.setPersistenceProvider(new HibernatePersistenceProvider());
retVal.setJpaProperties(jpaProperties());
return retVal;
}
private Properties jpaProperties() {
Properties extraProperties = new Properties();
extraProperties.put("hibernate.dialect", org.hibernate.dialect.DerbyTenSevenDialect.class.getName());
extraProperties.put("hibernate.format_sql", "true");
extraProperties.put("hibernate.show_sql", "false");
extraProperties.put("hibernate.hbm2ddl.auto", "update");
extraProperties.put("hibernate.jdbc.batch_size", "20");
extraProperties.put("hibernate.cache.use_query_cache", "false");
extraProperties.put("hibernate.cache.use_second_level_cache", "false");
extraProperties.put("hibernate.cache.use_structured_entries", "false");
extraProperties.put("hibernate.cache.use_minimal_puts", "false");
extraProperties.put("hibernate.search.model_mapping", LuceneSearchMappingFactory.class.getName());
extraProperties.put("hibernate.search.default.directory_provider", "filesystem");
extraProperties.put("hibernate.search.default.indexBase", "target/lucenefiles");
extraProperties.put("hibernate.search.lucene_version", "LUCENE_CURRENT");
// extraProperties.put("hibernate.search.default.worker.execution", "async");
return extraProperties;
}
/**
* Do some fancy logging to create a nice access log that has details about each incoming request.
*/
public IServerInterceptor loggingInterceptor() {
LoggingInterceptor retVal = new LoggingInterceptor();
retVal.setLoggerName("fhirtest.access");
retVal.setMessageFormat(
"Path[${servletPath}] Source[${requestHeader.x-forwarded-for}] Operation[${operationType} ${operationName} ${idOrResourceName}] UA[${requestHeader.user-agent}] Params[${requestParameters}] ResponseEncoding[${responseEncodingNoDefault}]");
retVal.setLogExceptions(true);
retVal.setErrorMessageFormat("ERROR - ${requestVerb} ${requestUrl}");
return retVal;
}
/**
* This interceptor adds some pretty syntax highlighting in responses when a browser is detected
*/
@Bean(autowire = Autowire.BY_TYPE)
public IServerInterceptor responseHighlighterInterceptor() {
ResponseHighlighterInterceptor retVal = new ResponseHighlighterInterceptor();
return retVal;
}
@Bean(autowire = Autowire.BY_TYPE)
public IServerInterceptor subscriptionSecurityInterceptor() {
SubscriptionsRequireManualActivationInterceptorDstu3 retVal = new SubscriptionsRequireManualActivationInterceptorDstu3();
return retVal;
}
@Bean()
public JpaTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager retVal = new JpaTransactionManager();
retVal.setEntityManagerFactory(entityManagerFactory);
return retVal;
}
}

View File

@ -0,0 +1,56 @@
package ca.uhn.fhir.jpa.cds.example;
import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.to.FhirTesterMvcConfig;
import ca.uhn.fhir.to.TesterConfig;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
//@formatter:off
/**
* This spring config file configures the web testing module. It serves two
* purposes:
* 1. It imports FhirTesterMvcConfig, which is the spring config for the
* tester itself
* 2. It tells the tester which server(s) to talk to, via the testerConfig()
* method below
*/
@Configuration
@Import(FhirTesterMvcConfig.class)
public class FhirTesterConfig {
/**
* This bean tells the testing webpage which servers it should configure itself
* to communicate with. In this example we configure it to talk to the local
* server, as well as one public server. If you are creating a project to
* deploy somewhere else, you might choose to only put your own server's
* address here.
*
* Note the use of the ${serverBase} variable below. This will be replaced with
* the base URL as reported by the server itself. Often for a simple Tomcat
* (or other container) installation, this will end up being something
* like "http://localhost:8080/hapi-fhir-jpaserver-example". If you are
* deploying your server to a place with a fully qualified domain name,
* you might want to use that instead of using the variable.
*/
@Bean
public TesterConfig testerConfig() {
TesterConfig retVal = new TesterConfig();
retVal
.addServer()
.withId("home")
.withFhirVersion(FhirVersionEnum.DSTU3)
.withBaseUrl("${serverBase}/baseDstu3")
.withName("Local Tester")
.addServer()
.withId("hapi")
.withFhirVersion(FhirVersionEnum.DSTU3)
.withBaseUrl("http://fhirtest.uhn.ca/baseDstu3")
.withName("Public HAPI Test Server");
return retVal;
}
}
//@formatter:on

View File

@ -0,0 +1,67 @@
<!DOCTYPE html>
<html lang="en">
<head th:include="tmpl-head :: head">
<title>About This Server</title>
</head>
<body>
<form action="" method="get" id="outerForm">
<div th:replace="tmpl-navbar-top :: top" ></div>
<div class="container-fluid">
<div class="row">
<div th:replace="tmpl-navbar-left :: left" ></div>
<div class="col-sm-9 col-sm-offset-3 col-md-9 col-md-offset-3 main">
<div th:replace="tmpl-banner :: banner"></div>
<div class="panel panel-default">
<div class="panel-heading">
<h3 class="panel-title">About This Server</h3>
</div>
<div class="panel-body">
<div class="pull-right">
<object data="img/fhirtest-architecture.svg" width="383" height="369" type="image/svg+xml"></object>
</div>
<p>
This server provides a nearly complete implementation of the FHIR Specification
using a 100% open source software stack. It is hosted by University Health Network.
</p>
<p>
The architecture in use here is shown in the image on the right. This server is built
from a number of modules of the
<a href="https://github.com/jamesagnew/hapi-fhir/">HAPI FHIR</a>
project, which is a 100% open-source (Apache 2.0 Licensed) Java based
implementation of the FHIR specification.
</p>
<p>
</p>
</div>
</div>
<div class="panel panel-default">
<div class="panel-heading">
<h3 class="panel-title">Data On This Server</h3>
</div>
<div class="panel-body">
<p>
This server is regularly loaded with a standard set of test data sourced
from UHN's own testing environment. Do not use this server to store any data
that you will need later, as we will be regularly resetting it.
</p>
<p>
This is not a production server and it provides no privacy. Do not store any
confidential data here.
</p>
</div>
</div>
</div>
</div>
</div>
<div th:replace="tmpl-footer :: footer" ></div>
</form>
</body>
</html>

View File

@ -0,0 +1,16 @@
<!DOCTYPE html>
<html lang="en">
<div th:fragment="footer">
<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','//www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-1395874-6', 'auto');
ga('require', 'displayfeatures');
ga('require', 'linkid', 'linkid.js');
ga('send', 'pageview');
</script>
</div>
</html>

View File

@ -0,0 +1,52 @@
<!DOCTYPE html>
<html lang="en">
<div th:fragment="banner" class="well">
<th:block th:if="${serverId} == 'home'">
<p>
This is the home for the FHIR test server operated by
<a href="http://uhn.ca">University Health Network</a>. This server
(and the testing application you are currently using to access it)
is entirely built using
<a href="https://github.com/jamesagnew/hapi-fhir">HAPI-FHIR</a>,
a 100% open-source Java implementation of the
<a href="http://hl7.org/implement/standards/fhir/">FHIR specification</a>.
</p>
<p>
Here are some things you might wish to try:
</p>
<ul>
<li>
View a
<a href="http://fhirtest.uhn.ca/search?serverId=home&amp;encoding=json&amp;pretty=true&amp;resource=Patient&amp;param.0.type=string&amp;param.0.name=_id&amp;param.0.0=&amp;resource-search-limit=">list of patients</a>
on this server.
</li>
<li>
Construct a
<a href="http://fhirtest.uhn.ca/resource?serverId=home&amp;encoding=json&amp;pretty=true&amp;resource=Patient">search query</a>
on this server.
</li>
<li>
Access a
<a href="http://fhirtest.uhn.ca/home?serverId=furore">different server</a>
(use the <b>Server</b> menu at the top of the page to see a list of public FHIR servers)
</li>
</ul>
</th:block>
<th:block th:if="${serverId} != 'home'">
<p>
You are accessing the public FHIR server
<b th:text="${baseName}"/>. This server is hosted elsewhere on the internet
but is being accessed using the HAPI client implementation.
</p>
</th:block>
<p>
<b style="color: red;">
<span class="glyphicon glyphicon-warning-sign"/>
This is not a production server!
</b>
Do not store any information here that contains personal health information
or any other confidential information. This server will be regularly purged
and reloaded with fixed test data.
</p>
</div>
</html>

View File

@ -0,0 +1,124 @@
<web-app xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="3.0"
xsi:schemaLocation="http://java.sun.com/xml/ns/javaee ./xsd/web-app_3_0.xsd">
<listener>
<listener-class>org.springframework.web.context.ContextLoaderListener</listener-class>
</listener>
<context-param>
<param-name>contextClass</param-name>
<param-value>
org.springframework.web.context.support.AnnotationConfigWebApplicationContext
</param-value>
</context-param>
<context-param>
<param-name>contextConfigLocation</param-name>
<param-value>
ca.uhn.fhir.jpa.cds.example.FhirServerConfig
</param-value>
</context-param>
<!-- Servlets -->
<servlet>
<servlet-name>cdsServicesServlet</servlet-name>
<servlet-class>ca.uhn.fhir.jpa.cds.example.CdsHooksServerExample</servlet-class>
<load-on-startup>3</load-on-startup>
</servlet>
<servlet>
<servlet-name>spring</servlet-name>
<servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class>
<init-param>
<param-name>contextClass</param-name>
<param-value>org.springframework.web.context.support.AnnotationConfigWebApplicationContext</param-value>
</init-param>
<init-param>
<param-name>contextConfigLocation</param-name>
<param-value>ca.uhn.fhir.jpa.cds.example.FhirTesterConfig</param-value>
</init-param>
<load-on-startup>2</load-on-startup>
</servlet>
<servlet>
<servlet-name>fhirServlet</servlet-name>
<servlet-class>ca.uhn.fhir.jpa.cds.example.CdsServerExample</servlet-class>
<init-param>
<param-name>ImplementationDescription</param-name>
<param-value>FHIR JPA Server</param-value>
</init-param>
<init-param>
<param-name>FhirVersion</param-name>
<param-value>DSTU3</param-value>
</init-param>
<load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
<servlet-name>cdsServicesServlet</servlet-name>
<url-pattern>/cds-services</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>cdsServicesServlet</servlet-name>
<url-pattern>/cds-services/*</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>fhirServlet</servlet-name>
<url-pattern>/baseDstu3/*</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>spring</servlet-name>
<url-pattern>/</url-pattern>
</servlet-mapping>
<!-- This filters provide support for Cross Origin Resource Sharing (CORS) -->
<filter>
<filter-name>CORS Filter</filter-name>
<filter-class>org.ebaysf.web.cors.CORSFilter</filter-class>
<init-param>
<description>A comma separated list of allowed origins. Note: An '*' cannot be used for an allowed origin when using credentials.</description>
<param-name>cors.allowed.origins</param-name>
<param-value>*</param-value>
</init-param>
<init-param>
<description>A comma separated list of HTTP verbs, using which a CORS request can be made.</description>
<param-name>cors.allowed.methods</param-name>
<param-value>GET,POST,PUT,DELETE,OPTIONS</param-value>
</init-param>
<init-param>
<description>A comma separated list of allowed headers when making a non simple CORS request.</description>
<param-name>cors.allowed.headers</param-name>
<param-value>X-FHIR-Starter,Origin,Accept,X-Requested-With,Content-Type,Access-Control-Request-Method,Access-Control-Request-Headers,Prefer</param-value>
</init-param>
<init-param>
<description>A comma separated list non-standard response headers that will be exposed to XHR2 object.</description>
<param-name>cors.exposed.headers</param-name>
<param-value>Location,Content-Location</param-value>
</init-param>
<init-param>
<description>A flag that suggests if CORS is supported with cookies</description>
<param-name>cors.support.credentials</param-name>
<param-value>true</param-value>
</init-param>
<init-param>
<description>A flag to control logging</description>
<param-name>cors.logging.enabled</param-name>
<param-value>true</param-value>
</init-param>
<init-param>
<description>Indicates how long (in seconds) the results of a preflight request can be cached in a preflight result cache.</description>
<param-name>cors.preflight.maxage</param-name>
<param-value>300</param-value>
</init-param>
</filter>
<filter-mapping>
<filter-name>CORS Filter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
</web-app>

View File

@ -0,0 +1,389 @@
<?xml version="1.0" encoding="UTF-8"?>
<xsd:schema xmlns="http://www.w3.org/2001/XMLSchema"
targetNamespace="http://java.sun.com/xml/ns/javaee"
xmlns:javaee="http://java.sun.com/xml/ns/javaee"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
elementFormDefault="qualified"
attributeFormDefault="unqualified"
version="2.2">
<xsd:annotation>
<xsd:documentation>
DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS HEADER.
Copyright 2003-2009 Sun Microsystems, Inc. All rights reserved.
The contents of this file are subject to the terms of either the
GNU General Public License Version 2 only ("GPL") or the Common
Development and Distribution License("CDDL") (collectively, the
"License"). You may not use this file except in compliance with
the License. You can obtain a copy of the License at
https://glassfish.dev.java.net/public/CDDL+GPL.html or
glassfish/bootstrap/legal/LICENSE.txt. See the License for the
specific language governing permissions and limitations under the
License.
When distributing the software, include this License Header
Notice in each file and include the License file at
glassfish/bootstrap/legal/LICENSE.txt. Sun designates this
particular file as subject to the "Classpath" exception as
provided by Sun in the GPL Version 2 section of the License file
that accompanied this code. If applicable, add the following
below the License Header, with the fields enclosed by brackets []
replaced by your own identifying information:
"Portions Copyrighted [year] [name of copyright owner]"
Contributor(s):
If you wish your version of this file to be governed by only the
CDDL or only the GPL Version 2, indicate your decision by adding
"[Contributor] elects to include this software in this
distribution under the [CDDL or GPL Version 2] license." If you
don't indicate a single choice of license, a recipient has the
option to distribute your version of this file under either the
CDDL, the GPL Version 2 or to extend the choice of license to its
licensees as provided above. However, if you add GPL Version 2
code and therefore, elected the GPL Version 2 license, then the
option applies only if the new code is made subject to such
option by the copyright holder.
</xsd:documentation>
</xsd:annotation>
<xsd:annotation>
<xsd:documentation>
This is the XML Schema for the JSP 2.2 deployment descriptor
types. The JSP 2.2 schema contains all the special
structures and datatypes that are necessary to use JSP files
from a web application.
The contents of this schema is used by the web-common_3_0.xsd
file to define JSP specific content.
</xsd:documentation>
</xsd:annotation>
<xsd:annotation>
<xsd:documentation>
The following conventions apply to all Java EE
deployment descriptor elements unless indicated otherwise.
- In elements that specify a pathname to a file within the
same JAR file, relative filenames (i.e., those not
starting with "/") are considered relative to the root of
the JAR file's namespace. Absolute filenames (i.e., those
starting with "/") also specify names in the root of the
JAR file's namespace. In general, relative names are
preferred. The exception is .war files where absolute
names are preferred for consistency with the Servlet API.
</xsd:documentation>
</xsd:annotation>
<xsd:include schemaLocation="javaee_6.xsd"/>
<!-- **************************************************** -->
<xsd:complexType name="jsp-configType">
<xsd:annotation>
<xsd:documentation>
The jsp-configType is used to provide global configuration
information for the JSP files in a web application. It has
two subelements, taglib and jsp-property-group.
</xsd:documentation>
</xsd:annotation>
<xsd:sequence>
<xsd:element name="taglib"
type="javaee:taglibType"
minOccurs="0"
maxOccurs="unbounded"/>
<xsd:element name="jsp-property-group"
type="javaee:jsp-property-groupType"
minOccurs="0"
maxOccurs="unbounded"/>
</xsd:sequence>
<xsd:attribute name="id"
type="xsd:ID"/>
</xsd:complexType>
<!-- **************************************************** -->
<xsd:complexType name="jsp-fileType">
<xsd:annotation>
<xsd:documentation>
The jsp-file element contains the full path to a JSP file
within the web application beginning with a `/'.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleContent>
<xsd:restriction base="javaee:pathType"/>
</xsd:simpleContent>
</xsd:complexType>
<!-- **************************************************** -->
<xsd:complexType name="jsp-property-groupType">
<xsd:annotation>
<xsd:documentation>
The jsp-property-groupType is used to group a number of
files so they can be given global property information.
All files so described are deemed to be JSP files. The
following additional properties can be described:
- Control whether EL is ignored.
- Control whether scripting elements are invalid.
- Indicate pageEncoding information.
- Indicate that a resource is a JSP document (XML).
- Prelude and Coda automatic includes.
- Control whether the character sequence #{ is allowed
when used as a String literal.
- Control whether template text containing only
whitespaces must be removed from the response output.
- Indicate the default contentType information.
- Indicate the default buffering model for JspWriter
- Control whether error should be raised for the use of
undeclared namespaces in a JSP page.
</xsd:documentation>
</xsd:annotation>
<xsd:sequence>
<xsd:group ref="javaee:descriptionGroup"/>
<xsd:element name="url-pattern"
type="javaee:url-patternType"
maxOccurs="unbounded"/>
<xsd:element name="el-ignored"
type="javaee:true-falseType"
minOccurs="0">
<xsd:annotation>
<xsd:documentation>
Can be used to easily set the isELIgnored
property of a group of JSP pages. By default, the
EL evaluation is enabled for Web Applications using
a Servlet 2.4 or greater web.xml, and disabled
otherwise.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="page-encoding"
type="javaee:string"
minOccurs="0">
<xsd:annotation>
<xsd:documentation>
The valid values of page-encoding are those of the
pageEncoding page directive. It is a
translation-time error to name different encodings
in the pageEncoding attribute of the page directive
of a JSP page and in a JSP configuration element
matching the page. It is also a translation-time
error to name different encodings in the prolog
or text declaration of a document in XML syntax and
in a JSP configuration element matching the document.
It is legal to name the same encoding through
mulitple mechanisms.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="scripting-invalid"
type="javaee:true-falseType"
minOccurs="0">
<xsd:annotation>
<xsd:documentation>
Can be used to easily disable scripting in a
group of JSP pages. By default, scripting is
enabled.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="is-xml"
type="javaee:true-falseType"
minOccurs="0">
<xsd:annotation>
<xsd:documentation>
If true, denotes that the group of resources
that match the URL pattern are JSP documents,
and thus must be interpreted as XML documents.
If false, the resources are assumed to not
be JSP documents, unless there is another
property group that indicates otherwise.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="include-prelude"
type="javaee:pathType"
minOccurs="0"
maxOccurs="unbounded">
<xsd:annotation>
<xsd:documentation>
The include-prelude element is a context-relative
path that must correspond to an element in the
Web Application. When the element is present,
the given path will be automatically included (as
in an include directive) at the beginning of each
JSP page in this jsp-property-group.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="include-coda"
type="javaee:pathType"
minOccurs="0"
maxOccurs="unbounded">
<xsd:annotation>
<xsd:documentation>
The include-coda element is a context-relative
path that must correspond to an element in the
Web Application. When the element is present,
the given path will be automatically included (as
in an include directive) at the end of each
JSP page in this jsp-property-group.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="deferred-syntax-allowed-as-literal"
type="javaee:true-falseType"
minOccurs="0">
<xsd:annotation>
<xsd:documentation>
The character sequence #{ is reserved for EL expressions.
Consequently, a translation error occurs if the #{
character sequence is used as a String literal, unless
this element is enabled (true). Disabled (false) by
default.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="trim-directive-whitespaces"
type="javaee:true-falseType"
minOccurs="0">
<xsd:annotation>
<xsd:documentation>
Indicates that template text containing only whitespaces
must be removed from the response output. It has no
effect on JSP documents (XML syntax). Disabled (false)
by default.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="default-content-type"
type="javaee:string"
minOccurs="0">
<xsd:annotation>
<xsd:documentation>
The valid values of default-content-type are those of the
contentType page directive. It specifies the default
response contentType if the page directive does not include
a contentType attribute.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="buffer"
type="javaee:string"
minOccurs="0">
<xsd:annotation>
<xsd:documentation>
The valid values of buffer are those of the
buffer page directive. It specifies if buffering should be
used for the output to response, and if so, the size of the
buffer to use.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="error-on-undeclared-namespace"
type="javaee:true-falseType"
minOccurs="0">
<xsd:annotation>
<xsd:documentation>
The default behavior when a tag with unknown namespace is used
in a JSP page (regular syntax) is to silently ignore it. If
set to true, then an error must be raised during the translation
time when an undeclared tag is used in a JSP page. Disabled
(false) by default.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="id"
type="xsd:ID"/>
</xsd:complexType>
<!-- **************************************************** -->
<xsd:complexType name="taglibType">
<xsd:annotation>
<xsd:documentation>
The taglibType defines the syntax for declaring in
the deployment descriptor that a tag library is
available to the application. This can be done
to override implicit map entries from TLD files and
from the container.
</xsd:documentation>
</xsd:annotation>
<xsd:sequence>
<xsd:element name="taglib-uri"
type="javaee:string">
<xsd:annotation>
<xsd:documentation>
A taglib-uri element describes a URI identifying a
tag library used in the web application. The body
of the taglib-uri element may be either an
absolute URI specification, or a relative URI.
There should be no entries in web.xml with the
same taglib-uri value.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
<xsd:element name="taglib-location"
type="javaee:pathType">
<xsd:annotation>
<xsd:documentation>
the taglib-location element contains the location
(as a resource relative to the root of the web
application) where to find the Tag Library
Description file for the tag library.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="id"
type="xsd:ID"/>
</xsd:complexType>
</xsd:schema>

View File

@ -0,0 +1,272 @@
<?xml version="1.0" encoding="UTF-8"?>
<xsd:schema xmlns="http://www.w3.org/2001/XMLSchema"
targetNamespace="http://java.sun.com/xml/ns/javaee"
xmlns:javaee="http://java.sun.com/xml/ns/javaee"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
elementFormDefault="qualified"
attributeFormDefault="unqualified"
version="3.0">
<xsd:annotation>
<xsd:documentation>
DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS HEADER.
Copyright 2003-2009 Sun Microsystems, Inc. All rights reserved.
The contents of this file are subject to the terms of either the
GNU General Public License Version 2 only ("GPL") or the Common
Development and Distribution License("CDDL") (collectively, the
"License"). You may not use this file except in compliance with
the License. You can obtain a copy of the License at
https://glassfish.dev.java.net/public/CDDL+GPL.html or
glassfish/bootstrap/legal/LICENSE.txt. See the License for the
specific language governing permissions and limitations under the
License.
When distributing the software, include this License Header
Notice in each file and include the License file at
glassfish/bootstrap/legal/LICENSE.txt. Sun designates this
particular file as subject to the "Classpath" exception as
provided by Sun in the GPL Version 2 section of the License file
that accompanied this code. If applicable, add the following
below the License Header, with the fields enclosed by brackets []
replaced by your own identifying information:
"Portions Copyrighted [year] [name of copyright owner]"
Contributor(s):
If you wish your version of this file to be governed by only the
CDDL or only the GPL Version 2, indicate your decision by adding
"[Contributor] elects to include this software in this
distribution under the [CDDL or GPL Version 2] license." If you
don't indicate a single choice of license, a recipient has the
option to distribute your version of this file under either the
CDDL, the GPL Version 2 or to extend the choice of license to its
licensees as provided above. However, if you add GPL Version 2
code and therefore, elected the GPL Version 2 license, then the
option applies only if the new code is made subject to such
option by the copyright holder.
</xsd:documentation>
</xsd:annotation>
<xsd:annotation>
<xsd:documentation>
<![CDATA[[
This is the XML Schema for the Servlet 3.0 deployment descriptor.
The deployment descriptor must be named "WEB-INF/web.xml" in the
web application's war file. All Servlet deployment descriptors
must indicate the web application schema by using the Java EE
namespace:
http://java.sun.com/xml/ns/javaee
and by indicating the version of the schema by
using the version element as shown below:
<web-app xmlns="http://java.sun.com/xml/ns/javaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="..."
version="3.0">
...
</web-app>
The instance documents may indicate the published version of
the schema using the xsi:schemaLocation attribute for Java EE
namespace with the following location:
http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd
]]>
</xsd:documentation>
</xsd:annotation>
<xsd:annotation>
<xsd:documentation>
The following conventions apply to all Java EE
deployment descriptor elements unless indicated otherwise.
- In elements that specify a pathname to a file within the
same JAR file, relative filenames (i.e., those not
starting with "/") are considered relative to the root of
the JAR file's namespace. Absolute filenames (i.e., those
starting with "/") also specify names in the root of the
JAR file's namespace. In general, relative names are
preferred. The exception is .war files where absolute
names are preferred for consistency with the Servlet API.
</xsd:documentation>
</xsd:annotation>
<xsd:include schemaLocation="web-common_3_0.xsd"/>
<!-- **************************************************** -->
<xsd:element name="web-app"
type="javaee:web-appType">
<xsd:annotation>
<xsd:documentation>
The web-app element is the root of the deployment
descriptor for a web application. Note that the sub-elements
of this element can be in the arbitrary order. Because of
that, the multiplicity of the elements of distributable,
session-config, welcome-file-list, jsp-config, login-config,
and locale-encoding-mapping-list was changed from "?" to "*"
in this schema. However, the deployment descriptor instance
file must not contain multiple elements of session-config,
jsp-config, and login-config. When there are multiple elements of
welcome-file-list or locale-encoding-mapping-list, the container
must concatenate the element contents. The multiple occurence
of the element distributable is redundant and the container
treats that case exactly in the same way when there is only
one distributable.
</xsd:documentation>
</xsd:annotation>
<xsd:unique name="web-common-servlet-name-uniqueness">
<xsd:annotation>
<xsd:documentation>
The servlet element contains the name of a servlet.
The name must be unique within the web application.
</xsd:documentation>
</xsd:annotation>
<xsd:selector xpath="javaee:servlet"/>
<xsd:field xpath="javaee:servlet-name"/>
</xsd:unique>
<xsd:unique name="web-common-filter-name-uniqueness">
<xsd:annotation>
<xsd:documentation>
The filter element contains the name of a filter.
The name must be unique within the web application.
</xsd:documentation>
</xsd:annotation>
<xsd:selector xpath="javaee:filter"/>
<xsd:field xpath="javaee:filter-name"/>
</xsd:unique>
<xsd:unique name="web-common-ejb-local-ref-name-uniqueness">
<xsd:annotation>
<xsd:documentation>
The ejb-local-ref-name element contains the name of an EJB
reference. The EJB reference is an entry in the web
application's environment and is relative to the
java:comp/env context. The name must be unique within
the web application.
It is recommended that name is prefixed with "ejb/".
</xsd:documentation>
</xsd:annotation>
<xsd:selector xpath="javaee:ejb-local-ref"/>
<xsd:field xpath="javaee:ejb-ref-name"/>
</xsd:unique>
<xsd:unique name="web-common-ejb-ref-name-uniqueness">
<xsd:annotation>
<xsd:documentation>
The ejb-ref-name element contains the name of an EJB
reference. The EJB reference is an entry in the web
application's environment and is relative to the
java:comp/env context. The name must be unique within
the web application.
It is recommended that name is prefixed with "ejb/".
</xsd:documentation>
</xsd:annotation>
<xsd:selector xpath="javaee:ejb-ref"/>
<xsd:field xpath="javaee:ejb-ref-name"/>
</xsd:unique>
<xsd:unique name="web-common-resource-env-ref-uniqueness">
<xsd:annotation>
<xsd:documentation>
The resource-env-ref-name element specifies the name of
a resource environment reference; its value is the
environment entry name used in the web application code.
The name is a JNDI name relative to the java:comp/env
context and must be unique within a web application.
</xsd:documentation>
</xsd:annotation>
<xsd:selector xpath="javaee:resource-env-ref"/>
<xsd:field xpath="javaee:resource-env-ref-name"/>
</xsd:unique>
<xsd:unique name="web-common-message-destination-ref-uniqueness">
<xsd:annotation>
<xsd:documentation>
The message-destination-ref-name element specifies the name of
a message destination reference; its value is the
environment entry name used in the web application code.
The name is a JNDI name relative to the java:comp/env
context and must be unique within a web application.
</xsd:documentation>
</xsd:annotation>
<xsd:selector xpath="javaee:message-destination-ref"/>
<xsd:field xpath="javaee:message-destination-ref-name"/>
</xsd:unique>
<xsd:unique name="web-common-res-ref-name-uniqueness">
<xsd:annotation>
<xsd:documentation>
The res-ref-name element specifies the name of a
resource manager connection factory reference. The name
is a JNDI name relative to the java:comp/env context.
The name must be unique within a web application.
</xsd:documentation>
</xsd:annotation>
<xsd:selector xpath="javaee:resource-ref"/>
<xsd:field xpath="javaee:res-ref-name"/>
</xsd:unique>
<xsd:unique name="web-common-env-entry-name-uniqueness">
<xsd:annotation>
<xsd:documentation>
The env-entry-name element contains the name of a web
application's environment entry. The name is a JNDI
name relative to the java:comp/env context. The name
must be unique within a web application.
</xsd:documentation>
</xsd:annotation>
<xsd:selector xpath="javaee:env-entry"/>
<xsd:field xpath="javaee:env-entry-name"/>
</xsd:unique>
<xsd:key name="web-common-role-name-key">
<xsd:annotation>
<xsd:documentation>
A role-name-key is specified to allow the references
from the security-role-refs.
</xsd:documentation>
</xsd:annotation>
<xsd:selector xpath="javaee:security-role"/>
<xsd:field xpath="javaee:role-name"/>
</xsd:key>
<xsd:keyref name="web-common-role-name-references"
refer="javaee:web-common-role-name-key">
<xsd:annotation>
<xsd:documentation>
The keyref indicates the references from
security-role-ref to a specified role-name.
</xsd:documentation>
</xsd:annotation>
<xsd:selector xpath="javaee:servlet/javaee:security-role-ref"/>
<xsd:field xpath="javaee:role-link"/>
</xsd:keyref>
</xsd:element>
</xsd:schema>

View File

@ -0,0 +1,287 @@
<?xml version='1.0'?>
<?xml-stylesheet href="../2008/09/xsd.xsl" type="text/xsl"?>
<xs:schema targetNamespace="http://www.w3.org/XML/1998/namespace"
xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns ="http://www.w3.org/1999/xhtml"
xml:lang="en">
<xs:annotation>
<xs:documentation>
<div>
<h1>About the XML namespace</h1>
<div class="bodytext">
<p>
This schema document describes the XML namespace, in a form
suitable for import by other schema documents.
</p>
<p>
See <a href="http://www.w3.org/XML/1998/namespace.html">
http://www.w3.org/XML/1998/namespace.html</a> and
<a href="http://www.w3.org/TR/REC-xml">
http://www.w3.org/TR/REC-xml</a> for information
about this namespace.
</p>
<p>
Note that local names in this namespace are intended to be
defined only by the World Wide Web Consortium or its subgroups.
The names currently defined in this namespace are listed below.
They should not be used with conflicting semantics by any Working
Group, specification, or document instance.
</p>
<p>
See further below in this document for more information about <a
href="#usage">how to refer to this schema document from your own
XSD schema documents</a> and about <a href="#nsversioning">the
namespace-versioning policy governing this schema document</a>.
</p>
</div>
</div>
</xs:documentation>
</xs:annotation>
<xs:attribute name="lang">
<xs:annotation>
<xs:documentation>
<div>
<h3>lang (as an attribute name)</h3>
<p>
denotes an attribute whose value
is a language code for the natural language of the content of
any element; its value is inherited. This name is reserved
by virtue of its definition in the XML specification.</p>
</div>
<div>
<h4>Notes</h4>
<p>
Attempting to install the relevant ISO 2- and 3-letter
codes as the enumerated possible values is probably never
going to be a realistic possibility.
</p>
<p>
See BCP 47 at <a href="http://www.rfc-editor.org/rfc/bcp/bcp47.txt">
http://www.rfc-editor.org/rfc/bcp/bcp47.txt</a>
and the IANA language subtag registry at
<a href="http://www.iana.org/assignments/language-subtag-registry">
http://www.iana.org/assignments/language-subtag-registry</a>
for further information.
</p>
<p>
The union allows for the 'un-declaration' of xml:lang with
the empty string.
</p>
</div>
</xs:documentation>
</xs:annotation>
<xs:simpleType>
<xs:union memberTypes="xs:language">
<xs:simpleType>
<xs:restriction base="xs:string">
<xs:enumeration value=""/>
</xs:restriction>
</xs:simpleType>
</xs:union>
</xs:simpleType>
</xs:attribute>
<xs:attribute name="space">
<xs:annotation>
<xs:documentation>
<div>
<h3>space (as an attribute name)</h3>
<p>
denotes an attribute whose
value is a keyword indicating what whitespace processing
discipline is intended for the content of the element; its
value is inherited. This name is reserved by virtue of its
definition in the XML specification.</p>
</div>
</xs:documentation>
</xs:annotation>
<xs:simpleType>
<xs:restriction base="xs:NCName">
<xs:enumeration value="default"/>
<xs:enumeration value="preserve"/>
</xs:restriction>
</xs:simpleType>
</xs:attribute>
<xs:attribute name="base" type="xs:anyURI"> <xs:annotation>
<xs:documentation>
<div>
<h3>base (as an attribute name)</h3>
<p>
denotes an attribute whose value
provides a URI to be used as the base for interpreting any
relative URIs in the scope of the element on which it
appears; its value is inherited. This name is reserved
by virtue of its definition in the XML Base specification.</p>
<p>
See <a
href="http://www.w3.org/TR/xmlbase/">http://www.w3.org/TR/xmlbase/</a>
for information about this attribute.
</p>
</div>
</xs:documentation>
</xs:annotation>
</xs:attribute>
<xs:attribute name="id" type="xs:ID">
<xs:annotation>
<xs:documentation>
<div>
<h3>id (as an attribute name)</h3>
<p>
denotes an attribute whose value
should be interpreted as if declared to be of type ID.
This name is reserved by virtue of its definition in the
xml:id specification.</p>
<p>
See <a
href="http://www.w3.org/TR/xml-id/">http://www.w3.org/TR/xml-id/</a>
for information about this attribute.
</p>
</div>
</xs:documentation>
</xs:annotation>
</xs:attribute>
<xs:attributeGroup name="specialAttrs">
<xs:attribute ref="xml:base"/>
<xs:attribute ref="xml:lang"/>
<xs:attribute ref="xml:space"/>
<xs:attribute ref="xml:id"/>
</xs:attributeGroup>
<xs:annotation>
<xs:documentation>
<div>
<h3>Father (in any context at all)</h3>
<div class="bodytext">
<p>
denotes Jon Bosak, the chair of
the original XML Working Group. This name is reserved by
the following decision of the W3C XML Plenary and
XML Coordination groups:
</p>
<blockquote>
<p>
In appreciation for his vision, leadership and
dedication the W3C XML Plenary on this 10th day of
February, 2000, reserves for Jon Bosak in perpetuity
the XML name "xml:Father".
</p>
</blockquote>
</div>
</div>
</xs:documentation>
</xs:annotation>
<xs:annotation>
<xs:documentation>
<div xml:id="usage" id="usage">
<h2><a name="usage">About this schema document</a></h2>
<div class="bodytext">
<p>
This schema defines attributes and an attribute group suitable
for use by schemas wishing to allow <code>xml:base</code>,
<code>xml:lang</code>, <code>xml:space</code> or
<code>xml:id</code> attributes on elements they define.
</p>
<p>
To enable this, such a schema must import this schema for
the XML namespace, e.g. as follows:
</p>
<pre>
&lt;schema . . .>
. . .
&lt;import namespace="http://www.w3.org/XML/1998/namespace"
schemaLocation="http://www.w3.org/2001/xml.xsd"/>
</pre>
<p>
or
</p>
<pre>
&lt;import namespace="http://www.w3.org/XML/1998/namespace"
schemaLocation="http://www.w3.org/2009/01/xml.xsd"/>
</pre>
<p>
Subsequently, qualified reference to any of the attributes or the
group defined below will have the desired effect, e.g.
</p>
<pre>
&lt;type . . .>
. . .
&lt;attributeGroup ref="xml:specialAttrs"/>
</pre>
<p>
will define a type which will schema-validate an instance element
with any of those attributes.
</p>
</div>
</div>
</xs:documentation>
</xs:annotation>
<xs:annotation>
<xs:documentation>
<div id="nsversioning" xml:id="nsversioning">
<h2><a name="nsversioning">Versioning policy for this schema document</a></h2>
<div class="bodytext">
<p>
In keeping with the XML Schema WG's standard versioning
policy, this schema document will persist at
<a href="http://www.w3.org/2009/01/xml.xsd">
http://www.w3.org/2009/01/xml.xsd</a>.
</p>
<p>
At the date of issue it can also be found at
<a href="http://www.w3.org/2001/xml.xsd">
http://www.w3.org/2001/xml.xsd</a>.
</p>
<p>
The schema document at that URI may however change in the future,
in order to remain compatible with the latest version of XML
Schema itself, or with the XML namespace itself. In other words,
if the XML Schema or XML namespaces change, the version of this
document at <a href="http://www.w3.org/2001/xml.xsd">
http://www.w3.org/2001/xml.xsd
</a>
will change accordingly; the version at
<a href="http://www.w3.org/2009/01/xml.xsd">
http://www.w3.org/2009/01/xml.xsd
</a>
will not change.
</p>
<p>
Previous dated (and unchanging) versions of this schema
document are at:
</p>
<ul>
<li><a href="http://www.w3.org/2009/01/xml.xsd">
http://www.w3.org/2009/01/xml.xsd</a></li>
<li><a href="http://www.w3.org/2007/08/xml.xsd">
http://www.w3.org/2007/08/xml.xsd</a></li>
<li><a href="http://www.w3.org/2004/10/xml.xsd">
http://www.w3.org/2004/10/xml.xsd</a></li>
<li><a href="http://www.w3.org/2001/03/xml.xsd">
http://www.w3.org/2001/03/xml.xsd</a></li>
</ul>
</div>
</div>
</xs:documentation>
</xs:annotation>
</xs:schema>

View File

@ -0,0 +1,260 @@
package ca.uhn.fhir.jpa.cds.example;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.rest.client.api.IGenericClient;
import ca.uhn.fhir.rest.client.api.ServerValidationModeEnum;
import ca.uhn.fhir.rest.client.interceptor.LoggingInterceptor;
import ca.uhn.fhir.rest.server.IResourceProvider;
import org.eclipse.jetty.server.Server;
import org.eclipse.jetty.webapp.WebAppContext;
import org.hl7.fhir.dstu3.model.*;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.junit.AfterClass;
import org.junit.Assert;
import org.junit.BeforeClass;
import org.junit.Test;
import java.io.*;
import java.net.HttpURLConnection;
import java.net.ServerSocket;
import java.net.URL;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.Scanner;
public class CdsExampleTests {
private static IGenericClient ourClient;
private static FhirContext ourCtx = FhirContext.forDstu3();
protected static int ourPort;
private static Server ourServer;
private static String ourServerBase;
private static Collection<IResourceProvider> providers;
@BeforeClass
public static void beforeClass() throws Exception {
String path = Paths.get("").toAbsolutePath().toString();
ourPort = RandomServerPortProvider.findFreePort();
ourServer = new Server(ourPort);
WebAppContext webAppContext = new WebAppContext();
webAppContext.setContextPath("/hapi-fhir-jpaserver-cds");
webAppContext.setDescriptor(path + "/src/main/webapp/WEB-INF/web.xml");
webAppContext.setResourceBase(path + "/target/hapi-fhir-jpaserver-cds");
webAppContext.setParentLoaderPriority(true);
ourServer.setHandler(webAppContext);
ourServer.start();
ourCtx.getRestfulClientFactory().setServerValidationMode(ServerValidationModeEnum.NEVER);
ourCtx.getRestfulClientFactory().setSocketTimeout(1200 * 1000);
ourServerBase = "http://localhost:" + ourPort + "/hapi-fhir-jpaserver-cds/baseDstu3";
ourClient = ourCtx.newRestfulGenericClient(ourServerBase);
ourClient.registerInterceptor(new LoggingInterceptor(true));
// Load test data
// Normally, I would use a transaction bundle, but issues with the random ports prevents that...
// So, doing it the old-fashioned way =)
// General
putResource("general-practitioner.json", "Practitioner-12208");
putResource("general-patient.json", "Patient-12214");
putResource("general-fhirhelpers-3.json", "FHIRHelpers");
}
@AfterClass
public static void afterClass() throws Exception {
ourServer.stop();
}
private static void putResource(String resourceFileName, String id) {
InputStream is = CdsExampleTests.class.getResourceAsStream(resourceFileName);
Scanner scanner = new Scanner(is).useDelimiter("\\A");
String json = scanner.hasNext() ? scanner.next() : "";
IBaseResource resource = ourCtx.newJsonParser().parseResource(json);
ourClient.update(id, resource);
}
@Test
public void MeasureProcessingTest() {
putResource("measure-processing-library.json", "col-logic");
putResource("measure-processing-measure.json", "col");
putResource("measure-processing-procedure.json", "Procedure-9");
putResource("measure-processing-condition.json", "Condition-13");
putResource("measure-processing-valueset-1.json", "2.16.840.1.113883.3.464.1003.108.11.1001");
putResource("measure-processing-valueset-2.json", "2.16.840.1.113883.3.464.1003.198.12.1019");
putResource("measure-processing-valueset-3.json", "2.16.840.1.113883.3.464.1003.108.12.1020");
putResource("measure-processing-valueset-4.json", "2.16.840.1.113883.3.464.1003.198.12.1010");
putResource("measure-processing-valueset-5.json", "2.16.840.1.113883.3.464.1003.198.12.1011");
Parameters inParams = new Parameters();
inParams.addParameter().setName("patient").setValue(new StringType("Patient-12214"));
inParams.addParameter().setName("startPeriod").setValue(new DateType("2001-01-01"));
inParams.addParameter().setName("endPeriod").setValue(new DateType("2015-03-01"));
Parameters outParams = ourClient
.operation()
.onInstance(new IdDt("Measure", "col"))
.named("$evaluate")
.withParameters(inParams)
.useHttpGet()
.execute();
List<Parameters.ParametersParameterComponent> response = outParams.getParameter();
Assert.assertTrue(!response.isEmpty());
Parameters.ParametersParameterComponent component = response.get(0);
Assert.assertTrue(component.getResource() instanceof MeasureReport);
MeasureReport report = (MeasureReport) component.getResource();
Assert.assertTrue(report.getEvaluatedResources() != null);
for (MeasureReport.MeasureReportGroupComponent group : report.getGroup()) {
if (group.getIdentifier().getValue().equals("history-of-colorectal-cancer")) {
Assert.assertTrue(group.getPopulation().get(0).getCount() > 0);
}
if (group.getIdentifier().getValue().equals("history-of-total-colectomy")) {
Assert.assertTrue(group.getPopulation().get(0).getCount() > 0);
}
}
}
@Test
public void PlanDefinitionApplyTest() throws ClassNotFoundException {
putResource("plandefinition-apply-library.json", "plandefinitionApplyTest");
putResource("plandefinition-apply.json", "apply-example");
Parameters inParams = new Parameters();
inParams.addParameter().setName("patient").setValue(new StringType("Patient-12214"));
Parameters outParams = ourClient
.operation()
.onInstance(new IdDt("PlanDefinition", "apply-example"))
.named("$apply")
.withParameters(inParams)
.useHttpGet()
.execute();
List<Parameters.ParametersParameterComponent> response = outParams.getParameter();
Assert.assertTrue(!response.isEmpty());
Resource resource = response.get(0).getResource();
Assert.assertTrue(resource instanceof CarePlan);
CarePlan carePlan = (CarePlan) resource;
Assert.assertTrue(carePlan.getTitle().equals("This is a dynamic definition!"));
}
@Test
public void ActivityDefinitionApplyTest() {
putResource("activitydefinition-apply-library.json", "activityDefinitionApplyTest");
putResource("activitydefinition-apply.json", "ad-apply-example");
Parameters inParams = new Parameters();
inParams.addParameter().setName("patient").setValue(new StringType("Patient-12214"));
Parameters outParams = ourClient
.operation()
.onInstance(new IdDt("ActivityDefinition", "ad-apply-example"))
.named("$apply")
.withParameters(inParams)
.useHttpGet()
.execute();
List<Parameters.ParametersParameterComponent> response = outParams.getParameter();
Assert.assertTrue(!response.isEmpty());
Resource resource = response.get(0).getResource();
Assert.assertTrue(resource instanceof ProcedureRequest);
ProcedureRequest procedureRequest = (ProcedureRequest) resource;
Assert.assertTrue(procedureRequest.getDoNotPerform());
}
//@Test
public void CdsHooksPatientViewTest() throws IOException {
putResource("cds-bcs-library.json", "patient-view");
putResource("cds-bcs-patient.json", "Patient-6532");
putResource("cds-bcs-plandefinition.json", "bcs-decision-support");
putResource("cds-bcs-activitydefinition.json", "mammogram-service-request");
// Get the CDS Hooks request
InputStream is = this.getClass().getResourceAsStream("cds-bcs-request.json");
Scanner scanner = new Scanner(is).useDelimiter("\\A");
String cdsHooksRequest = scanner.hasNext() ? scanner.next() : "";
byte[] data = cdsHooksRequest.getBytes("UTF-8");
URL url = new URL("http://localhost:" + ourPort + "/hapi-fhir-jpaserver-cds/cds-services/bcs-decision-support");
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("POST");
conn.setRequestProperty("Content-Type", "application/json");
conn.setRequestProperty("Content-Length", String.valueOf(data.length));
conn.setDoOutput(true);
conn.getOutputStream().write(data);
StringBuilder response = new StringBuilder();
try(Reader in = new BufferedReader(new InputStreamReader(conn.getInputStream(), "UTF-8")))
{
for (int i; (i = in.read()) >= 0;) {
response.append((char) i);
}
}
String expected = "{\n" +
" \"cards\": [\n" +
" {\n" +
" \"summary\": \"High risk for opioid overdose - taper now\",\n" +
" \"indicator\": \"warning\",\n" +
" \"detail\": \"Total morphine milligram equivalent (MME) is 20200.700mg/d. Taper to less than 50.\"\n" +
" }\n" +
" ]\n" +
"}";
Assert.assertTrue(
response.toString().replaceAll("\\s+", "")
.equals(expected.replaceAll("\\s+", ""))
);
}
}
class RandomServerPortProvider {
private static List<Integer> ourPorts = new ArrayList<>();
static int findFreePort() {
ServerSocket server;
try {
server = new ServerSocket(0);
int port = server.getLocalPort();
ourPorts.add(port);
server.close();
Thread.sleep(500);
return port;
} catch (IOException | InterruptedException e) {
throw new Error(e);
}
}
public static List<Integer> list() {
return ourPorts;
}
}

View File

@ -0,0 +1,19 @@
{
"resourceType": "Library",
"id": "activityDefinitionApplyTest",
"version": "1.0",
"status": "draft",
"type": {
"coding": [
{
"code": "logic-library"
}
]
},
"content": [
{
"contentType": "text/cql",
"data": "bGlicmFyeSBhY3Rpdml0eURlZmluaXRpb25BcHBseVRlc3QgdmVyc2lvbiAnMS4wJw0KDQpkZWZpbmUgIkR5bmFtaWMgZG9Ob3RQZXJmb3JtIFNldHRpbmciOg0KICAgIHRydWU="
}
]
}

View File

@ -0,0 +1,32 @@
{
"resourceType": "ActivityDefinition",
"id": "ad-apply-example",
"text": {
"status": "generated",
"div": "<div xmlns=\"http://www.w3.org/1999/xhtml\">ActivityDefinition $apply operation example.</div>"
},
"status": "draft",
"description": "This is a test.",
"library": [
{
"reference": "Library/activityDefinitionApplyTest"
}
],
"kind": "ProcedureRequest",
"code": {
"coding": [
{
"system": "http://snomed.info/sct",
"code": "303653007",
"display": "Computed tomography of head"
}
]
},
"dynamicValue": [
{
"description": "Set ProcedureRequest doNotPerform property",
"path": "doNotPerform",
"expression": "activityDefinitionApplyTest.\"Dynamic doNotPerform Setting\""
}
]
}

View File

@ -0,0 +1,37 @@
{
"resourceType": "ActivityDefinition",
"id": "mammogram-service-request",
"text": {
"status": "generated",
"div": "<div xmlns=\"http://www.w3.org/1999/xhtml\">Create ServiceRequest for Mammogrm Procedure</div>"
},
"status": "draft",
"description": "Create ServiceRequest for Mammogram Procedure",
"kind": "ProcedureRequest",
"code": {
"coding": [
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "77056",
"display": "Mammography; bilateral"
}
]
},
"timingTiming": {
"_event": [
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/cqif-basic-cqlExpression",
"valueString": "Now()"
}
]
}
]
},
"participant": [
{
"type": "practitioner"
}
]
}

View File

@ -0,0 +1,96 @@
{
"resourceType": "Patient",
"id": "Patient-6532",
"extension": [
{
"url": "http://hl7.org/fhir/us/core/StructureDefinition/us-core-race",
"valueCodeableConcept": {
"coding": [
{
"system": "http://hl7.org/fhir/v3/Race",
"code": "2106-3",
"display": "White"
}
]
}
},
{
"url": "http://hl7.org/fhir/us/core/StructureDefinition/us-core-ethnicity",
"valueCodeableConcept": {
"coding": [
{
"system": "http://hl7.org/fhir/v3/Ethnicity",
"code": "2186-5",
"display": "Not Hispanic or Latino"
}
]
}
},
{
"url": "http://hl7.org/fhir/us/core/StructureDefinition/us-core-religion",
"valueCodeableConcept": {
"coding": [
{
"system": "http://hl7.org/fhir/v3/ReligiousAffiliation",
"code": "1041",
"display": "Roman Catholic Church"
}
]
}
}
],
"identifier": [
{
"use": "official",
"type": {
"coding": [
{
"system": "http://hl7.org/fhir/identifier-type",
"code": "SB",
"display": "Social Beneficiary Identifier"
}
],
"text": "US Social Security Number"
},
"system": "http://hl7.org/fhir/sid/us-ssn",
"value": "000006532"
}
],
"active": true,
"name": [
{
"family": "Brandt",
"given": [
"Edith",
"Elaine"
]
}
],
"telecom": [
{
"system": "phone",
"value": "616-555-1082",
"use": "home"
},
{
"system": "phone",
"value": "616-555-1211",
"use": "mobile"
}
],
"gender": "female",
"birthDate": "1987-07-16",
"address": [
{
"use": "home",
"type": "postal",
"line": [
"893 N Elm Drive"
],
"city": "Grand Rapids",
"district": "Kent County",
"state": "MI",
"postalCode": "49504"
}
]
}

View File

@ -0,0 +1,33 @@
{
"resourceType": "PlanDefinition",
"id": "bcs-decision-support",
"status": "draft",
"library": {
"reference": "Library/patient-view"
},
"action": [
{
"condition": [
{
"kind": "applicability",
"language": "text/cql",
"expression": "Does Patient Qualify?"
}
],
"action": [
{
"condition": [
{
"kind": "applicability",
"language": "text/cql",
"expression": "Needs Mammogram"
}
],
"definition": {
"reference": "ActivityDefinition/mammogram-service-request"
}
}
]
}
]
}

View File

@ -0,0 +1,9 @@
{
"hookInstance": "d1577c69-dfbe-44ad-ba6d-3e05e953b2ea",
"fhirServer": "https://sb-fhir-dstu2.smarthealthit.org/smartdstu2/open",
"hook": "patient-view",
"user": "Practitioner/example",
"context": [],
"patient": "Patient/Patient-6535",
"prefetch": {}
}

View File

@ -0,0 +1,114 @@
{
"resourceType": "Patient",
"id": "Patient-12214",
"meta": {
"versionId": "1",
"lastUpdated": "2017-07-17T16:34:10.814+00:00"
},
"text": {
"status": "generated",
"div": "<div xmlns=\"http://www.w3.org/1999/xhtml\"><div class=\"hapiHeaderText\">2 <b>N GERIATRIC </b>Jr</div><table class=\"hapiPropertyTable\"><tbody><tr><td>Identifier</td><td>7f3672feb3b54789953e012d8aef5246</td></tr><tr><td>Address</td><td><span>202 Burlington Rd. </span><br/><span>Bedford </span><span>MA </span></td></tr><tr><td>Date of birth</td><td><span>07 May 1946</span></td></tr></tbody></table></div>"
},
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/us-core-race",
"valueCodeableConcept": {
"coding": [
{
"system": "http://hl7.org/fhir/v3/Race",
"code": "2106-3",
"display": "White"
}
]
}
},
{
"url": "http://hl7.org/fhir/StructureDefinition/us-core-ethnicity",
"valueCodeableConcept": {
"coding": [
{
"system": "http://hl7.org/fhir/v3/Ethnicity",
"code": "2186-5",
"display": "Not Hispanic or Latino"
}
]
}
},
{
"url": "http://hl7.org/fhir/StructureDefinition/us-core-religion",
"valueCodeableConcept": {
"coding": [
{
"system": "http://hl7.org/fhir/v3/ReligiousAffiliation",
"code": "1007",
"display": "Atheism"
}
]
}
}
],
"identifier": [
{
"use": "official",
"type": {
"coding": [
{
"system": "http://hl7.org/fhir/identifier-type",
"code": "SB",
"display": "Social Beneficiary Identifier"
}
],
"text": "Michigan Common Key Service Identifier"
},
"system": "http://mihin.org/fhir/cks",
"value": "7f3672feb3b54789953e012d8aef5246"
}
],
"active": false,
"name": [
{
"family": "N Geriatric",
"given": [
"2"
],
"suffix": [
"Jr"
]
}
],
"telecom": [
{
"system": "phone",
"value": "586-555-7576",
"use": "home"
},
{
"system": "phone",
"value": "586-555-0297",
"use": "work"
},
{
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/us-core-direct",
"valueBoolean": true
}
],
"system": "email",
"value": "2.N.Geriatric@direct.mihintest.org",
"use": "home"
}
],
"gender": "male",
"birthDate": "1946-05-07",
"address": [
{
"line": [
"202 Burlington Rd."
],
"city": "Bedford",
"state": "MA",
"postalCode": "01730"
}
]
}

View File

@ -0,0 +1,173 @@
{
"resourceType": "Practitioner",
"id": "Practitioner-12208",
"meta": {
"versionId": "1",
"lastUpdated": "2017-07-17T16:34:10.814+00:00"
},
"extension": [
{
"url": "http://hl7.org/fhir/StructureDefinition/us-core-race",
"valueCodeableConcept": {
"coding": [
{
"system": "http://hl7.org/fhir/v3/Race",
"code": "2056-0",
"display": "Black"
}
]
}
},
{
"url": "http://hl7.org/fhir/StructureDefinition/us-core-ethnicity",
"valueCodeableConcept": {
"coding": [
{
"system": "http://hl7.org/fhir/v3/Ethnicity",
"code": "2186-5",
"display": "Not Hispanic or Latino"
}
]
}
},
{
"url": "http://gov.onc.fhir.extension.taxonomy",
"valueCodeableConcept": {
"coding": [
{
"system": "http://org.nucc.taxonomy",
"code": "208D00000X",
"display": "General Practice"
}
]
}
},
{
"url": "http://org.mihin.fhir.extension.electronic-service",
"valueReference": {
"reference": "ElectronicService/ElectronicService-2415",
"display": "Jay.M.Sawyer@direct.mihintest.org"
}
}
],
"identifier": [
{
"use": "official",
"type": {
"coding": [
{
"system": "http://hl7.org/fhir/identifier-type",
"code": "SB",
"display": "Social Beneficiary Identifier"
}
],
"text": "US Social Security Number"
},
"system": "http://hl7.org/fhir/sid/us-ssn",
"value": "000012208"
},
{
"use": "official",
"type": {
"coding": [
{
"system": "http://hl7.org/fhir/v2/0203",
"code": "PRN",
"display": "Provider number"
}
],
"text": "US National Provider Identifier"
},
"system": "http://hl7.org/fhir/sid/us-npi",
"value": "9999912208"
},
{
"use": "official",
"type": {
"coding": [
{
"system": "http://hl7.org/fhir/identifier-type",
"code": "SB",
"display": "Social Beneficiary Identifier"
}
],
"text": "Michigan Common Key Service Identifier"
},
"system": "http://mihin.org/fhir/cks",
"value": "c6cc1bbaf5ea41c5a0d267e3a655def1"
}
],
"name": [
{
"family": "Sawyer",
"given": [
"Jay",
"McCann"
],
"suffix": [
"MD"
]
}
],
"telecom": [
{
"system": "phone",
"value": "989-555-8443",
"use": "home"
},
{
"system": "phone",
"value": "989-555-5764",
"use": "work"
}
],
"address": [
{
"line": [
"77 S Pine Place"
],
"city": "Beaverton",
"state": "MI",
"postalCode": "48612"
}
],
"gender": "male",
"birthDate": "1970-08-07",
"qualification": [
{
"identifier": [
{
"use": "official",
"type": {
"coding": [
{
"system": "http://hl7.org/fhir/v2/0203",
"code": "MD",
"display": "Medical License number"
}
],
"text": "Michigan Medical License"
},
"system": "http://michigan.gov/fhir/medical-license",
"value": "LARA-12208",
"assigner": {
"display": "State of Michigan"
}
}
],
"code": {
"coding": [
{
"system": "http://michigan.gov/lara/license-type",
"code": "4305",
"display": "Medical Doctor"
}
]
},
"issuer": {
"reference": "Organization/Organization-2000",
"display": "Michigan Department of Licensing and Regulatory Affairs"
}
}
]
}

View File

@ -0,0 +1,319 @@
<?xml version="1.0" encoding="UTF-8"?>
<library xmlns="urn:hl7-org:elm:r1" xmlns:t="urn:hl7-org:elm-types:r1" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:fhir="http://hl7.org/fhir" xmlns:a="urn:hl7-org:cql-annotations:r1">
<identifier id="COL" version="1"/>
<schemaIdentifier id="urn:hl7-org:elm" version="r1"/>
<usings>
<def localIdentifier="System" uri="urn:hl7-org:elm-types:r1"/>
<def localIdentifier="FHIR" uri="http://hl7.org/fhir" version="1.6"/>
</usings>
<parameters>
<def name="MeasurementPeriod" accessLevel="Public">
<parameterTypeSpecifier xsi:type="IntervalTypeSpecifier">
<pointType name="t:DateTime" xsi:type="NamedTypeSpecifier"/>
</parameterTypeSpecifier>
</def>
</parameters>
<codeSystems>
<def name="CPT" id="urn:oid:2.16.840.1.113883.6.12" accessLevel="Public"/>
<def name="SNOMED-CT" id="urn:oid:2.16.840.1.113883.6.96" accessLevel="Public"/>
<def name="LOINC" id="http://loinc.org" accessLevel="Public"/>
</codeSystems>
<valueSets>
<def name="Malignant Neoplasm of Colon" id="2.16.840.1.113883.3.464.1003.108.11.1001" accessLevel="Public"/>
<def name="Total Colectomy" id="2.16.840.1.113883.3.464.1003.198.12.1019" accessLevel="Public"/>
<def name="Colonoscopy" id="2.16.840.1.113883.3.464.1003.108.12.1020" accessLevel="Public"/>
<def name="Flexible Sigmoidoscopy" id="2.16.840.1.113883.3.464.1003.198.12.1010" accessLevel="Public"/>
<def name="Fecal Occult Blood Test (FOBT)" id="2.16.840.1.113883.3.464.1003.198.12.1011" accessLevel="Public"/>
</valueSets>
<statements>
<def name="Patient" context="Patient">
<expression xsi:type="SingletonFrom">
<operand dataType="fhir:Patient" xsi:type="Retrieve"/>
</expression>
</def>
<def name="Lookback Interval One Year" context="Patient" accessLevel="Public">
<expression lowClosed="true" highClosed="true" xsi:type="Interval">
<low xsi:type="Subtract">
<operand xsi:type="Start">
<operand name="MeasurementPeriod" xsi:type="ParameterRef"/>
</operand>
<operand value="1" unit="years" xsi:type="Quantity"/>
</low>
<high xsi:type="End">
<operand name="MeasurementPeriod" xsi:type="ParameterRef"/>
</high>
</expression>
</def>
<def name="Lookback Interval Five Years" context="Patient" accessLevel="Public">
<expression lowClosed="true" highClosed="true" xsi:type="Interval">
<low xsi:type="Subtract">
<operand xsi:type="Start">
<operand name="MeasurementPeriod" xsi:type="ParameterRef"/>
</operand>
<operand value="5" unit="years" xsi:type="Quantity"/>
</low>
<high xsi:type="End">
<operand name="MeasurementPeriod" xsi:type="ParameterRef"/>
</high>
</expression>
</def>
<def name="Lookback Interval Ten Years" context="Patient" accessLevel="Public">
<expression lowClosed="true" highClosed="true" xsi:type="Interval">
<low xsi:type="Subtract">
<operand xsi:type="Start">
<operand name="MeasurementPeriod" xsi:type="ParameterRef"/>
</operand>
<operand value="10" unit="years" xsi:type="Quantity"/>
</low>
<high xsi:type="End">
<operand name="MeasurementPeriod" xsi:type="ParameterRef"/>
</high>
</expression>
</def>
<def name="In Demographic" context="Patient" accessLevel="Public">
<expression xsi:type="GreaterOrEqual">
<operand precision="Year" xsi:type="CalculateAgeAt">
<operand path="birthDate.value" xsi:type="Property">
<source name="Patient" xsi:type="ExpressionRef"/>
</operand>
<operand xsi:type="Start">
<operand name="MeasurementPeriod" xsi:type="ParameterRef"/>
</operand>
</operand>
<operand valueType="t:Integer" value="50" xsi:type="Literal"/>
</expression>
</def>
<def name="Hx Colorectal Cancer" context="Patient" accessLevel="Public">
<expression xsi:type="Query">
<source alias="C">
<expression dataType="fhir:Condition" codeProperty="code" xsi:type="Retrieve">
<codes name="Malignant Neoplasm of Colon" xsi:type="ValueSetRef"/>
</expression>
</source>
<where xsi:type="And">
<operand xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="clinicalStatus" scope="C" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="active" xsi:type="Literal"/>
</operand>
<operand xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="verificationStatus" scope="C" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="confirmed" xsi:type="Literal"/>
</operand>
</where>
</expression>
</def>
<def name="Hx Total Colectomy" context="Patient" accessLevel="Public">
<expression xsi:type="Query">
<source alias="T">
<expression dataType="fhir:Procedure" codeProperty="code" xsi:type="Retrieve">
<codes name="Total Colectomy" xsi:type="ValueSetRef"/>
</expression>
</source>
<where xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="status" scope="T" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="completed" xsi:type="Literal"/>
</where>
</expression>
</def>
<def name="Colonoscopy Performed" context="Patient" accessLevel="Public">
<expression xsi:type="Query">
<source alias="C">
<expression dataType="fhir:Procedure" codeProperty="code" xsi:type="Retrieve">
<codes name="Colonoscopy" xsi:type="ValueSetRef"/>
</expression>
</source>
<where xsi:type="And">
<operand xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="status" scope="C" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="completed" xsi:type="Literal"/>
</operand>
<operand xsi:type="In">
<operand path="value" xsi:type="Property">
<source path="end" xsi:type="Property">
<source path="performedPeriod" scope="C" xsi:type="Property"/>
</source>
</operand>
<operand name="Lookback Interval Ten Years" xsi:type="ExpressionRef"/>
</operand>
</where>
</expression>
</def>
<def name="Colonoscopy Results" context="Patient" accessLevel="Public">
<expression xsi:type="Query">
<source alias="C">
<expression dataType="fhir:Observation" codeProperty="code" xsi:type="Retrieve">
<codes name="Colonoscopy" xsi:type="ValueSetRef"/>
</expression>
</source>
<where xsi:type="And">
<operand xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="status" scope="C" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="final" xsi:type="Literal"/>
</operand>
<operand xsi:type="In">
<operand path="value" xsi:type="Property">
<source path="effectiveDateTime" scope="C" xsi:type="Property"/>
</operand>
<operand name="Lookback Interval Ten Years" xsi:type="ExpressionRef"/>
</operand>
</where>
</expression>
</def>
<def name="Sigmoidoscopy Procedure" context="Patient" accessLevel="Public">
<expression xsi:type="Query">
<source alias="S">
<expression dataType="fhir:Procedure" codeProperty="code" xsi:type="Retrieve">
<codes name="Flexible Sigmoidoscopy" xsi:type="ValueSetRef"/>
</expression>
</source>
<where xsi:type="And">
<operand xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="status" scope="S" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="completed" xsi:type="Literal"/>
</operand>
<operand xsi:type="In">
<operand path="value" xsi:type="Property">
<source path="end" xsi:type="Property">
<source path="performedPeriod" scope="S" xsi:type="Property"/>
</source>
</operand>
<operand name="Lookback Interval Five Years" xsi:type="ExpressionRef"/>
</operand>
</where>
</expression>
</def>
<def name="Sigmoidoscopy Observation" context="Patient" accessLevel="Public">
<expression xsi:type="Query">
<source alias="O">
<expression dataType="fhir:Observation" codeProperty="code" xsi:type="Retrieve">
<codes name="Flexible Sigmoidoscopy" xsi:type="ValueSetRef"/>
</expression>
</source>
<where xsi:type="And">
<operand xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="status" scope="O" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="final" xsi:type="Literal"/>
</operand>
<operand xsi:type="In">
<operand path="value" xsi:type="Property">
<source path="effectiveDateTime" scope="O" xsi:type="Property"/>
</operand>
<operand name="Lookback Interval Five Years" xsi:type="ExpressionRef"/>
</operand>
</where>
</expression>
</def>
<def name="FOBT Procedure" context="Patient" accessLevel="Public">
<expression xsi:type="Query">
<source alias="F">
<expression dataType="fhir:Procedure" codeProperty="code" xsi:type="Retrieve">
<codes name="Fecal Occult Blood Test (FOBT)" xsi:type="ValueSetRef"/>
</expression>
</source>
<where xsi:type="And">
<operand xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="status" scope="F" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="completed" xsi:type="Literal"/>
</operand>
<operand xsi:type="In">
<operand path="value" xsi:type="Property">
<source path="end" xsi:type="Property">
<source path="performedPeriod" scope="F" xsi:type="Property"/>
</source>
</operand>
<operand name="Lookback Interval One Year" xsi:type="ExpressionRef"/>
</operand>
</where>
</expression>
</def>
<def name="FOBT Observation" context="Patient" accessLevel="Public">
<expression xsi:type="Query">
<source alias="O">
<expression dataType="fhir:Observation" codeProperty="code" xsi:type="Retrieve">
<codes name="Fecal Occult Blood Test (FOBT)" xsi:type="ValueSetRef"/>
</expression>
</source>
<where xsi:type="And">
<operand xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="status" scope="O" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="final" xsi:type="Literal"/>
</operand>
<operand xsi:type="In">
<operand path="value" xsi:type="Property">
<source path="effectiveDateTime" scope="O" xsi:type="Property"/>
</operand>
<operand name="Lookback Interval One Year" xsi:type="ExpressionRef"/>
</operand>
</where>
</expression>
</def>
<def name="Colonoscopy Procedure" context="Patient" accessLevel="Public">
<expression xsi:type="Query">
<source alias="C">
<expression dataType="fhir:Procedure" codeProperty="code" xsi:type="Retrieve">
<codes name="Colonoscopy" xsi:type="ValueSetRef"/>
</expression>
</source>
<where xsi:type="And">
<operand xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="status" scope="C" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="completed" xsi:type="Literal"/>
</operand>
<operand xsi:type="In">
<operand path="value" xsi:type="Property">
<source path="end" xsi:type="Property">
<source path="performedPeriod" scope="C" xsi:type="Property"/>
</source>
</operand>
<operand name="Lookback Interval Ten Years" xsi:type="ExpressionRef"/>
</operand>
</where>
</expression>
</def>
<def name="Colonoscopy Observation" context="Patient" accessLevel="Public">
<expression xsi:type="Query">
<source alias="O">
<expression dataType="fhir:Observation" codeProperty="code" xsi:type="Retrieve">
<codes name="Colonoscopy" xsi:type="ValueSetRef"/>
</expression>
</source>
<where xsi:type="And">
<operand xsi:type="Equal">
<operand path="value" xsi:type="Property">
<source path="status" scope="O" xsi:type="Property"/>
</operand>
<operand valueType="t:String" value="final" xsi:type="Literal"/>
</operand>
<operand xsi:type="In">
<operand path="value" xsi:type="Property">
<source path="effectiveDateTime" scope="O" xsi:type="Property"/>
</operand>
<operand name="Lookback Interval Ten Years" xsi:type="ExpressionRef"/>
</operand>
</where>
</expression>
</def>
</statements>
</library>

View File

@ -0,0 +1,138 @@
<?xml version="1.0" encoding="UTF-8"?>
<Measure xmlns="http://hl7.org/fhir" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://hl7.org/fhir ../../schema/measure.xsd">
<id value="col"/>
<text>
<status value="additional"/>
<div xmlns="http://www.w3.org/1999/xhtml">
Cohort definition for Colorectal Cancer Screening.
</div>
</text>
<identifier>
<use value="official"/>
<system value="http://hl7.org/fhir/cqi/ecqm/Measure/Identifier/payer-extract"/>
<value value="COL"/>
</identifier>
<version value="1.0.0"/>
<title value="Colorectal Cancer Screening. Cohort Definition"/>
<status value="active"/>
<experimental value="true"/>
<description value="Colorectal Cancer Screening. Cohort Definition"/>
<topic>
<coding>
<system value="http://hl7.org/fhir/c80-doc-typecodes"/>
<code value="57024-2"/>
</coding>
</topic>
<library>
<reference value="Library/col-logic"/>
</library>
<scoring value="cohort"/>
<group>
<identifier>
<value value="in-demographic"/>
</identifier>
<population>
<type value="initial-population"/>
<identifier>
<value value="in-demographic"/>
</identifier>
<criteria value="In Demographic"/>
</population>
</group>
<group>
<identifier>
<value value="history-of-colorectal-cancer"/>
</identifier>
<population>
<type value="initial-population"/>
<identifier>
<value value="history-of-colorectal-cancer"/>
</identifier>
<criteria value="Hx Colorectal Cancer"/>
</population>
</group>
<group>
<identifier>
<value value="history-of-total-colectomy"/>
</identifier>
<population>
<type value="initial-population"/>
<identifier>
<value value="history-of-total-colectomy"/>
</identifier>
<criteria value="Hx Total Colectomy"/>
</population>
</group>
<group>
<identifier>
<value value="colonoscopy-performed"/>
</identifier>
<population>
<type value="initial-population"/>
<identifier>
<value value="colonoscopy-performed"/>
</identifier>
<criteria value="Colonoscopy Performed"/>
</population>
</group>
<group>
<identifier>
<value value="colonoscopy-results"/>
</identifier>
<population>
<type value="initial-population"/>
<identifier>
<value value="colonoscopy-results"/>
</identifier>
<criteria value="Colonoscopy Results"/>
</population>
</group>
<group>
<identifier>
<value value="sigmoidoscopy-procedure"/>
</identifier>
<population>
<type value="initial-population"/>
<identifier>
<value value="sigmoidoscopy-procedure"/>
</identifier>
<criteria value="Sigmoidoscopy Procedure"/>
</population>
</group>
<group>
<identifier>
<value value="sigmoidoscopy-observation"/>
</identifier>
<population>
<type value="initial-population"/>
<identifier>
<value value="sigmoidoscopy-observation"/>
</identifier>
<criteria value="Sigmoidoscopy Observation"/>
</population>
</group>
<group>
<identifier>
<value value="fobt-procedure"/>
</identifier>
<population>
<type value="initial-population"/>
<identifier>
<value value="fobt-procedure"/>
</identifier>
<criteria value="FOBT Procedure"/>
</population>
</group>
<group>
<identifier>
<value value="fobt-observation"/>
</identifier>
<population>
<type value="initial-population"/>
<identifier>
<value value="fobt-observation"/>
</identifier>
<criteria value="FOBT Observation"/>
</population>
</group>
</Measure>

View File

@ -0,0 +1,49 @@
{
"resourceType": "Condition",
"id": "Condition-13",
"meta": {
"versionId": "1",
"lastUpdated": "2017-09-09T21:52:17.035-06:00"
},
"extension": [
{
"url": "http://mihin.org/fhir/templateId",
"valueString": "2.16.840.1.113883.10.20.22.4.3"
},
{
"url": "http://mihin.org/fhir/templateId",
"valueString": "2.16.840.1.113883.10.20.24.3.137"
}
],
"clinicalStatus": "active",
"verificationStatus": "confirmed",
"category": [
{
"coding": [
{
"system": "http://hl7.org/fhir/condition-category",
"code": "diagnosis",
"display": "Diagnosis"
}
],
"text": "This is a judgment made by a healthcare provider that the patient has a particular disease or condition"
}
],
"code": {
"coding": [
{
"system": "http://snomed.info/sct",
"code": "363414004"
}
],
"text": "Diagnosis: Malignant Neoplasm Of Colon"
},
"subject": {
"reference": "Patient/Patient-12214",
"display": "2 N Geriatric Jr"
},
"asserter": {
"reference": "Practitioner/Practitioner-12208",
"display": "Jay McCann Sawyer MD"
}
}

View File

@ -0,0 +1,158 @@
{
"resourceType": "Measure",
"id": "col",
"meta": {
"versionId": "1",
"lastUpdated": "2017-09-09T21:26:03.890-06:00"
},
"text": {
"status": "additional",
"div": "<div xmlns=\"http://www.w3.org/1999/xhtml\">\n Cohort definition for Colorectal Cancer Screening.\n </div>"
},
"identifier": [
{
"use": "official",
"system": "http://hl7.org/fhir/cqi/ecqm/Measure/Identifier/payer-extract",
"value": "COL"
}
],
"version": "1.0.0",
"title": "Colorectal Cancer Screening. Cohort Definition",
"status": "active",
"experimental": true,
"description": "Colorectal Cancer Screening. Cohort Definition",
"topic": [
{
"coding": [
{
"system": "http://hl7.org/fhir/c80-doc-typecodes",
"code": "57024-2"
}
]
}
],
"library": [
{
"reference": "Library/col-logic"
}
],
"group": [
{
"identifier": {
"value": "in-demographic"
},
"population": [
{
"identifier": {
"value": "in-demographic"
},
"criteria": "In Demographic"
}
]
},
{
"identifier": {
"value": "history-of-colorectal-cancer"
},
"population": [
{
"identifier": {
"value": "history-of-colorectal-cancer"
},
"criteria": "Hx Colorectal Cancer"
}
]
},
{
"identifier": {
"value": "history-of-total-colectomy"
},
"population": [
{
"identifier": {
"value": "history-of-total-colectomy"
},
"criteria": "Hx Total Colectomy"
}
]
},
{
"identifier": {
"value": "colonoscopy-performed"
},
"population": [
{
"identifier": {
"value": "colonoscopy-performed"
},
"criteria": "Colonoscopy Performed"
}
]
},
{
"identifier": {
"value": "colonoscopy-results"
},
"population": [
{
"identifier": {
"value": "colonoscopy-results"
},
"criteria": "Colonoscopy Results"
}
]
},
{
"identifier": {
"value": "sigmoidoscopy-procedure"
},
"population": [
{
"identifier": {
"value": "sigmoidoscopy-procedure"
},
"criteria": "Sigmoidoscopy Procedure"
}
]
},
{
"identifier": {
"value": "sigmoidoscopy-observation"
},
"population": [
{
"identifier": {
"value": "sigmoidoscopy-observation"
},
"criteria": "Sigmoidoscopy Observation"
}
]
},
{
"identifier": {
"value": "fobt-procedure"
},
"population": [
{
"identifier": {
"value": "fobt-procedure"
},
"criteria": "FOBT Procedure"
}
]
},
{
"identifier": {
"value": "fobt-observation"
},
"population": [
{
"identifier": {
"value": "fobt-observation"
},
"criteria": "FOBT Observation"
}
]
}
]
}

View File

@ -0,0 +1,68 @@
{
"resourceType": "Procedure",
"id": "Procedure-9",
"meta": {
"versionId": "1",
"lastUpdated": "2017-09-09T21:52:35.933-06:00"
},
"extension": [
{
"url": "http://mihin.org/fhir/templateId",
"valueString": "2.16.840.1.113883.10.20.24.3.64"
},
{
"url": "http://mihin.org/fhir/templateId",
"valueString": "2.16.840.1.113883.10.20.22.4.14"
}
],
"identifier": [
{
"system": "http://hl7.org/fhir/identifier",
"value": "1.3.6.1.4.1.115:579f4eb5aeac500a550c5c7b"
}
],
"status": "completed",
"category": {
"coding": [
{
"system": "http://snomed.info/sct",
"code": "387713003",
"display": "Surgical Procedure"
}
]
},
"code": {
"coding": [
{
"system": "http://snomed.info/sct",
"code": "36192008"
}
],
"text": "Procedure, Performed: Total Colectomy"
},
"subject": {
"reference": "Patient/Patient-12214",
"display": "2 N Geriatric Jr"
},
"performedPeriod": {
"start": "2010-10-12T06:00:00-04:00",
"end": "2010-10-12T08:15:00-04:00"
},
"performer": [
{
"role": {
"coding": [
{
"system": "http://hl7.org/fhir/ValueSet/performer-role",
"code": "112247003",
"display": "Medical doctor (occupation)"
}
]
},
"actor": {
"reference": "Practitioner/Practitioner-12208",
"display": "Jay McCann Sawyer MD"
}
}
]
}

View File

@ -0,0 +1,416 @@
{
"resourceType": "ValueSet",
"id": "2.16.840.1.113883.3.464.1003.108.11.1001",
"meta": {
"versionId": "3",
"lastUpdated": "2017-07-25T09:54:33.579+00:00"
},
"url": "http://measure.eval.kanvix.com/cqf-ruler/baseDstu3/Valueset/2.16.840.1.113883.3.464.1003.108.11.1001",
"name": "Malignant Neoplasm of Colon (SNOMED CT) eCQM",
"status": "active",
"compose": {
"include": [
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"concept": [
{
"code": "187758006"
},
{
"code": "109838007"
},
{
"code": "1701000119104"
},
{
"code": "187757001"
},
{
"code": "269533000"
},
{
"code": "269544008"
},
{
"code": "285312008"
},
{
"code": "285611007"
},
{
"code": "301756000"
},
{
"code": "312111009"
},
{
"code": "312112002"
},
{
"code": "312113007"
},
{
"code": "312114001"
},
{
"code": "312115000"
},
{
"code": "314965007"
},
{
"code": "315058005"
},
{
"code": "363406005"
},
{
"code": "363407001"
},
{
"code": "363408006"
},
{
"code": "363409003"
},
{
"code": "363410008"
},
{
"code": "363412000"
},
{
"code": "363413005"
},
{
"code": "363414004"
},
{
"code": "363510005"
},
{
"code": "425178004"
},
{
"code": "449218003"
},
{
"code": "93683002"
},
{
"code": "93761005"
},
{
"code": "93771007"
},
{
"code": "93826009"
},
{
"code": "93980002"
},
{
"code": "94006002"
},
{
"code": "94072004"
},
{
"code": "94105000"
},
{
"code": "94179005"
},
{
"code": "94260004"
},
{
"code": "94271003"
},
{
"code": "94328005"
},
{
"code": "94509004"
},
{
"code": "94538001"
},
{
"code": "94604000"
},
{
"code": "94643001"
}
]
}
]
},
"expansion": {
"identifier": "http://open-api2.hspconsortium.org/payerextract/data/ValueSet/2.16.840.1.113883.3.464.1003.108.11.1001",
"timestamp": "2016-09-19T14:05:21.939-04:00",
"total": 43,
"offset": 0,
"contains": [
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "425178004",
"display": "Adenocarcinoma of rectosigmoid junction"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "301756000",
"display": "Adenocarcinoma of sigmoid colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "312111009",
"display": "Carcinoma of ascending colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "269533000",
"display": "Carcinoma of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "312113007",
"display": "Carcinoma of descending colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "312114001",
"display": "Carcinoma of hepatic flexure"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "285312008",
"display": "Carcinoma of sigmoid colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "312115000",
"display": "Carcinoma of splenic flexure"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "269544008",
"display": "Carcinoma of the rectosigmoid junction"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "312112002",
"display": "Carcinoma of transverse colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "315058005",
"display": "Lynch syndrome"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "314965007",
"display": "Local recurrence of malignant tumor of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "449218003",
"display": "Lymphoma of sigmoid colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "187758006",
"display": "Malignant neoplasm of other specified sites of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "187757001",
"display": "Malignant neoplasm, overlapping lesion of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "363412000",
"display": "Malignant tumor of ascending colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "363406005",
"display": "Malignant tumor of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "363409003",
"display": "Malignant tumor of descending colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "363407001",
"display": "Malignant tumor of hepatic flexure"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "363510005",
"display": "Malignant tumor of large intestine"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "363414004",
"display": "Malignant tumor of rectosigmoid junction"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "363410008",
"display": "Malignant tumor of sigmoid colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "363413005",
"display": "Malignant tumor of splenic flexure"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "363408006",
"display": "Malignant tumor of transverse colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "285611007",
"display": "Metastasis to colon of unknown primary"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "109838007",
"display": "Overlapping malignant neoplasm of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "1701000119104",
"display": "Primary adenocarcinoma of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "93683002",
"display": "Primary malignant neoplasm of ascending colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "93761005",
"display": "Primary malignant neoplasm of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "93771007",
"display": "Primary malignant neoplasm of descending colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "93826009",
"display": "Primary malignant neoplasm of hepatic flexure of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "93980002",
"display": "Primary malignant neoplasm of rectosigmoid junction"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94006002",
"display": "Primary malignant neoplasm of sigmoid colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94072004",
"display": "Primary malignant neoplasm of splenic flexure of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94105000",
"display": "Primary malignant neoplasm of transverse colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94179005",
"display": "Secondary malignant neoplasm of ascending colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94260004",
"display": "Secondary malignant neoplasm of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94271003",
"display": "Secondary malignant neoplasm of descending colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94328005",
"display": "Secondary malignant neoplasm of hepatic flexure of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94509004",
"display": "Secondary malignant neoplasm of rectosigmoid junction"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94538001",
"display": "Secondary malignant neoplasm of sigmoid colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94604000",
"display": "Secondary malignant neoplasm of splenic flexure of colon"
},
{
"system": "http://snomed.info/sct",
"version": "2015.03.14AB",
"code": "94643001",
"display": "Secondary malignant neoplasm of transverse colon"
}
]
}
}

View File

@ -0,0 +1,181 @@
{
"resourceType": "ValueSet",
"id": "2.16.840.1.113883.3.464.1003.198.12.1019",
"meta": {
"versionId": "3",
"lastUpdated": "2017-07-25T09:54:33.579+00:00"
},
"url": "http://measure.eval.kanvix.com/cql-measure-processor/baseDstu3/Valueset/2.16.840.1.113883.3.464.1003.198.12.1019 ",
"name": "Total Colectomy eMeasure",
"compose": {
"include": [
{
"system": "http://www.ama-assn.org/go/cpt",
"version": "2016.1.15AA",
"concept": [
{
"code": "44156"
},
{
"code": "44158"
},
{
"code": "44157"
},
{
"code": "44155"
},
{
"code": "44151"
},
{
"code": "44150"
},
{
"code": "44211"
},
{
"code": "44212"
},
{
"code": "44210"
},
{
"code": "44153"
},
{
"code": "44152"
}
]
},
{
"system": "http://snomed.info/sct",
"version": "2015.09.15AA",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "26390003"
}
]
}
]
},
"expansion": {
"timestamp": "2016-09-20T12:32:19.296-04:00",
"total": 22,
"offset": 0,
"contains": [
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44156",
"display": "Colectomy, total, abdominal, with proctectomy; with continent ileostomy"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44158",
"display": "Colectomy, total, abdominal, with proctectomy; with ileoanal anastomosis, creation of ileal reservoir (S or J), includes loop ileostomy, and rectal mucosectomy, when performed"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44157",
"display": "Colectomy, total, abdominal, with proctectomy; with ileoanal anastomosis, includes loop ileostomy, and rectal mucosectomy, when performed"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44155",
"display": "Colectomy, total, abdominal, with proctectomy; with ileostomy"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44151",
"display": "Colectomy, total, abdominal, without proctectomy; with continent ileostomy"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44150",
"display": "Colectomy, total, abdominal, without proctectomy; with ileostomy or ileoproctostomy"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44153",
"display": "Colectomy, total, abdominal, without proctectomy; with rectal mucosectomy, ileoanal anastomosis, creation of ileal reservoir (S or J), with or without loop ileostomy"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44152",
"display": "Colectomy, total, abdominal, without proctectomy; with rectal mucosectomy, ileoanal anastomosis, with or without loop ileostomy"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44211",
"display": "Laparoscopy, surgical; colectomy, total, abdominal, with proctectomy, with ileoanal anastomosis, creation of ileal reservoir (S or J), with loop ileostomy, includes rectal mucosectomy, when performed"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44212",
"display": "Laparoscopy, surgical; colectomy, total, abdominal, with proctectomy, with ileostomy"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44210",
"display": "Laparoscopy, surgical; colectomy, total, abdominal, without proctectomy, with ileostomy or ileoproctostomy"
},
{
"system": "http://snomed.info/sct",
"code": "303401008",
"display": "Parks panproctocolectomy, anastomosis of ileum to anus and creation of pouch"
},
{
"system": "http://snomed.info/sct",
"code": "235331003",
"display": "Restorative proctocolectomy"
},
{
"system": "http://snomed.info/sct",
"code": "36192008",
"display": "Total abdominal colectomy with ileoproctostomy"
},
{
"system": "http://snomed.info/sct",
"code": "456004",
"display": "Total abdominal colectomy with ileostomy"
},
{
"system": "http://snomed.info/sct",
"code": "44751009",
"display": "Total abdominal colectomy with proctectomy and continent ileostomy"
},
{
"system": "http://snomed.info/sct",
"code": "31130001",
"display": "Total abdominal colectomy with proctectomy and ileostomy"
},
{
"system": "http://snomed.info/sct",
"code": "80294005",
"display": "Total abdominal colectomy with rectal mucosectomy and ileoanal anastomosis"
},
{
"system": "http://snomed.info/sct",
"code": "26390003",
"display": "Total colectomy"
},
{
"system": "http://snomed.info/sct",
"code": "307666008",
"display": "Total colectomy and ileostomy"
},
{
"system": "http://snomed.info/sct",
"code": "307669001",
"display": "Total colectomy, ileostomy and closure of rectal stump"
},
{
"system": "http://snomed.info/sct",
"code": "307667004",
"display": "Total colectomy, ileostomy and rectal mucous fistula"
}
]
}
}

View File

@ -0,0 +1,421 @@
{
"resourceType": "ValueSet",
"id": "2.16.840.1.113883.3.464.1003.108.12.1020",
"meta": {
"versionId": "3",
"lastUpdated": "2017-07-25T09:54:33.579+00:00"
},
"url": "http://measure.eval.kanvix.com/cql-measure-processor/baseDstu3/Valueset/2.16.840.1.113883.3.464.1003.108.12.1020",
"name": "Colonoscopy eMeasure",
"compose": {
"include": [
{
"system": "http://www.ama-assn.org/go/cpt",
"version": "2015.1.14AB",
"concept": [
{
"code": "44388"
},
{
"code": "44393"
},
{
"code": "44389"
},
{
"code": "44391"
},
{
"code": "44390"
},
{
"code": "44392"
},
{
"code": "44394"
},
{
"code": "44397"
},
{
"code": "45378"
},
{
"code": "45383"
},
{
"code": "45380"
},
{
"code": "45382"
},
{
"code": "45386"
},
{
"code": "45381"
},
{
"code": "45391"
},
{
"code": "45379"
},
{
"code": "45384"
},
{
"code": "45385"
},
{
"code": "45387"
},
{
"code": "45392"
},
{
"code": "45355"
},
{
"code": "44401"
},
{
"code": "44402"
},
{
"code": "44403"
},
{
"code": "44404"
},
{
"code": "44405"
},
{
"code": "44406"
},
{
"code": "44407"
},
{
"code": "44408"
},
{
"code": "45388"
},
{
"code": "45389"
},
{
"code": "45390"
},
{
"code": "45393"
},
{
"code": "45398"
}
]
},
{
"system": "http://snomed.info/sct",
"version": "2014.07.14AA",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "73761001"
}
]
},
{
"system": "http://snomed.info/sct",
"version": "2014.07.14AA",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "174184006"
}
]
}
]
},
"expansion": {
"timestamp": "2016-09-20T13:07:55.271-04:00",
"total": 54,
"offset": 0,
"contains": [
{
"system": "http://snomed.info/sct",
"code": "310634005",
"display": "Check colonoscopy"
},
{
"system": "http://snomed.info/sct",
"code": "73761001",
"display": "Colonoscopy"
},
{
"system": "http://snomed.info/sct",
"code": "446745002",
"display": "Colonoscopy and biopsy of colon"
},
{
"system": "http://snomed.info/sct",
"code": "446521004",
"display": "Colonoscopy and excision of mucosa of colon"
},
{
"system": "http://snomed.info/sct",
"code": "447021001",
"display": "Colonoscopy and tattooing"
},
{
"system": "http://snomed.info/sct",
"code": "443998000",
"display": "Colonoscopy through colostomy with endoscopic biopsy of colon"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44388",
"display": "Colonoscopy through stoma; diagnostic, including collection of specimen(s) by brushing or washing, when performed (separate procedure)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44401",
"display": "Colonoscopy through stoma; with ablation of tumor(s), polyp(s), or other lesion(s) (includes pre-and post-dilation and guide wire passage, when performed)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44393",
"display": "Colonoscopy through stoma; with ablation of tumor(s), polyp(s), or other lesion(s) not amenable to removal by hot biopsy forceps, bipolar cautery or snare technique"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44389",
"display": "Colonoscopy through stoma; with biopsy, single or multiple"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44391",
"display": "Colonoscopy through stoma; with control of bleeding, any method"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44408",
"display": "Colonoscopy through stoma; with decompression (for pathologic distention) (eg, volvulus, megacolon), including placement of decompression tube, when performed"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44404",
"display": "Colonoscopy through stoma; with directed submucosal injection(s), any substance"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44403",
"display": "Colonoscopy through stoma; with endoscopic mucosal resection"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44402",
"display": "Colonoscopy through stoma; with endoscopic stent placement (including pre- and post-dilation and guide wire passage, when performed)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44406",
"display": "Colonoscopy through stoma; with endoscopic ultrasound examination, limited to the sigmoid, descending, transverse, or ascending colon and cecum and adjacent structures"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44390",
"display": "Colonoscopy through stoma; with removal of foreign body(s)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44392",
"display": "Colonoscopy through stoma; with removal of tumor(s), polyp(s), or other lesion(s) by hot biopsy forceps"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44394",
"display": "Colonoscopy through stoma; with removal of tumor(s), polyp(s), or other lesion(s) by snare technique"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44405",
"display": "Colonoscopy through stoma; with transendoscopic balloon dilation"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44397",
"display": "Colonoscopy through stoma; with transendoscopic stent placement (includes predilation)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "44407",
"display": "Colonoscopy through stoma; with transendoscopic ultrasound guided intramural or transmural fine needle aspiration/biopsy(s), includes endoscopic ultrasound examination limited to the sigmoid, descending, transverse, or ascending colon and cecum and adjacent structures"
},
{
"system": "http://snomed.info/sct",
"code": "12350003",
"display": "Colonoscopy with rigid sigmoidoscope through colotomy"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45383",
"display": "Colonoscopy, flexible, proximal to splenic flexure; with ablation of tumor(s), polyp(s), or other lesion(s) not amenable to removal by hot biopsy forceps, bipolar cautery or snare technique"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45387",
"display": "Colonoscopy, flexible, proximal to splenic flexure; with transendoscopic stent placement (includes predilation)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45378",
"display": "Colonoscopy, flexible; diagnostic, including collection of specimen(s) by brushing or washing, when performed (separate procedure)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45388",
"display": "Colonoscopy, flexible; with ablation of tumor(s), polyp(s), or other lesion(s) (includes pre- and post-dilation and guide wire passage, when performed)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45398",
"display": "Colonoscopy, flexible; with band ligation(s) (eg, hemorrhoids)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45380",
"display": "Colonoscopy, flexible; with biopsy, single or multiple"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45382",
"display": "Colonoscopy, flexible; with control of bleeding, any method"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45393",
"display": "Colonoscopy, flexible; with decompression (for pathologic distention) (eg, volvulus, megacolon), including placement of decompression tube, when performed"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45381",
"display": "Colonoscopy, flexible; with directed submucosal injection(s), any substance"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45390",
"display": "Colonoscopy, flexible; with endoscopic mucosal resection"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45389",
"display": "Colonoscopy, flexible; with endoscopic stent placement (includes pre- and post-dilation and guide wire passage, when performed)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45391",
"display": "Colonoscopy, flexible; with endoscopic ultrasound examination limited to the rectum, sigmoid, descending, transverse, or ascending colon and cecum, and adjacent structures"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45379",
"display": "Colonoscopy, flexible; with removal of foreign body(s)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45384",
"display": "Colonoscopy, flexible; with removal of tumor(s), polyp(s), or other lesion(s) by hot biopsy forceps"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45385",
"display": "Colonoscopy, flexible; with removal of tumor(s), polyp(s), or other lesion(s) by snare technique"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45386",
"display": "Colonoscopy, flexible; with transendoscopic balloon dilation"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45392",
"display": "Colonoscopy, flexible; with transendoscopic ultrasound guided intramural or transmural fine needle aspiration/biopsy(s), includes endoscopic ultrasound examination limited to the rectum, sigmoid, descending, transverse, or ascending colon and cecum, and adjacent structures"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45355",
"display": "Colonoscopy, rigid or flexible, transabdominal via colotomy, single or multiple"
},
{
"system": "https://www.cms.gov/Medicare/Coding/MedHCPCSGenInfo/index.html",
"code": "G0105",
"display": "Colorectal cancer screening; colonoscopy on individual at high risk"
},
{
"system": "https://www.cms.gov/Medicare/Coding/MedHCPCSGenInfo/index.html",
"code": "G0121",
"display": "Colorectal cancer screening; colonoscopy on individual not meeting criteria for high risk"
},
{
"system": "http://snomed.info/sct",
"code": "427459009",
"display": "Diagnostic endoscopic examination of colonic pouch and biopsy of colonic pouch using colonoscope"
},
{
"system": "http://snomed.info/sct",
"code": "174184006",
"display": "Diagnostic endoscopic examination on colon"
},
{
"system": "http://snomed.info/sct",
"code": "367535003",
"display": "Fiberoptic colonoscopy"
},
{
"system": "http://snomed.info/sct",
"code": "8180007",
"display": "Fiberoptic colonoscopy through colostomy"
},
{
"system": "http://snomed.info/sct",
"code": "25732003",
"display": "Fiberoptic colonoscopy with biopsy"
},
{
"system": "http://snomed.info/sct",
"code": "34264006",
"display": "Intraoperative colonoscopy"
},
{
"system": "http://snomed.info/sct",
"code": "235151005",
"display": "Limited colonoscopy"
},
{
"system": "http://snomed.info/sct",
"code": "174158000",
"display": "Open colonoscopy"
},
{
"system": "http://snomed.info/sct",
"code": "444783004",
"display": "Screening colonoscopy"
},
{
"system": "http://snomed.info/sct",
"code": "303587008",
"display": "Therapeutic colonoscopy"
},
{
"system": "http://snomed.info/sct",
"code": "235150006",
"display": "Total colonoscopy"
}
]
}
}

View File

@ -0,0 +1,208 @@
{
"resourceType": "ValueSet",
"id": "2.16.840.1.113883.3.464.1003.198.12.1010",
"meta": {
"versionId": "6",
"lastUpdated": "2017-07-25T09:54:33.579+00:00"
},
"url": "http://measure.eval.kanvix.com/cql-measure-processor/baseDstu3/Valueset/2.16.840.1.113883.3.464.1003.198.12.1010",
"name": "Flexible Sigmoidoscopy eMeasure",
"compose": {
"include": [
{
"system": "http://www.ama-assn.org/go/cpt",
"version": "2015.1.14AB",
"concept": [
{
"code": "45330"
},
{
"code": "45339"
},
{
"code": "45331"
},
{
"code": "45334"
},
{
"code": "45337"
},
{
"code": "45340"
},
{
"code": "45335"
},
{
"code": "45341"
},
{
"code": "45332"
},
{
"code": "45333"
},
{
"code": "45338"
},
{
"code": "45345"
},
{
"code": "45342"
},
{
"code": "45346"
},
{
"code": "45347"
},
{
"code": "45349"
},
{
"code": "45350"
}
]
},
{
"system": "https://www.cms.gov/Medicare/Coding/MedHCPCSGenInfo/index.html",
"version": "2016.1.15AB",
"concept": [
{
"code": "G0104"
}
]
},
{
"system": "http://snomed.info/sct",
"version": "2014.07.14AA",
"filter": [
{
"property": "concept",
"op": "is-a",
"value": "44441009"
}
]
}
]
},
"expansion": {
"timestamp": "2016-09-20T13:20:03.237-04:00",
"total": 22,
"offset": 0,
"contains": [
{
"system": "https://www.cms.gov/Medicare/Coding/MedHCPCSGenInfo/index.html",
"code": "G0104",
"display": "Colorectal cancer screening; flexible sigmoidoscopy"
},
{
"system": "http://snomed.info/sct",
"code": "425634007",
"display": "Diagnostic endoscopic examination of lower bowel and sampling for bacterial overgrowth using fiberoptic sigmoidoscope"
},
{
"system": "http://snomed.info/sct",
"code": "44441009",
"display": "Flexible fiberoptic sigmoidoscopy"
},
{
"system": "http://snomed.info/sct",
"code": "112870002",
"display": "Flexible fiberoptic sigmoidoscopy for removal of foreign body"
},
{
"system": "http://snomed.info/sct",
"code": "396226005",
"display": "Flexible fiberoptic sigmoidoscopy with biopsy"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45330",
"display": "Sigmoidoscopy, flexible; diagnostic, including collection of specimen(s) by brushing or washing, when performed (separate procedure)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45346",
"display": "Sigmoidoscopy, flexible; with ablation of tumor(s), polyp(s), or other lesion(s) (includes pre- and post-dilation and guide wire passage, when performed)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45339",
"display": "Sigmoidoscopy, flexible; with ablation of tumor(s), polyp(s), or other lesion(s) not amenable to removal by hot biopsy forceps, bipolar cautery or snare technique"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45350",
"display": "Sigmoidoscopy, flexible; with band ligation(s) (eg, hemorrhoids)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45331",
"display": "Sigmoidoscopy, flexible; with biopsy, single or multiple"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45334",
"display": "Sigmoidoscopy, flexible; with control of bleeding, any method"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45337",
"display": "Sigmoidoscopy, flexible; with decompression (for pathologic distention) (eg, volvulus, megacolon), including placement of decompression tube, when performed"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45335",
"display": "Sigmoidoscopy, flexible; with directed submucosal injection(s), any substance"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45349",
"display": "Sigmoidoscopy, flexible; with endoscopic mucosal resection"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45341",
"display": "Sigmoidoscopy, flexible; with endoscopic ultrasound examination"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45347",
"display": "Sigmoidoscopy, flexible; with placement of endoscopic stent (includes pre- and post-dilation and guide wire passage, when performed)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45332",
"display": "Sigmoidoscopy, flexible; with removal of foreign body(s)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45333",
"display": "Sigmoidoscopy, flexible; with removal of tumor(s), polyp(s), or other lesion(s) by hot biopsy forceps"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45338",
"display": "Sigmoidoscopy, flexible; with removal of tumor(s), polyp(s), or other lesion(s) by snare technique"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45340",
"display": "Sigmoidoscopy, flexible; with transendoscopic balloon dilation"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45345",
"display": "Sigmoidoscopy, flexible; with transendoscopic stent placement (includes predilation)"
},
{
"system": "http://www.ama-assn.org/go/cpt",
"code": "45342",
"display": "Sigmoidoscopy, flexible; with transendoscopic ultrasound guided intramural or transmural fine needle aspiration/biopsy(s)"
}
]
}
}

View File

@ -0,0 +1,147 @@
{
"resourceType": "ValueSet",
"id": "2.16.840.1.113883.3.464.1003.198.12.1011",
"meta": {
"versionId": "3",
"lastUpdated": "2017-07-25T09:54:33.579+00:00"
},
"url": "http://measure.eval.kanvix.com/cql-measure-processor/baseDstu3/Valueset/2.16.840.1.113883.3.464.1003.198.12.1011",
"name": "Fecal Occult Blood Test (FOBT) eMeasure",
"compose": {
"include": [
{
"system": "http://loinc.org",
"version": "2.44.13AA",
"concept": [
{
"code": "27396-1"
},
{
"code": "58453-2"
},
{
"code": "2335-8"
},
{
"code": "14563-1"
},
{
"code": "14564-9"
},
{
"code": "14565-6"
},
{
"code": "12503-9"
},
{
"code": "12504-7"
},
{
"code": "27401-9"
},
{
"code": "27925-7"
},
{
"code": "27926-5"
},
{
"code": "29771-3"
},
{
"code": "57905-2"
},
{
"code": "56490-6"
},
{
"code": "56491-4"
}
]
}
]
},
"expansion": {
"timestamp": "2016-09-20T13:32:34.390-04:00",
"total": 15,
"offset": 0,
"contains": [
{
"system": "http://loinc.org",
"code": "27396-1",
"display": "Hemoglobin.gastrointestinal [Mass/mass] in Stool"
},
{
"system": "http://loinc.org",
"code": "58453-2",
"display": "Hemoglobin.gastrointestinal [Mass/volume] in Stool by Immunologic method"
},
{
"system": "http://loinc.org",
"code": "2335-8",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool"
},
{
"system": "http://loinc.org",
"code": "14563-1",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool --1st specimen"
},
{
"system": "http://loinc.org",
"code": "14564-9",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool --2nd specimen"
},
{
"system": "http://loinc.org",
"code": "14565-6",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool --3rd specimen"
},
{
"system": "http://loinc.org",
"code": "12503-9",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool --4th specimen"
},
{
"system": "http://loinc.org",
"code": "12504-7",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool --5th specimen"
},
{
"system": "http://loinc.org",
"code": "27401-9",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool --6th specimen"
},
{
"system": "http://loinc.org",
"code": "27925-7",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool --7th specimen"
},
{
"system": "http://loinc.org",
"code": "27926-5",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool --8th specimen"
},
{
"system": "http://loinc.org",
"code": "29771-3",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool by Immunologic method"
},
{
"system": "http://loinc.org",
"code": "57905-2",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool by Immunologic method --1st specimen"
},
{
"system": "http://loinc.org",
"code": "56490-6",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool by Immunologic method --2nd specimen"
},
{
"system": "http://loinc.org",
"code": "56491-4",
"display": "Hemoglobin.gastrointestinal [Presence] in Stool by Immunologic method --3rd specimen"
}
]
}
}

View File

@ -0,0 +1,19 @@
{
"resourceType": "Library",
"id": "plandefinitionApplyTest",
"version": "1.0",
"status": "draft",
"type": {
"coding": [
{
"code": "logic-library"
}
]
},
"content": [
{
"contentType": "text/cql",
"data": "bGlicmFyeSBwbGFuZGVmaW5pdGlvbkFwcGx5VGVzdCB2ZXJzaW9uICcxLjAnDQoNCmRlZmluZSBSZXN1bHRzOg0KICAgIHRydWUNCg0KZGVmaW5lICJEeW5hbWljIERldGFpbCBEZWZpbml0aW9uIjoNCiAgICAnVGhpcyBpcyBhIGR5bmFtaWMgZGVmaW5pdGlvbiEn"
}
]
}

View File

@ -0,0 +1,59 @@
{
"resourceType": "PlanDefinition",
"id": "apply-example",
"text": {
"status": "generated",
"div": "<div xmlns=\"http://www.w3.org/1999/xhtml\">General PlanDefinition $apply example resource</div>"
},
"identifier": [
{
"use": "official",
"value": "apply-example"
}
],
"version": "1.0",
"name": "Example",
"title": "Example for PlanDefinition $apply operation",
"type": {
"coding": [
{
"system": "http://hl7.org/fhir/plan-definition-type",
"code": "eca-rule",
"display": "ECA Rule"
}
]
},
"status": "draft",
"date": "2017-09-18",
"purpose": "Testing",
"usage": "This resource is to be used only for testing",
"topic": [
{
"text": "Testing $apply operation"
}
],
"library": [
{
"reference": "Library/plandefinitionApplyTest"
}
],
"action": [
{
"condition": [
{
"kind": "applicability",
"description": "Simple test",
"language": "text/cql",
"expression": "plandefinitionApplyTest.Results"
}
],
"dynamicValue": [
{
"description": "Set CarePlan detail definition",
"path": "title",
"expression": "plandefinitionApplyTest.\"Dynamic Detail Definition\""
}
]
}
]
}

View File

@ -133,7 +133,7 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>${argLine} -Dfile.encoding=UTF-8 -Xmx712m</argLine>
<argLine>@{argLine} -Dfile.encoding=UTF-8 -Xmx712m</argLine>
</configuration>
</plugin>
<!--

View File

@ -149,8 +149,12 @@ public class FhirContext {
myVersion = FhirVersionEnum.DSTU2.getVersionImplementation();
} else if (FhirVersionEnum.DSTU2_HL7ORG.isPresentOnClasspath()) {
myVersion = FhirVersionEnum.DSTU2_HL7ORG.getVersionImplementation();
} else if (FhirVersionEnum.DSTU2_1.isPresentOnClasspath()) {
myVersion = FhirVersionEnum.DSTU2_1.getVersionImplementation();
} else if (FhirVersionEnum.DSTU3.isPresentOnClasspath()) {
myVersion = FhirVersionEnum.DSTU3.getVersionImplementation();
} else if (FhirVersionEnum.R4.isPresentOnClasspath()) {
myVersion = FhirVersionEnum.R4.getVersionImplementation();
} else {
throw new IllegalStateException(getLocalizer().getMessage(FhirContext.class, "noStructures"));
}

View File

@ -102,40 +102,130 @@ public interface IContextValidationSupport<EVS_IN, EVS_OUT, SDT, CST, CDCT, IST>
*/
CodeValidationResult<CDCT, IST> validateCode(FhirContext theContext, String theCodeSystem, String theCode, String theDisplay);
abstract class BaseConceptProperty {
private final String myPropertyName;
/**
* Constructor
*/
protected BaseConceptProperty(String thePropertyName) {
myPropertyName = thePropertyName;
}
public String getPropertyName() {
return myPropertyName;
}
}
class StringConceptProperty extends BaseConceptProperty {
private final String myValue;
/**
* Constructor
*
* @param theName The name
*/
public StringConceptProperty(String theName, String theValue) {
super(theName);
myValue = theValue;
}
public String getValue() {
return myValue;
}
}
class CodingConceptProperty extends BaseConceptProperty {
private final String myCode;
private final String myCodeSystem;
private final String myDisplay;
/**
* Constructor
*
* @param theName The name
*/
public CodingConceptProperty(String theName, String theCodeSystem, String theCode, String theDisplay) {
super(theName);
myCodeSystem = theCodeSystem;
myCode = theCode;
myDisplay = theDisplay;
}
public String getCode() {
return myCode;
}
public String getCodeSystem() {
return myCodeSystem;
}
public String getDisplay() {
return myDisplay;
}
}
class CodeValidationResult<CDCT, IST> {
private CDCT definition;
private String message;
private IST severity;
private CDCT myDefinition;
private String myMessage;
private IST mySeverity;
private String myCodeSystemName;
private String myCodeSystemVersion;
private List<BaseConceptProperty> myProperties;
public CodeValidationResult(CDCT theNext) {
this.definition = theNext;
this.myDefinition = theNext;
}
public CodeValidationResult(IST severity, String message) {
this.severity = severity;
this.message = message;
this.mySeverity = severity;
this.myMessage = message;
}
public CodeValidationResult(IST severity, String message, CDCT definition) {
this.severity = severity;
this.message = message;
this.definition = definition;
this.mySeverity = severity;
this.myMessage = message;
this.myDefinition = definition;
}
public CDCT asConceptDefinition() {
return definition;
return myDefinition;
}
public String getCodeSystemName() {
return myCodeSystemName;
}
public void setCodeSystemName(String theCodeSystemName) {
myCodeSystemName = theCodeSystemName;
}
public String getCodeSystemVersion() {
return myCodeSystemVersion;
}
public void setCodeSystemVersion(String theCodeSystemVersion) {
myCodeSystemVersion = theCodeSystemVersion;
}
public String getMessage() {
return message;
return myMessage;
}
public List<BaseConceptProperty> getProperties() {
return myProperties;
}
public void setProperties(List<BaseConceptProperty> theProperties) {
myProperties = theProperties;
}
public IST getSeverity() {
return severity;
return mySeverity;
}
public boolean isOk() {
return definition != null;
return myDefinition != null;
}
}

View File

@ -605,4 +605,29 @@ public abstract class ResourceMetadataKeyEnum<T> implements Serializable {
}
public static final class ExtensionResourceMetadataKey extends ResourceMetadataKeyEnum<ExtensionDt> {
public ExtensionResourceMetadataKey(String url) {
super(url);
}
@Override
public ExtensionDt get(IResource theResource) {
Object retValObj = theResource.getResourceMetadata().get(this);
if (retValObj == null) {
return null;
} else if (retValObj instanceof ExtensionDt) {
return (ExtensionDt) retValObj;
}
throw new InternalErrorException("Found an object of type '" + retValObj.getClass().getCanonicalName()
+ "' in resource metadata for key " + this.name() + " - Expected "
+ ExtensionDt.class.getCanonicalName());
}
@Override
public void put(IResource theResource, ExtensionDt theObject) {
theResource.getResourceMetadata().put(this, theObject);
}
}
}

View File

@ -918,6 +918,18 @@ public abstract class BaseParser implements IParser {
throw new DataFormatException(nextChild + " has no child of type " + theType);
}
protected List<Map.Entry<ResourceMetadataKeyEnum<?>, Object>> getExtensionMetadataKeys(IResource resource) {
List<Map.Entry<ResourceMetadataKeyEnum<?>, Object>> extensionMetadataKeys = new ArrayList<Map.Entry<ResourceMetadataKeyEnum<?>, Object>>();
for (Map.Entry<ResourceMetadataKeyEnum<?>, Object> entry : resource.getResourceMetadata().entrySet()) {
if (entry.getKey() instanceof ResourceMetadataKeyEnum.ExtensionResourceMetadataKey) {
extensionMetadataKeys.add(entry);
}
}
return extensionMetadataKeys;
}
protected static <T> List<T> extractMetadataListNotNull(IResource resource, ResourceMetadataKeyEnum<List<T>> key) {
List<? extends T> securityLabels = key.get(resource);
if (securityLabels == null) {

View File

@ -45,10 +45,7 @@ import java.io.IOException;
import java.io.Reader;
import java.io.Writer;
import java.math.BigDecimal;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Set;
import java.util.*;
import static ca.uhn.fhir.context.BaseRuntimeElementDefinition.ChildTypeEnum.ID_DATATYPE;
import static ca.uhn.fhir.context.BaseRuntimeElementDefinition.ChildTypeEnum.PRIMITIVE_DATATYPE;
@ -654,8 +651,9 @@ public class JsonParser extends BaseParser implements IJsonLikeParser {
if (isBlank(versionIdPart)) {
versionIdPart = ResourceMetadataKeyEnum.VERSION.get(resource);
}
List<Map.Entry<ResourceMetadataKeyEnum<?>, Object>> extensionMetadataKeys = getExtensionMetadataKeys(resource);
if (super.shouldEncodeResourceMeta(resource) && ElementUtil.isEmpty(versionIdPart, updated, securityLabels, tags, profiles) == false) {
if (super.shouldEncodeResourceMeta(resource) && (ElementUtil.isEmpty(versionIdPart, updated, securityLabels, tags, profiles) == false) || !extensionMetadataKeys.isEmpty()) {
beginObject(theEventWriter, "meta");
writeOptionalTagWithTextNode(theEventWriter, "versionId", versionIdPart);
writeOptionalTagWithTextNode(theEventWriter, "lastUpdated", updated);
@ -695,6 +693,8 @@ public class JsonParser extends BaseParser implements IJsonLikeParser {
theEventWriter.endArray();
}
addExtensionMetadata(theResDef, theResource, theContainedResource, theSubResource, extensionMetadataKeys, resDef, theEventWriter);
theEventWriter.endObject(); // end meta
}
}
@ -704,6 +704,23 @@ public class JsonParser extends BaseParser implements IJsonLikeParser {
theEventWriter.endObject();
}
private void addExtensionMetadata(RuntimeResourceDefinition theResDef, IBaseResource theResource,
boolean theContainedResource, boolean theSubResource,
List<Map.Entry<ResourceMetadataKeyEnum<?>, Object>> extensionMetadataKeys,
RuntimeResourceDefinition resDef,
JsonLikeWriter theEventWriter) throws IOException {
if (extensionMetadataKeys.isEmpty()) {
return;
}
ExtensionDt metaResource = new ExtensionDt();
for (Map.Entry<ResourceMetadataKeyEnum<?>, Object> entry : extensionMetadataKeys) {
metaResource.addUndeclaredExtension((ExtensionDt) entry.getValue());
}
encodeCompositeElementToStreamWriter(theResDef, theResource, metaResource, theEventWriter, theContainedResource, theSubResource, new CompositeChildElement(resDef, theSubResource));
}
/**
* This is useful only for the two cases where extensions are encoded as direct children (e.g. not in some object
* called _name): resource extensions, and extension extensions

View File

@ -841,6 +841,25 @@ class ParserState<T> {
}
}
@Override
public void enteringNewElementExtension(StartElement theElem, String theUrlAttr, boolean theIsModifier, final String baseServerUrl) {
ResourceMetadataKeyEnum.ExtensionResourceMetadataKey resourceMetadataKeyEnum = new ResourceMetadataKeyEnum.ExtensionResourceMetadataKey(theUrlAttr);
Object metadataValue = myMap.get(resourceMetadataKeyEnum);
ExtensionDt newExtension;
if (metadataValue == null) {
newExtension = new ExtensionDt(theIsModifier);
} else if (metadataValue instanceof ExtensionDt) {
newExtension = (ExtensionDt) metadataValue;
} else {
throw new IllegalStateException("Expected ExtensionDt as custom resource metadata type, got: " + metadataValue.getClass().getSimpleName());
}
newExtension.setUrl(theUrlAttr);
myMap.put(resourceMetadataKeyEnum, newExtension);
ExtensionState newState = new ExtensionState(getPreResourceState(), newExtension);
push(newState);
}
}
private class MetaVersionElementState extends BaseState {

View File

@ -104,26 +104,35 @@ public class ReferenceParam extends BaseParam /*implements IQueryParameterType*/
void doSetValueAsQueryToken(FhirContext theContext, String theParamName, String theQualifier, String theValue) {
String q = theQualifier;
String resourceType = null;
boolean skipSetValue = false;
if (isNotBlank(q)) {
if (q.startsWith(":")) {
int nextIdx = q.indexOf('.');
if (nextIdx != -1) {
resourceType = q.substring(1, nextIdx);
myChain = q.substring(nextIdx + 1);
// type is explicitly defined so use it
myId.setParts(null, resourceType, theValue, null);
skipSetValue = true;
} else {
resourceType = q.substring(1);
}
} else if (q.startsWith(".")) {
myChain = q.substring(1);
// type not defined but this is a chain, so treat value as opaque
myId.setParts(null, null, theValue, null);
skipSetValue = true;
}
}
if (!skipSetValue) {
setValue(theValue);
if (isNotBlank(resourceType) && isBlank(getResourceType())) {
setValue(resourceType + '/' + theValue);
}
}
}

View File

@ -6,8 +6,11 @@ import org.apache.commons.lang3.time.DateUtils;
import java.text.DecimalFormat;
import java.text.NumberFormat;
import java.util.Date;
import java.util.LinkedHashMap;
import java.util.concurrent.TimeUnit;
import static org.apache.commons.lang3.StringUtils.isNotBlank;
/*
* #%L
* HAPI FHIR - Core Library
@ -29,6 +32,14 @@ import java.util.concurrent.TimeUnit;
*/
/**
* A multipurpose stopwatch which can be used to time tasks and produce
* human readable output about task duration, throughput, estimated task completion,
* etc.
* <p>
* <p>
* <b>Thread Safety Note: </b> StopWatch is not intended to be thread safe.
* </p>
*
* @since HAPI FHIR 3.3.0
*/
public class StopWatch {
@ -37,6 +48,9 @@ public class StopWatch {
private static final NumberFormat TEN_DAY_FORMAT = new DecimalFormat("0");
private static Long ourNowForUnitTest;
private long myStarted = now();
private long myCurrentTaskStarted = -1L;
private LinkedHashMap<String, Long> myTaskTotals;
private String myCurrentTaskName;
/**
* Constructor
@ -54,6 +68,66 @@ public class StopWatch {
myStarted = theStart.getTime();
}
private void ensureTaskTotalsMapExists() {
if (myTaskTotals == null) {
myTaskTotals = new LinkedHashMap<>();
}
}
/**
* Finish the counter on the current task (which was started by calling
* {@link #startTask(String)}. This method has no effect if no task
* is currently started so it's ok to call it more than once.
*/
public void endCurrentTask() {
if (isNotBlank(myCurrentTaskName)) {
ensureTaskTotalsMapExists();
Long existingTotal = myTaskTotals.get(myCurrentTaskName);
long taskTimeElapsed = now() - myCurrentTaskStarted;
Long newTotal = existingTotal != null ? existingTotal + taskTimeElapsed : taskTimeElapsed;
myTaskTotals.put(myCurrentTaskName, newTotal);
}
myCurrentTaskName = null;
}
/**
* Returns a string providing the durations of all tasks collected by {@link #startTask(String)}
*/
public String formatTaskDurations() {
// Flush the current task if it's ongoing
String continueTask = myCurrentTaskName;
if (isNotBlank(myCurrentTaskName)) {
endCurrentTask();
startTask(continueTask);
}
ensureTaskTotalsMapExists();
StringBuilder b = new StringBuilder();
for (String nextTask : myTaskTotals.keySet()) {
if (b.length() > 0) {
b.append("\n");
}
b.append(nextTask);
b.append(": ");
b.append(formatMillis(myTaskTotals.get(nextTask)));
}
return b.toString();
}
/**
* Determine the current throughput per unit of time (specified in theUnit)
* assuming that theNumOperations operations have happened.
* <p>
* For example, if this stopwatch has 2 seconds elapsed, and this method is
* called for theNumOperations=30 and TimeUnit=SECONDS,
* this method will return 15
* </p>
*
* @see #getThroughput(int, TimeUnit)
*/
public String formatThroughput(int theNumOperations, TimeUnit theUnit) {
double throughput = getThroughput(theNumOperations, theUnit);
return new DecimalFormat("0.0").format(throughput);
@ -99,6 +173,17 @@ public class StopWatch {
return new Date(myStarted);
}
/**
* Determine the current throughput per unit of time (specified in theUnit)
* assuming that theNumOperations operations have happened.
* <p>
* For example, if this stopwatch has 2 seconds elapsed, and this method is
* called for theNumOperations=30 and TimeUnit=SECONDS,
* this method will return 15
* </p>
*
* @see #formatThroughput(int, TimeUnit)
*/
public double getThroughput(int theNumOperations, TimeUnit theUnit) {
if (theNumOperations <= 0) {
return 0.0f;
@ -117,6 +202,23 @@ public class StopWatch {
myStarted = now();
}
/**
* Starts a counter for a sub-task
* <p>
* <b>Thread Safety Note: </b> This method is not threadsafe! Do not use subtasks in a
* multithreaded environment.
* </p>
*
* @param theTaskName Note that if theTaskName is blank or empty, no task is started
*/
public void startTask(String theTaskName) {
endCurrentTask();
if (isNotBlank(theTaskName)) {
myCurrentTaskStarted = now();
}
myCurrentTaskName = theTaskName;
}
/**
* Formats value in an appropriate format. See {@link #formatMillis(long)}}
* for a description of the format

View File

@ -103,17 +103,32 @@ public class StopWatchTest {
assertEquals("00:01:00.000", StopWatch.formatMillis(DateUtils.MILLIS_PER_MINUTE));
assertEquals("00:01:01", StopWatch.formatMillis(DateUtils.MILLIS_PER_MINUTE + DateUtils.MILLIS_PER_SECOND));
assertEquals("01:00:00", StopWatch.formatMillis(DateUtils.MILLIS_PER_HOUR));
assertEquals("1.0 day", StopWatch.formatMillis(DateUtils.MILLIS_PER_DAY));
assertEquals("2.0 days", StopWatch.formatMillis(DateUtils.MILLIS_PER_DAY * 2));
assertEquals("2.0 days", StopWatch.formatMillis((DateUtils.MILLIS_PER_DAY * 2) + 1));
assertEquals("2.4 days", StopWatch.formatMillis((DateUtils.MILLIS_PER_DAY * 2) + (10 * DateUtils.MILLIS_PER_HOUR)));
assertEquals("1.0 day", StopWatch.formatMillis(DateUtils.MILLIS_PER_DAY).replace(',', '.'));
assertEquals("2.0 days", StopWatch.formatMillis(DateUtils.MILLIS_PER_DAY * 2).replace(',', '.'));
assertEquals("2.0 days", StopWatch.formatMillis((DateUtils.MILLIS_PER_DAY * 2) + 1).replace(',', '.'));
assertEquals("2.4 days", StopWatch.formatMillis((DateUtils.MILLIS_PER_DAY * 2) + (10 * DateUtils.MILLIS_PER_HOUR)).replace(',', '.'));
assertEquals("11 days", StopWatch.formatMillis((DateUtils.MILLIS_PER_DAY * 11) + (10 * DateUtils.MILLIS_PER_HOUR)));
}
@Test
public void testFormatTaskDurations() {
StopWatch sw = new StopWatch();
StopWatch.setNowForUnitTestForUnitTest(1000L);
sw.startTask("TASK1");
StopWatch.setNowForUnitTestForUnitTest(1500L);
sw.startTask("TASK2");
StopWatch.setNowForUnitTestForUnitTest(1600L);
String taskDurations = sw.formatTaskDurations();
assertEquals("TASK1: 500ms\nTASK2: 100ms", taskDurations);
}
@Test
public void testFormatThroughput60Ops4Min() {
StopWatch sw = new StopWatch(DateUtils.addMinutes(new Date(), -4));
String throughput = sw.formatThroughput(60, TimeUnit.MINUTES);
String throughput = sw.formatThroughput(60, TimeUnit.MINUTES).replace(',', '.');
ourLog.info("{} operations in {}ms = {} ops / second", 60, sw.getMillis(), throughput);
assertThat(throughput, oneOf("14.9", "15.0", "15.1", "14,9", "15,0", "15,1"));
}

View File

@ -41,7 +41,7 @@ public class UploadTerminologyCommand extends BaseCommand {
opt.setRequired(true);
options.addOption(opt);
opt = new Option("u", "url", true, "The code system URL associated with this upload (e.g. " + IHapiTerminologyLoaderSvc.SCT_URL + ")");
opt = new Option("u", "url", true, "The code system URL associated with this upload (e.g. " + IHapiTerminologyLoaderSvc.SCT_URI + ")");
opt.setRequired(false);
options.addOption(opt);

View File

@ -8,10 +8,9 @@ import ca.uhn.fhir.jpa.provider.JpaConformanceProviderDstu2;
import ca.uhn.fhir.jpa.provider.JpaSystemProviderDstu2;
import ca.uhn.fhir.jpa.provider.dstu3.JpaConformanceProviderDstu3;
import ca.uhn.fhir.jpa.provider.dstu3.JpaSystemProviderDstu3;
import ca.uhn.fhir.jpa.provider.dstu3.TerminologyUploaderProviderDstu3;
import ca.uhn.fhir.jpa.provider.r4.JpaConformanceProviderR4;
import ca.uhn.fhir.jpa.provider.r4.JpaSystemProviderR4;
import ca.uhn.fhir.jpa.provider.r4.TerminologyUploaderProviderR4;
import ca.uhn.fhir.jpa.provider.TerminologyUploaderProvider;
import ca.uhn.fhir.model.dstu2.composite.MetaDt;
import ca.uhn.fhir.model.dstu2.resource.Bundle;
import ca.uhn.fhir.narrative.DefaultThymeleafNarrativeGenerator;
@ -24,11 +23,9 @@ import ca.uhn.fhir.rest.server.interceptor.CorsInterceptor;
import ca.uhn.fhir.rest.server.interceptor.IServerInterceptor;
import org.springframework.web.context.ContextLoaderListener;
import org.springframework.web.context.WebApplicationContext;
import org.springframework.web.cors.CorsConfiguration;
import javax.servlet.ServletException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
@ -81,10 +78,10 @@ public class JpaServerDemo extends RestfulServer {
systemProvider.add(myAppCtx.getBean("mySystemProviderDstu2", JpaSystemProviderDstu2.class));
} else if (fhirVersion == FhirVersionEnum.DSTU3) {
systemProvider.add(myAppCtx.getBean("mySystemProviderDstu3", JpaSystemProviderDstu3.class));
systemProvider.add(myAppCtx.getBean(TerminologyUploaderProviderDstu3.class));
systemProvider.add(myAppCtx.getBean(TerminologyUploaderProvider.class));
} else if (fhirVersion == FhirVersionEnum.R4) {
systemProvider.add(myAppCtx.getBean("mySystemProviderR4", JpaSystemProviderR4.class));
systemProvider.add(myAppCtx.getBean(TerminologyUploaderProviderR4.class));
systemProvider.add(myAppCtx.getBean(TerminologyUploaderProvider.class));
} else {
throw new IllegalStateException();
}

View File

@ -66,7 +66,7 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>${argLine} -Dfile.encoding=UTF-8 -Xmx712m</argLine>
<argLine>@{argLine} -Dfile.encoding=UTF-8 -Xmx712m</argLine>
</configuration>
</plugin>
</plugins>

View File

@ -2105,14 +2105,14 @@ public class GenericClient extends BaseClient implements IGenericClient {
@Override
public IUpdateTyped resource(IBaseResource theResource) {
Validate.notNull(theResource, "Resource can not be null");
//Validate.notNull(theResource, "Resource can not be null");
myResource = theResource;
return this;
}
@Override
public IUpdateTyped resource(String theResourceBody) {
Validate.notBlank(theResourceBody, "Body can not be null or blank");
//Validate.notBlank(theResourceBody, "Body can not be null or blank");
myResourceBody = theResourceBody;
return this;
}

View File

@ -34,8 +34,6 @@ import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import org.apache.commons.lang3.StringUtils;
import org.hl7.fhir.dstu3.hapi.rest.server.ServerCapabilityStatementProvider;
import org.hl7.fhir.dstu3.model.CapabilityStatement;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.slf4j.LoggerFactory;
@ -67,10 +65,10 @@ public abstract class AbstractJaxRsConformanceProvider extends AbstractJaxRsProv
/** the conformance. It is created once during startup */
private org.hl7.fhir.r4.model.CapabilityStatement myR4CapabilityStatement;
private CapabilityStatement myDstu3CapabilityStatement;
private org.hl7.fhir.dstu3.model.CapabilityStatement myDstu3CapabilityStatement;
private org.hl7.fhir.dstu2016may.model.Conformance myDstu2_1Conformance;
private ca.uhn.fhir.model.dstu2.resource.Conformance myDstu2Conformance;
private org.hl7.fhir.instance.model.Conformance myDstu2Hl7OrgConformance;
private ca.uhn.fhir.model.dstu2.resource.Conformance myDstu2Conformance;
/**
* Constructor allowing the description, servername and server to be set
@ -127,26 +125,35 @@ public abstract class AbstractJaxRsConformanceProvider extends AbstractJaxRsProv
HardcodedServerAddressStrategy hardcodedServerAddressStrategy = new HardcodedServerAddressStrategy();
hardcodedServerAddressStrategy.setValue(getBaseForServer());
serverConfiguration.setServerAddressStrategy(hardcodedServerAddressStrategy);
if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.R4)) {
org.hl7.fhir.r4.hapi.rest.server.ServerCapabilityStatementProvider serverCapabilityStatementProvider = new org.hl7.fhir.r4.hapi.rest.server.ServerCapabilityStatementProvider(serverConfiguration);
serverCapabilityStatementProvider.initializeOperations();
myR4CapabilityStatement = serverCapabilityStatementProvider.getServerConformance(null);
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU3)) {
ServerCapabilityStatementProvider serverCapabilityStatementProvider = new ServerCapabilityStatementProvider(serverConfiguration);
serverCapabilityStatementProvider.initializeOperations();
myDstu3CapabilityStatement = serverCapabilityStatementProvider.getServerConformance(null);
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU2_1)) {
org.hl7.fhir.dstu2016may.hapi.rest.server.ServerConformanceProvider serverCapabilityStatementProvider = new org.hl7.fhir.dstu2016may.hapi.rest.server.ServerConformanceProvider(serverConfiguration);
serverCapabilityStatementProvider.initializeOperations();
myDstu2_1Conformance = serverCapabilityStatementProvider.getServerConformance(null);
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU2)) {
ca.uhn.fhir.rest.server.provider.dstu2.ServerConformanceProvider serverCapabilityStatementProvider = new ca.uhn.fhir.rest.server.provider.dstu2.ServerConformanceProvider(serverConfiguration);
serverCapabilityStatementProvider.initializeOperations();
myDstu2Conformance = serverCapabilityStatementProvider.getServerConformance(null);
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU2_HL7ORG)) {
org.hl7.fhir.instance.conf.ServerConformanceProvider serverCapabilityStatementProvider = new org.hl7.fhir.instance.conf.ServerConformanceProvider(serverConfiguration);
serverCapabilityStatementProvider.initializeOperations();
myDstu2Hl7OrgConformance = serverCapabilityStatementProvider.getServerConformance(null);
FhirVersionEnum fhirContextVersion = super.getFhirContext().getVersion().getVersion();
switch (fhirContextVersion) {
case R4:
org.hl7.fhir.r4.hapi.rest.server.ServerCapabilityStatementProvider r4ServerCapabilityStatementProvider = new org.hl7.fhir.r4.hapi.rest.server.ServerCapabilityStatementProvider(serverConfiguration);
r4ServerCapabilityStatementProvider.initializeOperations();
myR4CapabilityStatement = r4ServerCapabilityStatementProvider.getServerConformance(null);
break;
case DSTU3:
org.hl7.fhir.dstu3.hapi.rest.server.ServerCapabilityStatementProvider dstu3ServerCapabilityStatementProvider = new org.hl7.fhir.dstu3.hapi.rest.server.ServerCapabilityStatementProvider(serverConfiguration);
dstu3ServerCapabilityStatementProvider.initializeOperations();
myDstu3CapabilityStatement = dstu3ServerCapabilityStatementProvider.getServerConformance(null);
break;
case DSTU2_1:
org.hl7.fhir.dstu2016may.hapi.rest.server.ServerConformanceProvider dstu2_1ServerConformanceProvider = new org.hl7.fhir.dstu2016may.hapi.rest.server.ServerConformanceProvider(serverConfiguration);
dstu2_1ServerConformanceProvider.initializeOperations();
myDstu2_1Conformance = dstu2_1ServerConformanceProvider.getServerConformance(null);
break;
case DSTU2_HL7ORG:
org.hl7.fhir.instance.conf.ServerConformanceProvider dstu2Hl7OrgServerConformanceProvider = new org.hl7.fhir.instance.conf.ServerConformanceProvider(serverConfiguration);
dstu2Hl7OrgServerConformanceProvider.initializeOperations();
myDstu2Hl7OrgConformance = dstu2Hl7OrgServerConformanceProvider.getServerConformance(null);
break;
case DSTU2:
ca.uhn.fhir.rest.server.provider.dstu2.ServerConformanceProvider dstu2ServerConformanceProvider = new ca.uhn.fhir.rest.server.provider.dstu2.ServerConformanceProvider(serverConfiguration);
dstu2ServerConformanceProvider.initializeOperations();
myDstu2Conformance = dstu2ServerConformanceProvider.getServerConformance(null);
break;
default:
throw new ConfigurationException("Unsupported Fhir version: " + fhirContextVersion);
}
}
@ -181,20 +188,26 @@ public abstract class AbstractJaxRsConformanceProvider extends AbstractJaxRsProv
IRestfulResponse response = request.build().getResponse();
response.addHeader(Constants.HEADER_CORS_ALLOW_ORIGIN, "*");
IBaseResource conformance = null;
if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.R4)) {
IBaseResource conformance;
FhirVersionEnum fhirContextVersion = super.getFhirContext().getVersion().getVersion();
switch (fhirContextVersion) {
case R4:
conformance = myR4CapabilityStatement;
// return (Response) response.returnResponse(ParseAction.create(myDstu3CapabilityStatement), Constants.STATUS_HTTP_200_OK, true, null, getResourceType().getSimpleName());
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU3)) {
break;
case DSTU3:
conformance = myDstu3CapabilityStatement;
// return (Response) response.returnResponse(ParseAction.create(myDstu3CapabilityStatement), Constants.STATUS_HTTP_200_OK, true, null, getResourceType().getSimpleName());
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU2_1)) {
break;
case DSTU2_1:
conformance = myDstu2_1Conformance;
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU2)) {
conformance = myDstu2Conformance;
// return (Response) response.returnResponse(ParseAction.create(myDstu2CapabilityStatement), Constants.STATUS_HTTP_200_OK, true, null, getResourceType().getSimpleName());
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU2_HL7ORG)) {
break;
case DSTU2_HL7ORG:
conformance = myDstu2Hl7OrgConformance;
break;
case DSTU2:
conformance = myDstu2Conformance;
break;
default:
throw new ConfigurationException("Unsupported Fhir version: " + fhirContextVersion);
}
if (conformance != null) {
@ -279,18 +292,21 @@ public abstract class AbstractJaxRsConformanceProvider extends AbstractJaxRsProv
@SuppressWarnings("unchecked")
@Override
public Class<IBaseResource> getResourceType() {
if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.R4)) {
FhirVersionEnum fhirContextVersion = super.getFhirContext().getVersion().getVersion();
switch (fhirContextVersion) {
case R4:
return Class.class.cast(org.hl7.fhir.r4.model.CapabilityStatement.class);
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU3)) {
return Class.class.cast(CapabilityStatement.class);
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU2_1)) {
case DSTU3:
return Class.class.cast(org.hl7.fhir.dstu3.model.CapabilityStatement.class);
case DSTU2_1:
return Class.class.cast(org.hl7.fhir.dstu2016may.model.Conformance.class);
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU2)) {
return Class.class.cast(ca.uhn.fhir.model.dstu2.resource.Conformance.class);
} else if (super.getFhirContext().getVersion().getVersion().equals(FhirVersionEnum.DSTU2_HL7ORG)) {
case DSTU2_HL7ORG:
return Class.class.cast(org.hl7.fhir.instance.model.Conformance.class);
case DSTU2:
return Class.class.cast(ca.uhn.fhir.model.dstu2.resource.Conformance.class);
default:
throw new ConfigurationException("Unsupported Fhir version: " + fhirContextVersion);
}
return null;
}
}

View File

@ -28,12 +28,11 @@ import javax.ws.rs.core.HttpHeaders;
import javax.ws.rs.core.MediaType;
import org.apache.commons.lang3.StringUtils;
import org.hl7.fhir.dstu3.model.IdType;
import ca.uhn.fhir.context.ConfigurationException;
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.context.FhirVersionEnum;
import ca.uhn.fhir.jaxrs.server.AbstractJaxRsProvider;
import ca.uhn.fhir.model.primitive.IdDt;
import ca.uhn.fhir.rest.api.*;
import ca.uhn.fhir.rest.api.server.IRestfulResponse;
import ca.uhn.fhir.rest.api.server.RequestDetails;
@ -93,28 +92,68 @@ public class JaxRsRequest extends RequestDetails {
FhirVersionEnum fhirContextVersion = myServer.getFhirContext().getVersion().getVersion();
if (StringUtils.isNotBlank(myVersion)) {
if (FhirVersionEnum.DSTU3.equals(fhirContextVersion) || FhirVersionEnum.DSTU2_HL7ORG.equals(fhirContextVersion)) {
result.setId(
new IdType(myServer.getBaseForRequest(), UrlUtil.unescape(myId), UrlUtil.unescape(myVersion)));
} else if (FhirVersionEnum.DSTU2.equals(fhirContextVersion)) {
result.setId(
new IdDt(myServer.getBaseForRequest(), UrlUtil.unescape(myId), UrlUtil.unescape(myVersion)));
switch (fhirContextVersion) {
case R4:
result.setId(new org.hl7.fhir.r4.model.IdType(myServer.getBaseForRequest(), UrlUtil.unescape(myId), UrlUtil.unescape(myVersion)));
break;
case DSTU3:
result.setId(new org.hl7.fhir.dstu3.model.IdType(myServer.getBaseForRequest(), UrlUtil.unescape(myId), UrlUtil.unescape(myVersion)));
break;
case DSTU2_1:
result.setId(new org.hl7.fhir.dstu2016may.model.IdType(myServer.getBaseForRequest(), UrlUtil.unescape(myId), UrlUtil.unescape(myVersion)));
break;
case DSTU2_HL7ORG:
result.setId(new org.hl7.fhir.instance.model.IdType(myServer.getBaseForRequest(), UrlUtil.unescape(myId), UrlUtil.unescape(myVersion)));
break;
case DSTU2:
result.setId(new ca.uhn.fhir.model.primitive.IdDt(myServer.getBaseForRequest(), UrlUtil.unescape(myId), UrlUtil.unescape(myVersion)));
break;
default:
throw new ConfigurationException("Unsupported Fhir version: " + fhirContextVersion);
}
} else if (StringUtils.isNotBlank(myId)) {
if (FhirVersionEnum.DSTU3.equals(fhirContextVersion) || FhirVersionEnum.DSTU2_HL7ORG.equals(fhirContextVersion)) {
result.setId(new IdType(myServer.getBaseForRequest(), UrlUtil.unescape(myId)));
} else if (FhirVersionEnum.DSTU2.equals(fhirContextVersion)) {
result.setId(new IdDt(myServer.getBaseForRequest(), UrlUtil.unescape(myId)));
switch (fhirContextVersion) {
case R4:
result.setId(new org.hl7.fhir.r4.model.IdType(myServer.getBaseForRequest(), UrlUtil.unescape(myId)));
break;
case DSTU3:
result.setId(new org.hl7.fhir.dstu3.model.IdType(myServer.getBaseForRequest(), UrlUtil.unescape(myId)));
break;
case DSTU2_1:
result.setId(new org.hl7.fhir.dstu2016may.model.IdType(myServer.getBaseForRequest(), UrlUtil.unescape(myId)));
break;
case DSTU2_HL7ORG:
result.setId(new org.hl7.fhir.instance.model.IdType(myServer.getBaseForRequest(), UrlUtil.unescape(myId)));
break;
case DSTU2:
result.setId(new ca.uhn.fhir.model.primitive.IdDt(myServer.getBaseForRequest(), UrlUtil.unescape(myId)));
break;
default:
throw new ConfigurationException("Unsupported Fhir version: " + fhirContextVersion);
}
}
if (myRestOperation == RestOperationTypeEnum.UPDATE) {
String contentLocation = result.getHeader(Constants.HEADER_CONTENT_LOCATION);
if (contentLocation != null) {
if (FhirVersionEnum.DSTU3.equals(fhirContextVersion) || FhirVersionEnum.DSTU2_HL7ORG.equals(fhirContextVersion)) {
result.setId(new IdType(contentLocation));
} else if (FhirVersionEnum.DSTU2.equals(fhirContextVersion)) {
result.setId(new IdDt(contentLocation));
switch (fhirContextVersion) {
case R4:
result.setId(new org.hl7.fhir.r4.model.IdType(contentLocation));
break;
case DSTU3:
result.setId(new org.hl7.fhir.dstu3.model.IdType(contentLocation));
break;
case DSTU2_1:
result.setId(new org.hl7.fhir.dstu2016may.model.IdType(contentLocation));
break;
case DSTU2_HL7ORG:
result.setId(new org.hl7.fhir.instance.model.IdType(contentLocation));
break;
case DSTU2:
result.setId(new ca.uhn.fhir.model.primitive.IdDt(contentLocation));
break;
default:
throw new ConfigurationException("Unsupported Fhir version: " + fhirContextVersion);
}
}
}

View File

@ -600,7 +600,7 @@
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<runOrder>alphabetical</runOrder>
<argLine>${argLine} -Dfile.encoding=UTF-8 -Xmx1024m</argLine>
<argLine>@{argLine} -Dfile.encoding=UTF-8 -Xmx1024m</argLine>
<forkCount>0.6C</forkCount>
</configuration>
</plugin>

View File

@ -9,7 +9,7 @@ import ca.uhn.fhir.jpa.dao.IFulltextSearchSvc;
import ca.uhn.fhir.jpa.dao.ISearchParamRegistry;
import ca.uhn.fhir.jpa.dao.dstu3.SearchParamExtractorDstu3;
import ca.uhn.fhir.jpa.dao.dstu3.SearchParamRegistryDstu3;
import ca.uhn.fhir.jpa.provider.dstu3.TerminologyUploaderProviderDstu3;
import ca.uhn.fhir.jpa.provider.TerminologyUploaderProvider;
import ca.uhn.fhir.jpa.term.*;
import ca.uhn.fhir.jpa.validation.JpaValidationSupportChainDstu3;
import ca.uhn.fhir.validation.IValidatorModule;
@ -115,8 +115,8 @@ public class BaseDstu3Config extends BaseConfig {
}
@Bean(autowire = Autowire.BY_TYPE)
public TerminologyUploaderProviderDstu3 terminologyUploaderProvider() {
TerminologyUploaderProviderDstu3 retVal = new TerminologyUploaderProviderDstu3();
public TerminologyUploaderProvider terminologyUploaderProvider() {
TerminologyUploaderProvider retVal = new TerminologyUploaderProvider();
retVal.setContext(fhirContextDstu3());
return retVal;
}

View File

@ -10,7 +10,7 @@ import ca.uhn.fhir.jpa.dao.ISearchParamRegistry;
import ca.uhn.fhir.jpa.dao.r4.SearchParamExtractorR4;
import ca.uhn.fhir.jpa.dao.r4.SearchParamRegistryR4;
import ca.uhn.fhir.jpa.graphql.JpaStorageServices;
import ca.uhn.fhir.jpa.provider.r4.TerminologyUploaderProviderR4;
import ca.uhn.fhir.jpa.provider.TerminologyUploaderProvider;
import ca.uhn.fhir.jpa.term.HapiTerminologySvcR4;
import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc;
import ca.uhn.fhir.jpa.term.IHapiTerminologySvcR4;
@ -133,8 +133,8 @@ public class BaseR4Config extends BaseConfig {
}
@Bean(autowire = Autowire.BY_TYPE)
public TerminologyUploaderProviderR4 terminologyUploaderProvider() {
TerminologyUploaderProviderR4 retVal = new TerminologyUploaderProviderR4();
public TerminologyUploaderProvider terminologyUploaderProvider() {
TerminologyUploaderProvider retVal = new TerminologyUploaderProvider();
retVal.setContext(fhirContextR4());
return retVal;
}

View File

@ -194,7 +194,7 @@ public class FhirResourceDaoValueSetDstu2 extends FhirResourceDaoDstu2<ValueSet>
retVal.setSearchedForSystem(system);
retVal.setFound(true);
if (nextCode.getAbstract() != null) {
retVal.setCodeIsAbstract(nextCode.getAbstract().booleanValue());
retVal.setCodeIsAbstract(nextCode.getAbstract());
}
retVal.setCodeDisplay(nextCode.getDisplay());
retVal.setCodeSystemVersion(nextCode.getVersion());

View File

@ -1,13 +1,12 @@
package ca.uhn.fhir.jpa.dao;
import ca.uhn.fhir.context.support.IContextValidationSupport;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.instance.model.api.IIdType;
import org.hl7.fhir.instance.model.api.IPrimitiveType;
import org.hl7.fhir.r4.model.BooleanType;
import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r4.model.StringType;
import org.hl7.fhir.r4.model.*;
import java.util.List;
@ -48,6 +47,8 @@ public interface IFhirResourceDaoCodeSystem<T extends IBaseResource, CD, CC> ext
private boolean myFound;
private String mySearchedForCode;
private String mySearchedForSystem;
private List<IContextValidationSupport.BaseConceptProperty> myProperties;
/**
* Constructor
*/
@ -111,6 +112,10 @@ public interface IFhirResourceDaoCodeSystem<T extends IBaseResource, CD, CC> ext
myFound = theFound;
}
public void setProperties(List<IContextValidationSupport.BaseConceptProperty> theProperties) {
myProperties = theProperties;
}
public void throwNotFoundIfAppropriate() {
if (isFound() == false) {
throw new ResourceNotFoundException("Unable to find code[" + getSearchedForCode() + "] in system[" + getSearchedForSystem() + "]");
@ -127,6 +132,35 @@ public interface IFhirResourceDaoCodeSystem<T extends IBaseResource, CD, CC> ext
retVal.addParameter().setName("display").setValue(new StringType(getCodeDisplay()));
retVal.addParameter().setName("abstract").setValue(new BooleanType(isCodeIsAbstract()));
if (myProperties != null) {
for (IContextValidationSupport.BaseConceptProperty next : myProperties) {
Parameters.ParametersParameterComponent property = retVal.addParameter().setName("property");
property
.addPart()
.setName("code")
.setValue(new CodeType(next.getPropertyName()));
if (next instanceof IContextValidationSupport.StringConceptProperty) {
IContextValidationSupport.StringConceptProperty prop = (IContextValidationSupport.StringConceptProperty) next;
property
.addPart()
.setName("value")
.setValue(new StringType(prop.getValue()));
} else if (next instanceof IContextValidationSupport.CodingConceptProperty) {
IContextValidationSupport.CodingConceptProperty prop = (IContextValidationSupport.CodingConceptProperty) next;
property
.addPart()
.setName("value")
.setValue(new Coding()
.setSystem(prop.getCodeSystem())
.setCode(prop.getCode())
.setDisplay(prop.getDisplay()));
} else {
throw new IllegalStateException("Don't know how to handle " + next.getClass());
}
}
}
return retVal;
}
}

View File

@ -0,0 +1,37 @@
package ca.uhn.fhir.jpa.dao.data;
import ca.uhn.fhir.jpa.entity.TermCodeSystemVersion;
import ca.uhn.fhir.jpa.entity.TermConcept;
import ca.uhn.fhir.jpa.entity.TermConceptProperty;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.jpa.repository.Modifying;
import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.query.Param;
import java.util.List;
/*
* #%L
* HAPI FHIR JPA Server
* %%
* Copyright (C) 2014 - 2018 University Health Network
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
public interface ITermConceptPropertyDao extends JpaRepository<TermConceptProperty, Long> {
// nothing
}

View File

@ -52,6 +52,7 @@ import java.util.Date;
import java.util.List;
import java.util.Set;
import static org.apache.commons.lang3.StringUtils.isBlank;
import static org.apache.commons.lang3.StringUtils.isNotBlank;
public class FhirResourceDaoCodeSystemDstu3 extends FhirResourceDaoDstu3<CodeSystem> implements IFhirResourceDaoCodeSystem<CodeSystem, Coding, CodeableConcept> {
@ -98,7 +99,7 @@ public class FhirResourceDaoCodeSystemDstu3 extends FhirResourceDaoDstu3<CodeSys
public List<IIdType> findCodeSystemIdsContainingSystemAndCode(String theCode, String theSystem) {
List<IIdType> valueSetIds;
Set<Long> ids = searchForIds(new SearchParameterMap(CodeSystem.SP_CODE, new TokenParam(theSystem, theCode)));
valueSetIds = new ArrayList<IIdType>();
valueSetIds = new ArrayList<>();
for (Long next : ids) {
valueSetIds.add(new IdType("CodeSystem", next));
}
@ -128,11 +129,11 @@ public class FhirResourceDaoCodeSystemDstu3 extends FhirResourceDaoDstu3<CodeSys
system = theSystem.getValue();
}
ourLog.info("Looking up {} / {}", system, code);
ourLog.debug("Looking up {} / {}", system, code);
if (myValidationSupport.isCodeSystemSupported(getContext(), system)) {
ourLog.info("Code system {} is supported", system);
ourLog.debug("Code system {} is supported", system);
CodeValidationResult result = myValidationSupport.validateCode(getContext(), system, code, null);
if (result != null) {
@ -142,32 +143,20 @@ public class FhirResourceDaoCodeSystemDstu3 extends FhirResourceDaoDstu3<CodeSys
retVal.setSearchedForCode(code);
retVal.setSearchedForSystem(system);
retVal.setCodeDisplay(result.asConceptDefinition().getDisplay());
retVal.setCodeSystemDisplayName("Unknown");
retVal.setCodeSystemVersion("");
String codeSystemDisplayName = result.getCodeSystemName();
if (isBlank(codeSystemDisplayName)) {
codeSystemDisplayName = "Unknown";
}
retVal.setCodeSystemDisplayName(codeSystemDisplayName);
retVal.setCodeSystemVersion(result.getCodeSystemVersion());
retVal.setProperties(result.getProperties());
return retVal;
}
}
// HapiWorkerContext ctx = new HapiWorkerContext(getContext(), myValidationSupport);
// ValueSetExpander expander = ctx.getExpander();
// ValueSet source = new ValueSet();
// source.getCompose().addInclude().setSystem(system).addConcept().setCode(code);
//
// ValueSetExpansionOutcome expansion;
// try {
// expansion = expander.expand(source);
// } catch (Exception e) {
// throw new InternalErrorException(e);
// }
//
// if (expansion.getValueset() != null) {
// List<ValueSetExpansionContainsComponent> contains = expansion.getValueset().getExpansion().getContains();
// LookupCodeResult result = lookup(contains, system, code);
// if (result != null) {
// return result;
// }
// }
}
// We didn't find it..
@ -199,7 +188,7 @@ public class FhirResourceDaoCodeSystemDstu3 extends FhirResourceDaoDstu3<CodeSys
if (isNotBlank(next.getCode())) {
TermConcept termConcept = new TermConcept();
termConcept.setCode(next.getCode());
termConcept.setCodeSystem(theCodeSystemVersion);
termConcept.setCodeSystemVersion(theCodeSystemVersion);
termConcept.setDisplay(next.getDisplay());
termConcept.addChildren(toPersistedConcepts(next.getConcept(), theCodeSystemVersion), RelationshipTypeEnum.ISA);
retVal.add(termConcept);
@ -229,7 +218,7 @@ public class FhirResourceDaoCodeSystemDstu3 extends FhirResourceDaoDstu3<CodeSys
persCs.setResource(retVal);
persCs.getConcepts().addAll(toPersistedConcepts(cs.getConcept(), persCs));
ourLog.info("Code system has {} concepts", persCs.getConcepts().size());
myTerminologySvc.storeNewCodeSystemVersion(codeSystemResourcePid, codeSystemUrl, persCs);
myTerminologySvc.storeNewCodeSystemVersion(codeSystemResourcePid, codeSystemUrl, cs.getName(), persCs);
}
}

View File

@ -99,11 +99,7 @@ public class FhirResourceDaoSubscriptionDstu3 extends FhirResourceDaoDstu3<Subsc
}
protected void validateChannelPayload(Subscription theResource) {
if (isBlank(theResource.getChannel().getPayload())) {
throw new UnprocessableEntityException("Subscription.channel.payload must be populated for rest-hook subscriptions");
}
if (EncodingEnum.forContentType(theResource.getChannel().getPayload()) == null) {
if (!isBlank(theResource.getChannel().getPayload()) && EncodingEnum.forContentType(theResource.getChannel().getPayload()) == null) {
throw new UnprocessableEntityException("Invalid value for Subscription.channel.payload: " + theResource.getChannel().getPayload());
}
}

View File

@ -32,8 +32,6 @@ import ca.uhn.fhir.util.StopWatch;
import ca.uhn.fhir.rest.param.ParameterUtil;
import org.apache.commons.lang3.Validate;
import org.apache.http.NameValuePair;
import org.hibernate.Session;
import org.hibernate.internal.SessionImpl;
import org.hl7.fhir.dstu3.model.*;
import org.hl7.fhir.dstu3.model.Bundle.*;
import org.hl7.fhir.dstu3.model.OperationOutcome.IssueSeverity;
@ -150,7 +148,7 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
}
if (transactionType == null) {
String message = "Transactiion Bundle did not specify valid Bundle.type, assuming " + BundleType.TRANSACTION.toCode();
String message = "Transaction Bundle did not specify valid Bundle.type, assuming " + BundleType.TRANSACTION.toCode();
ourLog.warn(message);
transactionType = BundleType.TRANSACTION;
}
@ -158,9 +156,10 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
throw new InvalidRequestException("Unable to process transaction where incoming Bundle.type = " + transactionType.toCode());
}
ourLog.info("Beginning {} with {} resources", theActionName, theRequest.getEntry().size());
ourLog.debug("Beginning {} with {} resources", theActionName, theRequest.getEntry().size());
long start = System.currentTimeMillis();
// long start = System.currentTimeMillis();
final StopWatch transactionSw = new StopWatch();
final Date updateTime = new Date();
final Set<IdType> allIds = new LinkedHashSet<IdType>();
@ -202,7 +201,7 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
* Basically if the resource has a match URL that references a placeholder,
* we try to handle the resource with the placeholder first.
*/
Set<String> placeholderIds = new HashSet<String>();
Set<String> placeholderIds = new HashSet<>();
final List<BundleEntryComponent> entries = theRequest.getEntry();
for (BundleEntryComponent nextEntry : entries) {
if (isNotBlank(nextEntry.getFullUrl()) && nextEntry.getFullUrl().startsWith(IdType.URN_PREFIX)) {
@ -224,7 +223,7 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
Map<BundleEntryComponent, ResourceTable> entriesToProcess = txManager.execute(new TransactionCallback<Map<BundleEntryComponent, ResourceTable>>() {
@Override
public Map<BundleEntryComponent, ResourceTable> doInTransaction(TransactionStatus status) {
return doTransactionWriteOperations(theRequestDetails, theRequest, theActionName, updateTime, allIds, idSubstitutions, idToPersistedOutcome, response, originalRequestOrder, entries);
return doTransactionWriteOperations(theRequestDetails, theRequest, theActionName, updateTime, allIds, idSubstitutions, idToPersistedOutcome, response, originalRequestOrder, entries, transactionSw);
}
});
for (Entry<BundleEntryComponent, ResourceTable> nextEntry : entriesToProcess.entrySet()) {
@ -237,8 +236,7 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
/*
* Loop through the request and process any entries of type GET
*/
for (int i = 0; i < getEntries.size(); i++) {
BundleEntryComponent nextReqEntry = getEntries.get(i);
for (BundleEntryComponent nextReqEntry : getEntries) {
Integer originalOrder = originalRequestOrder.get(nextReqEntry);
BundleEntryComponent nextRespEntry = response.getEntry().get(originalOrder);
@ -258,7 +256,7 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
for (NameValuePair next : parameters) {
paramValues.put(next.getName(), next.getValue());
}
for (java.util.Map.Entry<String, Collection<String>> nextParamEntry : paramValues.asMap().entrySet()) {
for (Entry<String, Collection<String>> nextParamEntry : paramValues.asMap().entrySet()) {
String[] nextValue = nextParamEntry.getValue().toArray(new String[nextParamEntry.getValue().size()]);
requestDetails.addParameter(nextParamEntry.getKey(), nextValue);
}
@ -302,8 +300,8 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
}
long delay = System.currentTimeMillis() - start;
ourLog.info(theActionName + " completed in {}ms", new Object[] { delay });
ourLog.info(theActionName + " completed in {}", transactionSw.toString());
ourLog.info(theActionName + " details:\n{}", transactionSw.formatTaskDurations());
response.setType(BundleType.TRANSACTIONRESPONSE);
return response;
@ -311,12 +309,12 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
@SuppressWarnings("unchecked")
private Map<BundleEntryComponent, ResourceTable> doTransactionWriteOperations(ServletRequestDetails theRequestDetails, Bundle theRequest, String theActionName, Date updateTime, Set<IdType> allIds,
Map<IdType, IdType> theIdSubstitutions, Map<IdType, DaoMethodOutcome> idToPersistedOutcome, Bundle response, IdentityHashMap<BundleEntryComponent, Integer> originalRequestOrder, List<BundleEntryComponent> theEntries) {
Set<String> deletedResources = new HashSet<String>();
List<DeleteConflict> deleteConflicts = new ArrayList<DeleteConflict>();
Map<BundleEntryComponent, ResourceTable> entriesToProcess = new IdentityHashMap<BundleEntryComponent, ResourceTable>();
Set<ResourceTable> nonUpdatedEntities = new HashSet<ResourceTable>();
Map<String, Class<? extends IBaseResource>> conditionalRequestUrls = new HashMap<String, Class<? extends IBaseResource>>();
Map<IdType, IdType> theIdSubstitutions, Map<IdType, DaoMethodOutcome> idToPersistedOutcome, Bundle response, IdentityHashMap<BundleEntryComponent, Integer> originalRequestOrder, List<BundleEntryComponent> theEntries, StopWatch theStopWatch) {
Set<String> deletedResources = new HashSet<>();
List<DeleteConflict> deleteConflicts = new ArrayList<>();
Map<BundleEntryComponent, ResourceTable> entriesToProcess = new IdentityHashMap<>();
Set<ResourceTable> nonUpdatedEntities = new HashSet<>();
Map<String, Class<? extends IBaseResource>> conditionalRequestUrls = new HashMap<>();
/*
* Loop through the request and process any entries of type
@ -371,6 +369,8 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
String resourceType = res != null ? getContext().getResourceDefinition(res).getName() : null;
BundleEntryComponent nextRespEntry = response.getEntry().get(originalRequestOrder.get(nextReqEntry));
theStopWatch.startTask("Process entry " + i + ": " + verb + " " + defaultString(resourceType));
switch (verb) {
case POST: {
// CREATE
@ -470,6 +470,8 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
}
}
theStopWatch.endCurrentTask();
/*
* Make sure that there are no conflicts from deletions. E.g. we can't delete something
* if something else has a reference to it.. Unless the thing that has a reference to it
@ -538,8 +540,12 @@ public class FhirSystemDaoDstu3 extends BaseHapiFhirSystemDao<Bundle, Meta> {
}
}
theStopWatch.startTask("Flush Session");
flushJpaSession();
theStopWatch.endCurrentTask();
/*
* Double check we didn't allow any duplicates we shouldn't have
*/

View File

@ -52,6 +52,7 @@ import java.util.Date;
import java.util.List;
import java.util.Set;
import static org.apache.commons.lang3.StringUtils.isBlank;
import static org.apache.commons.lang3.StringUtils.isNotBlank;
public class FhirResourceDaoCodeSystemR4 extends FhirResourceDaoR4<CodeSystem> implements IFhirResourceDaoCodeSystem<CodeSystem, Coding, CodeableConcept> {
@ -67,30 +68,6 @@ public class FhirResourceDaoCodeSystemR4 extends FhirResourceDaoR4<CodeSystem> i
@Autowired
private ValidationSupportChain myValidationSupport;
// private LookupCodeResult lookup(List<ValueSetExpansionContainsComponent> theContains, String theSystem, String theCode) {
// for (ValueSetExpansionContainsComponent nextCode : theContains) {
//
// String system = nextCode.getSystem();
// String code = nextCode.getCode();
// if (theSystem.equals(system) && theCode.equals(code)) {
// LookupCodeResult retVal = new LookupCodeResult();
// retVal.setSearchedForCode(code);
// retVal.setSearchedForSystem(system);
// retVal.setFound(true);
// if (nextCode.getAbstractElement().getValue() != null) {
// retVal.setCodeIsAbstract(nextCode.getAbstractElement().booleanValue());
// }
// retVal.setCodeDisplay(nextCode.getDisplay());
// retVal.setCodeSystemVersion(nextCode.getVersion());
// retVal.setCodeSystemDisplayName("Unknown"); // TODO: implement
// return retVal;
// }
//
// }
//
// return null;
// }
@Override
public List<IIdType> findCodeSystemIdsContainingSystemAndCode(String theCode, String theSystem) {
List<IIdType> valueSetIds;
@ -139,32 +116,20 @@ public class FhirResourceDaoCodeSystemR4 extends FhirResourceDaoR4<CodeSystem> i
retVal.setSearchedForCode(code);
retVal.setSearchedForSystem(system);
retVal.setCodeDisplay(result.asConceptDefinition().getDisplay());
retVal.setCodeSystemDisplayName("Unknown");
retVal.setCodeSystemVersion("");
String codeSystemDisplayName = result.getCodeSystemName();
if (isBlank(codeSystemDisplayName)) {
codeSystemDisplayName = "Unknown";
}
retVal.setCodeSystemDisplayName(codeSystemDisplayName);
retVal.setCodeSystemVersion(result.getCodeSystemVersion());
retVal.setProperties(result.getProperties());
return retVal;
}
}
// HapiWorkerContext ctx = new HapiWorkerContext(getContext(), myValidationSupport);
// ValueSetExpander expander = ctx.getExpander();
// ValueSet source = new ValueSet();
// source.getCompose().addInclude().setSystem(system).addConcept().setCode(code);
//
// ValueSetExpansionOutcome expansion;
// try {
// expansion = expander.expand(source);
// } catch (Exception e) {
// throw new InternalErrorException(e);
// }
//
// if (expansion.getValueset() != null) {
// List<ValueSetExpansionContainsComponent> contains = expansion.getValueset().getExpansion().getContains();
// LookupCodeResult result = lookup(contains, system, code);
// if (result != null) {
// return result;
// }
// }
}
// We didn't find it..
@ -196,7 +161,7 @@ public class FhirResourceDaoCodeSystemR4 extends FhirResourceDaoR4<CodeSystem> i
if (isNotBlank(next.getCode())) {
TermConcept termConcept = new TermConcept();
termConcept.setCode(next.getCode());
termConcept.setCodeSystem(theCodeSystemVersion);
termConcept.setCodeSystemVersion(theCodeSystemVersion);
termConcept.setDisplay(next.getDisplay());
termConcept.addChildren(toPersistedConcepts(next.getConcept(), theCodeSystemVersion), RelationshipTypeEnum.ISA);
retVal.add(termConcept);
@ -228,7 +193,7 @@ public class FhirResourceDaoCodeSystemR4 extends FhirResourceDaoR4<CodeSystem> i
persCs.setResource(retVal);
persCs.getConcepts().addAll(toPersistedConcepts(cs.getConcept(), persCs));
ourLog.info("Code system has {} concepts", persCs.getConcepts().size());
myTerminologySvc.storeNewCodeSystemVersion(codeSystemResourcePid, codeSystemUrl, persCs);
myTerminologySvc.storeNewCodeSystemVersion(codeSystemResourcePid, codeSystemUrl, cs.getName(), persCs);
}

View File

@ -96,11 +96,7 @@ public class FhirResourceDaoSubscriptionR4 extends FhirResourceDaoR4<Subscriptio
}
protected void validateChannelPayload(Subscription theResource) {
if (isBlank(theResource.getChannel().getPayload())) {
throw new UnprocessableEntityException("Subscription.channel.payload must be populated for rest-hook subscriptions");
}
if (EncodingEnum.forContentType(theResource.getChannel().getPayload()) == null) {
if (!isBlank(theResource.getChannel().getPayload()) && EncodingEnum.forContentType(theResource.getChannel().getPayload()) == null) {
throw new UnprocessableEntityException("Invalid value for Subscription.channel.payload: " + theResource.getChannel().getPayload());
}
}

View File

@ -157,7 +157,7 @@ public class FhirSystemDaoR4 extends BaseHapiFhirSystemDao<Bundle, Meta> {
}
if (transactionType == null) {
String message = "Transactiion Bundle did not specify valid Bundle.type, assuming " + BundleType.TRANSACTION.toCode();
String message = "Transaction Bundle did not specify valid Bundle.type, assuming " + BundleType.TRANSACTION.toCode();
ourLog.warn(message);
transactionType = BundleType.TRANSACTION;
}
@ -165,7 +165,7 @@ public class FhirSystemDaoR4 extends BaseHapiFhirSystemDao<Bundle, Meta> {
throw new InvalidRequestException("Unable to process transaction where incoming Bundle.type = " + transactionType.toCode());
}
ourLog.info("Beginning {} with {} resources", theActionName, theRequest.getEntry().size());
ourLog.debug("Beginning {} with {} resources", theActionName, theRequest.getEntry().size());
long start = System.currentTimeMillis();
final Date updateTime = new Date();

View File

@ -23,6 +23,8 @@ package ca.uhn.fhir.jpa.entity;
import javax.persistence.*;
import java.io.Serializable;
import static org.apache.commons.lang3.StringUtils.left;
//@formatter:off
@Table(name = "TRM_CODESYSTEM", uniqueConstraints = {
@UniqueConstraint(name = "IDX_CS_CODESYSTEM", columnNames = {"CODE_SYSTEM_URI"})
@ -31,6 +33,7 @@ import java.io.Serializable;
//@formatter:on
public class TermCodeSystem implements Serializable {
private static final long serialVersionUID = 1L;
public static final int CS_NAME_LENGTH = 200;
@Column(name = "CODE_SYSTEM_URI", nullable = false)
private String myCodeSystemUri;
@ -48,11 +51,17 @@ public class TermCodeSystem implements Serializable {
private ResourceTable myResource;
@Column(name = "RES_ID", insertable = false, updatable = false)
private Long myResourcePid;
@Column(name = "CS_NAME", nullable = true)
private String myName;
public String getCodeSystemUri() {
return myCodeSystemUri;
}
public String getName() {
return myName;
}
public void setCodeSystemUri(String theCodeSystemUri) {
myCodeSystemUri = theCodeSystemUri;
}
@ -73,6 +82,10 @@ public class TermCodeSystem implements Serializable {
return myResource;
}
public void setName(String theName) {
myName = left(theName, CS_NAME_LENGTH);
}
public void setResource(ResourceTable theResource) {
myResource = theResource;
}

View File

@ -20,26 +20,13 @@ package ca.uhn.fhir.jpa.entity;
* #L%
*/
import ca.uhn.fhir.util.CoverageIgnore;
import javax.persistence.*;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.Collection;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.ForeignKey;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.JoinColumn;
import javax.persistence.OneToMany;
import javax.persistence.OneToOne;
import javax.persistence.SequenceGenerator;
import javax.persistence.Table;
import javax.persistence.UniqueConstraint;
import ca.uhn.fhir.util.CoverageIgnore;
//@formatter:off
@Table(name = "TRM_CODESYSTEM_VER"
// Note, we used to have a constraint named IDX_CSV_RESOURCEPID_AND_VER (don't reuse this)
@ -64,7 +51,13 @@ public class TermCodeSystemVersion implements Serializable {
@Column(name = "CS_VERSION_ID", nullable = true, updatable = false)
private String myCodeSystemVersionId;
/**
* This was added in HAPI FHIR 3.3.0 and is nullable just to avoid migration
* issued. It should be made non-nullable at some point.
*/
@ManyToOne
@JoinColumn(name = "CODESYSTEM_PID", referencedColumnName = "PID", nullable = true)
private TermCodeSystem myCodeSystem;
@SuppressWarnings("unused")
@OneToOne(mappedBy = "myCurrentVersion", optional = true)
private TermCodeSystem myCodeSystemHavingThisVersionAsCurrentVersionIfAny;
@ -76,26 +69,6 @@ public class TermCodeSystemVersion implements Serializable {
super();
}
public Collection<TermConcept> getConcepts() {
if (myConcepts == null) {
myConcepts = new ArrayList<>();
}
return myConcepts;
}
public Long getPid() {
return myId;
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + ((myResource.getId() == null) ? 0 : myResource.getId().hashCode());
result = prime * result + ((myCodeSystemVersionId == null) ? 0 : myCodeSystemVersionId.hashCode());
return result;
}
@CoverageIgnore
@Override
public boolean equals(Object obj) {
@ -125,20 +98,48 @@ public class TermCodeSystemVersion implements Serializable {
return true;
}
public ResourceTable getResource() {
return myResource;
public TermCodeSystem getCodeSystem() {
return myCodeSystem;
}
public void setCodeSystem(TermCodeSystem theCodeSystem) {
myCodeSystem = theCodeSystem;
}
public String getCodeSystemVersionId() {
return myCodeSystemVersionId;
}
public void setResource(ResourceTable theResource) {
myResource = theResource;
}
public void setCodeSystemVersionId(String theCodeSystemVersionId) {
myCodeSystemVersionId = theCodeSystemVersionId;
}
public Collection<TermConcept> getConcepts() {
if (myConcepts == null) {
myConcepts = new ArrayList<>();
}
return myConcepts;
}
public Long getPid() {
return myId;
}
public ResourceTable getResource() {
return myResource;
}
public void setResource(ResourceTable theResource) {
myResource = theResource;
}
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + ((myResource.getId() == null) ? 0 : myResource.getId().hashCode());
result = prime * result + ((myCodeSystemVersionId == null) ? 0 : myCodeSystemVersionId.hashCode());
return result;
}
}

View File

@ -1,5 +1,6 @@
package ca.uhn.fhir.jpa.entity;
import ca.uhn.fhir.context.support.IContextValidationSupport;
import ca.uhn.fhir.jpa.entity.TermConceptParentChildLink.RelationshipTypeEnum;
import ca.uhn.fhir.jpa.search.DeferConceptIndexingInterceptor;
import org.apache.commons.lang3.Validate;
@ -75,7 +76,7 @@ public class TermConcept implements Serializable {
})
private String myDisplay;
@OneToMany(mappedBy = "myConcept")
@OneToMany(mappedBy = "myConcept", orphanRemoval = true)
@Field
@FieldBridge(impl = TermConceptPropertyFieldBridge.class)
private Collection<TermConceptProperty> myProperties;
@ -100,7 +101,7 @@ public class TermConcept implements Serializable {
}
public TermConcept(TermCodeSystemVersion theCs, String theCode) {
setCodeSystem(theCs);
setCodeSystemVersion(theCs);
setCode(theCode);
}
@ -130,7 +131,7 @@ public class TermConcept implements Serializable {
property.setType(thePropertyType);
property.setKey(thePropertyName);
property.setValue(thePropertyValue);
getStringProperties().add(property);
getProperties().add(property);
return property;
}
@ -177,14 +178,14 @@ public class TermConcept implements Serializable {
myCode = theCode;
}
public TermCodeSystemVersion getCodeSystem() {
public TermCodeSystemVersion getCodeSystemVersion() {
return myCodeSystem;
}
public void setCodeSystem(TermCodeSystemVersion theCodeSystem) {
myCodeSystem = theCodeSystem;
if (theCodeSystem.getPid() != null) {
myCodeSystemVersionPid = theCodeSystem.getPid();
public void setCodeSystemVersion(TermCodeSystemVersion theCodeSystemVersion) {
myCodeSystem = theCodeSystemVersion;
if (theCodeSystemVersion.getPid() != null) {
myCodeSystemVersionPid = theCodeSystemVersion.getPid();
}
}
@ -223,7 +224,7 @@ public class TermConcept implements Serializable {
return myParents;
}
public Collection<TermConceptProperty> getStringProperties() {
public Collection<TermConceptProperty> getProperties() {
if (myProperties == null) {
myProperties = new ArrayList<>();
}
@ -232,7 +233,7 @@ public class TermConcept implements Serializable {
public List<String> getStringProperties(String thePropertyName) {
List<String> retVal = new ArrayList<>();
for (TermConceptProperty next : getStringProperties()) {
for (TermConceptProperty next : getProperties()) {
if (thePropertyName.equals(next.getKey())) {
if (next.getType() == TermConceptPropertyTypeEnum.STRING) {
retVal.add(next.getValue());
@ -244,7 +245,7 @@ public class TermConcept implements Serializable {
public List<Coding> getCodingProperties(String thePropertyName) {
List<Coding> retVal = new ArrayList<>();
for (TermConceptProperty next : getStringProperties()) {
for (TermConceptProperty next : getProperties()) {
if (thePropertyName.equals(next.getKey())) {
if (next.getType() == TermConceptPropertyTypeEnum.CODING) {
Coding coding = new Coding();
@ -334,4 +335,20 @@ public class TermConcept implements Serializable {
return new ToStringBuilder(this, ToStringStyle.SHORT_PREFIX_STYLE).append("code", myCode).append("display", myDisplay).build();
}
public List<IContextValidationSupport.BaseConceptProperty> toValidationProperties() {
List<IContextValidationSupport.BaseConceptProperty> retVal = new ArrayList<>();
for (TermConceptProperty next : getProperties()) {
switch (next.getType()) {
case STRING:
retVal.add(new IContextValidationSupport.StringConceptProperty(next.getKey(), next.getValue()));
break;
case CODING:
retVal.add(new IContextValidationSupport.CodingConceptProperty(next.getKey(), next.getCodeSystem(), next.getValue(), next.getDisplay()));
break;
default:
throw new IllegalStateException("Don't know how to handle " + next.getType());
}
}
return retVal;
}
}

View File

@ -1,4 +1,4 @@
package ca.uhn.fhir.jpa.provider.dstu3;
package ca.uhn.fhir.jpa.provider;
/*
* #%L
@ -20,81 +20,86 @@ package ca.uhn.fhir.jpa.provider.dstu3;
* #L%
*/
import static org.apache.commons.lang3.StringUtils.defaultString;
import static org.apache.commons.lang3.StringUtils.isNotBlank;
import java.io.FileInputStream;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import javax.servlet.http.HttpServletRequest;
import org.apache.commons.io.IOUtils;
import org.hl7.fhir.dstu3.model.*;
import org.springframework.beans.factory.annotation.Autowired;
import ca.uhn.fhir.jpa.provider.BaseJpaProvider;
import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc;
import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc.UploadStatistics;
import ca.uhn.fhir.rest.annotation.Operation;
import ca.uhn.fhir.rest.annotation.OperationParam;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.param.StringParam;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import org.hl7.fhir.r4.model.IntegerType;
import org.hl7.fhir.r4.model.Parameters;
import org.hl7.fhir.r4.model.StringType;
import org.springframework.beans.factory.annotation.Autowired;
public class TerminologyUploaderProviderDstu3 extends BaseJpaProvider {
import javax.servlet.http.HttpServletRequest;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.List;
import static org.apache.commons.lang3.StringUtils.defaultString;
import static org.apache.commons.lang3.StringUtils.isNotBlank;
public class TerminologyUploaderProvider extends BaseJpaProvider {
public static final String UPLOAD_EXTERNAL_CODE_SYSTEM = "$upload-external-code-system";
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(TerminologyUploaderProviderDstu3.class);
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(TerminologyUploaderProvider.class);
@Autowired
private IHapiTerminologyLoaderSvc myTerminologyLoaderSvc;
//@formatter:off
@Operation(name = UPLOAD_EXTERNAL_CODE_SYSTEM, idempotent = false, returnParameters= {
@OperationParam(name="conceptCount", type=IntegerType.class, min=1)
})
public Parameters uploadExternalCodeSystem(
HttpServletRequest theServletRequest,
@OperationParam(name="url", min=1) UriType theUrl,
@OperationParam(name="package", min=0) Attachment thePackage,
@OperationParam(name="localfile", min=0, max=OperationParam.MAX_UNLIMITED) List<StringType> theLocalFile,
@OperationParam(name="url", min=1) StringParam theCodeSystemUrl,
@OperationParam(name="localfile", min=1, max=OperationParam.MAX_UNLIMITED) List<StringType> theLocalFile,
RequestDetails theRequestDetails
) {
//@formatter:on
startRequest(theServletRequest);
try {
List<byte[]> data = new ArrayList<byte[]>();
List<IHapiTerminologyLoaderSvc.FileDescriptor> localFiles = new ArrayList<>();
if (theLocalFile != null && theLocalFile.size() > 0) {
for (StringType nextLocalFile : theLocalFile) {
if (isNotBlank(nextLocalFile.getValue())) {
ourLog.info("Reading in local file: {}", nextLocalFile.getValue());
try {
byte[] nextData = IOUtils.toByteArray(new FileInputStream(nextLocalFile.getValue()));
data.add(nextData);
} catch (IOException e) {
throw new InternalErrorException(e);
File nextFile = new File(nextLocalFile.getValue());
if (!nextFile.exists() || nextFile.isFile()) {
throw new InvalidRequestException("Unknown file: " +nextFile.getName());
}
}
}
} else if (thePackage == null || thePackage.getData() == null || thePackage.getData().length == 0) {
throw new InvalidRequestException("No 'localfile' or 'package' parameter, or package had no data");
} else {
data = new ArrayList<byte[]>();
data.add(thePackage.getData());
thePackage.setData(null);
localFiles.add(new IHapiTerminologyLoaderSvc.FileDescriptor() {
@Override
public String getFilename() {
return nextFile.getAbsolutePath();
}
String url = theUrl != null ? theUrl.getValueAsString() : null;
@Override
public InputStream getInputStream() {
try {
return new FileInputStream(nextFile);
} catch (FileNotFoundException theE) {
throw new InternalErrorException(theE);
}
}
});
}
}
}
String url = theCodeSystemUrl != null ? theCodeSystemUrl.getValue() : null;
url = defaultString(url);
UploadStatistics stats;
if (IHapiTerminologyLoaderSvc.SCT_URL.equals(url)) {
stats = myTerminologyLoaderSvc.loadSnomedCt((data), theRequestDetails);
} else if (IHapiTerminologyLoaderSvc.LOINC_URL.equals(url)) {
stats = myTerminologyLoaderSvc.loadLoinc((data), theRequestDetails);
if (IHapiTerminologyLoaderSvc.SCT_URI.equals(url)) {
stats = myTerminologyLoaderSvc.loadSnomedCt(localFiles, theRequestDetails);
} else if (IHapiTerminologyLoaderSvc.LOINC_URI.equals(url)) {
stats = myTerminologyLoaderSvc.loadLoinc(localFiles, theRequestDetails);
} else {
throw new InvalidRequestException("Unknown URL: " + url);
}

View File

@ -1,111 +0,0 @@
package ca.uhn.fhir.jpa.provider.r4;
/*
* #%L
* HAPI FHIR JPA Server
* %%
* Copyright (C) 2014 - 2018 University Health Network
* %%
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* #L%
*/
import static org.apache.commons.lang3.StringUtils.defaultString;
import static org.apache.commons.lang3.StringUtils.isNotBlank;
import java.io.FileInputStream;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import javax.servlet.http.HttpServletRequest;
import org.apache.commons.io.IOUtils;
import org.hl7.fhir.r4.model.*;
import org.springframework.beans.factory.annotation.Autowired;
import ca.uhn.fhir.jpa.provider.BaseJpaProvider;
import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc;
import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc.UploadStatistics;
import ca.uhn.fhir.rest.annotation.Operation;
import ca.uhn.fhir.rest.annotation.OperationParam;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
public class TerminologyUploaderProviderR4 extends BaseJpaProvider {
public static final String UPLOAD_EXTERNAL_CODE_SYSTEM = "$upload-external-code-system";
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(TerminologyUploaderProviderR4.class);
@Autowired
private IHapiTerminologyLoaderSvc myTerminologyLoaderSvc;
//@formatter:off
@Operation(name = UPLOAD_EXTERNAL_CODE_SYSTEM, idempotent = false, returnParameters= {
@OperationParam(name="conceptCount", type=IntegerType.class, min=1)
})
public Parameters uploadExternalCodeSystem(
HttpServletRequest theServletRequest,
@OperationParam(name="url", min=1) UriType theUrl,
@OperationParam(name="package", min=0) Attachment thePackage,
@OperationParam(name="localfile", min=0, max=OperationParam.MAX_UNLIMITED) List<StringType> theLocalFile,
RequestDetails theRequestDetails
) {
//@formatter:on
startRequest(theServletRequest);
try {
List<byte[]> data = new ArrayList<byte[]>();
if (theLocalFile != null && theLocalFile.size() > 0) {
for (StringType nextLocalFile : theLocalFile) {
if (isNotBlank(nextLocalFile.getValue())) {
ourLog.info("Reading in local file: {}", nextLocalFile.getValue());
try {
byte[] nextData = IOUtils.toByteArray(new FileInputStream(nextLocalFile.getValue()));
data.add(nextData);
} catch (IOException e) {
throw new InternalErrorException(e);
}
}
}
} else if (thePackage == null || thePackage.getData() == null || thePackage.getData().length == 0) {
throw new InvalidRequestException("No 'localfile' or 'package' parameter, or package had no data");
} else {
data = new ArrayList<byte[]>();
data.add(thePackage.getData());
thePackage.setData(null);
}
String url = theUrl != null ? theUrl.getValueAsString() : null;
url = defaultString(url);
UploadStatistics stats;
if (IHapiTerminologyLoaderSvc.SCT_URL.equals(url)) {
stats = myTerminologyLoaderSvc.loadSnomedCt((data), theRequestDetails);
} else if (IHapiTerminologyLoaderSvc.LOINC_URL.equals(url)) {
stats = myTerminologyLoaderSvc.loadLoinc((data), theRequestDetails);
} else {
throw new InvalidRequestException("Unknown URL: " + url);
}
Parameters retVal = new Parameters();
retVal.addParameter().setName("conceptCount").setValue(new IntegerType(stats.getConceptCount()));
return retVal;
} finally {
endRequest(theServletRequest);
}
}
}

View File

@ -49,7 +49,7 @@ public abstract class BaseSubscriptionDeliverySubscriber extends BaseSubscriptio
try {
ResourceDeliveryMessage msg = (ResourceDeliveryMessage) theMessage.getPayload();
subscriptionId = msg.getPayload(getContext()).getIdElement().getValue();
subscriptionId = msg.getSubscription().getIdElement(getContext()).getValue();
if (!subscriptionTypeApplies(getContext(), msg.getSubscription().getBackingSubscription(getContext()))) {
return;

View File

@ -20,24 +20,26 @@ package ca.uhn.fhir.jpa.subscription.resthook;
* #L%
*/
import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.dao.IFhirResourceDao;
import ca.uhn.fhir.jpa.subscription.*;
import ca.uhn.fhir.rest.api.Constants;
import ca.uhn.fhir.rest.api.EncodingEnum;
import ca.uhn.fhir.rest.client.api.IGenericClient;
import ca.uhn.fhir.rest.client.api.ServerValidationModeEnum;
import ca.uhn.fhir.rest.api.RequestTypeEnum;
import ca.uhn.fhir.rest.client.api.*;
import ca.uhn.fhir.rest.client.interceptor.SimpleRequestHeaderInterceptor;
import ca.uhn.fhir.rest.gclient.IClientExecutable;
import org.apache.commons.lang3.ObjectUtils;
import org.apache.commons.lang3.StringUtils;
import ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException;
import org.hl7.fhir.instance.model.api.IBaseResource;
import org.hl7.fhir.r4.model.Subscription;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.messaging.Message;
import org.springframework.messaging.MessagingException;
import java.io.IOException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import static org.apache.commons.lang3.StringUtils.isNotBlank;
@ -54,10 +56,38 @@ public class SubscriptionDeliveringRestHookSubscriber extends BaseSubscriptionDe
IClientExecutable<?, ?> operation;
switch (theMsg.getOperationType()) {
case CREATE:
if (payloadResource == null || payloadResource.isEmpty()) {
if (thePayloadType != null ) {
operation = theClient.create().resource(payloadResource);
} else {
sendNotification(theMsg);
return;
}
} else {
if (thePayloadType != null ) {
operation = theClient.update().resource(payloadResource);
} else {
sendNotification(theMsg);
return;
}
}
break;
case UPDATE:
if (payloadResource == null || payloadResource.isEmpty()) {
if (thePayloadType != null ) {
operation = theClient.create().resource(payloadResource);
} else {
sendNotification(theMsg);
return;
}
} else {
if (thePayloadType != null ) {
operation = theClient.update().resource(payloadResource);
} else {
sendNotification(theMsg);
return;
}
}
break;
case DELETE:
operation = theClient.delete().resourceById(theMsg.getPayloadId(getContext()));
@ -67,11 +97,19 @@ public class SubscriptionDeliveringRestHookSubscriber extends BaseSubscriptionDe
return;
}
if (thePayloadType != null) {
operation.encoded(thePayloadType);
}
ourLog.info("Delivering {} rest-hook payload {} for {}", theMsg.getOperationType(), payloadResource.getIdElement().toUnqualified().getValue(), theSubscription.getIdElement(getContext()).toUnqualifiedVersionless().getValue());
try {
operation.execute();
} catch (ResourceNotFoundException e) {
ourLog.error("Cannot reach "+ theMsg.getSubscription().getEndpointUrl());
e.printStackTrace();
throw e;
}
}
@Override
@ -83,13 +121,14 @@ public class SubscriptionDeliveringRestHookSubscriber extends BaseSubscriptionDe
// Grab the payload type (encoding mimetype) from the subscription
String payloadString = subscription.getPayloadString();
payloadString = StringUtils.defaultString(payloadString, Constants.CT_FHIR_XML_NEW);
EncodingEnum payloadType = null;
if(payloadString != null) {
if (payloadString.contains(";")) {
payloadString = payloadString.substring(0, payloadString.indexOf(';'));
}
payloadString = payloadString.trim();
EncodingEnum payloadType = EncodingEnum.forContentType(payloadString);
payloadType = ObjectUtils.defaultIfNull(payloadType, EncodingEnum.XML);
payloadType = EncodingEnum.forContentType(payloadString);
}
// Create the client request
getContext().getRestfulClientFactory().setServerValidationMode(ServerValidationModeEnum.NEVER);
@ -109,4 +148,23 @@ public class SubscriptionDeliveringRestHookSubscriber extends BaseSubscriptionDe
deliverPayload(theMessage, subscription, payloadType, client);
}
/**
* Sends a POST notification without a payload
* @param theMsg
*/
protected void sendNotification(ResourceDeliveryMessage theMsg) {
FhirContext context= getContext();
Map<String, List<String>> params = new HashMap();
List<Header> headers = new ArrayList<>();
StringBuilder url = new StringBuilder(theMsg.getSubscription().getEndpointUrl());
IHttpClient client = context.getRestfulClientFactory().getHttpClient(url, params, "", RequestTypeEnum.POST, headers);
IHttpRequest request = client.createParamRequest(context, params, null);
try {
IHttpResponse response = request.execute();
} catch (IOException e) {
ourLog.error("Error trying to reach "+ theMsg.getSubscription().getEndpointUrl());
e.printStackTrace();
throw new ResourceNotFoundException(e.getMessage());
}
}
}

View File

@ -24,10 +24,7 @@ import ca.uhn.fhir.context.FhirContext;
import ca.uhn.fhir.jpa.dao.BaseHapiFhirDao;
import ca.uhn.fhir.jpa.dao.DaoConfig;
import ca.uhn.fhir.jpa.dao.IFhirResourceDaoCodeSystem;
import ca.uhn.fhir.jpa.dao.data.ITermCodeSystemDao;
import ca.uhn.fhir.jpa.dao.data.ITermCodeSystemVersionDao;
import ca.uhn.fhir.jpa.dao.data.ITermConceptDao;
import ca.uhn.fhir.jpa.dao.data.ITermConceptParentChildLinkDao;
import ca.uhn.fhir.jpa.dao.data.*;
import ca.uhn.fhir.jpa.entity.*;
import ca.uhn.fhir.jpa.entity.TermConceptParentChildLink.RelationshipTypeEnum;
import ca.uhn.fhir.rest.api.server.RequestDetails;
@ -79,6 +76,8 @@ public abstract class BaseHapiTerminologySvcImpl implements IHapiTerminologySvc
@Autowired
protected ITermConceptDao myConceptDao;
@Autowired
protected ITermConceptPropertyDao myConceptPropertyDao;
@Autowired
protected FhirContext myContext;
@PersistenceContext(type = PersistenceContextType.TRANSACTION)
protected EntityManager myEntityManager;
@ -409,7 +408,7 @@ public abstract class BaseHapiTerminologySvcImpl implements IHapiTerminologySvc
ourLog.info("Have processed {}/{} concepts ({}%)", theConceptsStack.size(), theTotalConcepts, (int) (pct * 100.0f));
}
theConcept.setCodeSystem(theCodeSystem);
theConcept.setCodeSystemVersion(theCodeSystem);
theConcept.setIndexStatus(BaseHapiFhirDao.INDEX_STATUS_INDEXED);
if (theConceptsStack.size() <= myDaoConfig.getDeferIndexingForCodesystemsOfSize()) {
@ -430,13 +429,17 @@ public abstract class BaseHapiTerminologySvcImpl implements IHapiTerminologySvc
}
}
for (TermConceptProperty next : theConcept.getProperties()){
myConceptPropertyDao.save(next);
}
}
private void populateVersion(TermConcept theNext, TermCodeSystemVersion theCodeSystemVersion) {
if (theNext.getCodeSystem() != null) {
if (theNext.getCodeSystemVersion() != null) {
return;
}
theNext.setCodeSystem(theCodeSystemVersion);
theNext.setCodeSystemVersion(theCodeSystemVersion);
for (TermConceptParentChildLink next : theNext.getChildren()) {
populateVersion(next.getChild(), theCodeSystemVersion);
}
@ -616,7 +619,7 @@ public abstract class BaseHapiTerminologySvcImpl implements IHapiTerminologySvc
@Override
@Transactional(propagation = Propagation.REQUIRED)
public void storeNewCodeSystemVersion(Long theCodeSystemResourcePid, String theSystemUri, TermCodeSystemVersion theCodeSystemVersion) {
public void storeNewCodeSystemVersion(Long theCodeSystemResourcePid, String theSystemUri, String theSystemName, TermCodeSystemVersion theCodeSystemVersion) {
ourLog.info("Storing code system");
ValidateUtil.isTrueOrThrowInvalidRequest(theCodeSystemVersion.getResource() != null, "No resource supplied");
@ -655,6 +658,7 @@ public abstract class BaseHapiTerminologySvcImpl implements IHapiTerminologySvc
}
codeSystem.setResource(theCodeSystemVersion.getResource());
codeSystem.setCodeSystemUri(theSystemUri);
codeSystem.setName(theSystemName);
myCodeSystemDao.save(codeSystem);
} else {
if (!ObjectUtil.equals(codeSystem.getResource().getId(), theCodeSystemVersion.getResource().getId())) {
@ -663,6 +667,7 @@ public abstract class BaseHapiTerminologySvcImpl implements IHapiTerminologySvc
throw new UnprocessableEntityException(msg);
}
}
theCodeSystemVersion.setCodeSystem(codeSystem);
ourLog.info("Validating all codes in CodeSystem for storage (this can take some time for large sets)");
@ -721,7 +726,7 @@ public abstract class BaseHapiTerminologySvcImpl implements IHapiTerminologySvc
ourLog.info("CodeSystem resource has ID: {}", csId.getValue());
theCodeSystemVersion.setResource(resource);
storeNewCodeSystemVersion(codeSystemResourcePid, theCodeSystemResource.getUrl(), theCodeSystemVersion);
storeNewCodeSystemVersion(codeSystemResourcePid, theCodeSystemResource.getUrl(), theCodeSystemResource.getName(), theCodeSystemVersion);
for (ValueSet nextValueSet : theValueSets) {
createOrUpdateValueSet(nextValueSet, theRequestDetails);
@ -749,8 +754,8 @@ public abstract class BaseHapiTerminologySvcImpl implements IHapiTerminologySvc
private int validateConceptForStorage(TermConcept theConcept, TermCodeSystemVersion theCodeSystem, ArrayList<String> theConceptsStack,
IdentityHashMap<TermConcept, Object> theAllConcepts) {
ValidateUtil.isTrueOrThrowInvalidRequest(theConcept.getCodeSystem() != null, "CodesystemValue is null");
ValidateUtil.isTrueOrThrowInvalidRequest(theConcept.getCodeSystem() == theCodeSystem, "CodeSystems are not equal");
ValidateUtil.isTrueOrThrowInvalidRequest(theConcept.getCodeSystemVersion() != null, "CodesystemValue is null");
ValidateUtil.isTrueOrThrowInvalidRequest(theConcept.getCodeSystemVersion() == theCodeSystem, "CodeSystems are not equal");
ValidateUtil.isNotBlankOrThrowInvalidRequest(theConcept.getCode(), "Codesystem contains a code with no code value");
if (theConceptsStack.contains(theConcept.getCode())) {

View File

@ -235,7 +235,10 @@ public class HapiTerminologySvcDstu3 extends BaseHapiTerminologySvcImpl implemen
ConceptDefinitionComponent def = new ConceptDefinitionComponent();
def.setCode(code.getCode());
def.setDisplay(code.getDisplay());
return new CodeValidationResult(def);
CodeValidationResult retVal = new CodeValidationResult(def);
retVal.setProperties(code.toValidationProperties());
retVal.setCodeSystemName(code.getCodeSystemVersion().getCodeSystem().getName());
return retVal;
}
return new CodeValidationResult(IssueSeverity.ERROR, "Unknown code {" + theCodeSystem + "}" + theCode);

View File

@ -103,7 +103,7 @@ public class HapiTerminologySvcR4 extends BaseHapiTerminologySvcImpl implements
@Override
protected void createOrUpdateConceptMap(org.hl7.fhir.r4.model.ConceptMap theConceptMap, RequestDetails theRequestDetails) {
String matchUrl = "ConceptMap?url=" + UrlUtil.escapeUrlParam(theConceptMap.getUrl());
myConceptMapResourceDao.update(theConceptMap, matchUrl, theRequestDetails).getId();
myConceptMapResourceDao.update(theConceptMap, matchUrl, theRequestDetails);
}
@Override
@ -201,10 +201,12 @@ public class HapiTerminologySvcR4 extends BaseHapiTerminologySvcImpl implements
ConceptDefinitionComponent def = new ConceptDefinitionComponent();
def.setCode(code.getCode());
def.setDisplay(code.getDisplay());
return new CodeValidationResult(def);
CodeValidationResult retVal = new CodeValidationResult(def);
retVal.setProperties(code.toValidationProperties());
return retVal;
}
return new CodeValidationResult(IssueSeverity.ERROR, "Unkonwn code {" + theCodeSystem + "}" + theCode);
return new CodeValidationResult(IssueSeverity.ERROR, "Unknown code {" + theCodeSystem + "}" + theCode);
}
}

View File

@ -20,20 +20,30 @@ package ca.uhn.fhir.jpa.term;
* #L%
*/
import java.util.List;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import java.io.InputStream;
import java.util.List;
public interface IHapiTerminologyLoaderSvc {
String LOINC_URL = "http://loinc.org";
String SCT_URL = "http://snomed.info/sct";
String LOINC_URI = "http://loinc.org";
String SCT_URI = "http://snomed.info/sct";
String IEEE_11073_10101_URI = "urn:iso:std:iso:11073:10101";
UploadStatistics loadLoinc(List<byte[]> theZipBytes, RequestDetails theRequestDetails);
UploadStatistics loadLoinc(List<FileDescriptor> theFiles, RequestDetails theRequestDetails);
UploadStatistics loadSnomedCt(List<byte[]> theZipBytes, RequestDetails theRequestDetails);
UploadStatistics loadSnomedCt(List<FileDescriptor> theFiles, RequestDetails theRequestDetails);
public static class UploadStatistics {
interface FileDescriptor {
String getFilename();
InputStream getInputStream();
}
class UploadStatistics {
private final int myConceptCount;
public UploadStatistics(int theConceptCount) {

View File

@ -57,7 +57,7 @@ public interface IHapiTerminologySvc {
*/
void setProcessDeferred(boolean theProcessDeferred);
void storeNewCodeSystemVersion(Long theCodeSystemResourcePid, String theSystemUri, TermCodeSystemVersion theCodeSytemVersion);
void storeNewCodeSystemVersion(Long theCodeSystemResourcePid, String theSystemUri, String theSystemName, TermCodeSystemVersion theCodeSytemVersion);
boolean supportsSystem(String theCodeSystem);

View File

@ -11,13 +11,14 @@ import ca.uhn.fhir.jpa.term.snomedct.SctHandlerRelationship;
import ca.uhn.fhir.jpa.util.Counter;
import ca.uhn.fhir.rest.api.server.RequestDetails;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import ca.uhn.fhir.rest.server.exceptions.InvalidRequestException;
import ca.uhn.fhir.rest.server.exceptions.UnprocessableEntityException;
import com.google.common.annotations.VisibleForTesting;
import com.google.common.base.Charsets;
import org.apache.commons.csv.CSVFormat;
import org.apache.commons.csv.CSVParser;
import org.apache.commons.csv.CSVRecord;
import org.apache.commons.csv.QuoteMode;
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.IOUtils;
import org.apache.commons.io.input.BOMInputStream;
import org.apache.commons.lang3.ObjectUtils;
@ -69,8 +70,11 @@ public class TerminologyLoaderSvcImpl implements IHapiTerminologyLoaderSvc {
public static final String LOINC_PART_LINK_FILE = "LoincPartLink_Beta_1.csv";
public static final String LOINC_PART_RELATED_CODE_MAPPING_FILE = "PartRelatedCodeMapping_Beta_1.csv";
public static final String LOINC_RSNA_PLAYBOOK_FILE = "LoincRsnaRadiologyPlaybook.csv";
public static final String TOP2000_COMMON_LAB_RESULTS_US_FILE = "Top2000CommonLabResultsUS.csv";
public static final String TOP2000_COMMON_LAB_RESULTS_SI_FILE = "Top2000CommonLabResultsSI.csv";
public static final String LOINC_TOP2000_COMMON_LAB_RESULTS_US_FILE = "Top2000CommonLabResultsUS.csv";
public static final String LOINC_TOP2000_COMMON_LAB_RESULTS_SI_FILE = "Top2000CommonLabResultsSI.csv";
public static final String LOINC_UNIVERSAL_LAB_ORDER_VALUESET_FILE = "LoincUniversalLabOrdersValueSet.csv";
public static final String LOINC_IEEE_MEDICAL_DEVICE_CODE_MAPPING_TABLE_CSV = "LoincIeeeMedicalDeviceCodeMappingTable.csv";
public static final String LOINC_IMAGING_DOCUMENT_CODES_FILE = "ImagingDocumentCodes.csv";
private static final int LOG_INCREMENT = 100000;
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(TerminologyLoaderSvcImpl.class);
@Autowired
@ -113,23 +117,17 @@ public class TerminologyLoaderSvcImpl implements IHapiTerminologyLoaderSvc {
}
private void iterateOverZipFile(List<byte[]> theZipBytes, String fileNamePart, IRecordHandler handler, char theDelimiter, QuoteMode theQuoteMode) {
boolean found = false;
private void iterateOverZipFile(LoadedFileDescriptors theDescriptors, String theFileNamePart, IRecordHandler theHandler, char theDelimiter, QuoteMode theQuoteMode) {
for (byte[] nextZipBytes : theZipBytes) {
ZipInputStream zis = new ZipInputStream(new BufferedInputStream(new ByteArrayInputStream(nextZipBytes)));
try {
for (ZipEntry nextEntry; (nextEntry = zis.getNextEntry()) != null; ) {
String nextFilename = nextEntry.getName();
if (nextFilename.contains(fileNamePart)) {
for (FileDescriptor nextZipBytes : theDescriptors.getUncompressedFileDescriptors()) {
String nextFilename = nextZipBytes.getFilename();
if (nextFilename.contains(theFileNamePart)) {
ourLog.info("Processing file {}", nextFilename);
found = true;
Reader reader;
CSVParser parsed;
try {
reader = new InputStreamReader(new BOMInputStream(zis), Charsets.UTF_8);
reader = new InputStreamReader(nextZipBytes.getInputStream(), Charsets.UTF_8);
CSVFormat format = CSVFormat.newFormat(theDelimiter).withFirstRecordAsHeader();
if (theQuoteMode != null) {
format = format.withQuote('"').withQuoteMode(theQuoteMode);
@ -139,15 +137,14 @@ public class TerminologyLoaderSvcImpl implements IHapiTerminologyLoaderSvc {
ourLog.debug("Header map: {}", parsed.getHeaderMap());
int count = 0;
int logIncrement = LOG_INCREMENT;
int nextLoggedCount = 0;
while (iter.hasNext()) {
CSVRecord nextRecord = iter.next();
handler.accept(nextRecord);
theHandler.accept(nextRecord);
count++;
if (count >= nextLoggedCount) {
ourLog.info(" * Processed {} records in {}", count, nextFilename);
nextLoggedCount += logIncrement;
nextLoggedCount += LOG_INCREMENT;
}
}
@ -155,43 +152,56 @@ public class TerminologyLoaderSvcImpl implements IHapiTerminologyLoaderSvc {
throw new InternalErrorException(e);
}
}
}
} catch (IOException e) {
throw new InternalErrorException(e);
} finally {
IOUtils.closeQuietly(zis);
}
}
// This should always be true, but just in case we've introduced a bug...
Validate.isTrue(found);
}
@Override
public UploadStatistics loadLoinc(List<byte[]> theZipBytes, RequestDetails theRequestDetails) {
List<String> expectedFilenameFragments = Arrays.asList(
public UploadStatistics loadLoinc(List<FileDescriptor> theFiles, RequestDetails theRequestDetails) {
LoadedFileDescriptors descriptors = new LoadedFileDescriptors(theFiles);
List<String> mandatoryFilenameFragments = Arrays.asList(
LOINC_FILE,
LOINC_HIERARCHY_FILE);
descriptors.verifyMandatoryFilesExist(mandatoryFilenameFragments);
verifyMandatoryFilesExist(theZipBytes, expectedFilenameFragments);
List<String> optionalFilenameFragments = Arrays.asList(
LOINC_ANSWERLIST_FILE,
LOINC_ANSWERLIST_LINK_FILE,
LOINC_PART_FILE,
LOINC_PART_LINK_FILE,
LOINC_PART_RELATED_CODE_MAPPING_FILE,
LOINC_DOCUMENT_ONTOLOGY_FILE,
LOINC_RSNA_PLAYBOOK_FILE,
LOINC_TOP2000_COMMON_LAB_RESULTS_US_FILE,
LOINC_TOP2000_COMMON_LAB_RESULTS_SI_FILE,
LOINC_UNIVERSAL_LAB_ORDER_VALUESET_FILE,
LOINC_IEEE_MEDICAL_DEVICE_CODE_MAPPING_TABLE_CSV,
LOINC_IMAGING_DOCUMENT_CODES_FILE
);
descriptors.verifyOptionalFilesExist(optionalFilenameFragments);
ourLog.info("Beginning LOINC processing");
return processLoincFiles(theZipBytes, theRequestDetails);
return processLoincFiles(descriptors, theRequestDetails);
}
@Override
public UploadStatistics loadSnomedCt(List<byte[]> theZipBytes, RequestDetails theRequestDetails) {
List<String> expectedFilenameFragments = Arrays.asList(SCT_FILE_DESCRIPTION, SCT_FILE_RELATIONSHIP, SCT_FILE_CONCEPT);
public UploadStatistics loadSnomedCt(List<FileDescriptor> theFiles, RequestDetails theRequestDetails) {
LoadedFileDescriptors descriptors = new LoadedFileDescriptors(theFiles);
verifyMandatoryFilesExist(theZipBytes, expectedFilenameFragments);
List<String> expectedFilenameFragments = Arrays.asList(
SCT_FILE_DESCRIPTION,
SCT_FILE_RELATIONSHIP,
SCT_FILE_CONCEPT);
descriptors.verifyMandatoryFilesExist(expectedFilenameFragments);
ourLog.info("Beginning SNOMED CT processing");
return processSnomedCtFiles(theZipBytes, theRequestDetails);
return processSnomedCtFiles(descriptors, theRequestDetails);
}
UploadStatistics processLoincFiles(List<byte[]> theZipBytes, RequestDetails theRequestDetails) {
UploadStatistics processLoincFiles(LoadedFileDescriptors theDescriptors, RequestDetails theRequestDetails) {
final TermCodeSystemVersion codeSystemVersion = new TermCodeSystemVersion();
final Map<String, TermConcept> code2concept = new HashMap<>();
final List<ValueSet> valueSets = new ArrayList<>();
@ -216,49 +226,61 @@ public class TerminologyLoaderSvcImpl implements IHapiTerminologyLoaderSvc {
// Loinc Codes
handler = new LoincHandler(codeSystemVersion, code2concept, propertyNames);
iterateOverZipFile(theZipBytes, LOINC_FILE, handler, ',', QuoteMode.NON_NUMERIC);
iterateOverZipFile(theDescriptors, LOINC_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// Loinc Hierarchy
handler = new LoincHierarchyHandler(codeSystemVersion, code2concept);
iterateOverZipFile(theZipBytes, LOINC_HIERARCHY_FILE, handler, ',', QuoteMode.NON_NUMERIC);
iterateOverZipFile(theDescriptors, LOINC_HIERARCHY_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// Answer lists (ValueSets of potential answers/values for loinc "questions")
handler = new LoincAnswerListHandler(codeSystemVersion, code2concept, propertyNames, valueSets);
iterateOverZipFile(theZipBytes, LOINC_ANSWERLIST_FILE, handler, ',', QuoteMode.NON_NUMERIC);
iterateOverZipFile(theDescriptors, LOINC_ANSWERLIST_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// Answer list links (connects loinc observation codes to answerlist codes)
handler = new LoincAnswerListLinkHandler(code2concept, valueSets);
iterateOverZipFile(theZipBytes, LOINC_ANSWERLIST_LINK_FILE, handler, ',', QuoteMode.NON_NUMERIC);
iterateOverZipFile(theDescriptors, LOINC_ANSWERLIST_LINK_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// Part file
handler = new LoincPartHandler(codeSystemVersion, code2concept);
iterateOverZipFile(theZipBytes, LOINC_PART_FILE, handler, ',', QuoteMode.NON_NUMERIC);
iterateOverZipFile(theDescriptors, LOINC_PART_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// Part link file
handler = new LoincPartLinkHandler(codeSystemVersion, code2concept);
iterateOverZipFile(theZipBytes, LOINC_PART_LINK_FILE, handler, ',', QuoteMode.NON_NUMERIC);
iterateOverZipFile(theDescriptors, LOINC_PART_LINK_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// Part related code mapping
handler = new LoincPartRelatedCodeMappingHandler(codeSystemVersion, code2concept, conceptMaps);
iterateOverZipFile(theZipBytes, LOINC_PART_RELATED_CODE_MAPPING_FILE, handler, ',', QuoteMode.NON_NUMERIC);
handler = new LoincPartRelatedCodeMappingHandler(codeSystemVersion, code2concept, valueSets, conceptMaps);
iterateOverZipFile(theDescriptors, LOINC_PART_RELATED_CODE_MAPPING_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// Document Ontology File
handler = new LoincDocumentOntologyHandler(codeSystemVersion, code2concept, propertyNames, valueSets);
iterateOverZipFile(theZipBytes, LOINC_DOCUMENT_ONTOLOGY_FILE, handler, ',', QuoteMode.NON_NUMERIC);
handler = new LoincDocumentOntologyHandler(codeSystemVersion, code2concept, propertyNames, valueSets, conceptMaps);
iterateOverZipFile(theDescriptors, LOINC_DOCUMENT_ONTOLOGY_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// RSNA Playbook file
handler = new LoincRsnaPlaybookHandler(codeSystemVersion, code2concept, propertyNames, valueSets, conceptMaps);
iterateOverZipFile(theZipBytes, LOINC_RSNA_PLAYBOOK_FILE, handler, ',', QuoteMode.NON_NUMERIC);
iterateOverZipFile(theDescriptors, LOINC_RSNA_PLAYBOOK_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// Top 2000 Codes - US
handler = new LoincTop2000LabResultsUsHandler(code2concept, valueSets);
iterateOverZipFile(theZipBytes, TOP2000_COMMON_LAB_RESULTS_US_FILE, handler, ',', QuoteMode.NON_NUMERIC);
handler = new LoincTop2000LabResultsUsHandler(code2concept, valueSets, conceptMaps);
iterateOverZipFile(theDescriptors, LOINC_TOP2000_COMMON_LAB_RESULTS_US_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// Top 2000 Codes - SI
handler = new LoincTop2000LabResultsSiHandler(code2concept, valueSets);
iterateOverZipFile(theZipBytes, TOP2000_COMMON_LAB_RESULTS_SI_FILE, handler, ',', QuoteMode.NON_NUMERIC);
handler = new LoincTop2000LabResultsSiHandler(code2concept, valueSets, conceptMaps);
iterateOverZipFile(theDescriptors, LOINC_TOP2000_COMMON_LAB_RESULTS_SI_FILE, handler, ',', QuoteMode.NON_NUMERIC);
theZipBytes.clear();
// Universal Lab Order ValueSet
handler = new LoincUniversalOrderSetHandler(code2concept, valueSets, conceptMaps);
iterateOverZipFile(theDescriptors, LOINC_UNIVERSAL_LAB_ORDER_VALUESET_FILE, handler, ',', QuoteMode.NON_NUMERIC);
// IEEE Medical Device Codes
handler = new LoincIeeeMedicalDeviceCodeHandler(code2concept, valueSets, conceptMaps);
iterateOverZipFile(theDescriptors, LOINC_IEEE_MEDICAL_DEVICE_CODE_MAPPING_TABLE_CSV, handler, ',', QuoteMode.NON_NUMERIC);
// Imaging Document Codes
handler = new LoincImagingDocumentCodeHandler(code2concept, valueSets, conceptMaps);
iterateOverZipFile(theDescriptors, LOINC_IMAGING_DOCUMENT_CODES_FILE, handler, ',', QuoteMode.NON_NUMERIC);
IOUtils.closeQuietly(theDescriptors);
for (Entry<String, TermConcept> next : code2concept.entrySet()) {
TermConcept nextConcept = next.getValue();
@ -277,34 +299,32 @@ public class TerminologyLoaderSvcImpl implements IHapiTerminologyLoaderSvc {
return new UploadStatistics(conceptCount);
}
UploadStatistics processSnomedCtFiles(List<byte[]> theZipBytes, RequestDetails theRequestDetails) {
private UploadStatistics processSnomedCtFiles(LoadedFileDescriptors theDescriptors, RequestDetails theRequestDetails) {
final TermCodeSystemVersion codeSystemVersion = new TermCodeSystemVersion();
final Map<String, TermConcept> id2concept = new HashMap<>();
final Map<String, TermConcept> code2concept = new HashMap<>();
final Set<String> validConceptIds = new HashSet<>();
IRecordHandler handler = new SctHandlerConcept(validConceptIds);
iterateOverZipFile(theZipBytes, SCT_FILE_CONCEPT, handler, '\t', null);
iterateOverZipFile(theDescriptors, SCT_FILE_CONCEPT, handler, '\t', null);
ourLog.info("Have {} valid concept IDs", validConceptIds.size());
handler = new SctHandlerDescription(validConceptIds, code2concept, id2concept, codeSystemVersion);
iterateOverZipFile(theZipBytes, SCT_FILE_DESCRIPTION, handler, '\t', null);
iterateOverZipFile(theDescriptors, SCT_FILE_DESCRIPTION, handler, '\t', null);
ourLog.info("Got {} concepts, cloning map", code2concept.size());
final HashMap<String, TermConcept> rootConcepts = new HashMap<>(code2concept);
handler = new SctHandlerRelationship(codeSystemVersion, rootConcepts, code2concept);
iterateOverZipFile(theZipBytes, SCT_FILE_RELATIONSHIP, handler, '\t', null);
iterateOverZipFile(theDescriptors, SCT_FILE_RELATIONSHIP, handler, '\t', null);
theZipBytes.clear();
IOUtils.closeQuietly(theDescriptors);
ourLog.info("Looking for root codes");
for (Iterator<Entry<String, TermConcept>> iter = rootConcepts.entrySet().iterator(); iter.hasNext(); ) {
if (iter.next().getValue().getParents().isEmpty() == false) {
iter.remove();
}
}
rootConcepts
.entrySet()
.removeIf(theStringTermConceptEntry -> theStringTermConceptEntry.getValue().getParents().isEmpty() == false);
ourLog.info("Done loading SNOMED CT files - {} root codes, {} total codes", rootConcepts.size(), code2concept.size());
@ -313,13 +333,14 @@ public class TerminologyLoaderSvcImpl implements IHapiTerminologyLoaderSvc {
long count = circularCounter.getThenAdd();
float pct = ((float) count / rootConcepts.size()) * 100.0f;
ourLog.info(" * Scanning for circular refs - have scanned {} / {} codes ({}%)", count, rootConcepts.size(), pct);
dropCircularRefs(next, new ArrayList<String>(), code2concept, circularCounter);
dropCircularRefs(next, new ArrayList<>(), code2concept, circularCounter);
}
codeSystemVersion.getConcepts().addAll(rootConcepts.values());
CodeSystem cs = new org.hl7.fhir.r4.model.CodeSystem();
cs.setUrl(SCT_URL);
cs.setUrl(SCT_URI);
cs.setName("SNOMED CT");
cs.setContent(CodeSystem.CodeSystemContentMode.NOTPRESENT);
storeCodeSystem(theRequestDetails, codeSystemVersion, cs, null, null);
@ -351,33 +372,6 @@ public class TerminologyLoaderSvcImpl implements IHapiTerminologyLoaderSvc {
myTermSvc.setProcessDeferred(true);
}
private void verifyMandatoryFilesExist(List<byte[]> theZipBytes, List<String> theExpectedFilenameFragments) {
Set<String> foundFragments = new HashSet<>();
for (byte[] nextZipBytes : theZipBytes) {
ZipInputStream zis = new ZipInputStream(new BufferedInputStream(new ByteArrayInputStream(nextZipBytes)));
try {
for (ZipEntry nextEntry; (nextEntry = zis.getNextEntry()) != null; ) {
for (String next : theExpectedFilenameFragments) {
if (nextEntry.getName().contains(next)) {
foundFragments.add(next);
}
}
}
} catch (IOException e) {
throw new InternalErrorException(e);
} finally {
IOUtils.closeQuietly(zis);
}
}
for (String next : theExpectedFilenameFragments) {
if (!foundFragments.contains(next)) {
throw new InvalidRequestException("Invalid input zip file, expected zip to contain the following name fragments: " + theExpectedFilenameFragments + " but found: " + foundFragments);
}
}
}
public static String firstNonBlank(String... theStrings) {
String retVal = "";
@ -395,10 +389,101 @@ public class TerminologyLoaderSvcImpl implements IHapiTerminologyLoaderSvc {
if (concept == null) {
concept = new TermConcept();
id2concept.put(id, concept);
concept.setCodeSystem(codeSystemVersion);
concept.setCodeSystemVersion(codeSystemVersion);
}
return concept;
}
static class LoadedFileDescriptors implements Closeable {
private List<File> myTemporaryFiles = new ArrayList<>();
private List<IHapiTerminologyLoaderSvc.FileDescriptor> myUncompressedFileDescriptors = new ArrayList<>();
LoadedFileDescriptors(List<IHapiTerminologyLoaderSvc.FileDescriptor> theFileDescriptors) {
try {
for (FileDescriptor next : theFileDescriptors) {
File nextTemporaryFile = File.createTempFile("hapifhir", ".tmp");
nextTemporaryFile.deleteOnExit();
if (next.getFilename().toLowerCase().endsWith(".zip")) {
ourLog.info("Uncompressing {} into temporary files", next.getFilename());
try (InputStream inputStream = next.getInputStream()) {
ZipInputStream zis = new ZipInputStream(new BufferedInputStream(inputStream));
for (ZipEntry nextEntry; (nextEntry = zis.getNextEntry()) != null; ) {
BOMInputStream fis = new BOMInputStream(zis);
FileOutputStream fos = new FileOutputStream(nextTemporaryFile);
IOUtils.copy(fis, fos);
String nextEntryFileName = nextEntry.getName();
myUncompressedFileDescriptors.add(new FileDescriptor() {
@Override
public String getFilename() {
return nextEntryFileName;
}
@Override
public InputStream getInputStream() {
try {
return new FileInputStream(nextTemporaryFile);
} catch (FileNotFoundException e) {
throw new InternalErrorException(e);
}
}
});
myTemporaryFiles.add(nextTemporaryFile);
}
}
} else {
myUncompressedFileDescriptors.add(next);
}
}
} catch (Exception e) {
close();
throw new InternalErrorException(e);
}
}
@Override
public void close() {
for (File next : myTemporaryFiles) {
FileUtils.deleteQuietly(next);
}
}
List<IHapiTerminologyLoaderSvc.FileDescriptor> getUncompressedFileDescriptors() {
return myUncompressedFileDescriptors;
}
private List<String> notFound(List<String> theExpectedFilenameFragments) {
Set<String> foundFragments = new HashSet<>();
for (String nextExpected : theExpectedFilenameFragments) {
for (FileDescriptor next : myUncompressedFileDescriptors) {
if (next.getFilename().contains(nextExpected)) {
foundFragments.add(nextExpected);
break;
}
}
}
ArrayList<String> notFoundFileNameFragments = new ArrayList<>(theExpectedFilenameFragments);
notFoundFileNameFragments.removeAll(foundFragments);
return notFoundFileNameFragments;
}
private void verifyMandatoryFilesExist(List<String> theExpectedFilenameFragments) {
List<String> notFound = notFound(theExpectedFilenameFragments);
if (!notFound.isEmpty()) {
throw new UnprocessableEntityException("Could not find the following mandatory files in input: " + notFound);
}
}
private void verifyOptionalFilesExist(List<String> theExpectedFilenameFragments) {
List<String> notFound = notFound(theExpectedFilenameFragments);
if (!notFound.isEmpty()) {
ourLog.warn("Could not find the following optional file: " + notFound);
}
}
}
}

View File

@ -2,6 +2,7 @@ package ca.uhn.fhir.jpa.term.loinc;
import ca.uhn.fhir.jpa.entity.TermConcept;
import ca.uhn.fhir.jpa.term.IRecordHandler;
import org.hl7.fhir.r4.model.ConceptMap;
import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.ValueSet;
@ -9,17 +10,20 @@ import java.util.HashMap;
import java.util.List;
import java.util.Map;
import static org.apache.commons.lang3.StringUtils.isBlank;
import static org.apache.commons.lang3.StringUtils.*;
abstract class BaseHandler implements IRecordHandler {
private final List<ConceptMap> myConceptMaps;
private final Map<String, ConceptMap> myIdToConceptMaps = new HashMap<>();
private final List<ValueSet> myValueSets;
private final Map<String, ValueSet> myIdToValueSet = new HashMap<>();
private final Map<String, TermConcept> myCode2Concept;
BaseHandler(Map<String, TermConcept> theCode2Concept, List<ValueSet> theValueSets) {
BaseHandler(Map<String, TermConcept> theCode2Concept, List<ValueSet> theValueSets, List<ConceptMap> theConceptMaps) {
myValueSets = theValueSets;
myCode2Concept = theCode2Concept;
myConceptMaps = theConceptMaps;
}
void addCodeAsIncludeToValueSet(ValueSet theVs, String theCodeSystemUrl, String theCode, String theDisplayName) {
@ -60,6 +64,78 @@ abstract class BaseHandler implements IRecordHandler {
}
}
void addConceptMapEntry(ConceptMapping theMapping) {
if (isBlank(theMapping.getSourceCode())) {
return;
}
if (isBlank(theMapping.getTargetCode())) {
return;
}
ConceptMap conceptMap;
if (!myIdToConceptMaps.containsKey(theMapping.getConceptMapId())) {
conceptMap = new ConceptMap();
conceptMap.setId(theMapping.getConceptMapId());
conceptMap.setUrl(theMapping.getConceptMapUri());
conceptMap.setName(theMapping.getConceptMapName());
myIdToConceptMaps.put(theMapping.getConceptMapId(), conceptMap);
myConceptMaps.add(conceptMap);
} else {
conceptMap = myIdToConceptMaps.get(theMapping.getConceptMapId());
}
if (isNotBlank(theMapping.getCopyright())) {
conceptMap.setCopyright(theMapping.getCopyright());
}
ConceptMap.SourceElementComponent source = null;
ConceptMap.ConceptMapGroupComponent group = null;
for (ConceptMap.ConceptMapGroupComponent next : conceptMap.getGroup()) {
if (next.getSource().equals(theMapping.getSourceCodeSystem())) {
if (next.getTarget().equals(theMapping.getTargetCodeSystem())) {
if (!defaultString(theMapping.getTargetCodeSystemVersion()).equals(defaultString(next.getTargetVersion()))) {
continue;
}
group = next;
break;
}
}
}
if (group == null) {
group = conceptMap.addGroup();
group.setSource(theMapping.getSourceCodeSystem());
group.setTarget(theMapping.getTargetCodeSystem());
group.setTargetVersion(defaultIfBlank(theMapping.getTargetCodeSystemVersion(), null));
}
for (ConceptMap.SourceElementComponent next : group.getElement()) {
if (next.getCode().equals(theMapping.getSourceCode())) {
source = next;
}
}
if (source == null) {
source = group.addElement();
source.setCode(theMapping.getSourceCode());
source.setDisplay(theMapping.getSourceDisplay());
}
boolean found = false;
for (ConceptMap.TargetElementComponent next : source.getTarget()) {
if (next.getCode().equals(theMapping.getTargetCode())) {
found = true;
}
}
if (!found) {
source
.addTarget()
.setCode(theMapping.getTargetCode())
.setDisplay(theMapping.getTargetDisplay())
.setEquivalence(theMapping.getEquivalence());
}
}
ValueSet getValueSet(String theValueSetId, String theValueSetUri, String theValueSetName) {
ValueSet vs;
if (!myIdToValueSet.containsKey(theValueSetId)) {
@ -77,4 +153,128 @@ abstract class BaseHandler implements IRecordHandler {
}
static class ConceptMapping {
private String myCopyright;
private String myConceptMapId;
private String myConceptMapUri;
private String myConceptMapName;
private String mySourceCodeSystem;
private String mySourceCode;
private String mySourceDisplay;
private String myTargetCodeSystem;
private String myTargetCode;
private String myTargetDisplay;
private Enumerations.ConceptMapEquivalence myEquivalence;
private String myTargetCodeSystemVersion;
String getConceptMapId() {
return myConceptMapId;
}
ConceptMapping setConceptMapId(String theConceptMapId) {
myConceptMapId = theConceptMapId;
return this;
}
String getConceptMapName() {
return myConceptMapName;
}
ConceptMapping setConceptMapName(String theConceptMapName) {
myConceptMapName = theConceptMapName;
return this;
}
String getConceptMapUri() {
return myConceptMapUri;
}
ConceptMapping setConceptMapUri(String theConceptMapUri) {
myConceptMapUri = theConceptMapUri;
return this;
}
String getCopyright() {
return myCopyright;
}
ConceptMapping setCopyright(String theCopyright) {
myCopyright = theCopyright;
return this;
}
Enumerations.ConceptMapEquivalence getEquivalence() {
return myEquivalence;
}
ConceptMapping setEquivalence(Enumerations.ConceptMapEquivalence theEquivalence) {
myEquivalence = theEquivalence;
return this;
}
String getSourceCode() {
return mySourceCode;
}
ConceptMapping setSourceCode(String theSourceCode) {
mySourceCode = theSourceCode;
return this;
}
String getSourceCodeSystem() {
return mySourceCodeSystem;
}
ConceptMapping setSourceCodeSystem(String theSourceCodeSystem) {
mySourceCodeSystem = theSourceCodeSystem;
return this;
}
String getSourceDisplay() {
return mySourceDisplay;
}
ConceptMapping setSourceDisplay(String theSourceDisplay) {
mySourceDisplay = theSourceDisplay;
return this;
}
String getTargetCode() {
return myTargetCode;
}
ConceptMapping setTargetCode(String theTargetCode) {
myTargetCode = theTargetCode;
return this;
}
String getTargetCodeSystem() {
return myTargetCodeSystem;
}
ConceptMapping setTargetCodeSystem(String theTargetCodeSystem) {
myTargetCodeSystem = theTargetCodeSystem;
return this;
}
String getTargetCodeSystemVersion() {
return myTargetCodeSystemVersion;
}
ConceptMapping setTargetCodeSystemVersion(String theTargetCodeSystemVersion) {
myTargetCodeSystemVersion = theTargetCodeSystemVersion;
return this;
}
String getTargetDisplay() {
return myTargetDisplay;
}
ConceptMapping setTargetDisplay(String theTargetDisplay) {
myTargetDisplay = theTargetDisplay;
return this;
}
}
}

View File

@ -4,6 +4,7 @@ import ca.uhn.fhir.jpa.entity.TermConcept;
import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc;
import ca.uhn.fhir.jpa.term.IRecordHandler;
import org.apache.commons.csv.CSVRecord;
import org.hl7.fhir.r4.model.ConceptMap;
import org.hl7.fhir.r4.model.ValueSet;
import java.util.List;
@ -17,8 +18,8 @@ public class BaseLoincTop2000LabResultsHandler extends BaseHandler implements IR
private String myValueSetUri;
private String myValueSetName;
public BaseLoincTop2000LabResultsHandler(Map<String, TermConcept> theCode2concept, List<ValueSet> theValueSets, String theValueSetId, String theValueSetUri, String theValueSetName) {
super(theCode2concept, theValueSets);
public BaseLoincTop2000LabResultsHandler(Map<String, TermConcept> theCode2concept, List<ValueSet> theValueSets, String theValueSetId, String theValueSetUri, String theValueSetName, List<ConceptMap> theConceptMaps) {
super(theCode2concept, theValueSets, theConceptMaps);
myValueSetId = theValueSetId;
myValueSetUri = theValueSetUri;
myValueSetName = theValueSetName;
@ -30,7 +31,7 @@ public class BaseLoincTop2000LabResultsHandler extends BaseHandler implements IR
String displayName = trim(theRecord.get("Long Common Name"));
ValueSet valueSet = getValueSet(myValueSetId, myValueSetUri, myValueSetName);
addCodeAsIncludeToValueSet(valueSet, IHapiTerminologyLoaderSvc.LOINC_URL, loincNumber, displayName);
addCodeAsIncludeToValueSet(valueSet, IHapiTerminologyLoaderSvc.LOINC_URI, loincNumber, displayName);
}
}

View File

@ -73,7 +73,7 @@ public class LoincAnswerListHandler implements IRecordHandler {
vs = new ValueSet();
vs.setUrl("urn:oid:" + answerListOid);
vs.addIdentifier()
.setSystem(IHapiTerminologyLoaderSvc.LOINC_URL)
.setSystem(IHapiTerminologyLoaderSvc.LOINC_URI)
.setValue(answerListId);
vs.setId(answerListId);
vs.setName(answerListName);
@ -86,7 +86,7 @@ public class LoincAnswerListHandler implements IRecordHandler {
vs
.getCompose()
.getIncludeFirstRep()
.setSystem(IHapiTerminologyLoaderSvc.LOINC_URL)
.setSystem(IHapiTerminologyLoaderSvc.LOINC_URI)
.addConcept()
.setCode(answerString)
.setDisplay(displayText);

View File

@ -6,7 +6,7 @@ import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc;
import ca.uhn.fhir.jpa.term.IRecordHandler;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import org.apache.commons.csv.CSVRecord;
import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.ConceptMap;
import org.hl7.fhir.r4.model.ValueSet;
import java.util.*;
@ -22,8 +22,8 @@ public class LoincDocumentOntologyHandler extends BaseHandler implements IRecord
private final TermCodeSystemVersion myCodeSystemVersion;
private final Set<String> myPropertyNames;
public LoincDocumentOntologyHandler(TermCodeSystemVersion theCodeSystemVersion, Map<String, TermConcept> theCode2concept, Set<String> thePropertyNames, List<ValueSet> theValueSets) {
super(theCode2concept, theValueSets);
public LoincDocumentOntologyHandler(TermCodeSystemVersion theCodeSystemVersion, Map<String, TermConcept> theCode2concept, Set<String> thePropertyNames, List<ValueSet> theValueSets, List<ConceptMap> theConceptMaps) {
super(theCode2concept, theValueSets, theConceptMaps);
myCodeSystemVersion = theCodeSystemVersion;
myCode2Concept = theCode2concept;
myPropertyNames = thePropertyNames;
@ -40,7 +40,7 @@ public class LoincDocumentOntologyHandler extends BaseHandler implements IRecord
// RSNA Codes VS
ValueSet vs = getValueSet(DOCUMENT_ONTOLOGY_CODES_VS_ID, DOCUMENT_ONTOLOGY_CODES_VS_URI, DOCUMENT_ONTOLOGY_CODES_VS_NAME);
addCodeAsIncludeToValueSet(vs, IHapiTerminologyLoaderSvc.LOINC_URL, loincNumber, null);
addCodeAsIncludeToValueSet(vs, IHapiTerminologyLoaderSvc.LOINC_URI, loincNumber, null);
// Part Properties
String loincCodePropName;
@ -66,7 +66,7 @@ public class LoincDocumentOntologyHandler extends BaseHandler implements IRecord
TermConcept code = myCode2Concept.get(loincNumber);
if (code != null) {
code.addPropertyCoding(loincCodePropName, IHapiTerminologyLoaderSvc.LOINC_URL, partNumber, partName);
code.addPropertyCoding(loincCodePropName, IHapiTerminologyLoaderSvc.LOINC_URI, partNumber, partName);
}
}

View File

@ -39,7 +39,7 @@ public class LoincHierarchyHandler implements IRecordHandler {
TermConcept retVal = myCode2Concept.get(theCode);
if (retVal == null) {
retVal = new TermConcept();
retVal.setCodeSystem(myCodeSystemVersion);
retVal.setCodeSystemVersion(myCodeSystemVersion);
retVal.setCode(theCode);
retVal.setDisplay(theDisplay);
myCode2Concept.put(theCode, retVal);

View File

@ -0,0 +1,56 @@
package ca.uhn.fhir.jpa.term.loinc;
import ca.uhn.fhir.jpa.entity.TermConcept;
import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc;
import ca.uhn.fhir.jpa.term.IRecordHandler;
import org.apache.commons.csv.CSVRecord;
import org.hl7.fhir.r4.model.ConceptMap;
import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.ValueSet;
import java.util.List;
import java.util.Map;
import static org.apache.commons.lang3.StringUtils.trim;
public class LoincIeeeMedicalDeviceCodeHandler extends BaseHandler implements IRecordHandler {
public static final String LOINC_IEEE_CM_ID = "LOINC-IEEE-MEDICAL-DEVICE-CM";
public static final String LOINC_IEEE_CM_URI = "http://loinc.org/fhir/loinc-ieee-device-code-mappings";
public static final String LOINC_IEEE_CM_NAME = "LOINC/IEEE Device Code Mappings";
/**
* Constructor
*/
public LoincIeeeMedicalDeviceCodeHandler(Map<String, TermConcept> theCode2concept, List<ValueSet> theValueSets, List<ConceptMap> theConceptMaps) {
super(theCode2concept, theValueSets, theConceptMaps);
}
@Override
public void accept(CSVRecord theRecord) {
String loincNumber = trim(theRecord.get("LOINC_NUM"));
String longCommonName = trim(theRecord.get("LOINC_LONG_COMMON_NAME"));
String ieeeCode = trim(theRecord.get("IEEE_CF_CODE10"));
String ieeeDisplayName = trim(theRecord.get("IEEE_REFID"));
// LOINC Part -> IEEE 11073:10101 Mappings
String sourceCodeSystemUri = IHapiTerminologyLoaderSvc.LOINC_URI;
String targetCodeSystemUri = IHapiTerminologyLoaderSvc.IEEE_11073_10101_URI;
addConceptMapEntry(
new ConceptMapping()
.setConceptMapId(LOINC_IEEE_CM_ID)
.setConceptMapUri(LOINC_IEEE_CM_URI)
.setConceptMapName(LOINC_IEEE_CM_NAME)
.setSourceCodeSystem(sourceCodeSystemUri)
.setSourceCode(loincNumber)
.setSourceDisplay(longCommonName)
.setTargetCodeSystem(targetCodeSystemUri)
.setTargetCode(ieeeCode)
.setTargetDisplay(ieeeDisplayName)
.setEquivalence(Enumerations.ConceptMapEquivalence.EQUAL));
}
}

View File

@ -0,0 +1,35 @@
package ca.uhn.fhir.jpa.term.loinc;
import ca.uhn.fhir.jpa.entity.TermConcept;
import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc;
import ca.uhn.fhir.jpa.term.IRecordHandler;
import org.apache.commons.csv.CSVRecord;
import org.hl7.fhir.r4.model.ConceptMap;
import org.hl7.fhir.r4.model.ValueSet;
import java.util.List;
import java.util.Map;
import static org.apache.commons.lang3.StringUtils.trim;
public class LoincImagingDocumentCodeHandler extends BaseHandler implements IRecordHandler {
public static final String VS_ID = "loinc-imaging-document-codes";
public static final String VS_URI = "http://loinc.org/fhir/loinc-imaging-document-codes";
public static final String VS_NAME = "LOINC Imaging Document Codes";
public LoincImagingDocumentCodeHandler(Map<String, TermConcept> theCode2concept, List<ValueSet> theValueSets, List<ConceptMap> theConceptMaps) {
super(theCode2concept, theValueSets, theConceptMaps);
}
@Override
public void accept(CSVRecord theRecord) {
String loincNumber = trim(theRecord.get("LOINC_NUM"));
String displayName = trim(theRecord.get("LONG_COMMON_NAME"));
ValueSet valueSet = getValueSet(VS_ID, VS_URI, VS_NAME);
addCodeAsIncludeToValueSet(valueSet, IHapiTerminologyLoaderSvc.LOINC_URI, loincNumber, displayName);
}
}

View File

@ -6,27 +6,26 @@ import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc;
import ca.uhn.fhir.jpa.term.IRecordHandler;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import org.apache.commons.csv.CSVRecord;
import org.hl7.fhir.r4.model.CanonicalType;
import org.hl7.fhir.r4.model.ConceptMap;
import org.hl7.fhir.r4.model.Enumerations;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.hl7.fhir.r4.model.ValueSet;
import java.util.List;
import java.util.Map;
import static org.apache.commons.lang3.StringUtils.defaultIfBlank;
import static org.apache.commons.lang3.StringUtils.trim;
public class LoincPartRelatedCodeMappingHandler implements IRecordHandler {
public class LoincPartRelatedCodeMappingHandler extends BaseHandler implements IRecordHandler {
public static final String LOINC_TO_SNOMED_CM_ID = "LOINC-TO-SNOMED-CM";
private static final Logger ourLog = LoggerFactory.getLogger(LoincPartRelatedCodeMappingHandler.class);
public static final String LOINC_PART_MAP_ID = "LOINC-PART-MAP";
public static final String LOINC_PART_MAP_URI = "http://loinc.org/fhir/loinc-part-map";
public static final String LOINC_PART_MAP_NAME = "LOINC Part Map";
private final Map<String, TermConcept> myCode2Concept;
private final TermCodeSystemVersion myCodeSystemVersion;
private final List<ConceptMap> myConceptMaps;
public LoincPartRelatedCodeMappingHandler(TermCodeSystemVersion theCodeSystemVersion, Map<String, TermConcept> theCode2concept, List<ConceptMap> theConceptMaps) {
public LoincPartRelatedCodeMappingHandler(TermCodeSystemVersion theCodeSystemVersion, Map<String, TermConcept> theCode2concept, List<ValueSet> theValueSets, List<ConceptMap> theConceptMaps) {
super(theCode2concept, theValueSets, theConceptMaps);
myCodeSystemVersion = theCodeSystemVersion;
myCode2Concept = theCode2concept;
myConceptMaps = theConceptMaps;
@ -48,83 +47,37 @@ public class LoincPartRelatedCodeMappingHandler implements IRecordHandler {
String extCodeSystemVersion = trim(theRecord.get("ExtCodeSystemVersion"));
String extCodeSystemCopyrightNotice = trim(theRecord.get("ExtCodeSystemCopyrightNotice"));
ConceptMap conceptMap;
if (extCodeSystem.equals(IHapiTerminologyLoaderSvc.SCT_URL)) {
conceptMap = findOrAddCodeSystem(LOINC_TO_SNOMED_CM_ID, "http://loinc.org/loinc-to-snomed", extCodeSystem, extCodeSystemCopyrightNotice);
} else {
throw new InternalErrorException("Unknown external code system ID: " + extCodeSystem);
}
ConceptMap.ConceptMapGroupComponent group = null;
for (ConceptMap.ConceptMapGroupComponent next : conceptMap.getGroup()) {
if (next.getTarget().equals(extCodeSystem)) {
if (defaultIfBlank(next.getTargetVersion(), "").equals(defaultIfBlank(extCodeSystemVersion, ""))) {
group = next;
break;
}
}
}
if (group == null) {
group = conceptMap.addGroup();
group.setSource(IHapiTerminologyLoaderSvc.LOINC_URL);
group.setTarget(extCodeSystem);
group.setTargetVersion(defaultIfBlank(extCodeSystemVersion, null));
}
ConceptMap.SourceElementComponent element = null;
for (ConceptMap.SourceElementComponent next : group.getElement()) {
if (next.getCode().equals(partNumber)) {
element = next;
break;
}
}
if (element == null) {
element = group
.addElement()
.setCode(partNumber)
.setDisplay(partName);
}
ConceptMap.TargetElementComponent target = element
.addTarget()
.setCode(extCodeId)
.setDisplay(extCodeDisplayName);
Enumerations.ConceptMapEquivalence equivalence;
switch (mapType) {
case "Exact":
// 'equal' is more exact than 'equivalent' in the equivalence codes
target.setEquivalence(Enumerations.ConceptMapEquivalence.EQUAL);
equivalence = Enumerations.ConceptMapEquivalence.EQUAL;
break;
case "LOINC broader":
target.setEquivalence(Enumerations.ConceptMapEquivalence.NARROWER);
equivalence = Enumerations.ConceptMapEquivalence.NARROWER;
break;
case "LOINC narrower":
target.setEquivalence(Enumerations.ConceptMapEquivalence.WIDER);
equivalence = Enumerations.ConceptMapEquivalence.WIDER;
break;
default:
throw new InternalErrorException("Unknown MapType: " + mapType);
}
addConceptMapEntry(
new ConceptMapping()
.setConceptMapId(LOINC_PART_MAP_ID)
.setConceptMapUri(LOINC_PART_MAP_URI)
.setConceptMapName(LOINC_PART_MAP_NAME)
.setSourceCodeSystem(IHapiTerminologyLoaderSvc.LOINC_URI)
.setSourceCode(partNumber)
.setSourceDisplay(partName)
.setTargetCodeSystem(extCodeSystem)
.setTargetCode(extCodeId)
.setTargetDisplay(extCodeDisplayName)
.setTargetCodeSystemVersion(extCodeSystemVersion)
.setEquivalence(equivalence)
.setCopyright(extCodeSystemCopyrightNotice));
}
private ConceptMap findOrAddCodeSystem(String theId, String theUri, String theTargetCodeSystem, String theTargetCopyright) {
for (ConceptMap next : myConceptMaps) {
if (next.getId().equals(theId)) {
return next;
}
}
ConceptMap cm = new ConceptMap();
cm.setId(theId);
cm.setUrl(theUri);
cm.setSource(new CanonicalType(IHapiTerminologyLoaderSvc.LOINC_URL));
cm.setTarget(new CanonicalType(theTargetCodeSystem));
cm.setCopyright(theTargetCopyright);
myConceptMaps.add(cm);
return cm;
}
}

View File

@ -6,7 +6,6 @@ import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc;
import ca.uhn.fhir.jpa.term.IRecordHandler;
import ca.uhn.fhir.rest.server.exceptions.InternalErrorException;
import org.apache.commons.csv.CSVRecord;
import org.hl7.fhir.r4.model.CanonicalType;
import org.hl7.fhir.r4.model.ConceptMap;
import org.hl7.fhir.r4.model.Enumerations;
import org.hl7.fhir.r4.model.ValueSet;
@ -16,7 +15,7 @@ import java.util.*;
import static org.apache.commons.lang3.StringUtils.isNotBlank;
import static org.apache.commons.lang3.StringUtils.trim;
public class LoincRsnaPlaybookHandler implements IRecordHandler {
public class LoincRsnaPlaybookHandler extends BaseHandler implements IRecordHandler {
public static final String RSNA_CODES_VS_ID = "RSNA-LOINC-CODES-VS";
public static final String RSNA_CODES_VS_URI = "http://loinc.org/rsna-codes";
@ -34,19 +33,17 @@ public class LoincRsnaPlaybookHandler implements IRecordHandler {
private final Set<String> myPropertyNames;
private final List<ValueSet> myValueSets;
private final Map<String, ValueSet> myIdToValueSet = new HashMap<>();
private final List<ConceptMap> myConceptMaps;
private final Set<String> myCodesInRsnaPlaybookValueSet = new HashSet<>();
private final Map<String, ConceptMap> myIdToConceptMaps = new HashMap<>();
/**
* Constructor
*/
public LoincRsnaPlaybookHandler(TermCodeSystemVersion theCodeSystemVersion, Map<String, TermConcept> theCode2concept, Set<String> thePropertyNames, List<ValueSet> theValueSets, List<ConceptMap> theConceptMaps) {
super(theCode2concept, theValueSets, theConceptMaps);
myCodeSystemVersion = theCodeSystemVersion;
myCode2Concept = theCode2concept;
myPropertyNames = thePropertyNames;
myValueSets = theValueSets;
myConceptMaps = theConceptMaps;
}
@Override
@ -81,7 +78,7 @@ public class LoincRsnaPlaybookHandler implements IRecordHandler {
vs
.getCompose()
.getIncludeFirstRep()
.setSystem(IHapiTerminologyLoaderSvc.LOINC_URL)
.setSystem(IHapiTerminologyLoaderSvc.LOINC_URI)
.addConcept()
.setCode(loincNumber)
.setDisplay(longCommonName);
@ -105,61 +102,42 @@ public class LoincRsnaPlaybookHandler implements IRecordHandler {
TermConcept code = myCode2Concept.get(loincNumber);
if (code != null) {
code.addPropertyCoding(loincCodePropName, IHapiTerminologyLoaderSvc.LOINC_URL, partNumber, partName);
code.addPropertyCoding(loincCodePropName, IHapiTerminologyLoaderSvc.LOINC_URI, partNumber, partName);
}
// LOINC Part -> Radlex RID code mappings
addMapping(partNumber, partName, RID_MAPPING_CM_ID, RID_MAPPING_CM_URI, RID_MAPPING_CM_NAME, RID_CS_URI, rid, preferredName, Enumerations.ConceptMapEquivalence.EQUAL);
if (isNotBlank(rid)) {
addConceptMapEntry(
new ConceptMapping()
.setConceptMapId(RID_MAPPING_CM_ID)
.setConceptMapUri(RID_MAPPING_CM_URI)
.setConceptMapName(RID_MAPPING_CM_NAME)
.setSourceCodeSystem(IHapiTerminologyLoaderSvc.LOINC_URI)
.setSourceCode(partNumber)
.setSourceDisplay(partName)
.setTargetCodeSystem(RID_CS_URI)
.setTargetCode(rid)
.setTargetDisplay(preferredName)
.setEquivalence(Enumerations.ConceptMapEquivalence.EQUAL));
}
// LOINC Term -> Radlex RPID code mappings
addMapping(loincNumber, longCommonName, RPID_MAPPING_CM_ID, RPID_MAPPING_CM_URI, RPID_MAPPING_CM_NAME, RPID_CS_URI, rpid, longName, Enumerations.ConceptMapEquivalence.EQUAL);
}
private void addMapping(String theLoincNumber, String theLongCommonName, String theConceptMapId, String theConceptMapUri, String theConceptMapName, String theTargetCodeSystemUri, String theTargetCode, String theTargetDisplay, Enumerations.ConceptMapEquivalence theEquivalence) {
if (isNotBlank(theTargetCode)) {
ConceptMap conceptMap;
if (!myIdToConceptMaps.containsKey(theConceptMapId)) {
conceptMap = new ConceptMap();
conceptMap.setId(theConceptMapId);
conceptMap.setUrl(theConceptMapUri);
conceptMap.setName(theConceptMapName);
conceptMap.setSource(new CanonicalType(IHapiTerminologyLoaderSvc.LOINC_URL));
conceptMap.setTarget(new CanonicalType(theTargetCodeSystemUri));
myIdToConceptMaps.put(theConceptMapId, conceptMap);
myConceptMaps.add(conceptMap);
} else {
conceptMap = myIdToConceptMaps.get(theConceptMapId);
}
ConceptMap.SourceElementComponent source = null;
ConceptMap.ConceptMapGroupComponent group = conceptMap.getGroupFirstRep();
for (ConceptMap.SourceElementComponent next : group.getElement()) {
if (next.getCode().equals(theLoincNumber)) {
source = next;
}
}
if (source == null) {
source = group.addElement();
source.setCode(theLoincNumber);
source.setDisplay(theLongCommonName);
}
boolean found = false;
for (ConceptMap.TargetElementComponent next : source.getTarget()) {
if (next.getCode().equals(theTargetCode)) {
found = true;
}
}
if (!found) {
source
.addTarget()
.setCode(theTargetCode)
.setDisplay(theTargetDisplay)
.setEquivalence(theEquivalence);
}
}
if (isNotBlank(rpid)) {
addConceptMapEntry(
new ConceptMapping()
.setConceptMapId(RPID_MAPPING_CM_ID)
.setConceptMapUri(RPID_MAPPING_CM_URI)
.setConceptMapName(RPID_MAPPING_CM_NAME)
.setSourceCodeSystem(IHapiTerminologyLoaderSvc.LOINC_URI)
.setSourceCode(loincNumber)
.setSourceDisplay(longCommonName)
.setTargetCodeSystem(RPID_CS_URI)
.setTargetCode(rpid)
.setTargetDisplay(longName)
.setEquivalence(Enumerations.ConceptMapEquivalence.EQUAL));
}
}
}

View File

@ -1,6 +1,7 @@
package ca.uhn.fhir.jpa.term.loinc;
import ca.uhn.fhir.jpa.entity.TermConcept;
import org.hl7.fhir.r4.model.ConceptMap;
import org.hl7.fhir.r4.model.ValueSet;
import java.util.List;
@ -12,8 +13,8 @@ public class LoincTop2000LabResultsSiHandler extends BaseLoincTop2000LabResultsH
public static final String TOP_2000_SI_VS_URI = "http://loinc.org/top-2000-lab-results-si";
public static final String TOP_2000_SI_VS_NAME = "Top 2000 Lab Results SI";
public LoincTop2000LabResultsSiHandler(Map<String, TermConcept> theCode2concept, List<ValueSet> theValueSets) {
super(theCode2concept, theValueSets, TOP_2000_SI_VS_ID, TOP_2000_SI_VS_URI, TOP_2000_SI_VS_NAME);
public LoincTop2000LabResultsSiHandler(Map<String, TermConcept> theCode2concept, List<ValueSet> theValueSets, List<ConceptMap> theConceptMaps) {
super(theCode2concept, theValueSets, TOP_2000_SI_VS_ID, TOP_2000_SI_VS_URI, TOP_2000_SI_VS_NAME, theConceptMaps);
}

View File

@ -1,6 +1,7 @@
package ca.uhn.fhir.jpa.term.loinc;
import ca.uhn.fhir.jpa.entity.TermConcept;
import org.hl7.fhir.r4.model.ConceptMap;
import org.hl7.fhir.r4.model.ValueSet;
import java.util.List;
@ -12,8 +13,8 @@ public class LoincTop2000LabResultsUsHandler extends BaseLoincTop2000LabResultsH
public static final String TOP_2000_US_VS_URI = "http://loinc.org/top-2000-lab-results-us";
public static final String TOP_2000_US_VS_NAME = "Top 2000 Lab Results US";
public LoincTop2000LabResultsUsHandler(Map<String, TermConcept> theCode2concept, List<ValueSet> theValueSets) {
super(theCode2concept, theValueSets, TOP_2000_US_VS_ID, TOP_2000_US_VS_URI, TOP_2000_US_VS_NAME);
public LoincTop2000LabResultsUsHandler(Map<String, TermConcept> theCode2concept, List<ValueSet> theValueSets, List<ConceptMap> theConceptMaps) {
super(theCode2concept, theValueSets, TOP_2000_US_VS_ID, TOP_2000_US_VS_URI, TOP_2000_US_VS_NAME, theConceptMaps);
}

View File

@ -0,0 +1,35 @@
package ca.uhn.fhir.jpa.term.loinc;
import ca.uhn.fhir.jpa.entity.TermConcept;
import ca.uhn.fhir.jpa.term.IHapiTerminologyLoaderSvc;
import ca.uhn.fhir.jpa.term.IRecordHandler;
import org.apache.commons.csv.CSVRecord;
import org.hl7.fhir.r4.model.ConceptMap;
import org.hl7.fhir.r4.model.ValueSet;
import java.util.*;
import static org.apache.commons.lang3.StringUtils.trim;
public class LoincUniversalOrderSetHandler extends BaseHandler implements IRecordHandler {
public static final String VS_ID = "loinc-universal-order-set-vs";
public static final String VS_URI = "http://loinc.org/fhir/loinc-universal-order-set";
public static final String VS_NAME = "LOINC Universal Order Set";
public LoincUniversalOrderSetHandler(Map<String, TermConcept> theCode2concept, List<ValueSet> theValueSets, List<ConceptMap> theConceptMaps) {
super(theCode2concept, theValueSets, theConceptMaps);
}
@Override
public void accept(CSVRecord theRecord) {
String loincNumber = trim(theRecord.get("LOINC_NUM"));
String displayName = trim(theRecord.get("LONG_COMMON_NAME"));
String orderObs = trim(theRecord.get("ORDER_OBS"));
ValueSet valueSet = getValueSet(VS_ID, VS_URI, VS_NAME);
addCodeAsIncludeToValueSet(valueSet, IHapiTerminologyLoaderSvc.LOINC_URI, loincNumber, displayName);
}
}

View File

@ -96,7 +96,7 @@ public class FhirResourceDaoDstu3TerminologyTest extends BaseJpaDstu3Test {
TermConcept childCA = new TermConcept(cs, "childCA").setDisplay("Child CA");
parentC.addChild(childCA, RelationshipTypeEnum.ISA);
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM, cs);
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM, "SYSTEM NAME", cs);
return codeSystem;
}
@ -127,7 +127,7 @@ public class FhirResourceDaoDstu3TerminologyTest extends BaseJpaDstu3Test {
parentB.addChild(childI, RelationshipTypeEnum.ISA);
}
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM, cs);
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM, "SYSTEM NAME", cs);
return codeSystem;
}
@ -163,7 +163,7 @@ public class FhirResourceDaoDstu3TerminologyTest extends BaseJpaDstu3Test {
TermConcept beagle = new TermConcept(cs, "beagle").setDisplay("Beagle");
dogs.addChild(beagle, RelationshipTypeEnum.ISA);
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM, cs);
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM,"SYSTEM NAME" , cs);
return codeSystem;
}
@ -713,7 +713,7 @@ public class FhirResourceDaoDstu3TerminologyTest extends BaseJpaDstu3Test {
cs.setResource(table);
TermConcept parentA = new TermConcept(cs, "ParentA").setDisplay("Parent A");
cs.getConcepts().add(parentA);
myTermSvc.storeNewCodeSystemVersion(table.getId(), "http://snomed.info/sct", cs);
myTermSvc.storeNewCodeSystemVersion(table.getId(), "http://snomed.info/sct", "Snomed CT", cs);
StringType code = new StringType("ParentA");
StringType system = new StringType("http://snomed.info/sct");

View File

@ -96,7 +96,7 @@ public class FhirResourceDaoR4TerminologyTest extends BaseJpaR4Test {
TermConcept childCA = new TermConcept(cs, "childCA").setDisplay("Child CA");
parentC.addChild(childCA, RelationshipTypeEnum.ISA);
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM, cs);
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM,"SYSTEM NAME" , cs);
return codeSystem;
}
@ -127,7 +127,7 @@ public class FhirResourceDaoR4TerminologyTest extends BaseJpaR4Test {
parentB.addChild(childI, RelationshipTypeEnum.ISA);
}
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM, cs);
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM,"SYSTEM NAME" , cs);
return codeSystem;
}
@ -163,7 +163,7 @@ public class FhirResourceDaoR4TerminologyTest extends BaseJpaR4Test {
TermConcept beagle = new TermConcept(cs, "beagle").setDisplay("Beagle");
dogs.addChild(beagle, RelationshipTypeEnum.ISA);
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM, cs);
myTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM,"SYSTEM NAME" , cs);
return codeSystem;
}
@ -713,7 +713,7 @@ public class FhirResourceDaoR4TerminologyTest extends BaseJpaR4Test {
cs.setResource(table);
TermConcept parentA = new TermConcept(cs, "ParentA").setDisplay("Parent A");
cs.getConcepts().add(parentA);
myTermSvc.storeNewCodeSystemVersion(table.getId(), "http://snomed.info/sct", cs);
myTermSvc.storeNewCodeSystemVersion(table.getId(), "http://snomed.info/sct","Snomed CT" , cs);
StringType code = new StringType("ParentA");
StringType system = new StringType("http://snomed.info/sct");

View File

@ -19,6 +19,7 @@ import org.junit.AfterClass;
import org.junit.Before;
import org.junit.Test;
import org.springframework.orm.jpa.JpaSystemException;
import org.springframework.test.context.TestPropertySource;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.support.TransactionCallbackWithoutResult;
import org.springframework.transaction.support.TransactionTemplate;
@ -33,6 +34,11 @@ import static org.hamcrest.Matchers.empty;
import static org.junit.Assert.*;
@SuppressWarnings({"unchecked", "deprecation"})
@TestPropertySource(properties = {
// Since scheduled tasks can cause searches, which messes up the
// value returned by SearchBuilder.getLastHandlerMechanismForUnitTest()
"scheduling_disabled=true"
})
public class FhirResourceDaoR4UniqueSearchParamTest extends BaseJpaR4Test {
private static final org.slf4j.Logger ourLog = org.slf4j.LoggerFactory.getLogger(FhirResourceDaoR4UniqueSearchParamTest.class);

View File

@ -20,6 +20,7 @@ import org.hl7.fhir.dstu3.model.IdType;
import org.hl7.fhir.dstu3.model.Observation;
import org.hl7.fhir.dstu3.model.Observation.ObservationStatus;
import org.hl7.fhir.dstu3.model.Patient;
import org.hl7.fhir.dstu3.model.Practitioner;
import org.hl7.fhir.dstu3.model.Reference;
import org.hl7.fhir.instance.model.api.IIdType;
import org.junit.AfterClass;
@ -27,6 +28,7 @@ import org.junit.Test;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import static org.hamcrest.Matchers.startsWith;
@ -41,6 +43,64 @@ public class AuthorizationInterceptorResourceProviderDstu3Test extends BaseResou
unregisterInterceptors();
}
/**
* See #778
*/
@Test
public void testReadingObservationAccessRight() throws IOException {
Practitioner practitioner1 = new Practitioner();
final IIdType practitionerId1 = ourClient.create().resource(practitioner1).execute().getId().toUnqualifiedVersionless();
Practitioner practitioner2 = new Practitioner();
final IIdType practitionerId2 = ourClient.create().resource(practitioner2).execute().getId().toUnqualifiedVersionless();
Patient patient = new Patient();
patient.setActive(true);
final IIdType patientId = ourClient.create().resource(patient).execute().getId().toUnqualifiedVersionless();
ourRestServer.registerInterceptor(new AuthorizationInterceptor(PolicyEnum.DENY) {
@Override
public List<IAuthRule> buildRuleList(RequestDetails theRequestDetails) {
// allow write all Observation resource
// allow read only Observation resource in which it has a practitioner1 or practitioner2 compartment
return new RuleBuilder().allow()
.write()
.resourcesOfType(Observation.class)
.withAnyId()
.andThen()
.allow()
.read()
.resourcesOfType(Observation.class)
.inCompartment("Practitioner", Arrays.asList(practitionerId1, practitionerId2))
.andThen()
.denyAll()
.build();
}
});
Observation obs1 = new Observation();
obs1.setStatus(ObservationStatus.FINAL);
obs1.setPerformer(
Arrays.asList(new Reference(practitionerId1), new Reference(practitionerId2)));
IIdType oid1 = ourClient.create().resource(obs1).execute().getId().toUnqualified();
// Observation with practitioner1 and practitioner1 as the Performer -> should have the read access
ourClient.read().resource(Observation.class).withId(oid1).execute();
Observation obs2 = new Observation();
obs2.setStatus(ObservationStatus.FINAL);
obs2.setSubject(new Reference(patientId));
IIdType oid2 = ourClient.create().resource(obs2).execute().getId().toUnqualified();
// Observation with patient as the subject -> read access should be blocked
try {
ourClient.read().resource(Observation.class).withId(oid2).execute();
fail();
} catch (ForbiddenOperationException e) {
// good
}
}
/**
* See #667
*/

View File

@ -4,6 +4,7 @@ import ca.uhn.fhir.jpa.config.WebsocketDispatcherConfig;
import ca.uhn.fhir.jpa.dao.data.ISearchDao;
import ca.uhn.fhir.jpa.dao.dstu3.BaseJpaDstu3Test;
import ca.uhn.fhir.jpa.dao.dstu3.SearchParamRegistryDstu3;
import ca.uhn.fhir.jpa.provider.TerminologyUploaderProvider;
import ca.uhn.fhir.jpa.search.DatabaseBackedPagingProvider;
import ca.uhn.fhir.jpa.search.ISearchCoordinatorSvc;
import ca.uhn.fhir.jpa.subscription.email.SubscriptionEmailInterceptor;
@ -56,7 +57,7 @@ public abstract class BaseResourceProviderDstu3Test extends BaseJpaDstu3Test {
private static Server ourServer;
protected static String ourServerBase;
protected static GenericWebApplicationContext ourWebApplicationContext;
private TerminologyUploaderProviderDstu3 myTerminologyUploaderProvider;
private TerminologyUploaderProvider myTerminologyUploaderProvider;
protected static SearchParamRegistryDstu3 ourSearchParamRegistry;
protected static DatabaseBackedPagingProvider ourPagingProvider;
protected static SubscriptionRestHookInterceptor ourRestHookSubscriptionInterceptor;
@ -92,7 +93,7 @@ public abstract class BaseResourceProviderDstu3Test extends BaseJpaDstu3Test {
ourRestServer.getFhirContext().setNarrativeGenerator(new DefaultThymeleafNarrativeGenerator());
myTerminologyUploaderProvider = myAppCtx.getBean(TerminologyUploaderProviderDstu3.class);
myTerminologyUploaderProvider = myAppCtx.getBean(TerminologyUploaderProvider.class);
ourRestServer.setPlainProviders(mySystemProvider, myTerminologyUploaderProvider);

View File

@ -570,7 +570,7 @@ public class ResourceProviderDstu3ValueSetTest extends BaseResourceProviderDstu3
TermConcept parentB = new TermConcept(cs, "ParentB").setDisplay("Parent B");
cs.getConcepts().add(parentB);
theTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM, cs);
theTermSvc.storeNewCodeSystemVersion(table.getId(), URL_MY_CODE_SYSTEM, "SYSTEM NAME", cs);
return codeSystem;
}

View File

@ -47,7 +47,7 @@ public class TerminologyUploaderProviderDstu3Test extends BaseResourceProviderDs
.operation()
.onServer()
.named("upload-external-code-system")
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.SCT_URL))
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.SCT_URI))
.andParameter("package", new Attachment().setData(packageBytes))
.execute();
//@formatter:on
@ -67,7 +67,7 @@ public class TerminologyUploaderProviderDstu3Test extends BaseResourceProviderDs
.operation()
.onServer()
.named("upload-external-code-system")
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.LOINC_URL))
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.LOINC_URI))
.andParameter("package", new Attachment().setData(packageBytes))
.execute();
//@formatter:on
@ -86,7 +86,7 @@ public class TerminologyUploaderProviderDstu3Test extends BaseResourceProviderDs
.operation()
.onServer()
.named("upload-external-code-system")
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.LOINC_URL))
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.LOINC_URI))
.andParameter("package", new Attachment().setData(packageBytes))
.execute();
//@formatter:on
@ -111,7 +111,7 @@ public class TerminologyUploaderProviderDstu3Test extends BaseResourceProviderDs
.operation()
.onServer()
.named("upload-external-code-system")
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.SCT_URL))
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.SCT_URI))
.andParameter("localfile", new StringType(tempFile.getAbsolutePath()))
.execute();
//@formatter:on
@ -132,7 +132,7 @@ public class TerminologyUploaderProviderDstu3Test extends BaseResourceProviderDs
.operation()
.onServer()
.named("upload-external-code-system")
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.SCT_URL + "FOO"))
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.SCT_URI + "FOO"))
.andParameter("package", new Attachment().setData(packageBytes))
.execute();
fail();
@ -169,7 +169,7 @@ public class TerminologyUploaderProviderDstu3Test extends BaseResourceProviderDs
.operation()
.onServer()
.named("upload-external-code-system")
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.SCT_URL))
.withParameter(Parameters.class, "url", new UriType(IHapiTerminologyLoaderSvc.SCT_URI))
.execute();
fail();
} catch (InvalidRequestException e) {

Some files were not shown because too many files have changed in this diff Show More