Compare commits

...

16 Commits

Author SHA1 Message Date
Joshua Darnell c8d207c5a3 Issue #76: preliminary sql generator 2021-05-27 15:03:23 -07:00
Joshua Darnell 313c57e9f7 Issue #76: initial working data generator. TODO: lookups need their own generator 2021-05-27 01:44:57 -07:00
Joshua Darnell 20dc65f77c Issue #76: added data generator and deserializer 2021-05-26 16:48:58 -07:00
Joshua Darnell 85924aafa5 Issue 56: cleaning up logging and build.gradle 2021-05-21 14:26:08 -07:00
Joshua Darnell 0dc848243b Issue #56: Disabled logging, changed payload sampling logic, encoded files for payload fields only, and added standard cookie spec headers to fix warnings 2021-05-20 11:26:42 -07:00
Joshua Darnell 8c0ae882f2 Issue #56: added better value extraction and turnedoff logging by default. 2021-05-05 04:26:34 -07:00
Joshua Darnell 790911f65f Issue #56: Added availability-report.json in the /build directory for payloads testing 2021-05-01 00:06:09 -07:00
Joshua Darnell 02b3855c20 Issue #56: Adding scoring functionality 2021-04-30 07:29:03 -07:00
Joshua Darnell 03eff74438 Issue #56: added PayloadSample and refactored basic availability report 2021-04-29 18:56:34 -07:00
Joshua Darnell d6d0799fb5 Issue #56 - threaded sampling with auto-terminate upon reaching end of records 2021-04-29 03:18:40 -07:00
Joshua Darnell b6ed5308ea Issue #56 - Multi-threaded fetching and use of timezone offsets 2021-04-28 23:14:05 -07:00
Joshua Darnell 554af35f17 Merge branch 'issue-73-create-model-generators' into issue-56-payloads-sampling-tool 2021-04-27 16:25:24 -07:00
Joshua Darnell 0ac742e7d1 Issue #73 - Initial ResourceInfo Model Generation 2021-04-27 16:17:46 -07:00
Joshua Darnell dc93fbb9b6 Issue #56 - Intermediate Commit 2021-04-27 10:34:45 -07:00
Joshua Darnell 094fc33d19 Issue #56 - Adding properties file and other items 2021-04-14 21:27:37 -07:00
Joshua Darnell cd9791f33a Issue #56 - Commit of basic sampling with SHA 256 scoring 2021-04-14 21:25:43 -07:00
31 changed files with 8696 additions and 121 deletions

View File

@ -58,48 +58,51 @@ $ java -jar path/to/web-api-commander.jar
Doing so displays the following information: Doing so displays the following information:
``` ```
usage: java -jar web-api-commander usage: java -jar web-api-commander
--bearerToken <b> Bearer token to be used with the --bearerToken <b> Bearer token to be used with the
request. request.
--clientId <d> Client Id to be used with the request. --clientId <d> Client Id to be used with the request.
--clientSecret <s> --clientSecret <s>
--contentType <t> Results format: JSON (default), --contentType <t> Results format: JSON (default),
JSON_NO_METADATA, JSON_FULL_METADATA, JSON_NO_METADATA, JSON_FULL_METADATA,
XML. XML.
--entityName <n> The name of the entity to fetch, e.g. --entityName <n> The name of the entity to fetch, e.g.
Property. Property.
--generateDDAcceptanceTests Generates acceptance tests in the --generateDDAcceptanceTests Generates acceptance tests in the
current directory. current directory.
--generateMetadataReport Generates metadata report from given --generateMetadataReport Generates metadata report from given
<inputFile>. <inputFile>.
--generateQueries Resolves queries in a given RESOScript --generateQueries Resolves queries in a given RESOScript
<inputFile> and displays them in <inputFile> and displays them in
standard out. standard out.
--generateReferenceDDL Generates reference DDL to create a --generateReferenceDDL Generates reference DDL to create a
RESO-compliant SQL database. Pass RESO-compliant SQL database. Pass
--useKeyNumeric to generate the DB using --useKeyNumeric to generate the DB
numeric keys. using numeric keys.
--generateReferenceEDMX Generates reference metadata in EDMX --generateReferenceEDMX Generates reference metadata in EDMX
format. format.
--getMetadata Fetches metadata from <serviceRoot> --generateResourceInfoModels Generates Java Models for the Web API
using <bearerToken> and saves results in Reference Server in the current
<outputFile>. directory.
--help print help --getMetadata Fetches metadata from <serviceRoot>
--inputFile <i> Path to input file. using <bearerToken> and saves results
--outputFile <o> Path to output file. in <outputFile>.
--runRESOScript Runs commands in RESOScript file given --help print help
as <inputFile>. --inputFile <i> Path to input file.
--saveGetRequest Performs GET from <requestURI> using the --outputFile <o> Path to output file.
given <bearerToken> and saves output to --runRESOScript Runs commands in RESOScript file given
<outputFile>. as <inputFile>.
--serviceRoot <s> Service root URL on the host. --saveGetRequest Performs GET from <requestURI> using
--uri <u> URI for raw request. Use 'single quotes' the given <bearerToken> and saves
to enclose. output to <outputFile>.
--useEdmEnabledClient present if an EdmEnabledClient should be --serviceRoot <s> Service root URL on the host.
used. --uri <u> URI for raw request. Use 'single
--useKeyNumeric present if numeric keys are to be used quotes' to enclose.
for database DDL generation. --useEdmEnabledClient present if an EdmEnabledClient should
--validateMetadata Validates previously-fetched metadata in be used.
the <inputFile> path. --useKeyNumeric present if numeric keys are to be used
for database DDL generation.
--validateMetadata Validates previously-fetched metadata
in the <inputFile> path.
``` ```
When using commands, if required arguments aren't provided, relevant feedback will be displayed in the terminal. When using commands, if required arguments aren't provided, relevant feedback will be displayed in the terminal.
@ -227,6 +230,17 @@ New Cucumber BDD acceptance tests will be generated and placed in a timestamped
To update the current tests, copy the newly generated ones into the [Data Dictionary BDD `.features` directory](src/main/java/org/reso/certification/features/data-dictionary/v1-7-0), run the `./gradlew build` task, and if everything works as expected, commit the newly generated tests. To update the current tests, copy the newly generated ones into the [Data Dictionary BDD `.features` directory](src/main/java/org/reso/certification/features/data-dictionary/v1-7-0), run the `./gradlew build` task, and if everything works as expected, commit the newly generated tests.
## Generating RESO Web API Reference Server Data Models
The RESO Commander can be used to generate data models for the Web API Reference server from the currently approved [Data Dictionary Spreadsheet](src/main/resources/RESODataDictionary-1.7.xlsx).
The Commander project's copy of the sheet needs to be updated with a copy of the [DD Google Sheet](https://docs.google.com/spreadsheets/d/1SZ0b6T4_lz6ti6qB2Je7NSz_9iNOaV_v9dbfhPwWgXA/edit?usp=sharing) prior to generating reference metadata.
```
$ java -jar path/to/web-api-commander.jar --generateResourceInfoModels
```
New ResourceInfo Models for the Web API Reference Server will be generated and placed in a timestamped directory relative to your current path.
## Generating RESO Data Dictionary Reference Metadata ## Generating RESO Data Dictionary Reference Metadata
In addition to generating DD acceptance tests, the RESO Commander can generate reference metadata based on the current reference [Data Dictionary Spreadsheet](src/main/resources/RESODataDictionary-1.7.xlsx). In addition to generating DD acceptance tests, the RESO Commander can generate reference metadata based on the current reference [Data Dictionary Spreadsheet](src/main/resources/RESODataDictionary-1.7.xlsx).

View File

@ -48,6 +48,8 @@ dependencies {
compile 'io.cucumber:cucumber-guice:6.10.2' compile 'io.cucumber:cucumber-guice:6.10.2'
compile 'io.cucumber:cucumber-core:6.10.2' compile 'io.cucumber:cucumber-core:6.10.2'
compile 'com.github.javafaker:javafaker:1.0.2'
compile 'net.masterthought:cucumber-reporting:5.5.2' compile 'net.masterthought:cucumber-reporting:5.5.2'
//TODO: choose one schema validator between this and rest-assured //TODO: choose one schema validator between this and rest-assured
@ -175,6 +177,35 @@ task testDataDictionary_1_7() {
} }
} }
task testIdxPayload_1_7() {
group = 'RESO Certification'
description = 'Runs IDX Payload 1.7 Automated Acceptance Tests.' +
'\n Example: ' +
'\n $ ./gradlew testIdxPayload_1_7 -DpathToRESOScript=/path/to/web-api-core-1.0.2.resoscript -DshowResponses=true\n'
dependsOn jar
doLast {
javaexec {
main = "io.cucumber.core.cli.Main"
classpath = configurations.cucumberRuntime + sourceSets.main.output + sourceSets.test.output
systemProperties = System.getProperties()
args = [
'--strict',
'--plugin',
'pretty',
'--plugin',
'json:build/idx-payload.dd-1.7.json',
'--plugin',
'html:build/idx-payload.dd-1.7.html',
'--glue',
'org.reso.certification.stepdefs#IDXPayload',
'src/main/java/org/reso/certification/features/payloads/idx-payload.feature'
]
}
}
}
task generateCertificationReport_DD_1_7() { task generateCertificationReport_DD_1_7() {
group = 'RESO Certification' group = 'RESO Certification'
description = 'Runs Data Dictionary 1.7 tests and creates a certification report' + description = 'Runs Data Dictionary 1.7 tests and creates a certification report' +

View File

@ -6,6 +6,8 @@ import org.apache.http.Header;
import org.apache.http.HttpStatus; import org.apache.http.HttpStatus;
import org.apache.http.NameValuePair; import org.apache.http.NameValuePair;
import org.apache.http.client.HttpClient; import org.apache.http.client.HttpClient;
import org.apache.http.client.config.CookieSpecs;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.client.entity.UrlEncodedFormEntity; import org.apache.http.client.entity.UrlEncodedFormEntity;
import org.apache.http.client.methods.CloseableHttpResponse; import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpPost; import org.apache.http.client.methods.HttpPost;
@ -64,6 +66,7 @@ public class OAuth2HttpClientFactory extends AbstractHttpClientFactory {
try { try {
LOG.debug("Fetching access token..."); LOG.debug("Fetching access token...");
final HttpPost post = new HttpPost(tokenUri); final HttpPost post = new HttpPost(tokenUri);
post.setConfig(RequestConfig.custom().setCookieSpec(CookieSpecs.STANDARD).build());
params.add(new BasicNameValuePair("grant_type", "client_credentials")); params.add(new BasicNameValuePair("grant_type", "client_credentials"));
params.add(new BasicNameValuePair("client_id", clientId)); params.add(new BasicNameValuePair("client_id", clientId));
@ -118,6 +121,7 @@ public class OAuth2HttpClientFactory extends AbstractHttpClientFactory {
.setUserAgent(USER_AGENT) .setUserAgent(USER_AGENT)
.setDefaultHeaders(headers) .setDefaultHeaders(headers)
.setConnectionManager(connectionManager) .setConnectionManager(connectionManager)
.setDefaultRequestConfig(RequestConfig.custom().setCookieSpec(CookieSpecs.STANDARD).build())
.build(); .build();
} }

View File

@ -2,6 +2,8 @@ package org.reso.auth;
import org.apache.http.Header; import org.apache.http.Header;
import org.apache.http.client.HttpClient; import org.apache.http.client.HttpClient;
import org.apache.http.client.config.CookieSpecs;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.conn.HttpClientConnectionManager; import org.apache.http.conn.HttpClientConnectionManager;
import org.apache.http.impl.client.CloseableHttpClient; import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder; import org.apache.http.impl.client.HttpClientBuilder;
@ -44,11 +46,11 @@ public class TokenHttpClientFactory extends AbstractHttpClientFactory {
return HttpClientBuilder.create() return HttpClientBuilder.create()
.setUserAgent(USER_AGENT) .setUserAgent(USER_AGENT)
.setDefaultHeaders(headers) .setDefaultHeaders(headers)
.setDefaultRequestConfig(RequestConfig.custom().setCookieSpec(CookieSpecs.STANDARD).build())
.setConnectionManager(connectionManager) .setConnectionManager(connectionManager)
.build(); .build();
} }
@Override @Override
public void close(final HttpClient httpClient) { public void close(final HttpClient httpClient) {
try { try {

View File

@ -122,7 +122,7 @@ public class BDDProcessor extends WorksheetProcessor {
return tags; return tags;
} }
private static String padLeft(String s, int n) { public static String padLeft(String s, int n) {
String[] padding = new String[n]; String[] padding = new String[n];
Arrays.fill(padding, " "); Arrays.fill(padding, " ");
return String.join("", padding) + s; return String.join("", padding) + s;

View File

@ -0,0 +1,83 @@
package org.reso.certification.codegen;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.reso.models.ReferenceStandardField;
import java.util.LinkedHashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
public class DDCacheProcessor extends WorksheetProcessor {
private static final Logger LOG = LogManager.getLogger(DDCacheProcessor.class);
Map<String, List<ReferenceStandardField>> fieldCache = new LinkedHashMap<>();
private void addToFieldCache(ReferenceStandardField field) {
fieldCache.putIfAbsent(field.getParentResourceName(), new LinkedList<>());
fieldCache.get(field.getParentResourceName()).add(field);
}
public Map<String, List<ReferenceStandardField>> getFieldCache() {
return fieldCache;
}
public static Map<String, List<ReferenceStandardField>> buildCache() {
LOG.info("Creating standard field cache...");
DDCacheProcessor cacheProcessor = new DDCacheProcessor();
DataDictionaryCodeGenerator generator = new DataDictionaryCodeGenerator(cacheProcessor);
generator.processWorksheets();
LOG.info("Standard field cache created!");
return cacheProcessor.getFieldCache();
}
public static DataDictionaryCodeGenerator getGeneratorInstance() {
DDCacheProcessor cacheProcessor = new DDCacheProcessor();
return new DataDictionaryCodeGenerator(cacheProcessor);
}
@Override
void processNumber(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processStringListSingle(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processString(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processBoolean(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processStringListMulti(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processDate(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processTimestamp(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processCollection(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void generateOutput() {
//no output
}
}

View File

@ -149,7 +149,7 @@ public class DDLProcessor extends WorksheetProcessor {
.append("\n\n") .append("\n\n")
.append("CREATE TABLE IF NOT EXISTS ") .append("CREATE TABLE IF NOT EXISTS ")
//exception for ouid so it doesn't become o_u_i_d //exception for ouid so it doesn't become o_u_i_d
.append(CaseFormat.UPPER_CAMEL.to(CaseFormat.LOWER_UNDERSCORE, resourceName).replace("o_u_i_d", "ouid")) .append(buildDbTableName(resourceName))
.append(" ( ") .append(" ( ")
.append(templateContent).append(",\n") .append(templateContent).append(",\n")
.append(PADDING).append(PADDING).append(buildPrimaryKeyMarkup(resourceName)).append("\n") .append(PADDING).append(PADDING).append(buildPrimaryKeyMarkup(resourceName)).append("\n")
@ -164,6 +164,10 @@ public class DDLProcessor extends WorksheetProcessor {
LOG.info(this::buildInsertLookupsStatement); LOG.info(this::buildInsertLookupsStatement);
} }
public static String buildDbTableName(String resourceName) {
return CaseFormat.UPPER_CAMEL.to(CaseFormat.LOWER_UNDERSCORE, resourceName).replace("o_u_i_d", "ouid");
}
private static String buildCreateLookupStatement(boolean useKeyNumeric) { private static String buildCreateLookupStatement(boolean useKeyNumeric) {
return return
"\n\n/**\n" + "\n\n/**\n" +

View File

@ -9,18 +9,20 @@ import org.reso.commander.common.DataDictionaryMetadata;
import static org.reso.certification.codegen.WorksheetProcessor.REFERENCE_WORKSHEET; import static org.reso.certification.codegen.WorksheetProcessor.REFERENCE_WORKSHEET;
import static org.reso.certification.codegen.WorksheetProcessor.buildWellKnownStandardFieldHeaderMap; import static org.reso.certification.codegen.WorksheetProcessor.buildWellKnownStandardFieldHeaderMap;
public class DataDictionaryCodeGenerator { public final class DataDictionaryCodeGenerator {
private static final Logger LOG = LogManager.getLogger(DataDictionaryCodeGenerator.class); private static final Logger LOG = LogManager.getLogger(DataDictionaryCodeGenerator.class);
WorksheetProcessor processor = null; private WorksheetProcessor processor = null;
Workbook workbook = null; Workbook workbook = null;
private DataDictionaryCodeGenerator() {
//private constructor, should not instantiate directly
}
/** /**
* Instantiates a new DataDictionary generator with the given worksheet processor * Instantiates a new DataDictionary generator with the given worksheet processor
* @param processor the worksheet processor to use to generate the data dictionary * @param processor the worksheet processor to use to generate the data dictionary
* @throws Exception an exception if the Data Dictionary processor is null
*/ */
public DataDictionaryCodeGenerator(WorksheetProcessor processor) throws Exception { public DataDictionaryCodeGenerator(WorksheetProcessor processor) {
if (processor == null) throw new Exception("Data Dictionary processor cannot be null!");
this.processor = processor; this.processor = processor;
processor.setReferenceResource(REFERENCE_WORKSHEET); processor.setReferenceResource(REFERENCE_WORKSHEET);
workbook = processor.getReferenceWorkbook(); workbook = processor.getReferenceWorkbook();
@ -31,21 +33,25 @@ public class DataDictionaryCodeGenerator {
* Generates Data Dictionary references for local workbook instance using the configured WorksheetProcessor * Generates Data Dictionary references for local workbook instance using the configured WorksheetProcessor
*/ */
public void processWorksheets() { public void processWorksheets() {
Sheet currentWorksheet, standardResourcesWorksheet; Sheet currentWorksheet, standardRelationshipsWorksheet;
int sheetIndex, rowIndex; int sheetIndex, rowIndex;
final int ROW_HEADER_INDEX = 0, FIRST_ROW_INDEX = 1; final int ROW_HEADER_INDEX = 0, FIRST_ROW_INDEX = 1;
final String STANDARD_RELATIONSHIPS_WORKSHEET = "Standard Relationships"; final String STANDARD_RELATIONSHIPS_WORKSHEET = "Standard Relationships";
try { try {
standardResourcesWorksheet = workbook.getSheet(STANDARD_RELATIONSHIPS_WORKSHEET); standardRelationshipsWorksheet = workbook.getSheet(STANDARD_RELATIONSHIPS_WORKSHEET);
assert standardResourcesWorksheet != null; assert standardRelationshipsWorksheet != null : "Standard Relationships worksheet MUST be present!";
processor.buildStandardRelationships(standardResourcesWorksheet); processor.buildStandardRelationships(standardRelationshipsWorksheet);
//workbook consists of many sheets, process only the ones that have the name of a well-known resource //workbook consists of many sheets, process only the ones that have the name of a well-known resource
//TODO: change to stream processing logic
for (sheetIndex = ROW_HEADER_INDEX; sheetIndex < workbook.getNumberOfSheets(); sheetIndex++) { for (sheetIndex = ROW_HEADER_INDEX; sheetIndex < workbook.getNumberOfSheets(); sheetIndex++) {
assert workbook != null && sheetIndex >= 0 && sheetIndex < workbook.getNumberOfSheets()
: "Worksheet at index + " + sheetIndex + " does not exist!";
currentWorksheet = workbook.getSheetAt(sheetIndex); currentWorksheet = workbook.getSheetAt(sheetIndex);
//TODO: make DD version dynamic
if (DataDictionaryMetadata.v1_7.WELL_KNOWN_RESOURCES.contains(currentWorksheet.getSheetName()) && currentWorksheet.getPhysicalNumberOfRows() > 1) { if (DataDictionaryMetadata.v1_7.WELL_KNOWN_RESOURCES.contains(currentWorksheet.getSheetName()) && currentWorksheet.getPhysicalNumberOfRows() > 1) {
processor.beforeResourceSheetProcessed(currentWorksheet); processor.beforeResourceSheetProcessed(currentWorksheet);
@ -67,4 +73,8 @@ public class DataDictionaryCodeGenerator {
LOG.info(ex); LOG.info(ex);
} }
} }
public WorksheetProcessor getProcessor() {
return processor;
}
} }

View File

@ -0,0 +1,281 @@
package org.reso.certification.codegen;
import com.github.javafaker.Faker;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.reso.commander.common.Utils;
import org.reso.models.DataGenerator;
import org.reso.models.ReferenceStandardField;
import org.reso.models.ReferenceStandardLookup;
import java.time.OffsetDateTime;
import java.time.temporal.ChronoUnit;
import java.util.*;
import java.util.concurrent.ThreadLocalRandom;
import java.util.concurrent.atomic.AtomicReference;
import java.util.stream.Collectors;
import static org.reso.certification.codegen.WorksheetProcessor.WELL_KNOWN_DATA_TYPES.*;
/**
* From: https://mariadb.com/kb/en/how-to-quickly-insert-data-into-mariadb/
*
* ALTER TABLE table_name DISABLE KEYS;
* BEGIN;
* ... inserting data with INSERT or LOAD DATA ....
* COMMIT;
* ALTER TABLE table_name ENABLE KEYS;
*
* SET @@session.unique_checks = 0;
* SET @@session.foreign_key_checks = 0;
*
* SET @@global.innodb_autoinc_lock_mode = 2;
*
* Then use this to import the data:
*
* mysqlimport --use-threads=<numThreads> database text-file-name [text-file-name...]
*/
public class DataDictionarySeedDataSqlGenerator {
private static final Logger LOG = LogManager.getLogger(DataDictionarySeedDataSqlGenerator.class);
final private DDCacheProcessor processor;
/**
* Cache of fields and their data generators by resource
*/
private final static AtomicReference<Map<String, Map<String, DataGenerator.FieldDataGenerator>>> dataGeneratorResourceFieldMap
= new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
/**
* Cache of standard fields from the current Data Dictionary worksheet
*/
private final static AtomicReference<Map<String, List<ReferenceStandardField>>> referenceStandardFieldCache
= new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
/**
* Cache of keys by resource name
*/
private final static AtomicReference<Map<String, String>> keyCache
= new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
/**
* TODO: add a standard relationships cache so keys can be sampled from the keyCache for related records
*/
public DataDictionarySeedDataSqlGenerator() {
LOG.info("Welcome to the RESO Data Dictionary Database Seed Generator!");
LOG.info("Creating standard field cache...");
DDCacheProcessor processor = new DDCacheProcessor();
DataDictionaryCodeGenerator generator = new DataDictionaryCodeGenerator(processor);
generator.processWorksheets();
LOG.info("Standard field cache created!");
this.processor = processor;
//build a cache of the Dictionary standard fields
referenceStandardFieldCache.set(processor.getFieldCache());
//build a cache of Data Dictionary generators
DataGenerator dataGenerator = DataGenerator.deserialize();
dataGenerator.getResourceInfo().forEach(resourceInfo -> {
dataGeneratorResourceFieldMap.get().putIfAbsent(resourceInfo.getResourceName(), new LinkedHashMap<>());
dataGenerator.getFields().forEach(fieldDataGenerator ->
dataGeneratorResourceFieldMap.get().get(resourceInfo.getResourceName()).put(fieldDataGenerator.getFieldName(), fieldDataGenerator));
});
//extract counts for each resource
final Map<String, Integer> resourceCounts = dataGenerator.getResourceInfo().stream()
.collect(Collectors.toMap(DataGenerator.ResourceInfo::getResourceName, DataGenerator.ResourceInfo::getRecordCount));
//iterate over each resource in the Data Dictionary and generate n items from it, where n is the recordCount
//in the resourceInfo section of the data generator reference file
referenceStandardFieldCache.get().keySet().forEach(resourceName -> {
LOG.info("Processing " + resourceName + " resource...");
LOG.info(generateRowInsertStatements(resourceName, referenceStandardFieldCache.get().get(resourceName), resourceCounts.get(resourceName)));
});
}
/**
* INSERT INTO tbl_name (a,b,c)
* VALUES(1,2,3), (4,5,6), (7,8,9);
*
* TODO: this function needs to have the lookups split out and handled in their own insert statement generator
*
* @param resourceName
* @param referenceStandardFields
* @param numStatements
* @return
*/
final String generateRowInsertStatements(String resourceName, List<ReferenceStandardField> referenceStandardFields, Integer numStatements) {
final String tableName = DDLProcessor.buildDbTableName(resourceName);
StringBuilder stringBuilder = new StringBuilder();
stringBuilder.append("ALTER TABLE ").append(tableName).append(" DISABLE KEYS;\n");
stringBuilder.append("BEGIN;\n");
stringBuilder.append("INSERT INTO ").append(tableName);
stringBuilder.append(" (");
stringBuilder.append(referenceStandardFields.stream().map(ReferenceStandardField::getStandardName)
.collect(Collectors.joining(", ")));
stringBuilder.append(") VALUES");
for (int statementCount = 0; statementCount < numStatements; statementCount++) {
stringBuilder.append("\n\t(");
stringBuilder.append(referenceStandardFields.stream().map(this::generateValues).collect(Collectors.joining(", ")));
stringBuilder.append(")");
//add commas between values only if we're not at the last item
if (statementCount < numStatements - 1) stringBuilder.append(", ");
}
stringBuilder.append(";\n");
stringBuilder.append("COMMIT;\n");
stringBuilder.append("ALTER TABLE " + tableName + " ENABLE KEYS;\n\n");
return stringBuilder.toString();
}
final String generateValues(ReferenceStandardField referenceStandardField) {
//now that row has been processed, extract field type and assemble the template
switch (referenceStandardField.getSimpleDataType()) {
case NUMBER:
return generateNumber(referenceStandardField);
case STRING_LIST_SINGLE:
return generateStringListSingle(referenceStandardField);
case STRING:
return generateString(referenceStandardField);
case BOOLEAN:
return generateBoolean(referenceStandardField);
case STRING_LIST_MULTI:
return generateStringListMulti(referenceStandardField).toString();
case DATE:
return generateDate(referenceStandardField);
case TIMESTAMP:
return generateTimestamp(referenceStandardField);
default:
if (referenceStandardField.getSimpleDataType() != null)
LOG.debug("Data type: " + referenceStandardField.getSimpleDataType() + " is not supported!");
}
return null;
}
String generateNumber(ReferenceStandardField referenceStandardField) {
return referenceStandardField.getSuggestedMaxPrecision() != null
? generateDecimal(referenceStandardField) : generateInteger(referenceStandardField);
}
String generateInteger(ReferenceStandardField referenceStandardField) {
final int MAX_INTEGER_POWER = 5;
int maxPower = Math.min(referenceStandardField.getSuggestedMaxLength(), MAX_INTEGER_POWER);
return String.valueOf(Faker.instance().number().numberBetween(0, (int)Math.pow(10, maxPower)));
}
String generateDecimal(ReferenceStandardField referenceStandardField) {
final int MAX_INTEGER_POWER = 6;
int maxPower = Math.min(referenceStandardField.getSuggestedMaxLength(), MAX_INTEGER_POWER);
return String.valueOf(Faker.instance().number()
.randomDouble(referenceStandardField.getSuggestedMaxPrecision(), 0, (int)Math.pow(10, maxPower)));
}
String generateBoolean(ReferenceStandardField referenceStandardField) {
return String.valueOf(ThreadLocalRandom.current().nextBoolean()).toUpperCase();
}
String generateStringListSingle(ReferenceStandardField referenceStandardField) {
List<String> possibleChoices;
List<String> customExamples = dataGeneratorResourceFieldMap.get().get(referenceStandardField.getParentResourceName()).get(referenceStandardField.getStandardName()) != null
? dataGeneratorResourceFieldMap.get().get(referenceStandardField.getParentResourceName()).get(referenceStandardField.getStandardName()).getCustomExamples() : null;
if (processor.getEnumerations().containsKey(referenceStandardField.getLookupStandardName())) {
possibleChoices = processor.getEnumerations().get(referenceStandardField.getLookupStandardName()).stream()
.map(ReferenceStandardLookup::getLookupValue).collect(Collectors.toList());
} else if (customExamples != null && customExamples.size() > 0) {
possibleChoices = customExamples;
} else {
possibleChoices = new ArrayList<>();
possibleChoices.add(Faker.instance().chuckNorris().fact());
}
Collections.shuffle(possibleChoices);
return wrapInQuotes(possibleChoices.get(0));
}
List<String> generateStringListMulti(ReferenceStandardField referenceStandardField) {
List<String> possibleChoices;
List<String> customExamples = dataGeneratorResourceFieldMap.get().get(referenceStandardField.getParentResourceName()).get(referenceStandardField.getStandardName()) != null
? dataGeneratorResourceFieldMap.get().get(referenceStandardField.getParentResourceName()).get(referenceStandardField.getStandardName()).getCustomExamples() : null;
int numElements, randomSize = 0;
Set<String> enumNames = new LinkedHashSet<>();
if (processor.getEnumerations().containsKey(referenceStandardField.getLookupStandardName())) {
numElements = processor.getEnumerations().get(referenceStandardField.getLookupStandardName()).size();
randomSize = ThreadLocalRandom.current().nextInt(0, numElements);
possibleChoices = processor.getEnumerations().get(referenceStandardField.getLookupStandardName()).stream()
.map(ReferenceStandardLookup::getLookupValue).collect(Collectors.toList());
} else if (customExamples != null && customExamples.size() > 0) {
randomSize = ThreadLocalRandom.current().nextInt(customExamples.size());
possibleChoices = customExamples;
} else {
possibleChoices = new ArrayList<>();
possibleChoices.add(Faker.instance().buffy().quotes());
}
new LinkedHashSet<>(randomSize);
for(int numEnums = 0; numEnums < randomSize; numEnums++) {
Collections.shuffle(possibleChoices);
if (possibleChoices.size() > 0) {
enumNames.add(wrapInQuotes(possibleChoices.get(0)));
possibleChoices.remove(0);
}
}
return new ArrayList<>(enumNames);
}
static String wrapInQuotes(String item) {
return "\"" + item + "\"";
}
/**
* TODO: determine whether we need to be able to go both ways on dates on demand.
* For example, it might make sense to have open house dates in the future.
* This method currently only generates past dates.
* @param referenceStandardField
* @return
*/
String generateDate(ReferenceStandardField referenceStandardField) {
long numDays = ThreadLocalRandom.current().nextInt(5 * 365); //max 5 years back
return wrapInQuotes(Utils.getIsoDate(OffsetDateTime.now().minus(numDays, ChronoUnit.DAYS)));
}
/**
* The only time a string will be generated will be when there is a custom example
* @param referenceStandardField
* @return
*/
String generateString(ReferenceStandardField referenceStandardField) {
List<String> customExamples = dataGeneratorResourceFieldMap.get().get(referenceStandardField.getParentResourceName()).get(referenceStandardField.getStandardName()) != null
? dataGeneratorResourceFieldMap.get().get(referenceStandardField.getParentResourceName()).get(referenceStandardField.getStandardName()).getCustomExamples() : null;
String value;
if (customExamples != null && customExamples.size() > 0) {
value = customExamples.get(ThreadLocalRandom.current().nextInt(customExamples.size()));
} else {
value = Faker.instance().buffy().quotes();
}
if (value != null) {
value = wrapInQuotes(value);
}
return value;
}
String generateTimestamp(ReferenceStandardField referenceStandardField) {
long numDays = ThreadLocalRandom.current().nextInt(5 * 365); //max 5 years back
return wrapInQuotes(Utils.getIsoTimestamp(OffsetDateTime.now().minus(numDays, ChronoUnit.DAYS)));
}
}

View File

@ -0,0 +1,249 @@
package org.reso.certification.codegen;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.poi.ss.usermodel.Sheet;
import org.reso.commander.common.Utils;
import org.reso.models.ReferenceStandardField;
import static org.reso.certification.codegen.DDLProcessor.buildDbTableName;
import static org.reso.certification.containers.WebAPITestContainer.EMPTY_STRING;
public class ResourceInfoProcessor extends WorksheetProcessor {
final static String
ANNOTATION_TERM_DISPLAY_NAME = "RESO.OData.Metadata.StandardName",
ANNOTATION_TERM_DESCRIPTION = "Core.Description",
ANNOTATION_TERM_URL = "RESO.DDWikiUrl";
private static final Logger LOG = LogManager.getLogger(ResourceInfoProcessor.class);
private static final String
FILE_EXTENSION = ".java";
public void processResourceSheet(Sheet sheet) {
super.processResourceSheet(sheet);
markup.append(ResourceInfoTemplates.buildClassInfo(sheet.getSheetName(), null));
}
@Override
void processNumber(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildNumberMarkup(row));
}
@Override
void processStringListSingle(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildStringListSingleMarkup(row));
}
@Override
void processString(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildStringMarkup(row));
}
@Override
void processBoolean(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildBooleanMarkup(row));
}
@Override
void processStringListMulti(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildStringListMultiMarkup(row));
}
@Override
void processDate(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildDateMarkup(row));
}
@Override
void processTimestamp(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildTimestampMarkup(row));
}
@Override
void processCollection(ReferenceStandardField row) {
LOG.debug("Collection Type is not supported!");
}
@Override
void generateOutput() {
LOG.info("Using reference worksheet: " + REFERENCE_WORKSHEET);
LOG.info("Generating ResourceInfo .java files for the following resources: " + resourceTemplates.keySet().toString());
resourceTemplates.forEach((resourceName, content) -> {
//put in local directory rather than relative to where the input file is
Utils.createFile(getDirectoryName(), resourceName + "Definition" + FILE_EXTENSION, content);
});
}
@Override
String getDirectoryName() {
return startTimestamp + "-ResourceInfoModels";
}
@Override
public void afterResourceSheetProcessed(Sheet sheet) {
assert sheet != null && sheet.getSheetName() != null;
String resourceName = sheet.getSheetName();
String templateContent =
markup.toString() + "\n" +
" return " + resourceName + "Definition.fieldList;\n" +
" }\n" +
"}";
resourceTemplates.put(resourceName, templateContent);
resetMarkupBuffer();
}
public static final class ResourceInfoTemplates {
/**
* Contains various templates used for test generation
* TODO: add a formatter rather than using inline spaces
*/
public static String buildClassInfo(String resourceName, String generatedTimestamp) {
if (resourceName == null) return null;
if (generatedTimestamp == null) generatedTimestamp = Utils.getIsoTimestamp();
final String definitionName = resourceName + "Definition";
return "package org.reso.service.data.definition;\n" + "\n" +
"import org.apache.olingo.commons.api.edm.EdmPrimitiveTypeKind;\n" +
"import org.reso.service.data.meta.FieldInfo;\n" +
"import org.reso.service.data.meta.ResourceInfo;\n" + "\n" +
"import java.util.ArrayList;\n" + "\n" +
"// This class was autogenerated on: " + generatedTimestamp + "\n" +
"public class " + definitionName + " extends ResourceInfo {\n" +
" private static ArrayList<FieldInfo> fieldList = null;\n" + "\n" +
" public " + definitionName + "() {" + "\n" +
" this.tableName = " + buildDbTableName(resourceName) + ";\n" +
" this.resourcesName = " + resourceName + ";\n" +
" this.resourceName = " + resourceName + ";\n" +
" }\n" + "\n" +
" public ArrayList<FieldInfo> getFieldList() {\n" +
" return " + definitionName + ".getStaticFieldList();\n" +
" }\n" + "\n" +
" public static ArrayList<FieldInfo> getStaticFieldList() {\n" +
" if (null != " + definitionName + ".fieldList) {\n" +
" return " + definitionName + ".fieldList;\n" +
" }\n" + "\n" +
" ArrayList<FieldInfo> list = new ArrayList<FieldInfo>();\n" +
" " + definitionName + ".fieldList = list;\n" +
" FieldInfo fieldInfo = null;\n";
}
public static String buildBooleanMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
//TODO: refactor into one method that takes a type name and returns the appropriate content
return "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.Boolean.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n" +
" list.add(fieldInfo);" +
"\n";
}
public static String buildDateMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
return "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.Date.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n" +
" list.add(fieldInfo);" +
"\n";
}
/**
* Provides special routing for Data Dictionary numeric types, which may be Integer or Decimal
*
* @param field the numeric field to build type markup for
* @return a string containing specific markup for the given field
*/
public static String buildNumberMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
if (field.getSuggestedMaxPrecision() != null) return buildDecimalMarkup(field);
else return buildIntegerMarkup(field);
}
public static String buildDecimalMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
return "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.Decimal.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n" +
" list.add(fieldInfo);" +
"\n";
//TODO: Length is actually scale for Decimal fields by the DD! :/
//TODO: Add setScale property to Decimal types in FieldInfo
//TODO: Precision is actually Scale for Decimal fields by the DD! :/
//TODO: Add setPrecision property to Decimal types in FieldInfo
}
public static String buildIntegerMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
return "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.Int64.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n" +
" list.add(fieldInfo);" +
"\n";
}
private static String buildStandardEnumerationMarkup(String lookupName) {
//TODO: add code to build Lookups
return "\n /* TODO: buildStandardEnumerationMarkup */\n";
}
public static String buildStringListMultiMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
//TODO: add multi lookup handler
return "\n /* TODO: buildStringListMultiMarkup */\n";
}
public static String buildStringListSingleMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
//TODO: add single lookup handler
return "\n /* TODO: buildStringListSingleMarkup */\n";
}
public static String buildStringMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
String content = "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.String.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n";
if (field.getSuggestedMaxLength() != null) {
content +=
" fieldInfo.setMaxLength(" + field.getSuggestedMaxLength() + ");\n";
}
content +=
" list.add(fieldInfo);" + "\n";
return content;
}
public static String buildTimestampMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
return "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.DateTime.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n" +
" list.add(fieldInfo);" +
"\n";
}
}
}

View File

@ -27,7 +27,7 @@ public abstract class WorksheetProcessor {
public static final String REFERENCE_WORKSHEET = "RESODataDictionary-1.7.xlsx"; public static final String REFERENCE_WORKSHEET = "RESODataDictionary-1.7.xlsx";
static final Map<String, String> resourceTemplates = new LinkedHashMap<>(); static final Map<String, String> resourceTemplates = new LinkedHashMap<>();
static final Map<String, Set<ReferenceStandardLookup>> standardEnumerationsMap = new LinkedHashMap<>(); static final Map<String, List<ReferenceStandardLookup>> standardEnumerationsMap = new LinkedHashMap<>();
static final Map<String, Map<String, ReferenceStandardField>> standardFieldsMap = new LinkedHashMap<>(new LinkedHashMap<>()); static final Map<String, Map<String, ReferenceStandardField>> standardFieldsMap = new LinkedHashMap<>(new LinkedHashMap<>());
private static final Logger LOG = LogManager.getLogger(WorksheetProcessor.class); private static final Logger LOG = LogManager.getLogger(WorksheetProcessor.class);
String referenceDocument = null; String referenceDocument = null;
@ -332,9 +332,7 @@ public abstract class WorksheetProcessor {
public void buildEnumerationMap() { public void buildEnumerationMap() {
final String ENUMERATION_TAB_NAME = "Lookup Fields and Values"; final String ENUMERATION_TAB_NAME = "Lookup Fields and Values";
final int LOOKUP_NAME_INDEX = 0, STANDARD_NAME_INDEX = 1;
DataFormatter formatter = new DataFormatter();
Sheet sheet = getReferenceWorkbook().getSheet(ENUMERATION_TAB_NAME); Sheet sheet = getReferenceWorkbook().getSheet(ENUMERATION_TAB_NAME);
buildWellKnownStandardEnumerationHeaderMap(sheet); buildWellKnownStandardEnumerationHeaderMap(sheet);
@ -345,12 +343,11 @@ public abstract class WorksheetProcessor {
standardEnumeration.set(deserializeStandardEnumerationRow(row)); standardEnumeration.set(deserializeStandardEnumerationRow(row));
if (!standardEnumerationsMap.containsKey(standardEnumeration.get().getLookupField())) { if (!standardEnumerationsMap.containsKey(standardEnumeration.get().getLookupField())) {
standardEnumerationsMap.put(standardEnumeration.get().getLookupField(), new LinkedHashSet<>()); standardEnumerationsMap.put(standardEnumeration.get().getLookupField(), new ArrayList<>());
} }
standardEnumerationsMap.get(standardEnumeration.get().getLookupField()).add(standardEnumeration.get()); standardEnumerationsMap.get(standardEnumeration.get().getLookupField()).add(standardEnumeration.get());
} }
}); });
//enumerations.forEach((key, items) -> LOG.info("key: " + key + " , items: " + items.toString()));
} }
public void buildStandardRelationships(Sheet worksheet) { public void buildStandardRelationships(Sheet worksheet) {
@ -366,7 +363,7 @@ public abstract class WorksheetProcessor {
} }
} }
public Map<String, Set<ReferenceStandardLookup>> getEnumerations() { public Map<String, List<ReferenceStandardLookup>> getEnumerations() {
return standardEnumerationsMap; return standardEnumerationsMap;
} }

View File

@ -81,6 +81,8 @@ public final class WebAPITestContainer implements TestContainer {
private final AtomicBoolean isUsingMetadataFile = new AtomicBoolean(false); private final AtomicBoolean isUsingMetadataFile = new AtomicBoolean(false);
// request instance variables - these get resetMarkupBuffer with every request // request instance variables - these get resetMarkupBuffer with every request
//TODO: refactor underlying response properties to use a ODataTransportWrapper (or any TransportWrapper)
// and create the test container with the appropriate response of the transport wrapper
private final AtomicReference<String> selectList = new AtomicReference<>(); private final AtomicReference<String> selectList = new AtomicReference<>();
private final AtomicReference<ODataRawResponse> oDataRawResponse = new AtomicReference<>(); private final AtomicReference<ODataRawResponse> oDataRawResponse = new AtomicReference<>();
private final AtomicReference<Request> request = new AtomicReference<>(); private final AtomicReference<Request> request = new AtomicReference<>();

View File

@ -0,0 +1,53 @@
Feature: IDX Payload Endorsement (Web API)
All Scenarios passing means the given Web API server passes the IDX Payloads Endorsement
# SEE: https://docs.google.com/document/d/1btCduOpWWzeadeMcSviA8M9dclIz23P-bPUGKwcD0NY/edit?usp=sharing
Background:
Given a RESOScript file was provided
And Client Settings and Parameters were read from the file
And a test container was successfully created from the given RESOScript
And the test container uses an authorization_code or client_credentials for authentication
# TODO: tie back into common metadata validation shared scenario
@metadata-validation @idx-payload-endorsement @dd-1.7 @web-api-1.0.2
Scenario: Request and Validate Server Metadata
When XML Metadata are requested from the service root in "ClientSettings_WebAPIURI"
Then the server responds with a status code of 200
And the server has an OData-Version header value of "4.0" or "4.01"
And the XML Metadata response is valid XML
And the XML Metadata returned by the server are valid
And the XML Metadata returned by the server contains Edm metadata
And the Edm metadata returned by the server are valid
And the metadata contains a valid service document
And each resource MUST have a primary key field by the OData specification
@standard-resource-sampling @dd-1.7 @idx-payload-endorsement
Scenario: Standard Resource Sampling
Given that valid metadata have been requested from the server
And the metadata contains RESO Standard Resources
And "payload-samples" has been created in the build directory
Then up to 10000 records are sampled from each resource with "IDX" payload samples stored in "payload-samples"
# data are not stored in this case, just sampled and scored
@local-resource-sampling @dd-1.7 @idx-payload-endorsement
Scenario: Non Standard Resource Sampling - Request Data from Each Server Resource
Given that valid metadata have been requested from the server
And the metadata contains local resources
Then up to 10000 records are sampled from each local resource
@idx-payload-endorsement @dd-1.7
Scenario: A Data Availability Report is Created from Sampled Records
Given standard and local resources have been processed
Then a data availability report is created in "data-availability-report.json"
@idx-user-sampling @dd-1.7 @idx-payload-endorsement
Scenario: IDX User Sampling
Given samples exist in "payload-samples" in the build directory
And a RESOScript file was provided for the IDX User
And Client Settings and Parameters were read from the file
And a test container was successfully created from the given RESOScript
And the test container uses an authorization_code or client_credentials for authentication
When samples from "payload-samples" are fetched as the representative user for each resource in the "IDX" payload
Then each result MUST contain the string version of the key and the following fields
|ModificationTimestamp|
And the "IDX" payload field values MUST match those in the samples

View File

@ -16,35 +16,39 @@ Feature: Web API Server Add/Edit Endorsement
# OData-Version: 4.01 # OData-Version: 4.01
# Content-Type: application/json;odata.metadata=minimal # Content-Type: application/json;odata.metadata=minimal
# Accept: application/json # Accept: application/json
#
# This is without the prefer header and minimal value
#
@create @create-succeeds @add-edit-endorsement @rcp-010 @1.0.2 @create @create-succeeds @add-edit-endorsement @rcp-010 @1.0.2
Scenario: Create operation succeeds using a given payload Scenario: Create operation succeeds using a given payload
Given valid metadata have been retrieved Given valid metadata have been retrieved
And request data has been provided in "create-succeeds.json" And request data has been provided in "create-succeeds.json"
And request data in "create-succeeds.json" is valid JSON And request data in "create-succeeds.json" is valid JSON
And schema in "create-succeeds.json" matches the metadata And schema in "create-succeeds.json" matches the metadata
And the request header "OData-Version" is "4.01" And the request header "OData-Version" "equals" one of the following values
And the request header "Content-Type" contains "application/json" |4.0|4.01|
And the request header "Accept" is "application/json" And the request header "Content-Type" "contains" "application/json"
And the request header "Accept" "contains" "application/json"
When a "POST" request is made to the "resource-endpoint" URL with data in "create-succeeds.json" When a "POST" request is made to the "resource-endpoint" URL with data in "create-succeeds.json"
Then the test is skipped if the server responds with a status code of 401 Then the server responds with one of the following status codes
# TODO: check spec for 204 |201|
When the server responds with one of the following status codes And the response header "OData-Version" "equals" one of the following values
|200|201| |4.0|4.01|
Then the response header "OData-Version" is "4.01" And the response header "EntityId" "MUST" "be present"
And the response header "EntityId" is present And the response header "Location" "MUST" "be present"
And the response header "Location" is present And the response header "Location" "is a valid URL"
And the response header "Location" is a valid URL And the response header "Location" "MUST" reference the resource being created
When the server responds with a 200 status code "valid JSON exists" in the JSON response And the response is valid JSON
When the server responds with a 200 status code "@odata.context" "is present" in the JSON response And the JSON response "MUST" contain "@odata.context"
When the server responds with a 200 status code "@odata.context" "is a valid URL" in the JSON response And the JSON response value "@odata.context" "is a valid URL"
When the server responds with a 200 status code "@odata.id" "is present" in the JSON response And the JSON response "MUST" contain "@odata.id"
When the server responds with a 200 status code "@odata.id" "is a valid URL" in the JSON response And the JSON response value "@odata.id" "is a valid URL"
When the server responds with a 200 status code "@odata.editLink" "is present" in the JSON response And the JSON response "MAY" contain "@odata.editLink"
When the server responds with a 200 status code "@odata.editLink" "is a valid URL" in the JSON response And the JSON response value "@odata.editLink" "is a valid URL"
When the server responds with a 200 status code "@odata.etag" "is present" in the JSON response And the JSON response "MAY" contain "@odata.etag"
When the server responds with a 200 status code "@odata.etag" "starts with" "W/" in the JSON response And the JSON response value "@odata.etag" "starts with" "W/"
When the server responds with a 200 status code data from "create-succeeds.json" "exists" in the JSON response And the JSON response "MUST" contain all JSON data in "create-succeeds.json"
When a "GET" request is made to the response header "Location" URL When a "GET" request is made to the URL in response header "Location"
Then the server responds with a status code of 200 Then the server responds with a status code of 200
And the response has header "OData-Version" with one of the following values And the response has header "OData-Version" with one of the following values
|4.0|4.01| |4.0|4.01|
@ -58,7 +62,7 @@ Feature: Web API Server Add/Edit Endorsement
# SEE: https://reso.atlassian.net/wiki/spaces/RESOWebAPIRCP/pages/2239399511/RCP+-+WEBAPI-010+Add+Functionality+to+Web+API+Specification#Error-Message-Example # SEE: https://reso.atlassian.net/wiki/spaces/RESOWebAPIRCP/pages/2239399511/RCP+-+WEBAPI-010+Add+Functionality+to+Web+API+Specification#Error-Message-Example
# POST serviceRoot/Property # POST serviceRoot/Property
# OData-Version: 4.01 # OData-Version: 4.01
# Content-Type: application/json;odata.metadata=minimal # Content-Type: application/json
# Accept: application/json # Accept: application/json
@create @create-fails @add-edit-endorsement @rcp-010 @1.0.2 @create @create-fails @add-edit-endorsement @rcp-010 @1.0.2
Scenario: Create operation fails using a given payload Scenario: Create operation fails using a given payload
@ -66,12 +70,16 @@ Feature: Web API Server Add/Edit Endorsement
And request data has been provided in "create-fails.json" And request data has been provided in "create-fails.json"
And request data in "create-fails.json" is valid JSON And request data in "create-fails.json" is valid JSON
And schema in "create-fails.json" matches the metadata And schema in "create-fails.json" matches the metadata
And the request header "OData-Version" is "4.01" And the request header "OData-Version" "equals" one of the following values
And the request header "Content-Type" is "application/json;odata.metadata=minimal" |4.0|4.01|
And the request header "Accept" is "application/json" And the request header "Content-Type" "MUST" "be present"
And the request header "Content-Type" "equals" "application/json"
And the request header "Accept" "MUST" "be present"
And the request header "Accept" "contains" "application/json"
When a "POST" request is made to the "resource-endpoint" URL with data in "create-fails.json" When a "POST" request is made to the "resource-endpoint" URL with data in "create-fails.json"
Then the server responds with one of the following error codes Then the server responds with one of the following error codes
|400|401|403|405|408|500|501|503| |400|
And the response header "OData-Version" is "4.01" And the response has header "OData-Version" with one of the following values
|4.0|4.01|
And the error response is in a valid format And the error response is in a valid format
And the values in the "target" field in the JSON payload "error.details" path are contained within the metadata And the values in the "target" field in the JSON payload "error.details" path are contained within the metadata

View File

@ -114,7 +114,6 @@ public class CertificationReportGenerator {
} catch (IOException e) { } catch (IOException e) {
e.printStackTrace(); e.printStackTrace();
} }
return null; return null;
} }
} }

View File

@ -29,7 +29,6 @@ import org.reso.models.Settings;
import java.io.File; import java.io.File;
import java.io.IOException; import java.io.IOException;
import java.lang.reflect.Type; import java.lang.reflect.Type;
import java.net.URL;
import java.nio.file.Files; import java.nio.file.Files;
import java.nio.file.Paths; import java.nio.file.Paths;
import java.util.*; import java.util.*;
@ -70,8 +69,8 @@ public class DataDictionary {
//named args //named args
private static final String SHOW_RESPONSES_ARG = "showResponses"; private static final String SHOW_RESPONSES_ARG = "showResponses";
private static final String USE_STRICT_MODE_ARG = "strict"; private static final String USE_STRICT_MODE_ARG = "strict";
private static final String PATH_TO_METADATA_ARG = "pathToMetadata"; protected static final String PATH_TO_METADATA_ARG = "pathToMetadata";
private static final String PATH_TO_RESOSCRIPT_ARG = "pathToRESOScript"; protected static final String PATH_TO_RESOSCRIPT_ARG = "pathToRESOScript";
private static final String LOOKUP_VALUE = "lookupValue"; private static final String LOOKUP_VALUE = "lookupValue";
//extract any params here //extract any params here
@ -150,9 +149,8 @@ public class DataDictionary {
@When("a metadata file is provided") @When("a metadata file is provided")
public void aMetadataFileIsProvided() { public void aMetadataFileIsProvided() {
boolean result = false;
if (isUsingMetadata) { if (isUsingMetadata) {
result = pathToMetadata != null && Files.exists(Paths.get(pathToMetadata)); boolean result = pathToMetadata != null && Files.exists(Paths.get(pathToMetadata));
if (!result) { if (!result) {
failAndExitWithErrorMessage("Path to given metadata file does not exist: " + PATH_TO_METADATA_ARG + "=" + pathToMetadata, scenario); failAndExitWithErrorMessage("Path to given metadata file does not exist: " + PATH_TO_METADATA_ARG + "=" + pathToMetadata, scenario);
} }
@ -195,8 +193,6 @@ public class DataDictionary {
@And("valid metadata were retrieved from the server") @And("valid metadata were retrieved from the server")
public void validMetadataWereRetrievedFromTheServer() { public void validMetadataWereRetrievedFromTheServer() {
boolean result = false;
if (isUsingRESOScript && container.getShouldValidateMetadata()) { if (isUsingRESOScript && container.getShouldValidateMetadata()) {
//request metadata from server using service root in RESOScript file //request metadata from server using service root in RESOScript file
TestUtils.assertXMLMetadataAreRequestedFromTheServer(container, scenario); TestUtils.assertXMLMetadataAreRequestedFromTheServer(container, scenario);
@ -591,11 +587,9 @@ public class DataDictionary {
private XMLMetadata getReferenceMetadata() { private XMLMetadata getReferenceMetadata() {
if (referenceMetadata == null) { if (referenceMetadata == null) {
URL resource = Thread.currentThread().getContextClassLoader().getResource(REFERENCE_METADATA); final String xmlMetadata = Commander.convertInputStreamToString(Thread.currentThread().getContextClassLoader().getResourceAsStream(REFERENCE_METADATA));
assert resource != null; assert xmlMetadata != null : getDefaultErrorMessage("could not load reference metadata from: " + REFERENCE_METADATA);
referenceMetadata = Commander referenceMetadata = Commander.deserializeXMLMetadata(xmlMetadata, container.getCommander().getClient());
.deserializeXMLMetadata(Commander.convertInputStreamToString(Commander.deserializeFileFromPath(resource.getPath())),
container.getCommander().getClient());
} }
return referenceMetadata; return referenceMetadata;
} }

View File

@ -0,0 +1,586 @@
package org.reso.certification.stepdefs;
import com.google.common.collect.Sets;
import com.google.common.hash.Hashing;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import com.google.inject.Inject;
import io.cucumber.java.Before;
import io.cucumber.java.Scenario;
import io.cucumber.java.en.And;
import io.cucumber.java.en.Given;
import io.cucumber.java.en.Then;
import io.cucumber.java.en.When;
import org.apache.http.HttpStatus;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.olingo.client.api.data.ResWrap;
import org.apache.olingo.commons.api.data.EntityCollection;
import org.apache.olingo.commons.api.edm.EdmEntityType;
import org.apache.olingo.commons.api.edm.EdmKeyPropertyRef;
import org.apache.olingo.commons.api.edm.EdmNamed;
import org.apache.olingo.commons.api.edm.EdmPrimitiveTypeKind;
import org.apache.olingo.commons.api.format.ContentType;
import org.reso.certification.codegen.DDCacheProcessor;
import org.reso.certification.containers.WebAPITestContainer;
import org.reso.commander.common.DataDictionaryMetadata;
import org.reso.commander.common.Utils;
import org.reso.models.*;
import java.io.ByteArrayInputStream;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.time.OffsetDateTime;
import java.time.format.DateTimeFormatter;
import java.time.temporal.ChronoUnit;
import java.util.*;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.concurrent.atomic.AtomicReference;
import java.util.stream.Collectors;
import static io.restassured.path.json.JsonPath.from;
import static org.junit.Assert.assertNotNull;
import static org.junit.Assume.assumeTrue;
import static org.reso.certification.containers.WebAPITestContainer.EMPTY_STRING;
import static org.reso.commander.Commander.NOT_OK;
import static org.reso.commander.common.ErrorMsg.getDefaultErrorMessage;
import static org.reso.commander.common.TestUtils.failAndExitWithErrorMessage;
public class IDXPayload {
private static final Logger LOG = LogManager.getLogger(IDXPayload.class);
private static final String MODIFICATION_TIMESTAMP_FIELD = "ModificationTimestamp";
private static final String POSTAL_CODE_FIELD = "PostalCode";
private static final int TOP_COUNT = 100;
private static final int MAX_RETRIES = 3;
private static final String SAMPLES_DIRECTORY_ROOT = "build";
private static final String SAMPLES_DIRECTORY_TEMPLATE = SAMPLES_DIRECTORY_ROOT + File.separator + "%s";
private static final String PATH_TO_RESOSCRIPT_KEY = "pathToRESOScript";
final String REQUEST_URI_TEMPLATE = "?$filter=%s" + " lt %s&$orderby=%s desc&$top=" + TOP_COUNT;
final String COUNT_REQUEST_URI_TEMPLATE = "?$count=true";
//TODO: get this from the parameters
private final static boolean DEBUG = false;
private static Scenario scenario;
private final static AtomicBoolean hasStandardResources = new AtomicBoolean(false);
private final static AtomicBoolean hasLocalResources = new AtomicBoolean(false);
private final static AtomicReference<Set<String>> standardResources = new AtomicReference<>(new LinkedHashSet<>());
private final static AtomicReference<Set<String>> localResources = new AtomicReference<>(new LinkedHashSet<>());
private final static AtomicReference<WebAPITestContainer> container = new AtomicReference<>();
private final static AtomicBoolean hasSamplesDirectoryBeenCleared = new AtomicBoolean(false);
private final static AtomicReference<Map<String, List<PayloadSample>>> resourcePayloadSampleMap =
new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
private final static AtomicReference<Map<String, List<ReferenceStandardField>>> standardFieldCache =
new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
private final static AtomicReference<Map<String, Integer>> resourceCounts =
new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
@Inject
public IDXPayload(WebAPITestContainer c) {
container.set(c);
}
@Before
public void beforeStep(Scenario scenario) {
final String pathToRESOScript = System.getProperty(PATH_TO_RESOSCRIPT_KEY, null);
if (pathToRESOScript == null) return;
IDXPayload.scenario = scenario;
if (!container.get().getIsInitialized()) {
container.get().setSettings(Settings.loadFromRESOScript(new File(System.getProperty(PATH_TO_RESOSCRIPT_KEY))));
container.get().initialize();
}
}
/**
* Creates a data availability report for the given samples map
* @param resourcePayloadSamplesMap the samples map to create the report from
* @param reportName the name of the report
*/
public void createDataAvailabilityReport(Map<String, List<PayloadSample>> resourcePayloadSamplesMap,
String reportName, Map<String, Integer> resourceCounts) {
PayloadSampleReport payloadSampleReport = new PayloadSampleReport(container.get().getEdm(), resourcePayloadSamplesMap, resourceCounts);
GsonBuilder gsonBuilder = new GsonBuilder().setPrettyPrinting();
gsonBuilder.registerTypeAdapter(PayloadSampleReport.class, payloadSampleReport);
Utils.createFile(SAMPLES_DIRECTORY_ROOT, reportName, gsonBuilder.create().toJson(payloadSampleReport));
}
/**
* Hashes the given values
*
* @param values items to hash, will be joined together, then hashed
* @return the SHA hash of the given values
*/
private static String hashValues(String... values) {
return Hashing.sha256().hashString(String.join(EMPTY_STRING, values), StandardCharsets.UTF_8).toString();
}
/**
* Builds a request URI string, taking into account whether the sampling is being done with an optional
* filter, for instance in the shared systems case
* @param resourceName the resource name to query
* @param timestampField the timestamp field for the resource
* @param lastFetchedDate the last fetched date for filtering
* @return a string OData query used for sampling
*/
private String buildODataTimestampRequestUriString(String resourceName, String timestampField, OffsetDateTime lastFetchedDate) {
String requestUri = container.get().getCommander().getClient()
.newURIBuilder(container.get().getServiceRoot())
.appendEntitySetSegment(resourceName).build().toString();
requestUri += String.format(REQUEST_URI_TEMPLATE, timestampField,
lastFetchedDate.format(DateTimeFormatter.ISO_INSTANT), timestampField);
return requestUri;
}
/**
* Builds a request URI string for counting the number of available items on a resource, taking into account
* whether the sample is being done with an optional filter, for instance in the shared system case
* @param resourceName the resource name to query
* @return a request URI string for getting OData counts
*/
private String buildODataCountRequestUriString(String resourceName) {
String requestUri = container.get().getCommander().getClient()
.newURIBuilder(container.get().getServiceRoot())
.appendEntitySetSegment(resourceName).build().toString();
requestUri += COUNT_REQUEST_URI_TEMPLATE;
return requestUri;
}
/**
* Queries the server and fetches a resource count for the given resource name
* @param resourceName the resource name to get the count for
* @return the count found for the resource, or null if the request did not return a count
*/
private Integer getResourceCount(String resourceName) {
ODataTransportWrapper transportWrapper;
String requestUri = buildODataCountRequestUriString(resourceName);
Integer count = null;
LOG.info("\n\nMaking count request to the " + resourceName + " resource: " + requestUri);
transportWrapper = container.get().getCommander().executeODataGetRequest(requestUri);
if (transportWrapper.getHttpResponseCode() == null || transportWrapper.getHttpResponseCode() != HttpStatus.SC_OK) {
if (transportWrapper.getHttpResponseCode() == null) {
scenario.log(getDefaultErrorMessage("Count request to", requestUri,
"failed! No response code was provided. Check commander.log for any errors..."));
} else {
scenario.log(getDefaultErrorMessage("Count request to", requestUri,
"failed with response code", transportWrapper.getHttpResponseCode().toString()));
}
} else {
String oDataCountString = from(transportWrapper.getResponseData()).getString("\"@odata.count\"");
count = oDataCountString != null && oDataCountString.trim().length() > 0 ? Integer.parseInt(oDataCountString) : 0;
scenario.log("Total record count for the " + resourceName + " resource: " + oDataCountString);
}
return count;
}
/**
* Fetches and processes records in cases where encoding the results is necessary
*
* @param resourceName the resource name to sample from
* @param targetRecordCount the target record count to fetch (will stop before then if the end is reached)
* @param encodedResultsDirectoryName the directory name for encoded results
* @return a list of PayloadSample items
*/
List<PayloadSample> fetchAndProcessRecords(String resourceName, int targetRecordCount, String encodedResultsDirectoryName) {
final AtomicReference<OffsetDateTime> lastFetchedDate = new AtomicReference<>(OffsetDateTime.now());
final List<String> timestampCandidateFields = new LinkedList<>();
final AtomicReference<EdmEntityType> entityType = new AtomicReference<>();
final AtomicReference<Map<String, String>> encodedSample = new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
final AtomicReference<ODataTransportWrapper> transportWrapper = new AtomicReference<>();
final AtomicReference<ResWrap<EntityCollection>> entityCollectionResWrap = new AtomicReference<>();
final AtomicReference<String> timestampField = new AtomicReference<>();
final AtomicBoolean hasRecords = new AtomicBoolean(true);
final AtomicReference<PayloadSample> payloadSample = new AtomicReference<>();
final AtomicReference<List<PayloadSample>> payloadSamples =
new AtomicReference<>(Collections.synchronizedList(new LinkedList<>()));
boolean hasStandardTimestampField = false;
String requestUri;
int recordsProcessed = 0;
int numRetries = 0;
int lastTimestampCandidateIndex = 0;
container.get().getEdm().getSchemas().forEach(edmSchema ->
edmSchema.getEntityTypes().stream().filter(edmEntityType -> edmEntityType.getName().equals(resourceName))
.findFirst().ifPresent(entityType::set));
//return null if the entity type isn't defined
if (entityType.get() == null) return new ArrayList<>();
if (entityType.get().getProperty(MODIFICATION_TIMESTAMP_FIELD) == null) {
scenario.log("Could not find " + MODIFICATION_TIMESTAMP_FIELD + " in the " + resourceName + " resource!\n");
scenario.log("Searching for suitable timestamp fields...");
entityType.get().getPropertyNames().forEach(propertyName -> {
try {
if (entityType.get().getProperty(propertyName).getType().getFullQualifiedName().getFullQualifiedNameAsString()
.contentEquals(EdmPrimitiveTypeKind.DateTimeOffset.getFullQualifiedName().getFullQualifiedNameAsString())) {
scenario.log("Found Edm.DateTimeOffset field " + propertyName + " in the " + resourceName + " resource!\n");
timestampCandidateFields.add(propertyName);
}
} catch (Exception ex) {
LOG.error(ex);
}
});
} else {
hasStandardTimestampField = true;
}
final List<EdmKeyPropertyRef> keyFields = entityType.get().getKeyPropertyRefs();
scenario.log("Sampling resource: " + resourceName);
scenario.log("Keys found: " + keyFields.stream().map(EdmKeyPropertyRef::getName).collect(Collectors.joining(", ")));
//loop and fetch records as long as items are available and we haven't reached our target count yet
while (hasRecords.get() && recordsProcessed < targetRecordCount) {
if (hasStandardTimestampField) {
timestampField.set(MODIFICATION_TIMESTAMP_FIELD);
} else if (timestampCandidateFields.size() > 0 && lastTimestampCandidateIndex < timestampCandidateFields.size()) {
timestampField.set(timestampCandidateFields.get(lastTimestampCandidateIndex++));
} else {
scenario.log(getDefaultErrorMessage("Could not find a suitable timestamp field in the "
+ resourceName + " resource to sample with..."));
//skip this resource since no suitable fields were found
break;
}
payloadSample.set(new PayloadSample(resourceName, timestampField.get(),
keyFields.stream().map(EdmKeyPropertyRef::getName).collect(Collectors.toList())));
requestUri = buildODataTimestampRequestUriString(resourceName, timestampField.get(), lastFetchedDate.get());
payloadSample.get().setRequestUri(requestUri);
LOG.info("Making request to: " + requestUri);
transportWrapper.set(container.get().getCommander().executeODataGetRequest(requestUri));
// retries. sometimes requests can time out and fail and we don't want to stop sampling
// immediately, but retry a couple of times before we bail
if (recordsProcessed == 0 && transportWrapper.get().getResponseData() == null) {
//only count retries if we're constantly making requests and not getting anything
numRetries += 1;
} else {
numRetries = 0;
}
if (numRetries >= MAX_RETRIES) {
if (timestampCandidateFields.size() > 0 && (lastTimestampCandidateIndex < timestampCandidateFields.size())) {
LOG.info("Trying next candidate timestamp field: " + timestampCandidateFields.get(lastTimestampCandidateIndex));
numRetries = 0;
} else {
LOG.info("Could not fetch records from the " + resourceName + " resource after " + MAX_RETRIES
+ " tries from the given URL: " + requestUri);
break;
}
}
if (transportWrapper.get().getHttpResponseCode() == null || transportWrapper.get().getHttpResponseCode() != HttpStatus.SC_OK) {
if (transportWrapper.get().getHttpResponseCode() == null) {
LOG.error(getDefaultErrorMessage("Request to", requestUri,
"failed! No response code was provided. Check commander.log for any errors..."));
} else {
scenario.log(getDefaultErrorMessage("Request to", requestUri,
"failed with response code", transportWrapper.get().getHttpResponseCode().toString()));
}
break;
} else {
LOG.info("Time taken: "
+ (transportWrapper.get().getElapsedTimeMillis() >= 1000 ? (transportWrapper.get().getElapsedTimeMillis() / 1000) + "s"
: transportWrapper.get().getElapsedTimeMillis() + "ms"));
try {
payloadSample.get().setResponseSizeBytes(transportWrapper.get().getResponseData().getBytes().length);
entityCollectionResWrap.set(container.get().getCommander().getClient()
.getDeserializer(ContentType.APPLICATION_JSON)
.toEntitySet(new ByteArrayInputStream(transportWrapper.get().getResponseData().getBytes())));
if (entityCollectionResWrap.get().getPayload().getEntities().size() > 0) {
LOG.info("Hashing " + resourceName + " payload values...");
entityCollectionResWrap.get().getPayload().getEntities().forEach(entity -> {
encodedSample.set(Collections.synchronizedMap(new LinkedHashMap<>()));
entity.getProperties().forEach(property -> {
//value will be considered null unless trimmed string or collection has non-zero length
String value = property.getValue() != null
&& ((property.isCollection() && property.asCollection().size() > 0)
|| (property.isComplex() && property.asComplex().getValue().size() > 0)
|| property.getValue().toString().trim().length() > 0
|| property.isGeospatial() && property.asGeospatial() != null)
? property.getValue().toString() : null;
if (DEBUG) {
if (property.isCollection() && property.asCollection().size() > 0) {
LOG.info("Found Collection for field: " + property.getName() + ", value: " + property.asCollection());
}
if (property.isComplex() && property.asComplex().getValue().size() > 0) {
LOG.info("Found Complex Type for field: " + property.getName() + ", value: " + property.asComplex());
}
if (property.isEnum() && property.asEnum() != null) {
LOG.info("Found Enum for field" + property.getName() + ", value: " + property.asEnum());
}
if (property.isGeospatial() && property.asGeospatial() != null) {
LOG.info("Found Enum for field: " + property.getName() + ", value: " + property.asGeospatial());
}
}
//turn off hashing when DEBUG is true
if (!DEBUG && value != null) {
if (!(property.getName().contentEquals(timestampField.get())
|| property.getName().equals(POSTAL_CODE_FIELD)
|| keyFields.stream().reduce(true, (acc, f) -> acc && f.getName().contentEquals(property.getName()), Boolean::logicalAnd))) {
value = hashValues(property.getValue().toString());
}
}
// TODO: clean up. If field is timestamp field or key field unmask, if field is null report null, otherwise hash value
encodedSample.get().put(property.getName(), value);
if (property.getName().contentEquals(MODIFICATION_TIMESTAMP_FIELD)) {
if (OffsetDateTime.parse(property.getValue().toString()).isBefore(lastFetchedDate.get())) {
lastFetchedDate.set(OffsetDateTime.parse(property.getValue().toString()));
}
}
});
payloadSample.get().addSample(encodedSample.get());
});
LOG.info("Values encoded!");
recordsProcessed += entityCollectionResWrap.get().getPayload().getEntities().size();
LOG.info("Records processed: " + recordsProcessed + ". Target record count: " + targetRecordCount + "\n");
payloadSample.get().setResponseTimeMillis(transportWrapper.get().getElapsedTimeMillis());
if (encodedResultsDirectoryName != null) {
payloadSample.get().setPayloadFields(standardFieldCache.get().get(resourceName).stream()
.map(ReferenceStandardField::getStandardName).collect(Collectors.toList()));
//serialize results once resource processing has finished
Utils.createFile(String.format(SAMPLES_DIRECTORY_TEMPLATE, encodedResultsDirectoryName),
resourceName + "-" + Utils.getTimestamp() + ".json",
payloadSample.get().serialize(payloadSample.get(), PayloadSample.class, null).toString());
}
payloadSamples.get().add(payloadSample.get());
} else {
scenario.log("All available records fetched! Total: " + recordsProcessed);
hasRecords.set(false);
}
} catch (Exception ex) {
scenario.log("Error in fetchAndProcessRecords: " + getDefaultErrorMessage(ex.toString()));
scenario.log("Skipping sample...");
lastFetchedDate.set(lastFetchedDate.get().minus(1, ChronoUnit.WEEKS));
}
}
}
return payloadSamples.get();
}
/**
* fetches and processes records in cases where only sampling is required and encoding is not necessary
*
* @param resourceName the resource name to sample from
* @param targetRecordCount the target record count for the resource (will stop if the end of the records is reached)
* @return a list of PayloadSample items
*/
List<PayloadSample> fetchAndProcessRecords(String resourceName, int targetRecordCount) {
return fetchAndProcessRecords(resourceName, targetRecordCount, null);
}
/*==================================== TESTS START HERE ====================================*/
@Given("that valid metadata have been requested from the server")
public void thatValidMetadataHaveBeenRequestedFromTheServer() {
try {
if (container.get().hasValidMetadata()) {
if (standardFieldCache.get().size() == 0) {
standardFieldCache.get().putAll(DDCacheProcessor.buildCache());
}
} else {
failAndExitWithErrorMessage("Valid metadata was not retrieved from the server. Exiting!", scenario);
}
} catch (Exception ex) {
failAndExitWithErrorMessage(ex.toString(), scenario);
}
}
@And("the metadata contains RESO Standard Resources")
public void theMetadataContainsRESOStandardResources() {
Set<String> resources = container.get().getEdm().getSchemas().stream().map(schema ->
schema.getEntityTypes().stream().map(EdmNamed::getName)
.collect(Collectors.toSet()))
.flatMap(Collection::stream)
.collect(Collectors.toSet());
standardResources.set(resources.stream()
.filter(DataDictionaryMetadata.v1_7.WELL_KNOWN_RESOURCES::contains).collect(Collectors.toSet()));
localResources.set(Sets.difference(resources, standardResources.get()));
hasStandardResources.set(standardResources.get().size() > 0);
hasLocalResources.set(localResources.get().size() > 0);
if (hasStandardResources.get()) {
//TODO: add pluralizer
scenario.log("Found " + standardResources.get().size() + " RESO Standard Resource"
+ (standardResources.get().size() == 1 ? "" : "s") + ": "
+ String.join(", ", standardResources.get()));
} else {
scenario.log("No RESO Standard Resources found. Skipping...");
assumeTrue(true);
}
}
@And("each resource MUST have a primary key field by the OData specification")
public void eachResourceMUSTHaveAPrimaryKeyFieldByTheODataSpecification() {
scenario.log("Each resource MUST have a primary key field by the OData specification!");
assumeTrue(true);
}
@And("the metadata contains local resources")
public void theMetadataContainsLocalResources() {
if (localResources.get() == null || localResources.get().size() == 0) {
scenario.log("No local resources found! Skipping...");
assumeTrue(true);
} else {
scenario.log("Found " + localResources.get().size() + " Local Resource"
+ (localResources.get().size() == 1 ? "" : "s") + ": "
+ String.join(", ", localResources.get()));
}
}
@Then("up to {int} records are sampled from each resource with {string} payload samples stored in {string}")
public void upToRecordsAreSampledFromEachResourceWithPayloadSamplesStoredIn(int numRecords, String payloadName, String resultsDirectoryName) {
assertNotNull(getDefaultErrorMessage("resultsDirectoryName MUST be present!"), resultsDirectoryName);
if (!hasStandardResources.get()) {
scenario.log("No RESO Standard Resources to sample!");
assumeTrue(true);
} else {
Set<String> payloadResources = new LinkedHashSet<>();
standardFieldCache.get().forEach((resourceName, fieldList) -> {
if (!payloadResources.contains(resourceName) && fieldList.stream().anyMatch(field -> field.getPayloads().contains(payloadName))) {
payloadResources.add(resourceName);
}
});
standardResources.get().forEach(resourceName -> {
resourceCounts.get().put(resourceName, getResourceCount(resourceName));
resourcePayloadSampleMap.get().putIfAbsent(resourceName, Collections.synchronizedList(new LinkedList<>()));
//only save results to the directory if the resources are part of the given payload
resourcePayloadSampleMap.get().put(resourceName,
fetchAndProcessRecords(resourceName, numRecords, payloadResources.contains(resourceName) ? resultsDirectoryName : null));
});
}
}
@Then("up to {int} records are sampled from each local resource")
public void upToRecordsAreSampledFromEachLocalResource(int numRecords) {
if (!hasLocalResources.get()) {
scenario.log("No local resources were found to sample!");
assumeTrue(true);
} else {
localResources.get().forEach(resourceName -> {
resourceCounts.get().put(resourceName, getResourceCount(resourceName));
resourcePayloadSampleMap.get().putIfAbsent(resourceName, Collections.synchronizedList(new LinkedList<>()));
resourcePayloadSampleMap.get().put(resourceName, fetchAndProcessRecords(resourceName, numRecords, null));
});
}
}
@Given("samples exist in {string} in the build directory")
public void samplesExistInInTheBuildDirectory(String resultsDirectory) {
scenario.log("Samples exist in {string} in the build directory!");
assumeTrue(true);
}
@And("a RESOScript file was provided for the IDX User")
public void aRESOScriptFileWasProvidedForTheIDXUser() {
scenario.log("!!TODO!! A RESOScript file was provided for the IDX User!");
assumeTrue(true);
}
@When("samples from {string} are fetched as the representative user for each resource in the {string} payload")
public void samplesFromAreFetchedAsTheRepresentativeUserForEachResourceInThePayload(String resultsDirectory, String payloadName) {
File f = new File(SAMPLES_DIRECTORY_ROOT + File.separator + resultsDirectory);
AtomicReference<PayloadSample> payloadSample = new AtomicReference<>();
if (f.list() == null) return;
Arrays.stream(Objects.requireNonNull(f.list((file, s) -> s.endsWith("json")))).forEach(sampleResults -> {
try {
final String jsonString = new String(Files.readAllBytes(Paths.get(f.getPath() + File.separator + sampleResults)));
payloadSample.set(new Gson().fromJson(jsonString, PayloadSample.class));
} catch (FileNotFoundException fnfEx) {
LOG.error(getDefaultErrorMessage("file", sampleResults, "could not be found! Skipping..."));
LOG.error("Exception: " + fnfEx);
} catch (IOException ioException) {
ioException.printStackTrace();
}
});
}
@Then("each result MUST contain the string version of the key and the following fields")
public void eachResultMUSTContainTheStringVersionOfTheKeyAndTheFollowingFields(List<String> requiredFields) {
scenario.log("!!TODO!! Each result MUST contain the string version of the key and the following fields!");
assumeTrue(true);
}
@And("the {string} payload field values MUST match those in the samples")
public void thePayloadFieldValuesMUSTMatchThoseInTheSamples(String arg0) {
scenario.log("!!TODO!! The {string} payload field values MUST match those in the samples!");
assumeTrue(true);
}
@Given("standard and local resources have been processed")
public void standardAndLocalResourcesHaveBeenProcessed() {
scenario.log("!!TODO!! Standard and local resources have been processed!");
assumeTrue(true);
}
@Then("a data availability report is created in {string}")
public void aDataAvailabilityReportIsCreatedIn(String reportFileName) {
if (resourcePayloadSampleMap.get() == null) {
LOG.info("No resource payload samples found! Skipping...");
assumeTrue(true);
}
LOG.info("\n\nCreating data availability report!");
createDataAvailabilityReport(resourcePayloadSampleMap.get(), reportFileName, resourceCounts.get());
}
@And("{string} has been created in the build directory")
public void hasBeenCreatedInTheBuildDirectory(String encodedResultsDirectoryName) {
if (encodedResultsDirectoryName != null && !hasSamplesDirectoryBeenCleared.get()) {
if (!Utils.removeDirectory(String.format(SAMPLES_DIRECTORY_TEMPLATE, encodedResultsDirectoryName))) {
LOG.error("Failed to create runtime directories needed for program execution. Exiting...");
System.exit(NOT_OK);
} else {
hasSamplesDirectoryBeenCleared.set(true);
}
}
}
}

View File

@ -2,6 +2,7 @@ package org.reso.certification.stepdefs;
import com.fasterxml.jackson.databind.node.ObjectNode; import com.fasterxml.jackson.databind.node.ObjectNode;
import com.fasterxml.jackson.databind.node.POJONode; import com.fasterxml.jackson.databind.node.POJONode;
import com.google.inject.Inject;
import io.cucumber.java8.En; import io.cucumber.java8.En;
import org.apache.http.HttpStatus; import org.apache.http.HttpStatus;
import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.LogManager;
@ -77,7 +78,10 @@ class WebAPIServer implements En {
/** /**
* Entry point to the Web API Server tests * Entry point to the Web API Server tests
*/ */
public WebAPIServer() { @Inject
public WebAPIServer(WebAPITestContainer c) {
container.set(c);
getTestContainer().setShowResponses(showResponses); getTestContainer().setShowResponses(showResponses);
runBackground(); runBackground();

View File

@ -95,4 +95,48 @@ public class WebAPIServerAddEdit {
public void theRequestHeaderContains(String arg0, String arg1) { public void theRequestHeaderContains(String arg0, String arg1) {
} }
@And("the request header {string} {string} one of the following values")
public void theRequestHeaderOneOfTheFollowingValues(String arg0, String arg1) {
}
@And("the request header {string} {string} {string}")
public void theRequestHeader(String arg0, String arg1, String arg2) {
}
@And("the response header {string} {string} one of the following values")
public void theResponseHeaderOneOfTheFollowingValues(String arg0, String arg1) {
}
@And("the response header {string} {string} {string}")
public void theResponseHeader(String arg0, String arg1, String arg2) {
}
@And("the response header {string} {string}")
public void theResponseHeader(String arg0, String arg1) {
}
@And("the response header {string} {string} reference the resource being created")
public void theResponseHeaderReferenceTheResourceBeingCreated(String arg0, String arg1) {
}
@And("the JSON response {string} contain {string}")
public void theJSONResponseContain(String arg0, String arg1) {
}
@And("the JSON response value {string} {string}")
public void theJSONResponseValue(String arg0, String arg1) {
}
@And("the JSON response value {string} {string} {string}")
public void theJSONResponseValue(String arg0, String arg1, String arg2) {
}
@And("the JSON response {string} contain all JSON data in {string}")
public void theJSONResponseContainAllJSONDataIn(String arg0, String arg1) {
}
@When("a {string} request is made to the URL in response header {string}")
public void aRequestIsMadeToTheURLInResponseHeader(String arg0, String arg1) {
}
} }

View File

@ -4,10 +4,7 @@ import org.apache.commons.cli.*;
import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.Logger;
import org.apache.olingo.commons.api.format.ContentType; import org.apache.olingo.commons.api.format.ContentType;
import org.reso.certification.codegen.BDDProcessor; import org.reso.certification.codegen.*;
import org.reso.certification.codegen.DDLProcessor;
import org.reso.certification.codegen.DataDictionaryCodeGenerator;
import org.reso.certification.codegen.EDMXProcessor;
import org.reso.models.ClientSettings; import org.reso.models.ClientSettings;
import org.reso.models.ODataTransportWrapper; import org.reso.models.ODataTransportWrapper;
import org.reso.models.Request; import org.reso.models.Request;
@ -222,6 +219,14 @@ public class App {
} catch (Exception ex) { } catch (Exception ex) {
LOG.error(getDefaultErrorMessage(ex)); LOG.error(getDefaultErrorMessage(ex));
} }
} else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_RESOURCE_INFO_MODELS)) {
APP_OPTIONS.validateAction(cmd, APP_OPTIONS.ACTIONS.GENERATE_RESOURCE_INFO_MODELS);
try {
DataDictionaryCodeGenerator generator = new DataDictionaryCodeGenerator(new ResourceInfoProcessor());
generator.processWorksheets();
} catch (Exception ex) {
LOG.error(getDefaultErrorMessage(ex));
}
} else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_REFERENCE_EDMX)) { } else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_REFERENCE_EDMX)) {
APP_OPTIONS.validateAction(cmd, APP_OPTIONS.ACTIONS.GENERATE_REFERENCE_EDMX); APP_OPTIONS.validateAction(cmd, APP_OPTIONS.ACTIONS.GENERATE_REFERENCE_EDMX);
try { try {
@ -237,6 +242,12 @@ public class App {
} catch (Exception ex) { } catch (Exception ex) {
LOG.error(getDefaultErrorMessage(ex)); LOG.error(getDefaultErrorMessage(ex));
} }
} else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_SEED_DATA_SQL)) {
try {
DataDictionarySeedDataSqlGenerator generator = new DataDictionarySeedDataSqlGenerator();
} catch (Exception ex) {
LOG.error(getDefaultErrorMessage(ex));
}
} else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_QUERIES)) { } else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_QUERIES)) {
APP_OPTIONS.validateAction(cmd, APP_OPTIONS.ACTIONS.GENERATE_QUERIES); APP_OPTIONS.validateAction(cmd, APP_OPTIONS.ACTIONS.GENERATE_QUERIES);
@ -373,10 +384,10 @@ public class App {
private static class APP_OPTIONS { private static class APP_OPTIONS {
//parameter names //parameter names
static final String SERVICE_ROOT = "serviceRoot"; static final String SERVICE_ROOT = "serviceRoot";
static final String BEARER_TOKEN = "bearerToken"; static final String BEARER_TOKEN = "bearerToken";
static final String CLIENT_ID = "clientId"; static final String CLIENT_ID = "clientId";
static final String CLIENT_SECRET = "clientSecret"; static final String CLIENT_SECRET = "clientSecret";
static final String INPUT_FILE = "inputFile"; static final String INPUT_FILE = "inputFile";
static final String OUTPUT_FILE = "outputFile"; static final String OUTPUT_FILE = "outputFile";
static final String URI = "uri"; static final String URI = "uri";
@ -419,6 +430,8 @@ public class App {
} }
} else if (action.matches(ACTIONS.GENERATE_QUERIES)) { } else if (action.matches(ACTIONS.GENERATE_QUERIES)) {
validationResponse = validateOptions(cmd, INPUT_FILE); validationResponse = validateOptions(cmd, INPUT_FILE);
} else if (action.matches(ACTIONS.GENERATE_REFERENCE_DDL) || action.matches(ACTIONS.GENERATE_SEED_DATA_SQL)) {
validationResponse = validateOptions(cmd, INPUT_FILE);
} }
if (validationResponse != null) { if (validationResponse != null) {
@ -517,10 +530,14 @@ public class App {
.desc("Runs commands in RESOScript file given as <inputFile>.").build()) .desc("Runs commands in RESOScript file given as <inputFile>.").build())
.addOption(Option.builder().argName("t").longOpt(ACTIONS.GENERATE_DD_ACCEPTANCE_TESTS) .addOption(Option.builder().argName("t").longOpt(ACTIONS.GENERATE_DD_ACCEPTANCE_TESTS)
.desc("Generates acceptance tests in the current directory.").build()) .desc("Generates acceptance tests in the current directory.").build())
.addOption(Option.builder().argName("i").longOpt(ACTIONS.GENERATE_RESOURCE_INFO_MODELS)
.desc("Generates Java Models for the Web API Reference Server in the current directory.").build())
.addOption(Option.builder().argName("r").longOpt(ACTIONS.GENERATE_REFERENCE_EDMX) .addOption(Option.builder().argName("r").longOpt(ACTIONS.GENERATE_REFERENCE_EDMX)
.desc("Generates reference metadata in EDMX format.").build()) .desc("Generates reference metadata in EDMX format.").build())
.addOption(Option.builder().argName("k").longOpt(ACTIONS.GENERATE_REFERENCE_DDL) .addOption(Option.builder().argName("k").longOpt(ACTIONS.GENERATE_REFERENCE_DDL)
.desc("Generates reference DDL to create a RESO-compliant SQL database. Pass --useKeyNumeric to generate the DB using numeric keys.").build()) .desc("Generates reference DDL to create a RESO-compliant SQL database. Pass --useKeyNumeric to generate the DB using numeric keys.").build())
.addOption(Option.builder().argName("d").longOpt(ACTIONS.GENERATE_SEED_DATA_SQL)
.desc("Generates SQL statements to seed data (Data Dictionary 1.7). Pass --useKeyNumeric to generate the DB using numeric keys.").build())
.addOption(Option.builder().argName("m").longOpt(ACTIONS.GET_METADATA) .addOption(Option.builder().argName("m").longOpt(ACTIONS.GET_METADATA)
.desc("Fetches metadata from <serviceRoot> using <bearerToken> and saves results in <outputFile>.").build()) .desc("Fetches metadata from <serviceRoot> using <bearerToken> and saves results in <outputFile>.").build())
.addOption(Option.builder().argName("g").longOpt(ACTIONS.GENERATE_METADATA_REPORT) .addOption(Option.builder().argName("g").longOpt(ACTIONS.GENERATE_METADATA_REPORT)
@ -553,12 +570,14 @@ public class App {
public static final String GENERATE_DD_ACCEPTANCE_TESTS = "generateDDAcceptanceTests"; public static final String GENERATE_DD_ACCEPTANCE_TESTS = "generateDDAcceptanceTests";
public static final String GENERATE_REFERENCE_EDMX = "generateReferenceEDMX"; public static final String GENERATE_REFERENCE_EDMX = "generateReferenceEDMX";
public static final String GENERATE_REFERENCE_DDL = "generateReferenceDDL"; public static final String GENERATE_REFERENCE_DDL = "generateReferenceDDL";
public static final String GENERATE_SEED_DATA_SQL = "generateSeedDataSql";
public static final String GENERATE_QUERIES = "generateQueries"; public static final String GENERATE_QUERIES = "generateQueries";
public static final String RUN_RESOSCRIPT = "runRESOScript"; public static final String RUN_RESOSCRIPT = "runRESOScript";
public static final String GET_METADATA = "getMetadata"; public static final String GET_METADATA = "getMetadata";
public static final String VALIDATE_METADATA = "validateMetadata"; public static final String VALIDATE_METADATA = "validateMetadata";
public static final String SAVE_GET_REQUEST = "saveGetRequest"; public static final String SAVE_GET_REQUEST = "saveGetRequest";
public static final String GENERATE_METADATA_REPORT = "generateMetadataReport"; public static final String GENERATE_METADATA_REPORT = "generateMetadataReport";
public static final String GENERATE_RESOURCE_INFO_MODELS = "generateResourceInfoModels";
} }
} }
} }

View File

@ -31,7 +31,7 @@ import java.nio.charset.StandardCharsets;
import java.sql.Time; import java.sql.Time;
import java.sql.Timestamp; import java.sql.Timestamp;
import java.time.LocalDate; import java.time.LocalDate;
import java.time.ZonedDateTime; import java.time.OffsetDateTime;
import java.time.format.DateTimeFormatter; import java.time.format.DateTimeFormatter;
import java.time.format.DateTimeParseException; import java.time.format.DateTimeParseException;
import java.time.temporal.ChronoField; import java.time.temporal.ChronoField;
@ -680,7 +680,7 @@ public final class TestUtils {
public static Integer getTimestampPart(String timestampPart, Object value) throws DateTimeParseException { public static Integer getTimestampPart(String timestampPart, Object value) throws DateTimeParseException {
if (timestampPart == null || value == null) return null; if (timestampPart == null || value == null) return null;
ZonedDateTime dateTime = ZonedDateTime.parse((String) value, DateTimeFormatter.ISO_DATE_TIME); OffsetDateTime dateTime = OffsetDateTime.parse((String) value, DateTimeFormatter.ISO_DATE_TIME);
switch (timestampPart) { switch (timestampPart) {
case DateParts.YEAR: case DateParts.YEAR:

View File

@ -8,10 +8,11 @@ import java.io.FileWriter;
import java.nio.charset.StandardCharsets; import java.nio.charset.StandardCharsets;
import java.text.DateFormat; import java.text.DateFormat;
import java.text.SimpleDateFormat; import java.text.SimpleDateFormat;
import java.time.ZoneOffset; import java.time.OffsetDateTime;
import java.time.ZonedDateTime;
import java.time.format.DateTimeFormatter; import java.time.format.DateTimeFormatter;
import java.util.Arrays;
import java.util.Date; import java.util.Date;
import java.util.Objects;
public class Utils { public class Utils {
private static final Logger LOG = LogManager.getLogger(Utils.class); private static final Logger LOG = LogManager.getLogger(Utils.class);
@ -85,6 +86,32 @@ public class Utils {
return outputFile; return outputFile;
} }
/**
* Removes a directory at the given pathToDirectory.
*
* If current user has write access then directory creation will result in True being returned.
* Otherwise will return false if the directory couldn't be created for some reason.
* @param pathToDirectory
* @return
*/
public static Boolean removeDirectory(String pathToDirectory) {
if (pathToDirectory == null) return null;
File outputDirectory = new File(pathToDirectory);
if (outputDirectory.exists()) {
if (outputDirectory.canWrite()) {
if (outputDirectory.listFiles() != null) {
Arrays.stream(Objects.requireNonNull(outputDirectory.listFiles())).forEach(File::delete);
}
return outputDirectory.delete();
} else {
LOG.error("Tried deleting directory " + outputDirectory.getPath() + " but didn't have sufficient access.");
return false;
}
}
return true;
}
public static String pluralize(int lengthAttribute) { public static String pluralize(int lengthAttribute) {
return lengthAttribute != 1 ? "s" : ""; return lengthAttribute != 1 ? "s" : "";
} }
@ -109,7 +136,19 @@ public class Utils {
} }
public static String getIsoTimestamp() { public static String getIsoTimestamp() {
return ZonedDateTime.now(ZoneOffset.UTC).format(DateTimeFormatter.ISO_INSTANT); return getIsoTimestamp(OffsetDateTime.now());
}
public static String getIsoTimestamp(OffsetDateTime fromDate) {
return OffsetDateTime.from(fromDate).format(DateTimeFormatter.ISO_INSTANT);
}
public static String getIsoDate() {
return getIsoDate(OffsetDateTime.now());
}
public static String getIsoDate(OffsetDateTime fromDate) {
return fromDate.format(DateTimeFormatter.ISO_DATE);
} }
} }

View File

@ -0,0 +1,134 @@
package org.reso.models;
import com.google.gson.Gson;
import com.google.gson.reflect.TypeToken;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.reso.commander.Commander;
import java.lang.reflect.Type;
import java.util.List;
import static org.reso.commander.common.ErrorMsg.getDefaultErrorMessage;
/**
* Used to deserialize the Data Dictionary reference sheet into a cache of generators
*/
public class DataGenerator {
private static final Logger LOG = LogManager.getLogger(DataGenerator.class);
private final static String DATA_GENERATOR_JSON = "RESODataDictionary-1.7.data-generator.json";
private String description;
private String version;
private String generatedOn;
private List<ResourceInfo> resourceInfo;
private List<FieldDataGenerator> fields;
/**
* Creates a nested map of Data Dictionary reference generators where
* * the outer map is keyed by resource name
* * inner map is keyed by standard field name and returns a generator for that field
*
* @return nested hashes of standard field generators
*/
public static DataGenerator deserialize() {
final String generatorJson = Commander.convertInputStreamToString(Thread.currentThread().getContextClassLoader().getResourceAsStream(DATA_GENERATOR_JSON));
assert generatorJson != null : getDefaultErrorMessage("could not load resource " + DATA_GENERATOR_JSON);
//final String generatorJson = Commander.convertInputStreamToString(Commander.deserializeFileFromPath(resource.getPath()));
//note the open braces before getType()
Type targetClassType = new TypeToken<DataGenerator>() {}.getType();
return new Gson().fromJson(generatorJson, targetClassType);
}
public String getDescription() {
return description;
}
public String getVersion() {
return version;
}
public String getGeneratedOn() {
return generatedOn;
}
public List<ResourceInfo> getResourceInfo() {
return resourceInfo;
}
public List<FieldDataGenerator> getFields() {
return fields;
}
public static final class FieldDataGenerator {
private String fieldName;
private String resourceName;
private String fakerGeneratorName;
private List<String> customExamples;
public FieldDataGenerator(String fieldName, String resourceName, String fakerGeneratorName, List<String> customExamples) {
this.fieldName = fieldName;
this.resourceName = resourceName;
this.fakerGeneratorName = fakerGeneratorName;
this.customExamples = customExamples;
}
public String getFieldName() {
return fieldName;
}
public String getResourceName() {
return resourceName;
}
public void setResourceName(String resourceName) {
this.resourceName = resourceName;
}
public String getFakerGeneratorName() {
return fakerGeneratorName;
}
public List<String> getCustomExamples() {
return customExamples;
}
public boolean hasFakerGenerator() {
return fakerGeneratorName != null && fakerGeneratorName.length() > 0;
}
public boolean hasCustomExamples() {
return customExamples != null && customExamples.size() > 0;
}
@Override
public String toString() {
return "FieldDataGenerator{" +
"fieldName='" + fieldName + '\'' +
", resourceName=" + (resourceName == null ? "null" : "'" + resourceName + "'") +
", fakerGeneratorName=" + (fakerGeneratorName == null ? "null" : "'" + fakerGeneratorName + "'") +
", customExamples=" + customExamples +
'}';
}
}
public static final class ResourceInfo {
private String resourceName;
private Integer recordCount;
public ResourceInfo(String resourceName,Integer recordCount) {
this.resourceName = resourceName;
this.recordCount = recordCount;
}
public String getResourceName() {
return resourceName;
}
public Integer getRecordCount() {
return recordCount;
}
}
}

View File

@ -165,7 +165,7 @@ public class MetadataReport implements JsonSerializer<MetadataReport> {
} }
static class SneakyAnnotationReader { static class SneakyAnnotationReader {
Class object; Class<? extends EdmAnnotationImpl> object;
Field field; Field field;
EdmAnnotationImpl edmAnnotationImpl; EdmAnnotationImpl edmAnnotationImpl;
ClientCsdlAnnotation clientCsdlAnnotation; ClientCsdlAnnotation clientCsdlAnnotation;

View File

@ -0,0 +1,131 @@
package org.reso.models;
import com.google.gson.*;
import org.reso.commander.common.Utils;
import java.lang.reflect.Type;
import java.util.Collections;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
public class PayloadSample implements JsonSerializer<PayloadSample> {
String resourceName;
String dateField;
Long responseTimeMillis = null;
Integer responseSizeBytes = null;
String requestUri = null;
//format is a list of key/value pairs where all fields besides
//keys and timestamps are encoded with SHA
final List<Map<String, String>> encodedSamples = Collections.synchronizedList(new LinkedList<>());
//keeps track of the list of key fields found on the server
final List<String> keyFields = new LinkedList<>();
final List<String> payloadFields = new LinkedList<>();
public PayloadSample(String resourceName, String dateField, List<String> keyFields) {
assert resourceName != null : "resourceName MUST be present";
this.resourceName = resourceName;
this.dateField = dateField;
this.keyFields.addAll(keyFields);
}
public void setPayloadFields(List<String> payloadFields) {
this.payloadFields.addAll(payloadFields);
}
public void addSample(Map<String, String> sample) {
encodedSamples.add(sample);
}
public List<Map<String, String>> getSamples() {
return encodedSamples;
}
public Long getResponseTimeMillis() {
return responseTimeMillis;
}
public void setResponseTimeMillis(Long value) {
responseTimeMillis = value;
}
public String getRequestUri() {
return requestUri;
}
public void setRequestUri(String value) {
requestUri = value;
}
public Integer getResponseSizeBytes() {
return responseSizeBytes;
}
public void setResponseSizeBytes(Integer responseSizeBytes) {
this.responseSizeBytes = responseSizeBytes;
}
/**
* Gson invokes this call-back method during serialization when it encounters a field of the
* specified type.
*
* <p>In the implementation of this call-back method, you should consider invoking
* {@link JsonSerializationContext#serialize(Object, Type)} method to create JsonElements for any
* non-trivial field of the {@code src} object. However, you should never invoke it on the
* {@code src} object itself since that will cause an infinite loop (Gson will call your
* call-back method again).</p>
*
* @param src the object that needs to be converted to Json.
* @param typeOfSrc the actual type (fully genericized version) of the source object.
* @param context context of the serialization request
* @return a JsonElement corresponding to the specified object.
*/
@Override
public JsonElement serialize(PayloadSample src, Type typeOfSrc, JsonSerializationContext context) {
final String
DESCRIPTION = "RESO Payload Sample",
DESCRIPTION_KEY = "description",
GENERATED_ON_KEY = "generatedOn",
NUM_SAMPLES_KEY = "numSamples",
REQUEST_URI_KEY = "requestUri",
RESOURCE_NAME_KEY = "resourceName",
DATE_FIELD_KEY = "dateField",
KEY_FIELDS_KEY = "keyFields",
ENCODED_VALUES_KEY = "encodedValues",
PAYLOAD_FIELDS_KEY = "payloadFields";
JsonObject serialized = new JsonObject();
serialized.addProperty(DESCRIPTION_KEY, DESCRIPTION);
serialized.addProperty(GENERATED_ON_KEY, Utils.getIsoTimestamp());
serialized.addProperty(NUM_SAMPLES_KEY, src.encodedSamples.size());
serialized.addProperty(REQUEST_URI_KEY, src.requestUri);
serialized.addProperty(RESOURCE_NAME_KEY,src.resourceName);
JsonArray keyFields = new JsonArray();
src.keyFields.forEach(keyFields::add);
serialized.add(KEY_FIELDS_KEY, keyFields);
serialized.addProperty(DATE_FIELD_KEY, src.dateField);
JsonArray payloadFieldsJson = new JsonArray();
src.payloadFields.forEach(payloadFieldsJson::add);
serialized.add(PAYLOAD_FIELDS_KEY, payloadFieldsJson);
JsonArray encodedSamplesJson = new JsonArray();
src.encodedSamples.forEach(sample -> {
JsonObject sampleJson = new JsonObject();
sample.forEach(sampleJson::addProperty);
encodedSamplesJson.add(sampleJson);
});
serialized.add(ENCODED_VALUES_KEY, encodedSamplesJson);
return serialized;
}
}

View File

@ -0,0 +1,310 @@
package org.reso.models;
import com.google.gson.*;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.olingo.commons.api.edm.Edm;
import org.apache.olingo.commons.api.edm.EdmElement;
import org.reso.commander.common.Utils;
import java.lang.reflect.Type;
import java.time.OffsetDateTime;
import java.time.format.DateTimeFormatter;
import java.util.*;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicLong;
import java.util.concurrent.atomic.AtomicReference;
public class PayloadSampleReport implements JsonSerializer<PayloadSampleReport> {
private static final Logger LOG = LogManager.getLogger(PayloadSampleReport.class);
private static final String POSTAL_CODE_KEY = "PostalCode";
private final Map<String, List<PayloadSample>> resourcePayloadSamplesMap = Collections.synchronizedMap(new LinkedHashMap<>());
private final Map<String, Map<String, Integer>> resourceFieldTallies = Collections.synchronizedMap(new LinkedHashMap<>(new LinkedHashMap<>()));
private final Map<String, Integer> resourceCounts = Collections.synchronizedMap(new LinkedHashMap<>());
private Edm metadata;
private PayloadSampleReport() {
//private default constructor
}
public PayloadSampleReport(final Edm metadata, final Map<String, List<PayloadSample>> resourcePayloadSamplesMap, final Map<String, Integer> resourceCounts) {
this.metadata = metadata;
this.resourcePayloadSamplesMap.putAll(resourcePayloadSamplesMap);
resourceFieldTallies.putAll(createResourceFieldTallies(resourcePayloadSamplesMap));
this.resourceCounts.putAll(resourceCounts);
}
@Override
public String toString() {
return String.valueOf(serialize(this, FieldAvailabilityJson.class, null));
}
/**
* FieldAvailabilityJson uses a JSON payload with the following structure:
*
* {
* "resourceName": "Property",
* "fieldName": "AboveGradeFinishedArea",
* "availability": 0.1
* }
*/
private final class FieldAvailabilityJson implements JsonSerializer<FieldAvailabilityJson> {
static final String
RESOURCE_NAME_KEY = "resourceName",
FIELD_NAME_KEY = "fieldName",
FIELDS_KEY = "fields",
AVAILABILITY_KEY = "availability";
String resourceName;
EdmElement edmElement;
public FieldAvailabilityJson(String resourceName, EdmElement edmElement) {
this.resourceName = resourceName;
this.edmElement = edmElement;
}
public String buildReportString(JsonElement dataAvailabilityReport) {
StringBuilder reportBuilder = new StringBuilder();
dataAvailabilityReport.getAsJsonObject().get(FIELDS_KEY).getAsJsonArray().forEach(field -> {
reportBuilder.append("\nResource: ");
reportBuilder.append(field.getAsJsonObject().get(RESOURCE_NAME_KEY));
reportBuilder.append("\nField: ");
reportBuilder.append(field.getAsJsonObject().get(FIELD_NAME_KEY));
reportBuilder.append("\nAvailability: ");
reportBuilder.append(field.getAsJsonObject().get(AVAILABILITY_KEY));
reportBuilder.append("\n");
});
return reportBuilder.toString();
}
@Override
public JsonElement serialize(FieldAvailabilityJson src, Type typeOfSrc, JsonSerializationContext context) {
JsonObject field = new JsonObject();
int numTimesPresent = resourceFieldTallies.get(src.resourceName) != null
&& resourceFieldTallies.get(src.resourceName).get(src.edmElement.getName()) != null
? resourceFieldTallies.get(src.resourceName).get(src.edmElement.getName()) : 0;
int numSamples = resourcePayloadSamplesMap.get(src.resourceName) != null
? resourcePayloadSamplesMap.get(src.resourceName).stream().reduce(0, (a, f) -> a + f.encodedSamples.size(), Integer::sum) : 0;
field.addProperty(RESOURCE_NAME_KEY, src.resourceName);
field.addProperty(FIELD_NAME_KEY, src.edmElement.getName());
field.addProperty(AVAILABILITY_KEY, numSamples > 0 ? (1.0 * numTimesPresent) / numSamples : 0);
return field;
}
}
private static Map<String, Map<String, Integer>> createResourceFieldTallies(Map<String, List<PayloadSample>> resourcePayloadSamplesMap) {
AtomicReference<Map<String, Map<String, Integer>>> resourceTallies = new AtomicReference<>(new LinkedHashMap<>());
AtomicInteger numSamples = new AtomicInteger(0);
resourcePayloadSamplesMap.keySet().forEach(resourceName -> {
LOG.info("Processing resource: " + resourceName);
numSamples.set(resourcePayloadSamplesMap.get(resourceName) != null
? resourcePayloadSamplesMap.get(resourceName).stream().reduce(0, (a, f) -> a + f.getSamples().size(), Integer::sum) : 0);
LOG.info("Sample size: " + numSamples.get());
//for each resource, go through the keys and tally the data presence counts for each field
//as well as the number of samples in each case
resourceTallies.get().putIfAbsent(resourceName, new LinkedHashMap<>());
if (numSamples.get() > 0) {
resourcePayloadSamplesMap.get(resourceName).forEach(payloadSample -> {
payloadSample.getSamples().forEach(sample -> {
sample.forEach((fieldName, encodedValue) -> {
if (encodedValue != null) {
resourceTallies.get().get(resourceName).putIfAbsent(fieldName, 0);
resourceTallies.get().get(resourceName).put(fieldName, resourceTallies.get().get(resourceName).get(fieldName) + 1);
}
});
});
});
}
});
return resourceTallies.get();
}
@Override
public JsonElement serialize(PayloadSampleReport src, Type typeOfSrc, JsonSerializationContext context) {
final String
DESCRIPTION_KEY = "description", DESCRIPTION = "RESO Data Availability Report",
VERSION_KEY = "version", VERSION = "1.7",
GENERATED_ON_KEY = "generatedOn",
RESOURCE_INFO_KEY = "resourceInfo",
FIELDS_KEY = "fields";
JsonArray fields = new JsonArray();
src.metadata.getSchemas().forEach(edmSchema -> {
//serialize entities (resources) and members (fields)
edmSchema.getEntityTypes().forEach(edmEntityType -> {
edmEntityType.getPropertyNames().forEach(propertyName -> {
FieldAvailabilityJson fieldJson = new FieldAvailabilityJson(edmEntityType.getName(), edmEntityType.getProperty(propertyName));
fields.add(fieldJson.serialize(fieldJson, FieldAvailabilityJson.class, null));
});
});
});
JsonObject availabilityReport = new JsonObject();
availabilityReport.addProperty(DESCRIPTION_KEY, DESCRIPTION);
availabilityReport.addProperty(VERSION_KEY, VERSION);
availabilityReport.addProperty(GENERATED_ON_KEY, Utils.getIsoTimestamp());
final JsonArray resourceTotalsByResource = new JsonArray();
src.resourcePayloadSamplesMap.keySet().forEach(resourceName -> {
Set<String> postalCodes = new LinkedHashSet<>();
ResourceInfo resourceInfo = new ResourceInfo(resourceName);
int resourceRecordCount = 0;
if (src.resourceCounts.get(resourceName) != null) {
resourceRecordCount = src.resourceCounts.get(resourceName);
}
resourceInfo.numRecordsTotal.set(resourceRecordCount);
PayloadSample zerothSample = resourcePayloadSamplesMap.get(resourceName) != null
&& resourcePayloadSamplesMap.get(resourceName).size() > 0
? resourcePayloadSamplesMap.get(resourceName).get(0) : null;
if (zerothSample != null) {
resourceInfo.keyFields.set(zerothSample.keyFields);
resourceInfo.dateField.set(zerothSample.dateField);
}
if (src.resourcePayloadSamplesMap.get(resourceName) != null) {
AtomicReference<OffsetDateTime> offsetDateTime = new AtomicReference<>();
src.resourcePayloadSamplesMap.get(resourceName).forEach(payloadSample -> {
resourceInfo.totalBytesReceived.getAndAdd(payloadSample.getResponseSizeBytes());
resourceInfo.totalResponseTimeMillis.getAndAdd(payloadSample.getResponseTimeMillis());
resourceInfo.numSamplesProcessed.getAndIncrement();
resourceInfo.numRecordsFetched.getAndAdd(payloadSample.encodedSamples.size());
payloadSample.encodedSamples.forEach(encodedSample -> {
offsetDateTime.set(OffsetDateTime.parse(encodedSample.get(payloadSample.dateField)));
if (offsetDateTime.get() != null) {
if (resourceInfo.dateLow.get() == null) {
resourceInfo.dateLow.set(offsetDateTime.get());
} else if (offsetDateTime.get().isBefore(resourceInfo.dateLow.get())) {
resourceInfo.dateLow.set(offsetDateTime.get());
}
if (resourceInfo.dateHigh.get() == null) {
resourceInfo.dateHigh.set(offsetDateTime.get());
} else if (offsetDateTime.get().isAfter(resourceInfo.dateHigh.get())) {
resourceInfo.dateHigh.set(offsetDateTime.get());
}
}
if (encodedSample.containsKey(POSTAL_CODE_KEY)) {
postalCodes.add(encodedSample.get(POSTAL_CODE_KEY));
}
});
if (resourceInfo.pageSize.get() == 0) resourceInfo.pageSize.set(payloadSample.getSamples().size());
});
}
if (postalCodes.size() > 0) {
resourceInfo.postalCodes.set(postalCodes);
}
resourceTotalsByResource.add(resourceInfo.serialize(resourceInfo, ResourceInfo.class, null));
});
availabilityReport.add(RESOURCE_INFO_KEY, resourceTotalsByResource);
availabilityReport.add(FIELDS_KEY, fields);
return availabilityReport;
}
static final class ResourceInfo implements JsonSerializer<ResourceInfo> {
final String
RESOURCE_NAME_KEY = "resourceName",
RECORD_COUNT_KEY = "recordCount",
TOTAL_NUM_RECORDS_FETCHED = "numRecordsFetched",
TOTAL_NUM_SAMPLES_KEY = "numSamples",
PAGE_SIZE_KEY = "pageSize",
AVERAGE_RESPONSE_TIME_MILLIS_KEY = "averageResponseTimeMillis",
AVERAGE_RESPONSE_BYTES_KEY = "averageResponseBytes",
KEY_FIELDS_KEY = "keyFields",
DATE_FIELD_KEY = "dateField",
DATE_LOW_KEY = "dateLow",
DATE_HIGH_KEY = "dateHigh",
POSTAL_CODES_KEY = "postalCodes";
final AtomicInteger numSamplesProcessed = new AtomicInteger(0);
final AtomicInteger numRecordsTotal = new AtomicInteger(0);
final AtomicInteger numRecordsFetched = new AtomicInteger(0);
final AtomicReference<String> resourceName = new AtomicReference<>();
final AtomicLong totalResponseTimeMillis = new AtomicLong(0);
final AtomicLong totalBytesReceived = new AtomicLong(0);
final AtomicInteger pageSize = new AtomicInteger(0);
final AtomicReference<List<String>> keyFields = new AtomicReference<>(new LinkedList<>());
final AtomicReference<String> dateField = new AtomicReference<>();
final AtomicReference<OffsetDateTime> dateLow = new AtomicReference<>(null);
final AtomicReference<OffsetDateTime> dateHigh = new AtomicReference<>(null);
final AtomicReference<Set<String>> postalCodes = new AtomicReference<>(new LinkedHashSet<>());
public ResourceInfo(String resourceName) {
this.resourceName.set(resourceName);
}
/**
* Gson invokes this call-back method during serialization when it encounters a field of the
* specified type.
*
* <p>In the implementation of this call-back method, you should consider invoking
* {@link JsonSerializationContext#serialize(Object, Type)} method to create JsonElements for any
* non-trivial field of the {@code src} object. However, you should never invoke it on the
* {@code src} object itself since that will cause an infinite loop (Gson will call your
* call-back method again).</p>
*
* @param src the object that needs to be converted to Json.
* @param typeOfSrc the actual type (fully genericized version) of the source object.
* @param context
* @return a JsonElement corresponding to the specified object.
*/
@Override
public JsonElement serialize(ResourceInfo src, Type typeOfSrc, JsonSerializationContext context) {
JsonObject totals = new JsonObject();
totals.addProperty(RESOURCE_NAME_KEY, src.resourceName.get());
totals.addProperty(RECORD_COUNT_KEY, src.numRecordsTotal);
totals.addProperty(TOTAL_NUM_RECORDS_FETCHED, src.numRecordsFetched.get());
totals.addProperty(TOTAL_NUM_SAMPLES_KEY, src.numSamplesProcessed.get());
totals.addProperty(PAGE_SIZE_KEY, src.pageSize.get());
totals.addProperty(AVERAGE_RESPONSE_BYTES_KEY, src.numSamplesProcessed.get() > 0
? src.totalBytesReceived.get() / src.numSamplesProcessed.get() : 0);
totals.addProperty(AVERAGE_RESPONSE_TIME_MILLIS_KEY, src.numSamplesProcessed.get() > 0
? src.totalResponseTimeMillis.get() / src.numSamplesProcessed.get() : 0);
totals.addProperty(DATE_FIELD_KEY, src.dateField.get());
totals.addProperty(DATE_LOW_KEY, src.dateLow.get() != null
? src.dateLow.get().format(DateTimeFormatter.ISO_INSTANT) : null);
totals.addProperty(DATE_HIGH_KEY, src.dateHigh.get() != null
? src.dateHigh.get().format(DateTimeFormatter.ISO_INSTANT): null);
JsonArray keyFields = new JsonArray();
src.keyFields.get().forEach(keyFields::add);
totals.add(KEY_FIELDS_KEY, keyFields);
if (src.postalCodes.get().size() > 0) {
JsonArray postalCodes = new JsonArray();
src.postalCodes.get().forEach(postalCodes::add);
totals.add(POSTAL_CODES_KEY, postalCodes);
}
return totals;
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,7 +1,7 @@
{ {
"description": "RESO Data Dictionary Metadata Report", "description": "RESO Data Dictionary Metadata Report",
"version": "1.7", "version": "1.7",
"generatedOn": "2021-04-13T23:23:58.588Z", "generatedOn": "2021-04-30T12:16:46.822Z",
"fields": [ "fields": [
{ {
"resourceName": "Property", "resourceName": "Property",

View File

@ -0,0 +1 @@
cucumber.publish.quiet=true

View File

@ -77,7 +77,6 @@
} }
}, },
"required": [ "required": [
"annotations",
"fieldName", "fieldName",
"isCollection", "isCollection",
"nullable", "nullable",
@ -89,8 +88,11 @@
}, },
"Annotation": { "Annotation": {
"type": "object", "type": "object",
"additionalProperties": false, "additionalProperties": true,
"properties": { "properties": {
"term": {
"type": "string"
},
"value": { "value": {
"type": "string", "type": "string",
"qt-uri-protocols": [ "qt-uri-protocols": [

View File

@ -14,12 +14,12 @@
</File> </File>
</Appenders> </Appenders>
<Loggers> <Loggers>
<Logger name="org.apache.olingo.client.core" level="all" additivity="false"> <Logger name="org.apache.olingo.client.core" level="error" additivity="false">
<AppenderRef ref="Log"/> <AppenderRef ref="Log"/>
</Logger> </Logger>
<Root level="all"> <Root level="all">
<AppenderRef ref="Console" level="info"/> <AppenderRef ref="Console" level="info"/>
<AppenderRef ref="Log"/> <AppenderRef ref="Log" level="error"/>
</Root> </Root>
</Loggers> </Loggers>
</Configuration> </Configuration>