Payloads Sampling Tool (#75)

* Sampling with SHA 256 scoring
* Optional multi-threaded fetching and use of timezone offsets
* Data availability reports (availability-report.json in the /build directory)
* Disabled logging, changed payload sampling logic, encoded files for payload fields only, and added standard cookie spec headers to fix warnings
* Cleaned up logging and build.gradle
* Added line to quiet velocity logs
This commit is contained in:
Joshua Darnell 2021-07-13 16:25:03 -07:00 committed by GitHub
parent d6a2ce3649
commit 8c68e140c9
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
27 changed files with 1660 additions and 103 deletions

View File

@ -76,30 +76,33 @@ usage: java -jar web-api-commander
standard out. standard out.
--generateReferenceDDL Generates reference DDL to create a --generateReferenceDDL Generates reference DDL to create a
RESO-compliant SQL database. Pass RESO-compliant SQL database. Pass
--useKeyNumeric to generate the DB using --useKeyNumeric to generate the DB
numeric keys. using numeric keys.
--generateReferenceEDMX Generates reference metadata in EDMX --generateReferenceEDMX Generates reference metadata in EDMX
format. format.
--generateResourceInfoModels Generates Java Models for the Web API
Reference Server in the current
directory.
--getMetadata Fetches metadata from <serviceRoot> --getMetadata Fetches metadata from <serviceRoot>
using <bearerToken> and saves results in using <bearerToken> and saves results
<outputFile>. in <outputFile>.
--help print help --help print help
--inputFile <i> Path to input file. --inputFile <i> Path to input file.
--outputFile <o> Path to output file. --outputFile <o> Path to output file.
--runRESOScript Runs commands in RESOScript file given --runRESOScript Runs commands in RESOScript file given
as <inputFile>. as <inputFile>.
--saveGetRequest Performs GET from <requestURI> using the --saveGetRequest Performs GET from <requestURI> using
given <bearerToken> and saves output to the given <bearerToken> and saves
<outputFile>. output to <outputFile>.
--serviceRoot <s> Service root URL on the host. --serviceRoot <s> Service root URL on the host.
--uri <u> URI for raw request. Use 'single quotes' --uri <u> URI for raw request. Use 'single
to enclose. quotes' to enclose.
--useEdmEnabledClient present if an EdmEnabledClient should be --useEdmEnabledClient present if an EdmEnabledClient should
used. be used.
--useKeyNumeric present if numeric keys are to be used --useKeyNumeric present if numeric keys are to be used
for database DDL generation. for database DDL generation.
--validateMetadata Validates previously-fetched metadata in --validateMetadata Validates previously-fetched metadata
the <inputFile> path. in the <inputFile> path.
``` ```
When using commands, if required arguments aren't provided, relevant feedback will be displayed in the terminal. When using commands, if required arguments aren't provided, relevant feedback will be displayed in the terminal.
@ -227,6 +230,17 @@ New Cucumber BDD acceptance tests will be generated and placed in a timestamped
To update the current tests, copy the newly generated ones into the [Data Dictionary BDD `.features` directory](src/main/java/org/reso/certification/features/data-dictionary/v1-7-0), run the `./gradlew build` task, and if everything works as expected, commit the newly generated tests. To update the current tests, copy the newly generated ones into the [Data Dictionary BDD `.features` directory](src/main/java/org/reso/certification/features/data-dictionary/v1-7-0), run the `./gradlew build` task, and if everything works as expected, commit the newly generated tests.
## Generating RESO Web API Reference Server Data Models
The RESO Commander can be used to generate data models for the Web API Reference server from the currently approved [Data Dictionary Spreadsheet](src/main/resources/RESODataDictionary-1.7.xlsx).
The Commander project's copy of the sheet needs to be updated with a copy of the [DD Google Sheet](https://docs.google.com/spreadsheets/d/1SZ0b6T4_lz6ti6qB2Je7NSz_9iNOaV_v9dbfhPwWgXA/edit?usp=sharing) prior to generating reference metadata.
```
$ java -jar path/to/web-api-commander.jar --generateResourceInfoModels
```
New ResourceInfo Models for the Web API Reference Server will be generated and placed in a timestamped directory relative to your current path.
## Generating RESO Data Dictionary Reference Metadata ## Generating RESO Data Dictionary Reference Metadata
In addition to generating DD acceptance tests, the RESO Commander can generate reference metadata based on the current reference [Data Dictionary Spreadsheet](src/main/resources/RESODataDictionary-1.7.xlsx). In addition to generating DD acceptance tests, the RESO Commander can generate reference metadata based on the current reference [Data Dictionary Spreadsheet](src/main/resources/RESODataDictionary-1.7.xlsx).

View File

@ -175,6 +175,35 @@ task testDataDictionary_1_7() {
} }
} }
task testIdxPayload_1_7() {
group = 'RESO Certification'
description = 'Runs IDX Payload 1.7 Automated Acceptance Tests.' +
'\n Example: ' +
'\n $ ./gradlew testIdxPayload_1_7 -DpathToRESOScript=/path/to/web-api-core-1.0.2.resoscript -DshowResponses=true\n'
dependsOn jar
doLast {
javaexec {
main = "io.cucumber.core.cli.Main"
classpath = configurations.cucumberRuntime + sourceSets.main.output + sourceSets.test.output
systemProperties = System.getProperties()
args = [
'--strict',
'--plugin',
'pretty',
'--plugin',
'json:build/idx-payload.dd-1.7.json',
'--plugin',
'html:build/idx-payload.dd-1.7.html',
'--glue',
'org.reso.certification.stepdefs#IDXPayload',
'src/main/java/org/reso/certification/features/payloads/idx-payload.feature'
]
}
}
}
task generateCertificationReport_DD_1_7() { task generateCertificationReport_DD_1_7() {
group = 'RESO Certification' group = 'RESO Certification'
description = 'Runs Data Dictionary 1.7 tests and creates a certification report' + description = 'Runs Data Dictionary 1.7 tests and creates a certification report' +

View File

@ -6,6 +6,8 @@ import org.apache.http.Header;
import org.apache.http.HttpStatus; import org.apache.http.HttpStatus;
import org.apache.http.NameValuePair; import org.apache.http.NameValuePair;
import org.apache.http.client.HttpClient; import org.apache.http.client.HttpClient;
import org.apache.http.client.config.CookieSpecs;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.client.entity.UrlEncodedFormEntity; import org.apache.http.client.entity.UrlEncodedFormEntity;
import org.apache.http.client.methods.CloseableHttpResponse; import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpPost; import org.apache.http.client.methods.HttpPost;
@ -64,6 +66,7 @@ public class OAuth2HttpClientFactory extends AbstractHttpClientFactory {
try { try {
LOG.debug("Fetching access token..."); LOG.debug("Fetching access token...");
final HttpPost post = new HttpPost(tokenUri); final HttpPost post = new HttpPost(tokenUri);
post.setConfig(RequestConfig.custom().setCookieSpec(CookieSpecs.STANDARD).build());
params.add(new BasicNameValuePair("grant_type", "client_credentials")); params.add(new BasicNameValuePair("grant_type", "client_credentials"));
params.add(new BasicNameValuePair("client_id", clientId)); params.add(new BasicNameValuePair("client_id", clientId));
@ -118,6 +121,7 @@ public class OAuth2HttpClientFactory extends AbstractHttpClientFactory {
.setUserAgent(USER_AGENT) .setUserAgent(USER_AGENT)
.setDefaultHeaders(headers) .setDefaultHeaders(headers)
.setConnectionManager(connectionManager) .setConnectionManager(connectionManager)
.setDefaultRequestConfig(RequestConfig.custom().setCookieSpec(CookieSpecs.STANDARD).build())
.build(); .build();
} }

View File

@ -2,6 +2,8 @@ package org.reso.auth;
import org.apache.http.Header; import org.apache.http.Header;
import org.apache.http.client.HttpClient; import org.apache.http.client.HttpClient;
import org.apache.http.client.config.CookieSpecs;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.conn.HttpClientConnectionManager; import org.apache.http.conn.HttpClientConnectionManager;
import org.apache.http.impl.client.CloseableHttpClient; import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder; import org.apache.http.impl.client.HttpClientBuilder;
@ -44,11 +46,11 @@ public class TokenHttpClientFactory extends AbstractHttpClientFactory {
return HttpClientBuilder.create() return HttpClientBuilder.create()
.setUserAgent(USER_AGENT) .setUserAgent(USER_AGENT)
.setDefaultHeaders(headers) .setDefaultHeaders(headers)
.setDefaultRequestConfig(RequestConfig.custom().setCookieSpec(CookieSpecs.STANDARD).build())
.setConnectionManager(connectionManager) .setConnectionManager(connectionManager)
.build(); .build();
} }
@Override @Override
public void close(final HttpClient httpClient) { public void close(final HttpClient httpClient) {
try { try {

View File

@ -122,7 +122,7 @@ public class BDDProcessor extends WorksheetProcessor {
return tags; return tags;
} }
private static String padLeft(String s, int n) { public static String padLeft(String s, int n) {
String[] padding = new String[n]; String[] padding = new String[n];
Arrays.fill(padding, " "); Arrays.fill(padding, " ");
return String.join("", padding) + s; return String.join("", padding) + s;

View File

@ -0,0 +1,66 @@
package org.reso.certification.codegen;
import org.reso.models.ReferenceStandardField;
import java.util.LinkedHashMap;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
public class DDCacheProcessor extends WorksheetProcessor {
Map<String, List<ReferenceStandardField>> standardFieldCache = new LinkedHashMap<>();
private void addToFieldCache(ReferenceStandardField field) {
standardFieldCache.putIfAbsent(field.getParentResourceName(), new LinkedList<>());
standardFieldCache.get(field.getParentResourceName()).add(field);
}
public Map<String, List<ReferenceStandardField>> getStandardFieldCache() {
return standardFieldCache;
}
@Override
void processNumber(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processStringListSingle(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processString(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processBoolean(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processStringListMulti(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processDate(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processTimestamp(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void processCollection(ReferenceStandardField field) {
addToFieldCache(field);
}
@Override
void generateOutput() {
//no output
}
}

View File

@ -149,7 +149,7 @@ public class DDLProcessor extends WorksheetProcessor {
.append("\n\n") .append("\n\n")
.append("CREATE TABLE IF NOT EXISTS ") .append("CREATE TABLE IF NOT EXISTS ")
//exception for ouid so it doesn't become o_u_i_d //exception for ouid so it doesn't become o_u_i_d
.append(CaseFormat.UPPER_CAMEL.to(CaseFormat.LOWER_UNDERSCORE, resourceName).replace("o_u_i_d", "ouid")) .append(buildDbTableName(resourceName))
.append(" ( ") .append(" ( ")
.append(templateContent).append(",\n") .append(templateContent).append(",\n")
.append(PADDING).append(PADDING).append(buildPrimaryKeyMarkup(resourceName)).append("\n") .append(PADDING).append(PADDING).append(buildPrimaryKeyMarkup(resourceName)).append("\n")
@ -164,6 +164,12 @@ public class DDLProcessor extends WorksheetProcessor {
LOG.info(this::buildInsertLookupsStatement); LOG.info(this::buildInsertLookupsStatement);
} }
public static String buildDbTableName(String resourceName) {
return CaseFormat.UPPER_CAMEL.to(CaseFormat.LOWER_UNDERSCORE, resourceName).replace("o_u_i_d", "ouid");
}
private static String buildCreateLookupStatement(boolean useKeyNumeric) { private static String buildCreateLookupStatement(boolean useKeyNumeric) {
return return
"\n\n/**\n" + "\n\n/**\n" +

View File

@ -14,13 +14,15 @@ public class DataDictionaryCodeGenerator {
WorksheetProcessor processor = null; WorksheetProcessor processor = null;
Workbook workbook = null; Workbook workbook = null;
private DataDictionaryCodeGenerator() {
//private constructor, should not instantiate directly
}
/** /**
* Instantiates a new DataDictionary generator with the given worksheet processor * Instantiates a new DataDictionary generator with the given worksheet processor
* @param processor the worksheet processor to use to generate the data dictionary * @param processor the worksheet processor to use to generate the data dictionary
* @throws Exception an exception if the Data Dictionary processor is null
*/ */
public DataDictionaryCodeGenerator(WorksheetProcessor processor) throws Exception { public DataDictionaryCodeGenerator(WorksheetProcessor processor) {
if (processor == null) throw new Exception("Data Dictionary processor cannot be null!");
this.processor = processor; this.processor = processor;
processor.setReferenceResource(REFERENCE_WORKSHEET); processor.setReferenceResource(REFERENCE_WORKSHEET);
workbook = processor.getReferenceWorkbook(); workbook = processor.getReferenceWorkbook();

View File

@ -0,0 +1,249 @@
package org.reso.certification.codegen;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.poi.ss.usermodel.Sheet;
import org.reso.commander.common.Utils;
import org.reso.models.ReferenceStandardField;
import static org.reso.certification.codegen.DDLProcessor.buildDbTableName;
import static org.reso.certification.containers.WebAPITestContainer.EMPTY_STRING;
public class ResourceInfoProcessor extends WorksheetProcessor {
final static String
ANNOTATION_TERM_DISPLAY_NAME = "RESO.OData.Metadata.StandardName",
ANNOTATION_TERM_DESCRIPTION = "Core.Description",
ANNOTATION_TERM_URL = "RESO.DDWikiUrl";
private static final Logger LOG = LogManager.getLogger(ResourceInfoProcessor.class);
private static final String
FILE_EXTENSION = ".java";
public void processResourceSheet(Sheet sheet) {
super.processResourceSheet(sheet);
markup.append(ResourceInfoTemplates.buildClassInfo(sheet.getSheetName(), null));
}
@Override
void processNumber(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildNumberMarkup(row));
}
@Override
void processStringListSingle(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildStringListSingleMarkup(row));
}
@Override
void processString(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildStringMarkup(row));
}
@Override
void processBoolean(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildBooleanMarkup(row));
}
@Override
void processStringListMulti(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildStringListMultiMarkup(row));
}
@Override
void processDate(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildDateMarkup(row));
}
@Override
void processTimestamp(ReferenceStandardField row) {
markup.append(ResourceInfoTemplates.buildTimestampMarkup(row));
}
@Override
void processCollection(ReferenceStandardField row) {
LOG.debug("Collection Type is not supported!");
}
@Override
void generateOutput() {
LOG.info("Using reference worksheet: " + REFERENCE_WORKSHEET);
LOG.info("Generating ResourceInfo .java files for the following resources: " + resourceTemplates.keySet().toString());
resourceTemplates.forEach((resourceName, content) -> {
//put in local directory rather than relative to where the input file is
Utils.createFile(getDirectoryName(), resourceName + "Definition" + FILE_EXTENSION, content);
});
}
@Override
String getDirectoryName() {
return startTimestamp + "-ResourceInfoModels";
}
@Override
public void afterResourceSheetProcessed(Sheet sheet) {
assert sheet != null && sheet.getSheetName() != null;
String resourceName = sheet.getSheetName();
String templateContent =
markup.toString() + "\n" +
" return " + resourceName + "Definition.fieldList;\n" +
" }\n" +
"}";
resourceTemplates.put(resourceName, templateContent);
resetMarkupBuffer();
}
public static final class ResourceInfoTemplates {
/**
* Contains various templates used for test generation
* TODO: add a formatter rather than using inline spaces
*/
public static String buildClassInfo(String resourceName, String generatedTimestamp) {
if (resourceName == null) return null;
if (generatedTimestamp == null) generatedTimestamp = Utils.getIsoTimestamp();
final String definitionName = resourceName + "Definition";
return "package org.reso.service.data.definition;\n" + "\n" +
"import org.apache.olingo.commons.api.edm.EdmPrimitiveTypeKind;\n" +
"import org.reso.service.data.meta.FieldInfo;\n" +
"import org.reso.service.data.meta.ResourceInfo;\n" + "\n" +
"import java.util.ArrayList;\n" + "\n" +
"// This class was autogenerated on: " + generatedTimestamp + "\n" +
"public class " + definitionName + " extends ResourceInfo {\n" +
" private static ArrayList<FieldInfo> fieldList = null;\n" + "\n" +
" public " + definitionName + "() {" + "\n" +
" this.tableName = " + buildDbTableName(resourceName) + ";\n" +
" this.resourcesName = " + resourceName + ";\n" +
" this.resourceName = " + resourceName + ";\n" +
" }\n" + "\n" +
" public ArrayList<FieldInfo> getFieldList() {\n" +
" return " + definitionName + ".getStaticFieldList();\n" +
" }\n" + "\n" +
" public static ArrayList<FieldInfo> getStaticFieldList() {\n" +
" if (null != " + definitionName + ".fieldList) {\n" +
" return " + definitionName + ".fieldList;\n" +
" }\n" + "\n" +
" ArrayList<FieldInfo> list = new ArrayList<FieldInfo>();\n" +
" " + definitionName + ".fieldList = list;\n" +
" FieldInfo fieldInfo = null;\n";
}
public static String buildBooleanMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
//TODO: refactor into one method that takes a type name and returns the appropriate content
return "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.Boolean.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n" +
" list.add(fieldInfo);" +
"\n";
}
public static String buildDateMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
return "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.Date.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n" +
" list.add(fieldInfo);" +
"\n";
}
/**
* Provides special routing for Data Dictionary numeric types, which may be Integer or Decimal
*
* @param field the numeric field to build type markup for
* @return a string containing specific markup for the given field
*/
public static String buildNumberMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
if (field.getSuggestedMaxPrecision() != null) return buildDecimalMarkup(field);
else return buildIntegerMarkup(field);
}
public static String buildDecimalMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
return "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.Decimal.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n" +
" list.add(fieldInfo);" +
"\n";
//TODO: Length is actually scale for Decimal fields by the DD! :/
//TODO: Add setScale property to Decimal types in FieldInfo
//TODO: Precision is actually Scale for Decimal fields by the DD! :/
//TODO: Add setPrecision property to Decimal types in FieldInfo
}
public static String buildIntegerMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
return "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.Int64.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n" +
" list.add(fieldInfo);" +
"\n";
}
private static String buildStandardEnumerationMarkup(String lookupName) {
//TODO: add code to build Lookups
return "\n /* TODO: buildStandardEnumerationMarkup */\n";
}
public static String buildStringListMultiMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
//TODO: add multi lookup handler
return "\n /* TODO: buildStringListMultiMarkup */\n";
}
public static String buildStringListSingleMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
//TODO: add single lookup handler
return "\n /* TODO: buildStringListSingleMarkup */\n";
}
public static String buildStringMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
String content = "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.String.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n";
if (field.getSuggestedMaxLength() != null) {
content +=
" fieldInfo.setMaxLength(" + field.getSuggestedMaxLength() + ");\n";
}
content +=
" list.add(fieldInfo);" + "\n";
return content;
}
public static String buildTimestampMarkup(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;
return "\n" +
" fieldInfo = new FieldInfo(\"" + field.getStandardName() + "\", EdmPrimitiveTypeKind.DateTime.getFullQualifiedName());\n" +
" fieldInfo.addAnnotation(\"" + field.getDisplayName() + "\", \"" + ANNOTATION_TERM_DISPLAY_NAME + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getDefinition() + "\", \"" + ANNOTATION_TERM_DESCRIPTION + "\");\n" +
" fieldInfo.addAnnotation(\"" + field.getWikiPageUrl() + "\", \"" + ANNOTATION_TERM_URL + "\");\n" +
" list.add(fieldInfo);" +
"\n";
}
}
}

View File

@ -81,6 +81,8 @@ public final class WebAPITestContainer implements TestContainer {
private final AtomicBoolean isUsingMetadataFile = new AtomicBoolean(false); private final AtomicBoolean isUsingMetadataFile = new AtomicBoolean(false);
// request instance variables - these get resetMarkupBuffer with every request // request instance variables - these get resetMarkupBuffer with every request
//TODO: refactor underlying response properties to use a ODataTransportWrapper (or any TransportWrapper)
// and create the test container with the appropriate response of the transport wrapper
private final AtomicReference<String> selectList = new AtomicReference<>(); private final AtomicReference<String> selectList = new AtomicReference<>();
private final AtomicReference<ODataRawResponse> oDataRawResponse = new AtomicReference<>(); private final AtomicReference<ODataRawResponse> oDataRawResponse = new AtomicReference<>();
private final AtomicReference<Request> request = new AtomicReference<>(); private final AtomicReference<Request> request = new AtomicReference<>();

View File

@ -0,0 +1,53 @@
Feature: IDX Payload Endorsement (Web API)
All Scenarios passing means the given Web API server passes the IDX Payloads Endorsement
# SEE: https://docs.google.com/document/d/1btCduOpWWzeadeMcSviA8M9dclIz23P-bPUGKwcD0NY/edit?usp=sharing
Background:
Given a RESOScript file was provided
And Client Settings and Parameters were read from the file
And a test container was successfully created from the given RESOScript
And the test container uses an authorization_code or client_credentials for authentication
# TODO: tie back into common metadata validation shared scenario
@metadata-validation @idx-payload-endorsement @dd-1.7 @web-api-1.0.2
Scenario: Request and Validate Server Metadata
When XML Metadata are requested from the service root in "ClientSettings_WebAPIURI"
Then the server responds with a status code of 200
And the server has an OData-Version header value of "4.0" or "4.01"
And the XML Metadata response is valid XML
And the XML Metadata returned by the server are valid
And the XML Metadata returned by the server contains Edm metadata
And the Edm metadata returned by the server are valid
And the metadata contains a valid service document
And each resource MUST have a primary key field by the OData specification
@standard-resource-sampling @dd-1.7 @idx-payload-endorsement
Scenario: Standard Resource Sampling
Given that valid metadata have been requested from the server
And the metadata contains RESO Standard Resources
And "payload-samples" has been created in the build directory
Then up to 10000 records are sampled from each resource with "IDX" payload samples stored in "payload-samples"
# data are not stored in this case, just sampled and scored
@local-resource-sampling @dd-1.7 @idx-payload-endorsement
Scenario: Non Standard Resource Sampling - Request Data from Each Server Resource
Given that valid metadata have been requested from the server
And the metadata contains local resources
Then up to 10000 records are sampled from each local resource
@idx-payload-endorsement @dd-1.7
Scenario: A Data Availability Report is Created from Sampled Records
Given standard and local resources have been processed
Then a data availability report is created in "data-availability-report.json"
@idx-user-sampling @dd-1.7 @idx-payload-endorsement
Scenario: IDX User Sampling
Given samples exist in "payload-samples" in the build directory
And a RESOScript file was provided for the IDX User
And Client Settings and Parameters were read from the file
And a test container was successfully created from the given RESOScript
And the test container uses an authorization_code or client_credentials for authentication
When samples from "payload-samples" are fetched as the representative user for each resource in the "IDX" payload
Then each result MUST contain the string version of the key and the following fields
|ModificationTimestamp|
And the "IDX" payload field values MUST match those in the samples

View File

@ -16,35 +16,39 @@ Feature: Web API Server Add/Edit Endorsement
# OData-Version: 4.01 # OData-Version: 4.01
# Content-Type: application/json;odata.metadata=minimal # Content-Type: application/json;odata.metadata=minimal
# Accept: application/json # Accept: application/json
#
# This is without the prefer header and minimal value
#
@create @create-succeeds @add-edit-endorsement @rcp-010 @1.0.2 @create @create-succeeds @add-edit-endorsement @rcp-010 @1.0.2
Scenario: Create operation succeeds using a given payload Scenario: Create operation succeeds using a given payload
Given valid metadata have been retrieved Given valid metadata have been retrieved
And request data has been provided in "create-succeeds.json" And request data has been provided in "create-succeeds.json"
And request data in "create-succeeds.json" is valid JSON And request data in "create-succeeds.json" is valid JSON
And schema in "create-succeeds.json" matches the metadata And schema in "create-succeeds.json" matches the metadata
And the request header "OData-Version" is "4.01" And the request header "OData-Version" "equals" one of the following values
And the request header "Content-Type" contains "application/json" |4.0|4.01|
And the request header "Accept" is "application/json" And the request header "Content-Type" "contains" "application/json"
And the request header "Accept" "contains" "application/json"
When a "POST" request is made to the "resource-endpoint" URL with data in "create-succeeds.json" When a "POST" request is made to the "resource-endpoint" URL with data in "create-succeeds.json"
Then the test is skipped if the server responds with a status code of 401 Then the server responds with one of the following status codes
# TODO: check spec for 204 |201|
When the server responds with one of the following status codes And the response header "OData-Version" "equals" one of the following values
|200|201| |4.0|4.01|
Then the response header "OData-Version" is "4.01" And the response header "EntityId" "MUST" "be present"
And the response header "EntityId" is present And the response header "Location" "MUST" "be present"
And the response header "Location" is present And the response header "Location" "is a valid URL"
And the response header "Location" is a valid URL And the response header "Location" "MUST" reference the resource being created
When the server responds with a 200 status code "valid JSON exists" in the JSON response And the response is valid JSON
When the server responds with a 200 status code "@odata.context" "is present" in the JSON response And the JSON response "MUST" contain "@odata.context"
When the server responds with a 200 status code "@odata.context" "is a valid URL" in the JSON response And the JSON response value "@odata.context" "is a valid URL"
When the server responds with a 200 status code "@odata.id" "is present" in the JSON response And the JSON response "MUST" contain "@odata.id"
When the server responds with a 200 status code "@odata.id" "is a valid URL" in the JSON response And the JSON response value "@odata.id" "is a valid URL"
When the server responds with a 200 status code "@odata.editLink" "is present" in the JSON response And the JSON response "MAY" contain "@odata.editLink"
When the server responds with a 200 status code "@odata.editLink" "is a valid URL" in the JSON response And the JSON response value "@odata.editLink" "is a valid URL"
When the server responds with a 200 status code "@odata.etag" "is present" in the JSON response And the JSON response "MAY" contain "@odata.etag"
When the server responds with a 200 status code "@odata.etag" "starts with" "W/" in the JSON response And the JSON response value "@odata.etag" "starts with" "W/"
When the server responds with a 200 status code data from "create-succeeds.json" "exists" in the JSON response And the JSON response "MUST" contain all JSON data in "create-succeeds.json"
When a "GET" request is made to the response header "Location" URL When a "GET" request is made to the URL in response header "Location"
Then the server responds with a status code of 200 Then the server responds with a status code of 200
And the response has header "OData-Version" with one of the following values And the response has header "OData-Version" with one of the following values
|4.0|4.01| |4.0|4.01|
@ -58,7 +62,7 @@ Feature: Web API Server Add/Edit Endorsement
# SEE: https://reso.atlassian.net/wiki/spaces/RESOWebAPIRCP/pages/2239399511/RCP+-+WEBAPI-010+Add+Functionality+to+Web+API+Specification#Error-Message-Example # SEE: https://reso.atlassian.net/wiki/spaces/RESOWebAPIRCP/pages/2239399511/RCP+-+WEBAPI-010+Add+Functionality+to+Web+API+Specification#Error-Message-Example
# POST serviceRoot/Property # POST serviceRoot/Property
# OData-Version: 4.01 # OData-Version: 4.01
# Content-Type: application/json;odata.metadata=minimal # Content-Type: application/json
# Accept: application/json # Accept: application/json
@create @create-fails @add-edit-endorsement @rcp-010 @1.0.2 @create @create-fails @add-edit-endorsement @rcp-010 @1.0.2
Scenario: Create operation fails using a given payload Scenario: Create operation fails using a given payload
@ -66,12 +70,16 @@ Feature: Web API Server Add/Edit Endorsement
And request data has been provided in "create-fails.json" And request data has been provided in "create-fails.json"
And request data in "create-fails.json" is valid JSON And request data in "create-fails.json" is valid JSON
And schema in "create-fails.json" matches the metadata And schema in "create-fails.json" matches the metadata
And the request header "OData-Version" is "4.01" And the request header "OData-Version" "equals" one of the following values
And the request header "Content-Type" is "application/json;odata.metadata=minimal" |4.0|4.01|
And the request header "Accept" is "application/json" And the request header "Content-Type" "MUST" "be present"
And the request header "Content-Type" "equals" "application/json"
And the request header "Accept" "MUST" "be present"
And the request header "Accept" "contains" "application/json"
When a "POST" request is made to the "resource-endpoint" URL with data in "create-fails.json" When a "POST" request is made to the "resource-endpoint" URL with data in "create-fails.json"
Then the server responds with one of the following error codes Then the server responds with one of the following error codes
|400|401|403|405|408|500|501|503| |400|
And the response header "OData-Version" is "4.01" And the response has header "OData-Version" with one of the following values
|4.0|4.01|
And the error response is in a valid format And the error response is in a valid format
And the values in the "target" field in the JSON payload "error.details" path are contained within the metadata And the values in the "target" field in the JSON payload "error.details" path are contained within the metadata

View File

@ -114,7 +114,6 @@ public class CertificationReportGenerator {
} catch (IOException e) { } catch (IOException e) {
e.printStackTrace(); e.printStackTrace();
} }
return null; return null;
} }
} }

View File

@ -70,8 +70,8 @@ public class DataDictionary {
//named args //named args
private static final String SHOW_RESPONSES_ARG = "showResponses"; private static final String SHOW_RESPONSES_ARG = "showResponses";
private static final String USE_STRICT_MODE_ARG = "strict"; private static final String USE_STRICT_MODE_ARG = "strict";
private static final String PATH_TO_METADATA_ARG = "pathToMetadata"; protected static final String PATH_TO_METADATA_ARG = "pathToMetadata";
private static final String PATH_TO_RESOSCRIPT_ARG = "pathToRESOScript"; protected static final String PATH_TO_RESOSCRIPT_ARG = "pathToRESOScript";
private static final String LOOKUP_VALUE = "lookupValue"; private static final String LOOKUP_VALUE = "lookupValue";
//extract any params here //extract any params here
@ -150,9 +150,8 @@ public class DataDictionary {
@When("a metadata file is provided") @When("a metadata file is provided")
public void aMetadataFileIsProvided() { public void aMetadataFileIsProvided() {
boolean result = false;
if (isUsingMetadata) { if (isUsingMetadata) {
result = pathToMetadata != null && Files.exists(Paths.get(pathToMetadata)); boolean result = pathToMetadata != null && Files.exists(Paths.get(pathToMetadata));
if (!result) { if (!result) {
failAndExitWithErrorMessage("Path to given metadata file does not exist: " + PATH_TO_METADATA_ARG + "=" + pathToMetadata, scenario); failAndExitWithErrorMessage("Path to given metadata file does not exist: " + PATH_TO_METADATA_ARG + "=" + pathToMetadata, scenario);
} }
@ -195,8 +194,6 @@ public class DataDictionary {
@And("valid metadata were retrieved from the server") @And("valid metadata were retrieved from the server")
public void validMetadataWereRetrievedFromTheServer() { public void validMetadataWereRetrievedFromTheServer() {
boolean result = false;
if (isUsingRESOScript && container.getShouldValidateMetadata()) { if (isUsingRESOScript && container.getShouldValidateMetadata()) {
//request metadata from server using service root in RESOScript file //request metadata from server using service root in RESOScript file
TestUtils.assertXMLMetadataAreRequestedFromTheServer(container, scenario); TestUtils.assertXMLMetadataAreRequestedFromTheServer(container, scenario);

View File

@ -0,0 +1,592 @@
package org.reso.certification.stepdefs;
import com.google.common.collect.Sets;
import com.google.common.hash.Hashing;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import com.google.inject.Inject;
import io.cucumber.java.Before;
import io.cucumber.java.Scenario;
import io.cucumber.java.en.And;
import io.cucumber.java.en.Given;
import io.cucumber.java.en.Then;
import io.cucumber.java.en.When;
import org.apache.http.HttpStatus;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.olingo.client.api.data.ResWrap;
import org.apache.olingo.commons.api.data.EntityCollection;
import org.apache.olingo.commons.api.edm.EdmEntityType;
import org.apache.olingo.commons.api.edm.EdmKeyPropertyRef;
import org.apache.olingo.commons.api.edm.EdmNamed;
import org.apache.olingo.commons.api.edm.EdmPrimitiveTypeKind;
import org.apache.olingo.commons.api.format.ContentType;
import org.reso.certification.codegen.DDCacheProcessor;
import org.reso.certification.codegen.DataDictionaryCodeGenerator;
import org.reso.certification.containers.WebAPITestContainer;
import org.reso.commander.common.DataDictionaryMetadata;
import org.reso.commander.common.Utils;
import org.reso.models.*;
import java.io.ByteArrayInputStream;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.time.OffsetDateTime;
import java.time.format.DateTimeFormatter;
import java.time.temporal.ChronoUnit;
import java.util.*;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.concurrent.atomic.AtomicReference;
import java.util.stream.Collectors;
import static io.restassured.path.json.JsonPath.from;
import static org.junit.Assert.assertNotNull;
import static org.junit.Assume.assumeTrue;
import static org.reso.certification.containers.WebAPITestContainer.EMPTY_STRING;
import static org.reso.commander.Commander.NOT_OK;
import static org.reso.commander.common.ErrorMsg.getDefaultErrorMessage;
import static org.reso.commander.common.TestUtils.failAndExitWithErrorMessage;
public class IDXPayload {
private static final Logger LOG = LogManager.getLogger(IDXPayload.class);
private static final String MODIFICATION_TIMESTAMP_FIELD = "ModificationTimestamp";
private static final String POSTAL_CODE_FIELD = "PostalCode";
private static final int TOP_COUNT = 100;
private static final int MAX_RETRIES = 3;
private static final String SAMPLES_DIRECTORY_ROOT = "build";
private static final String SAMPLES_DIRECTORY_TEMPLATE = SAMPLES_DIRECTORY_ROOT + File.separator + "%s";
private static final String PATH_TO_RESOSCRIPT_KEY = "pathToRESOScript";
final String REQUEST_URI_TEMPLATE = "?$filter=%s" + " lt %s&$orderby=%s desc&$top=" + TOP_COUNT;
final String COUNT_REQUEST_URI_TEMPLATE = "?$count=true";
//TODO: get this from the parameters
private final static boolean DEBUG = false;
private static Scenario scenario;
private final static AtomicBoolean hasStandardResources = new AtomicBoolean(false);
private final static AtomicBoolean hasLocalResources = new AtomicBoolean(false);
private final static AtomicReference<Set<String>> standardResources = new AtomicReference<>(new LinkedHashSet<>());
private final static AtomicReference<Set<String>> localResources = new AtomicReference<>(new LinkedHashSet<>());
private final static AtomicReference<WebAPITestContainer> container = new AtomicReference<>();
private final static AtomicBoolean hasSamplesDirectoryBeenCleared = new AtomicBoolean(false);
private final static AtomicReference<Map<String, List<PayloadSample>>> resourcePayloadSampleMap =
new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
private final static AtomicReference<Map<String, List<ReferenceStandardField>>> standardFieldCache =
new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
private final static AtomicReference<Map<String, Integer>> resourceCounts =
new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
@Inject
public IDXPayload(WebAPITestContainer c) {
container.set(c);
}
@Before
public void beforeStep(Scenario scenario) {
final String pathToRESOScript = System.getProperty(PATH_TO_RESOSCRIPT_KEY, null);
if (pathToRESOScript == null) return;
IDXPayload.scenario = scenario;
if (!container.get().getIsInitialized()) {
container.get().setSettings(Settings.loadFromRESOScript(new File(System.getProperty(PATH_TO_RESOSCRIPT_KEY))));
container.get().initialize();
}
}
/**
* Creates a data availability report for the given samples map
* @param resourcePayloadSamplesMap the samples map to create the report from
* @param reportName the name of the report
*/
public void createDataAvailabilityReport(Map<String, List<PayloadSample>> resourcePayloadSamplesMap,
String reportName, Map<String, Integer> resourceCounts) {
PayloadSampleReport payloadSampleReport = new PayloadSampleReport(container.get().getEdm(), resourcePayloadSamplesMap, resourceCounts);
GsonBuilder gsonBuilder = new GsonBuilder().setPrettyPrinting();
gsonBuilder.registerTypeAdapter(PayloadSampleReport.class, payloadSampleReport);
Utils.createFile(SAMPLES_DIRECTORY_ROOT, reportName, gsonBuilder.create().toJson(payloadSampleReport));
}
/**
* Hashes the given values
*
* @param values items to hash, will be joined together, then hashed
* @return the SHA hash of the given values
*/
private static String hashValues(String... values) {
return Hashing.sha256().hashString(String.join(EMPTY_STRING, values), StandardCharsets.UTF_8).toString();
}
/**
* Builds a request URI string, taking into account whether the sampling is being done with an optional
* filter, for instance in the shared systems case
* @param resourceName the resource name to query
* @param timestampField the timestamp field for the resource
* @param lastFetchedDate the last fetched date for filtering
* @return a string OData query used for sampling
*/
private String buildODataTimestampRequestUriString(String resourceName, String timestampField, OffsetDateTime lastFetchedDate) {
String requestUri = container.get().getCommander().getClient()
.newURIBuilder(container.get().getServiceRoot())
.appendEntitySetSegment(resourceName).build().toString();
requestUri += String.format(REQUEST_URI_TEMPLATE, timestampField,
lastFetchedDate.format(DateTimeFormatter.ISO_INSTANT), timestampField);
return requestUri;
}
/**
* Builds a request URI string for counting the number of available items on a resource, taking into account
* whether the sample is being done with an optional filter, for instance in the shared system case
* @param resourceName the resource name to query
* @return a request URI string for getting OData counts
*/
private String buildODataCountRequestUriString(String resourceName) {
String requestUri = container.get().getCommander().getClient()
.newURIBuilder(container.get().getServiceRoot())
.appendEntitySetSegment(resourceName).build().toString();
requestUri += COUNT_REQUEST_URI_TEMPLATE;
return requestUri;
}
/**
* Queries the server and fetches a resource count for the given resource name
* @param resourceName the resource name to get the count for
* @return the count found for the resource, or null if the request did not return a count
*/
private Integer getResourceCount(String resourceName) {
ODataTransportWrapper transportWrapper;
String requestUri = buildODataCountRequestUriString(resourceName);
Integer count = null;
LOG.info("\n\nMaking count request to the " + resourceName + " resource: " + requestUri);
transportWrapper = container.get().getCommander().executeODataGetRequest(requestUri);
if (transportWrapper.getHttpResponseCode() == null || transportWrapper.getHttpResponseCode() != HttpStatus.SC_OK) {
if (transportWrapper.getHttpResponseCode() == null) {
scenario.log(getDefaultErrorMessage("Count request to", requestUri,
"failed! No response code was provided. Check commander.log for any errors..."));
} else {
scenario.log(getDefaultErrorMessage("Count request to", requestUri,
"failed with response code", transportWrapper.getHttpResponseCode().toString()));
}
} else {
String oDataCountString = from(transportWrapper.getResponseData()).getString("\"@odata.count\"");
count = oDataCountString != null && oDataCountString.trim().length() > 0 ? Integer.parseInt(oDataCountString) : 0;
scenario.log("Total record count for the " + resourceName + " resource: " + oDataCountString);
}
return count;
}
/**
* Fetches and processes records in cases where encoding the results is necessary
*
* @param resourceName the resource name to sample from
* @param targetRecordCount the target record count to fetch (will stop before then if the end is reached)
* @param encodedResultsDirectoryName the directory name for encoded results
* @return a list of PayloadSample items
*/
List<PayloadSample> fetchAndProcessRecords(String resourceName, int targetRecordCount, String encodedResultsDirectoryName) {
final AtomicReference<OffsetDateTime> lastFetchedDate = new AtomicReference<>(OffsetDateTime.now());
final List<String> timestampCandidateFields = new LinkedList<>();
final AtomicReference<EdmEntityType> entityType = new AtomicReference<>();
final AtomicReference<Map<String, String>> encodedSample = new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
final AtomicReference<ODataTransportWrapper> transportWrapper = new AtomicReference<>();
final AtomicReference<ResWrap<EntityCollection>> entityCollectionResWrap = new AtomicReference<>();
final AtomicReference<String> timestampField = new AtomicReference<>();
final AtomicBoolean hasRecords = new AtomicBoolean(true);
final AtomicReference<PayloadSample> payloadSample = new AtomicReference<>();
final AtomicReference<List<PayloadSample>> payloadSamples =
new AtomicReference<>(Collections.synchronizedList(new LinkedList<>()));
boolean hasStandardTimestampField = false;
String requestUri;
int recordsProcessed = 0;
int numRetries = 0;
int lastTimestampCandidateIndex = 0;
container.get().getEdm().getSchemas().forEach(edmSchema ->
edmSchema.getEntityTypes().stream().filter(edmEntityType -> edmEntityType.getName().equals(resourceName))
.findFirst().ifPresent(entityType::set));
//return null if the entity type isn't defined
if (entityType.get() == null) return new ArrayList<>();
if (entityType.get().getProperty(MODIFICATION_TIMESTAMP_FIELD) == null) {
scenario.log("Could not find " + MODIFICATION_TIMESTAMP_FIELD + " in the " + resourceName + " resource!\n");
scenario.log("Searching for suitable timestamp fields...");
entityType.get().getPropertyNames().forEach(propertyName -> {
try {
if (entityType.get().getProperty(propertyName).getType().getFullQualifiedName().getFullQualifiedNameAsString()
.contentEquals(EdmPrimitiveTypeKind.DateTimeOffset.getFullQualifiedName().getFullQualifiedNameAsString())) {
scenario.log("Found Edm.DateTimeOffset field " + propertyName + " in the " + resourceName + " resource!\n");
timestampCandidateFields.add(propertyName);
}
} catch (Exception ex) {
LOG.error(ex);
}
});
} else {
hasStandardTimestampField = true;
}
final List<EdmKeyPropertyRef> keyFields = entityType.get().getKeyPropertyRefs();
scenario.log("Sampling resource: " + resourceName);
scenario.log("Keys found: " + keyFields.stream().map(EdmKeyPropertyRef::getName).collect(Collectors.joining(", ")));
//loop and fetch records as long as items are available and we haven't reached our target count yet
while (hasRecords.get() && recordsProcessed < targetRecordCount) {
if (hasStandardTimestampField) {
timestampField.set(MODIFICATION_TIMESTAMP_FIELD);
} else if (timestampCandidateFields.size() > 0 && lastTimestampCandidateIndex < timestampCandidateFields.size()) {
timestampField.set(timestampCandidateFields.get(lastTimestampCandidateIndex++));
} else {
scenario.log(getDefaultErrorMessage("Could not find a suitable timestamp field in the "
+ resourceName + " resource to sample with..."));
//skip this resource since no suitable fields were found
break;
}
payloadSample.set(new PayloadSample(resourceName, timestampField.get(),
keyFields.stream().map(EdmKeyPropertyRef::getName).collect(Collectors.toList())));
requestUri = buildODataTimestampRequestUriString(resourceName, timestampField.get(), lastFetchedDate.get());
payloadSample.get().setRequestUri(requestUri);
LOG.info("Making request to: " + requestUri);
transportWrapper.set(container.get().getCommander().executeODataGetRequest(requestUri));
// retries. sometimes requests can time out and fail and we don't want to stop sampling
// immediately, but retry a couple of times before we bail
if (recordsProcessed == 0 && transportWrapper.get().getResponseData() == null) {
//only count retries if we're constantly making requests and not getting anything
numRetries += 1;
} else {
numRetries = 0;
}
if (numRetries >= MAX_RETRIES) {
if (timestampCandidateFields.size() > 0 && (lastTimestampCandidateIndex < timestampCandidateFields.size())) {
LOG.info("Trying next candidate timestamp field: " + timestampCandidateFields.get(lastTimestampCandidateIndex));
numRetries = 0;
} else {
LOG.info("Could not fetch records from the " + resourceName + " resource after " + MAX_RETRIES
+ " tries from the given URL: " + requestUri);
break;
}
}
if (transportWrapper.get().getHttpResponseCode() == null || transportWrapper.get().getHttpResponseCode() != HttpStatus.SC_OK) {
if (transportWrapper.get().getHttpResponseCode() == null) {
LOG.error(getDefaultErrorMessage("Request to", requestUri,
"failed! No response code was provided. Check commander.log for any errors..."));
} else {
scenario.log(getDefaultErrorMessage("Request to", requestUri,
"failed with response code", transportWrapper.get().getHttpResponseCode().toString()));
}
break;
} else {
LOG.info("Time taken: "
+ (transportWrapper.get().getElapsedTimeMillis() >= 1000 ? (transportWrapper.get().getElapsedTimeMillis() / 1000) + "s"
: transportWrapper.get().getElapsedTimeMillis() + "ms"));
try {
payloadSample.get().setResponseSizeBytes(transportWrapper.get().getResponseData().getBytes().length);
entityCollectionResWrap.set(container.get().getCommander().getClient()
.getDeserializer(ContentType.APPLICATION_JSON)
.toEntitySet(new ByteArrayInputStream(transportWrapper.get().getResponseData().getBytes())));
if (entityCollectionResWrap.get().getPayload().getEntities().size() > 0) {
LOG.info("Hashing " + resourceName + " payload values...");
entityCollectionResWrap.get().getPayload().getEntities().forEach(entity -> {
encodedSample.set(Collections.synchronizedMap(new LinkedHashMap<>()));
entity.getProperties().forEach(property -> {
//value will be considered null unless trimmed string or collection has non-zero length
String value = property.getValue() != null
&& ((property.isCollection() && property.asCollection().size() > 0)
|| (property.isComplex() && property.asComplex().getValue().size() > 0)
|| property.getValue().toString().trim().length() > 0
|| property.isGeospatial() && property.asGeospatial() != null)
? property.getValue().toString() : null;
if (DEBUG) {
if (property.isCollection() && property.asCollection().size() > 0) {
LOG.info("Found Collection for field: " + property.getName() + ", value: " + property.asCollection());
}
if (property.isComplex() && property.asComplex().getValue().size() > 0) {
LOG.info("Found Complex Type for field: " + property.getName() + ", value: " + property.asComplex());
}
if (property.isEnum() && property.asEnum() != null) {
LOG.info("Found Enum for field" + property.getName() + ", value: " + property.asEnum());
}
if (property.isGeospatial() && property.asGeospatial() != null) {
LOG.info("Found Enum for field: " + property.getName() + ", value: " + property.asGeospatial());
}
}
//turn off hashing when DEBUG is true
if (!DEBUG && value != null) {
if (!(property.getName().contentEquals(timestampField.get())
|| property.getName().equals(POSTAL_CODE_FIELD)
|| keyFields.stream().reduce(true, (acc, f) -> acc && f.getName().contentEquals(property.getName()), Boolean::logicalAnd))) {
value = hashValues(property.getValue().toString());
}
}
// TODO: clean up. If field is timestamp field or key field unmask, if field is null report null, otherwise hash value
encodedSample.get().put(property.getName(), value);
if (property.getName().contentEquals(MODIFICATION_TIMESTAMP_FIELD)) {
if (OffsetDateTime.parse(property.getValue().toString()).isBefore(lastFetchedDate.get())) {
lastFetchedDate.set(OffsetDateTime.parse(property.getValue().toString()));
}
}
});
payloadSample.get().addSample(encodedSample.get());
});
LOG.info("Values encoded!");
recordsProcessed += entityCollectionResWrap.get().getPayload().getEntities().size();
LOG.info("Records processed: " + recordsProcessed + ". Target record count: " + targetRecordCount + "\n");
payloadSample.get().setResponseTimeMillis(transportWrapper.get().getElapsedTimeMillis());
if (encodedResultsDirectoryName != null) {
payloadSample.get().setPayloadFields(standardFieldCache.get().get(resourceName).stream()
.map(ReferenceStandardField::getStandardName).collect(Collectors.toList()));
//serialize results once resource processing has finished
Utils.createFile(String.format(SAMPLES_DIRECTORY_TEMPLATE, encodedResultsDirectoryName),
resourceName + "-" + Utils.getTimestamp() + ".json",
payloadSample.get().serialize(payloadSample.get(), PayloadSample.class, null).toString());
}
payloadSamples.get().add(payloadSample.get());
} else {
scenario.log("All available records fetched! Total: " + recordsProcessed);
hasRecords.set(false);
}
} catch (Exception ex) {
scenario.log("Error in fetchAndProcessRecords: " + getDefaultErrorMessage(ex.toString()));
scenario.log("Skipping sample...");
lastFetchedDate.set(lastFetchedDate.get().minus(1, ChronoUnit.WEEKS));
}
}
}
return payloadSamples.get();
}
/**
* fetches and processes records in cases where only sampling is required and encoding is not necessary
*
* @param resourceName the resource name to sample from
* @param targetRecordCount the target record count for the resource (will stop if the end of the records is reached)
* @return a list of PayloadSample items
*/
List<PayloadSample> fetchAndProcessRecords(String resourceName, int targetRecordCount) {
return fetchAndProcessRecords(resourceName, targetRecordCount, null);
}
/*==================================== TESTS START HERE ====================================*/
@Given("that valid metadata have been requested from the server")
public void thatValidMetadataHaveBeenRequestedFromTheServer() {
try {
if (container.get().hasValidMetadata()) {
if (standardFieldCache.get().size() == 0) {
LOG.info("Creating standard field cache...");
DDCacheProcessor cacheProcessor = new DDCacheProcessor();
DataDictionaryCodeGenerator generator = new DataDictionaryCodeGenerator(cacheProcessor);
generator.processWorksheets();
standardFieldCache.get().putAll(cacheProcessor.getStandardFieldCache());
LOG.info("Standard field cache created!");
}
} else {
failAndExitWithErrorMessage("Valid metadata was not retrieved from the server. Exiting!", scenario);
}
} catch (Exception ex) {
failAndExitWithErrorMessage(ex.toString(), scenario);
}
}
@And("the metadata contains RESO Standard Resources")
public void theMetadataContainsRESOStandardResources() {
Set<String> resources = container.get().getEdm().getSchemas().stream().map(schema ->
schema.getEntityTypes().stream().map(EdmNamed::getName)
.collect(Collectors.toSet()))
.flatMap(Collection::stream)
.collect(Collectors.toSet());
standardResources.set(resources.stream()
.filter(DataDictionaryMetadata.v1_7.WELL_KNOWN_RESOURCES::contains).collect(Collectors.toSet()));
localResources.set(Sets.difference(resources, standardResources.get()));
hasStandardResources.set(standardResources.get().size() > 0);
hasLocalResources.set(localResources.get().size() > 0);
if (hasStandardResources.get()) {
//TODO: add pluralizer
scenario.log("Found " + standardResources.get().size() + " RESO Standard Resource"
+ (standardResources.get().size() == 1 ? "" : "s") + ": "
+ String.join(", ", standardResources.get()));
} else {
scenario.log("No RESO Standard Resources found. Skipping...");
assumeTrue(true);
}
}
@And("each resource MUST have a primary key field by the OData specification")
public void eachResourceMUSTHaveAPrimaryKeyFieldByTheODataSpecification() {
scenario.log("Each resource MUST have a primary key field by the OData specification!");
assumeTrue(true);
}
@And("the metadata contains local resources")
public void theMetadataContainsLocalResources() {
if (localResources.get() == null || localResources.get().size() == 0) {
scenario.log("No local resources found! Skipping...");
assumeTrue(true);
} else {
scenario.log("Found " + localResources.get().size() + " Local Resource"
+ (localResources.get().size() == 1 ? "" : "s") + ": "
+ String.join(", ", localResources.get()));
}
}
@Then("up to {int} records are sampled from each resource with {string} payload samples stored in {string}")
public void upToRecordsAreSampledFromEachResourceWithPayloadSamplesStoredIn(int numRecords, String payloadName, String resultsDirectoryName) {
assertNotNull(getDefaultErrorMessage("resultsDirectoryName MUST be present!"), resultsDirectoryName);
if (!hasStandardResources.get()) {
scenario.log("No RESO Standard Resources to sample!");
assumeTrue(true);
} else {
Set<String> payloadResources = new LinkedHashSet<>();
standardFieldCache.get().forEach((resourceName, fieldList) -> {
if (!payloadResources.contains(resourceName) && fieldList.stream().anyMatch(field -> field.getPayloads().contains(payloadName))) {
payloadResources.add(resourceName);
}
});
standardResources.get().forEach(resourceName -> {
resourceCounts.get().put(resourceName, getResourceCount(resourceName));
resourcePayloadSampleMap.get().putIfAbsent(resourceName, Collections.synchronizedList(new LinkedList<>()));
//only save results to the directory if the resources are part of the given payload
resourcePayloadSampleMap.get().put(resourceName,
fetchAndProcessRecords(resourceName, numRecords, payloadResources.contains(resourceName) ? resultsDirectoryName : null));
});
}
}
@Then("up to {int} records are sampled from each local resource")
public void upToRecordsAreSampledFromEachLocalResource(int numRecords) {
if (!hasLocalResources.get()) {
scenario.log("No local resources were found to sample!");
assumeTrue(true);
} else {
localResources.get().forEach(resourceName -> {
resourceCounts.get().put(resourceName, getResourceCount(resourceName));
resourcePayloadSampleMap.get().putIfAbsent(resourceName, Collections.synchronizedList(new LinkedList<>()));
resourcePayloadSampleMap.get().put(resourceName, fetchAndProcessRecords(resourceName, numRecords, null));
});
}
}
@Given("samples exist in {string} in the build directory")
public void samplesExistInInTheBuildDirectory(String resultsDirectory) {
scenario.log("Samples exist in {string} in the build directory!");
assumeTrue(true);
}
@And("a RESOScript file was provided for the IDX User")
public void aRESOScriptFileWasProvidedForTheIDXUser() {
scenario.log("!!TODO!! A RESOScript file was provided for the IDX User!");
assumeTrue(true);
}
@When("samples from {string} are fetched as the representative user for each resource in the {string} payload")
public void samplesFromAreFetchedAsTheRepresentativeUserForEachResourceInThePayload(String resultsDirectory, String payloadName) {
File f = new File(SAMPLES_DIRECTORY_ROOT + File.separator + resultsDirectory);
AtomicReference<PayloadSample> payloadSample = new AtomicReference<>();
if (f.list() == null) return;
Arrays.stream(Objects.requireNonNull(f.list((file, s) -> s.endsWith("json")))).forEach(sampleResults -> {
try {
final String jsonString = new String(Files.readAllBytes(Paths.get(f.getPath() + File.separator + sampleResults)));
payloadSample.set(new Gson().fromJson(jsonString, PayloadSample.class));
} catch (FileNotFoundException fnfEx) {
LOG.error(getDefaultErrorMessage("file", sampleResults, "could not be found! Skipping..."));
LOG.error("Exception: " + fnfEx);
} catch (IOException ioException) {
ioException.printStackTrace();
}
});
}
@Then("each result MUST contain the string version of the key and the following fields")
public void eachResultMUSTContainTheStringVersionOfTheKeyAndTheFollowingFields(List<String> requiredFields) {
scenario.log("!!TODO!! Each result MUST contain the string version of the key and the following fields!");
assumeTrue(true);
}
@And("the {string} payload field values MUST match those in the samples")
public void thePayloadFieldValuesMUSTMatchThoseInTheSamples(String arg0) {
scenario.log("!!TODO!! The {string} payload field values MUST match those in the samples!");
assumeTrue(true);
}
@Given("standard and local resources have been processed")
public void standardAndLocalResourcesHaveBeenProcessed() {
scenario.log("!!TODO!! Standard and local resources have been processed!");
assumeTrue(true);
}
@Then("a data availability report is created in {string}")
public void aDataAvailabilityReportIsCreatedIn(String reportFileName) {
if (resourcePayloadSampleMap.get() == null) {
LOG.info("No resource payload samples found! Skipping...");
assumeTrue(true);
}
LOG.info("\n\nCreating data availability report!");
createDataAvailabilityReport(resourcePayloadSampleMap.get(), reportFileName, resourceCounts.get());
}
@And("{string} has been created in the build directory")
public void hasBeenCreatedInTheBuildDirectory(String encodedResultsDirectoryName) {
if (encodedResultsDirectoryName != null && !hasSamplesDirectoryBeenCleared.get()) {
if (!Utils.removeDirectory(String.format(SAMPLES_DIRECTORY_TEMPLATE, encodedResultsDirectoryName))) {
LOG.error("Failed to create runtime directories needed for program execution. Exiting...");
System.exit(NOT_OK);
} else {
hasSamplesDirectoryBeenCleared.set(true);
}
}
}
}

View File

@ -2,6 +2,7 @@ package org.reso.certification.stepdefs;
import com.fasterxml.jackson.databind.node.ObjectNode; import com.fasterxml.jackson.databind.node.ObjectNode;
import com.fasterxml.jackson.databind.node.POJONode; import com.fasterxml.jackson.databind.node.POJONode;
import com.google.inject.Inject;
import io.cucumber.java8.En; import io.cucumber.java8.En;
import org.apache.http.HttpStatus; import org.apache.http.HttpStatus;
import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.LogManager;
@ -77,7 +78,10 @@ class WebAPIServer implements En {
/** /**
* Entry point to the Web API Server tests * Entry point to the Web API Server tests
*/ */
public WebAPIServer() { @Inject
public WebAPIServer(WebAPITestContainer c) {
container.set(c);
getTestContainer().setShowResponses(showResponses); getTestContainer().setShowResponses(showResponses);
runBackground(); runBackground();

View File

@ -95,4 +95,48 @@ public class WebAPIServerAddEdit {
public void theRequestHeaderContains(String arg0, String arg1) { public void theRequestHeaderContains(String arg0, String arg1) {
} }
@And("the request header {string} {string} one of the following values")
public void theRequestHeaderOneOfTheFollowingValues(String arg0, String arg1) {
}
@And("the request header {string} {string} {string}")
public void theRequestHeader(String arg0, String arg1, String arg2) {
}
@And("the response header {string} {string} one of the following values")
public void theResponseHeaderOneOfTheFollowingValues(String arg0, String arg1) {
}
@And("the response header {string} {string} {string}")
public void theResponseHeader(String arg0, String arg1, String arg2) {
}
@And("the response header {string} {string}")
public void theResponseHeader(String arg0, String arg1) {
}
@And("the response header {string} {string} reference the resource being created")
public void theResponseHeaderReferenceTheResourceBeingCreated(String arg0, String arg1) {
}
@And("the JSON response {string} contain {string}")
public void theJSONResponseContain(String arg0, String arg1) {
}
@And("the JSON response value {string} {string}")
public void theJSONResponseValue(String arg0, String arg1) {
}
@And("the JSON response value {string} {string} {string}")
public void theJSONResponseValue(String arg0, String arg1, String arg2) {
}
@And("the JSON response {string} contain all JSON data in {string}")
public void theJSONResponseContainAllJSONDataIn(String arg0, String arg1) {
}
@When("a {string} request is made to the URL in response header {string}")
public void aRequestIsMadeToTheURLInResponseHeader(String arg0, String arg1) {
}
} }

View File

@ -4,10 +4,7 @@ import org.apache.commons.cli.*;
import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.Logger;
import org.apache.olingo.commons.api.format.ContentType; import org.apache.olingo.commons.api.format.ContentType;
import org.reso.certification.codegen.BDDProcessor; import org.reso.certification.codegen.*;
import org.reso.certification.codegen.DDLProcessor;
import org.reso.certification.codegen.DataDictionaryCodeGenerator;
import org.reso.certification.codegen.EDMXProcessor;
import org.reso.models.ClientSettings; import org.reso.models.ClientSettings;
import org.reso.models.ODataTransportWrapper; import org.reso.models.ODataTransportWrapper;
import org.reso.models.Request; import org.reso.models.Request;
@ -222,6 +219,14 @@ public class App {
} catch (Exception ex) { } catch (Exception ex) {
LOG.error(getDefaultErrorMessage(ex)); LOG.error(getDefaultErrorMessage(ex));
} }
} else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_RESOURCE_INFO_MODELS)) {
APP_OPTIONS.validateAction(cmd, APP_OPTIONS.ACTIONS.GENERATE_RESOURCE_INFO_MODELS);
try {
DataDictionaryCodeGenerator generator = new DataDictionaryCodeGenerator(new ResourceInfoProcessor());
generator.processWorksheets();
} catch (Exception ex) {
LOG.error(getDefaultErrorMessage(ex));
}
} else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_REFERENCE_EDMX)) { } else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_REFERENCE_EDMX)) {
APP_OPTIONS.validateAction(cmd, APP_OPTIONS.ACTIONS.GENERATE_REFERENCE_EDMX); APP_OPTIONS.validateAction(cmd, APP_OPTIONS.ACTIONS.GENERATE_REFERENCE_EDMX);
try { try {
@ -517,6 +522,8 @@ public class App {
.desc("Runs commands in RESOScript file given as <inputFile>.").build()) .desc("Runs commands in RESOScript file given as <inputFile>.").build())
.addOption(Option.builder().argName("t").longOpt(ACTIONS.GENERATE_DD_ACCEPTANCE_TESTS) .addOption(Option.builder().argName("t").longOpt(ACTIONS.GENERATE_DD_ACCEPTANCE_TESTS)
.desc("Generates acceptance tests in the current directory.").build()) .desc("Generates acceptance tests in the current directory.").build())
.addOption(Option.builder().argName("i").longOpt(ACTIONS.GENERATE_RESOURCE_INFO_MODELS)
.desc("Generates Java Models for the Web API Reference Server in the current directory.").build())
.addOption(Option.builder().argName("r").longOpt(ACTIONS.GENERATE_REFERENCE_EDMX) .addOption(Option.builder().argName("r").longOpt(ACTIONS.GENERATE_REFERENCE_EDMX)
.desc("Generates reference metadata in EDMX format.").build()) .desc("Generates reference metadata in EDMX format.").build())
.addOption(Option.builder().argName("k").longOpt(ACTIONS.GENERATE_REFERENCE_DDL) .addOption(Option.builder().argName("k").longOpt(ACTIONS.GENERATE_REFERENCE_DDL)
@ -559,6 +566,7 @@ public class App {
public static final String VALIDATE_METADATA = "validateMetadata"; public static final String VALIDATE_METADATA = "validateMetadata";
public static final String SAVE_GET_REQUEST = "saveGetRequest"; public static final String SAVE_GET_REQUEST = "saveGetRequest";
public static final String GENERATE_METADATA_REPORT = "generateMetadataReport"; public static final String GENERATE_METADATA_REPORT = "generateMetadataReport";
public static final String GENERATE_RESOURCE_INFO_MODELS = "generateResourceInfoModels";
} }
} }
} }

View File

@ -31,7 +31,7 @@ import java.nio.charset.StandardCharsets;
import java.sql.Time; import java.sql.Time;
import java.sql.Timestamp; import java.sql.Timestamp;
import java.time.LocalDate; import java.time.LocalDate;
import java.time.ZonedDateTime; import java.time.OffsetDateTime;
import java.time.format.DateTimeFormatter; import java.time.format.DateTimeFormatter;
import java.time.format.DateTimeParseException; import java.time.format.DateTimeParseException;
import java.time.temporal.ChronoField; import java.time.temporal.ChronoField;
@ -680,7 +680,7 @@ public final class TestUtils {
public static Integer getTimestampPart(String timestampPart, Object value) throws DateTimeParseException { public static Integer getTimestampPart(String timestampPart, Object value) throws DateTimeParseException {
if (timestampPart == null || value == null) return null; if (timestampPart == null || value == null) return null;
ZonedDateTime dateTime = ZonedDateTime.parse((String) value, DateTimeFormatter.ISO_DATE_TIME); OffsetDateTime dateTime = OffsetDateTime.parse((String) value, DateTimeFormatter.ISO_DATE_TIME);
switch (timestampPart) { switch (timestampPart) {
case DateParts.YEAR: case DateParts.YEAR:

View File

@ -8,10 +8,11 @@ import java.io.FileWriter;
import java.nio.charset.StandardCharsets; import java.nio.charset.StandardCharsets;
import java.text.DateFormat; import java.text.DateFormat;
import java.text.SimpleDateFormat; import java.text.SimpleDateFormat;
import java.time.ZoneOffset; import java.time.OffsetDateTime;
import java.time.ZonedDateTime;
import java.time.format.DateTimeFormatter; import java.time.format.DateTimeFormatter;
import java.util.Arrays;
import java.util.Date; import java.util.Date;
import java.util.Objects;
public class Utils { public class Utils {
private static final Logger LOG = LogManager.getLogger(Utils.class); private static final Logger LOG = LogManager.getLogger(Utils.class);
@ -85,6 +86,32 @@ public class Utils {
return outputFile; return outputFile;
} }
/**
* Removes a directory at the given pathToDirectory.
*
* If current user has write access then directory creation will result in True being returned.
* Otherwise will return false if the directory couldn't be created for some reason.
* @param pathToDirectory
* @return
*/
public static Boolean removeDirectory(String pathToDirectory) {
if (pathToDirectory == null) return null;
File outputDirectory = new File(pathToDirectory);
if (outputDirectory.exists()) {
if (outputDirectory.canWrite()) {
if (outputDirectory.listFiles() != null) {
Arrays.stream(Objects.requireNonNull(outputDirectory.listFiles())).forEach(File::delete);
}
return outputDirectory.delete();
} else {
LOG.error("Tried deleting directory " + outputDirectory.getPath() + " but didn't have sufficient access.");
return false;
}
}
return true;
}
public static String pluralize(int lengthAttribute) { public static String pluralize(int lengthAttribute) {
return lengthAttribute != 1 ? "s" : ""; return lengthAttribute != 1 ? "s" : "";
} }
@ -109,7 +136,11 @@ public class Utils {
} }
public static String getIsoTimestamp() { public static String getIsoTimestamp() {
return ZonedDateTime.now(ZoneOffset.UTC).format(DateTimeFormatter.ISO_INSTANT); return OffsetDateTime.now().format(DateTimeFormatter.ISO_INSTANT);
}
public static String getIsoTimestamp(OffsetDateTime fromDate) {
return OffsetDateTime.from(fromDate.toInstant()).format(DateTimeFormatter.ISO_INSTANT);
} }
} }

View File

@ -165,7 +165,7 @@ public class MetadataReport implements JsonSerializer<MetadataReport> {
} }
static class SneakyAnnotationReader { static class SneakyAnnotationReader {
Class object; Class<? extends EdmAnnotationImpl> object;
Field field; Field field;
EdmAnnotationImpl edmAnnotationImpl; EdmAnnotationImpl edmAnnotationImpl;
ClientCsdlAnnotation clientCsdlAnnotation; ClientCsdlAnnotation clientCsdlAnnotation;

View File

@ -0,0 +1,131 @@
package org.reso.models;
import com.google.gson.*;
import org.reso.commander.common.Utils;
import java.lang.reflect.Type;
import java.util.Collections;
import java.util.LinkedList;
import java.util.List;
import java.util.Map;
public class PayloadSample implements JsonSerializer<PayloadSample> {
String resourceName;
String dateField;
Long responseTimeMillis = null;
Integer responseSizeBytes = null;
String requestUri = null;
//format is a list of key/value pairs where all fields besides
//keys and timestamps are encoded with SHA
final List<Map<String, String>> encodedSamples = Collections.synchronizedList(new LinkedList<>());
//keeps track of the list of key fields found on the server
final List<String> keyFields = new LinkedList<>();
final List<String> payloadFields = new LinkedList<>();
public PayloadSample(String resourceName, String dateField, List<String> keyFields) {
assert resourceName != null : "resourceName MUST be present";
this.resourceName = resourceName;
this.dateField = dateField;
this.keyFields.addAll(keyFields);
}
public void setPayloadFields(List<String> payloadFields) {
this.payloadFields.addAll(payloadFields);
}
public void addSample(Map<String, String> sample) {
encodedSamples.add(sample);
}
public List<Map<String, String>> getSamples() {
return encodedSamples;
}
public Long getResponseTimeMillis() {
return responseTimeMillis;
}
public void setResponseTimeMillis(Long value) {
responseTimeMillis = value;
}
public String getRequestUri() {
return requestUri;
}
public void setRequestUri(String value) {
requestUri = value;
}
public Integer getResponseSizeBytes() {
return responseSizeBytes;
}
public void setResponseSizeBytes(Integer responseSizeBytes) {
this.responseSizeBytes = responseSizeBytes;
}
/**
* Gson invokes this call-back method during serialization when it encounters a field of the
* specified type.
*
* <p>In the implementation of this call-back method, you should consider invoking
* {@link JsonSerializationContext#serialize(Object, Type)} method to create JsonElements for any
* non-trivial field of the {@code src} object. However, you should never invoke it on the
* {@code src} object itself since that will cause an infinite loop (Gson will call your
* call-back method again).</p>
*
* @param src the object that needs to be converted to Json.
* @param typeOfSrc the actual type (fully genericized version) of the source object.
* @param context context of the serialization request
* @return a JsonElement corresponding to the specified object.
*/
@Override
public JsonElement serialize(PayloadSample src, Type typeOfSrc, JsonSerializationContext context) {
final String
DESCRIPTION = "RESO Payload Sample",
DESCRIPTION_KEY = "description",
GENERATED_ON_KEY = "generatedOn",
NUM_SAMPLES_KEY = "numSamples",
REQUEST_URI_KEY = "requestUri",
RESOURCE_NAME_KEY = "resourceName",
DATE_FIELD_KEY = "dateField",
KEY_FIELDS_KEY = "keyFields",
ENCODED_VALUES_KEY = "encodedValues",
PAYLOAD_FIELDS_KEY = "payloadFields";
JsonObject serialized = new JsonObject();
serialized.addProperty(DESCRIPTION_KEY, DESCRIPTION);
serialized.addProperty(GENERATED_ON_KEY, Utils.getIsoTimestamp());
serialized.addProperty(NUM_SAMPLES_KEY, src.encodedSamples.size());
serialized.addProperty(REQUEST_URI_KEY, src.requestUri);
serialized.addProperty(RESOURCE_NAME_KEY,src.resourceName);
JsonArray keyFields = new JsonArray();
src.keyFields.forEach(keyFields::add);
serialized.add(KEY_FIELDS_KEY, keyFields);
serialized.addProperty(DATE_FIELD_KEY, src.dateField);
JsonArray payloadFieldsJson = new JsonArray();
src.payloadFields.forEach(payloadFieldsJson::add);
serialized.add(PAYLOAD_FIELDS_KEY, payloadFieldsJson);
JsonArray encodedSamplesJson = new JsonArray();
src.encodedSamples.forEach(sample -> {
JsonObject sampleJson = new JsonObject();
sample.forEach(sampleJson::addProperty);
encodedSamplesJson.add(sampleJson);
});
serialized.add(ENCODED_VALUES_KEY, encodedSamplesJson);
return serialized;
}
}

View File

@ -0,0 +1,310 @@
package org.reso.models;
import com.google.gson.*;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.olingo.commons.api.edm.Edm;
import org.apache.olingo.commons.api.edm.EdmElement;
import org.reso.commander.common.Utils;
import java.lang.reflect.Type;
import java.time.OffsetDateTime;
import java.time.format.DateTimeFormatter;
import java.util.*;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicLong;
import java.util.concurrent.atomic.AtomicReference;
public class PayloadSampleReport implements JsonSerializer<PayloadSampleReport> {
private static final Logger LOG = LogManager.getLogger(PayloadSampleReport.class);
private static final String POSTAL_CODE_KEY = "PostalCode";
private final Map<String, List<PayloadSample>> resourcePayloadSamplesMap = Collections.synchronizedMap(new LinkedHashMap<>());
private final Map<String, Map<String, Integer>> resourceFieldTallies = Collections.synchronizedMap(new LinkedHashMap<>(new LinkedHashMap<>()));
private final Map<String, Integer> resourceCounts = Collections.synchronizedMap(new LinkedHashMap<>());
private Edm metadata;
private PayloadSampleReport() {
//private default constructor
}
public PayloadSampleReport(final Edm metadata, final Map<String, List<PayloadSample>> resourcePayloadSamplesMap, final Map<String, Integer> resourceCounts) {
this.metadata = metadata;
this.resourcePayloadSamplesMap.putAll(resourcePayloadSamplesMap);
resourceFieldTallies.putAll(createResourceFieldTallies(resourcePayloadSamplesMap));
this.resourceCounts.putAll(resourceCounts);
}
@Override
public String toString() {
return String.valueOf(serialize(this, FieldAvailabilityJson.class, null));
}
/**
* FieldAvailabilityJson uses a JSON payload with the following structure:
*
* {
* "resourceName": "Property",
* "fieldName": "AboveGradeFinishedArea",
* "availability": 0.1
* }
*/
private final class FieldAvailabilityJson implements JsonSerializer<FieldAvailabilityJson> {
static final String
RESOURCE_NAME_KEY = "resourceName",
FIELD_NAME_KEY = "fieldName",
FIELDS_KEY = "fields",
AVAILABILITY_KEY = "availability";
String resourceName;
EdmElement edmElement;
public FieldAvailabilityJson(String resourceName, EdmElement edmElement) {
this.resourceName = resourceName;
this.edmElement = edmElement;
}
public String buildReportString(JsonElement dataAvailabilityReport) {
StringBuilder reportBuilder = new StringBuilder();
dataAvailabilityReport.getAsJsonObject().get(FIELDS_KEY).getAsJsonArray().forEach(field -> {
reportBuilder.append("\nResource: ");
reportBuilder.append(field.getAsJsonObject().get(RESOURCE_NAME_KEY));
reportBuilder.append("\nField: ");
reportBuilder.append(field.getAsJsonObject().get(FIELD_NAME_KEY));
reportBuilder.append("\nAvailability: ");
reportBuilder.append(field.getAsJsonObject().get(AVAILABILITY_KEY));
reportBuilder.append("\n");
});
return reportBuilder.toString();
}
@Override
public JsonElement serialize(FieldAvailabilityJson src, Type typeOfSrc, JsonSerializationContext context) {
JsonObject field = new JsonObject();
int numTimesPresent = resourceFieldTallies.get(src.resourceName) != null
&& resourceFieldTallies.get(src.resourceName).get(src.edmElement.getName()) != null
? resourceFieldTallies.get(src.resourceName).get(src.edmElement.getName()) : 0;
int numSamples = resourcePayloadSamplesMap.get(src.resourceName) != null
? resourcePayloadSamplesMap.get(src.resourceName).stream().reduce(0, (a, f) -> a + f.encodedSamples.size(), Integer::sum) : 0;
field.addProperty(RESOURCE_NAME_KEY, src.resourceName);
field.addProperty(FIELD_NAME_KEY, src.edmElement.getName());
field.addProperty(AVAILABILITY_KEY, numSamples > 0 ? (1.0 * numTimesPresent) / numSamples : 0);
return field;
}
}
private static Map<String, Map<String, Integer>> createResourceFieldTallies(Map<String, List<PayloadSample>> resourcePayloadSamplesMap) {
AtomicReference<Map<String, Map<String, Integer>>> resourceTallies = new AtomicReference<>(new LinkedHashMap<>());
AtomicInteger numSamples = new AtomicInteger(0);
resourcePayloadSamplesMap.keySet().forEach(resourceName -> {
LOG.info("Processing resource: " + resourceName);
numSamples.set(resourcePayloadSamplesMap.get(resourceName) != null
? resourcePayloadSamplesMap.get(resourceName).stream().reduce(0, (a, f) -> a + f.getSamples().size(), Integer::sum) : 0);
LOG.info("Sample size: " + numSamples.get());
//for each resource, go through the keys and tally the data presence counts for each field
//as well as the number of samples in each case
resourceTallies.get().putIfAbsent(resourceName, new LinkedHashMap<>());
if (numSamples.get() > 0) {
resourcePayloadSamplesMap.get(resourceName).forEach(payloadSample -> {
payloadSample.getSamples().forEach(sample -> {
sample.forEach((fieldName, encodedValue) -> {
if (encodedValue != null) {
resourceTallies.get().get(resourceName).putIfAbsent(fieldName, 0);
resourceTallies.get().get(resourceName).put(fieldName, resourceTallies.get().get(resourceName).get(fieldName) + 1);
}
});
});
});
}
});
return resourceTallies.get();
}
@Override
public JsonElement serialize(PayloadSampleReport src, Type typeOfSrc, JsonSerializationContext context) {
final String
DESCRIPTION_KEY = "description", DESCRIPTION = "RESO Data Availability Report",
VERSION_KEY = "version", VERSION = "1.7",
GENERATED_ON_KEY = "generatedOn",
RESOURCE_INFO_KEY = "resourceInfo",
FIELDS_KEY = "fields";
JsonArray fields = new JsonArray();
src.metadata.getSchemas().forEach(edmSchema -> {
//serialize entities (resources) and members (fields)
edmSchema.getEntityTypes().forEach(edmEntityType -> {
edmEntityType.getPropertyNames().forEach(propertyName -> {
FieldAvailabilityJson fieldJson = new FieldAvailabilityJson(edmEntityType.getName(), edmEntityType.getProperty(propertyName));
fields.add(fieldJson.serialize(fieldJson, FieldAvailabilityJson.class, null));
});
});
});
JsonObject availabilityReport = new JsonObject();
availabilityReport.addProperty(DESCRIPTION_KEY, DESCRIPTION);
availabilityReport.addProperty(VERSION_KEY, VERSION);
availabilityReport.addProperty(GENERATED_ON_KEY, Utils.getIsoTimestamp());
final JsonArray resourceTotalsByResource = new JsonArray();
src.resourcePayloadSamplesMap.keySet().forEach(resourceName -> {
Set<String> postalCodes = new LinkedHashSet<>();
ResourceInfo resourceInfo = new ResourceInfo(resourceName);
int resourceRecordCount = 0;
if (src.resourceCounts.get(resourceName) != null) {
resourceRecordCount = src.resourceCounts.get(resourceName);
}
resourceInfo.numRecordsTotal.set(resourceRecordCount);
PayloadSample zerothSample = resourcePayloadSamplesMap.get(resourceName) != null
&& resourcePayloadSamplesMap.get(resourceName).size() > 0
? resourcePayloadSamplesMap.get(resourceName).get(0) : null;
if (zerothSample != null) {
resourceInfo.keyFields.set(zerothSample.keyFields);
resourceInfo.dateField.set(zerothSample.dateField);
}
if (src.resourcePayloadSamplesMap.get(resourceName) != null) {
AtomicReference<OffsetDateTime> offsetDateTime = new AtomicReference<>();
src.resourcePayloadSamplesMap.get(resourceName).forEach(payloadSample -> {
resourceInfo.totalBytesReceived.getAndAdd(payloadSample.getResponseSizeBytes());
resourceInfo.totalResponseTimeMillis.getAndAdd(payloadSample.getResponseTimeMillis());
resourceInfo.numSamplesProcessed.getAndIncrement();
resourceInfo.numRecordsFetched.getAndAdd(payloadSample.encodedSamples.size());
payloadSample.encodedSamples.forEach(encodedSample -> {
offsetDateTime.set(OffsetDateTime.parse(encodedSample.get(payloadSample.dateField)));
if (offsetDateTime.get() != null) {
if (resourceInfo.dateLow.get() == null) {
resourceInfo.dateLow.set(offsetDateTime.get());
} else if (offsetDateTime.get().isBefore(resourceInfo.dateLow.get())) {
resourceInfo.dateLow.set(offsetDateTime.get());
}
if (resourceInfo.dateHigh.get() == null) {
resourceInfo.dateHigh.set(offsetDateTime.get());
} else if (offsetDateTime.get().isAfter(resourceInfo.dateHigh.get())) {
resourceInfo.dateHigh.set(offsetDateTime.get());
}
}
if (encodedSample.containsKey(POSTAL_CODE_KEY)) {
postalCodes.add(encodedSample.get(POSTAL_CODE_KEY));
}
});
if (resourceInfo.pageSize.get() == 0) resourceInfo.pageSize.set(payloadSample.getSamples().size());
});
}
if (postalCodes.size() > 0) {
resourceInfo.postalCodes.set(postalCodes);
}
resourceTotalsByResource.add(resourceInfo.serialize(resourceInfo, ResourceInfo.class, null));
});
availabilityReport.add(RESOURCE_INFO_KEY, resourceTotalsByResource);
availabilityReport.add(FIELDS_KEY, fields);
return availabilityReport;
}
static final class ResourceInfo implements JsonSerializer<ResourceInfo> {
final String
RESOURCE_NAME_KEY = "resourceName",
RECORD_COUNT_KEY = "recordCount",
TOTAL_NUM_RECORDS_FETCHED = "numRecordsFetched",
TOTAL_NUM_SAMPLES_KEY = "numSamples",
PAGE_SIZE_KEY = "pageSize",
AVERAGE_RESPONSE_TIME_MILLIS_KEY = "averageResponseTimeMillis",
AVERAGE_RESPONSE_BYTES_KEY = "averageResponseBytes",
KEY_FIELDS_KEY = "keyFields",
DATE_FIELD_KEY = "dateField",
DATE_LOW_KEY = "dateLow",
DATE_HIGH_KEY = "dateHigh",
POSTAL_CODES_KEY = "postalCodes";
final AtomicInteger numSamplesProcessed = new AtomicInteger(0);
final AtomicInteger numRecordsTotal = new AtomicInteger(0);
final AtomicInteger numRecordsFetched = new AtomicInteger(0);
final AtomicReference<String> resourceName = new AtomicReference<>();
final AtomicLong totalResponseTimeMillis = new AtomicLong(0);
final AtomicLong totalBytesReceived = new AtomicLong(0);
final AtomicInteger pageSize = new AtomicInteger(0);
final AtomicReference<List<String>> keyFields = new AtomicReference<>(new LinkedList<>());
final AtomicReference<String> dateField = new AtomicReference<>();
final AtomicReference<OffsetDateTime> dateLow = new AtomicReference<>(null);
final AtomicReference<OffsetDateTime> dateHigh = new AtomicReference<>(null);
final AtomicReference<Set<String>> postalCodes = new AtomicReference<>(new LinkedHashSet<>());
public ResourceInfo(String resourceName) {
this.resourceName.set(resourceName);
}
/**
* Gson invokes this call-back method during serialization when it encounters a field of the
* specified type.
*
* <p>In the implementation of this call-back method, you should consider invoking
* {@link JsonSerializationContext#serialize(Object, Type)} method to create JsonElements for any
* non-trivial field of the {@code src} object. However, you should never invoke it on the
* {@code src} object itself since that will cause an infinite loop (Gson will call your
* call-back method again).</p>
*
* @param src the object that needs to be converted to Json.
* @param typeOfSrc the actual type (fully genericized version) of the source object.
* @param context
* @return a JsonElement corresponding to the specified object.
*/
@Override
public JsonElement serialize(ResourceInfo src, Type typeOfSrc, JsonSerializationContext context) {
JsonObject totals = new JsonObject();
totals.addProperty(RESOURCE_NAME_KEY, src.resourceName.get());
totals.addProperty(RECORD_COUNT_KEY, src.numRecordsTotal);
totals.addProperty(TOTAL_NUM_RECORDS_FETCHED, src.numRecordsFetched.get());
totals.addProperty(TOTAL_NUM_SAMPLES_KEY, src.numSamplesProcessed.get());
totals.addProperty(PAGE_SIZE_KEY, src.pageSize.get());
totals.addProperty(AVERAGE_RESPONSE_BYTES_KEY, src.numSamplesProcessed.get() > 0
? src.totalBytesReceived.get() / src.numSamplesProcessed.get() : 0);
totals.addProperty(AVERAGE_RESPONSE_TIME_MILLIS_KEY, src.numSamplesProcessed.get() > 0
? src.totalResponseTimeMillis.get() / src.numSamplesProcessed.get() : 0);
totals.addProperty(DATE_FIELD_KEY, src.dateField.get());
totals.addProperty(DATE_LOW_KEY, src.dateLow.get() != null
? src.dateLow.get().format(DateTimeFormatter.ISO_INSTANT) : null);
totals.addProperty(DATE_HIGH_KEY, src.dateHigh.get() != null
? src.dateHigh.get().format(DateTimeFormatter.ISO_INSTANT): null);
JsonArray keyFields = new JsonArray();
src.keyFields.get().forEach(keyFields::add);
totals.add(KEY_FIELDS_KEY, keyFields);
if (src.postalCodes.get().size() > 0) {
JsonArray postalCodes = new JsonArray();
src.postalCodes.get().forEach(postalCodes::add);
totals.add(POSTAL_CODES_KEY, postalCodes);
}
return totals;
}
}
}

View File

@ -1,7 +1,7 @@
{ {
"description": "RESO Data Dictionary Metadata Report", "description": "RESO Data Dictionary Metadata Report",
"version": "1.7", "version": "1.7",
"generatedOn": "2021-04-13T23:23:58.588Z", "generatedOn": "2021-04-30T12:16:46.822Z",
"fields": [ "fields": [
{ {
"resourceName": "Property", "resourceName": "Property",

View File

@ -0,0 +1 @@
cucumber.publish.quiet=true

View File

@ -77,7 +77,6 @@
} }
}, },
"required": [ "required": [
"annotations",
"fieldName", "fieldName",
"isCollection", "isCollection",
"nullable", "nullable",
@ -89,8 +88,11 @@
}, },
"Annotation": { "Annotation": {
"type": "object", "type": "object",
"additionalProperties": false, "additionalProperties": true,
"properties": { "properties": {
"term": {
"type": "string"
},
"value": { "value": {
"type": "string", "type": "string",
"qt-uri-protocols": [ "qt-uri-protocols": [

View File

@ -14,12 +14,15 @@
</File> </File>
</Appenders> </Appenders>
<Loggers> <Loggers>
<Logger name="org.apache.olingo.client.core" level="all" additivity="false"> <Logger name="org.apache.olingo.client.core" level="error" additivity="false">
<AppenderRef ref="Log"/>
</Logger>
<Logger name="org.apache.velocity.runtime" level="error" additivity="false">
<AppenderRef ref="Log"/> <AppenderRef ref="Log"/>
</Logger> </Logger>
<Root level="all"> <Root level="all">
<AppenderRef ref="Console" level="info"/> <AppenderRef ref="Console" level="info"/>
<AppenderRef ref="Log"/> <AppenderRef ref="Log" level="error"/>
</Root> </Root>
</Loggers> </Loggers>
</Configuration> </Configuration>