Support for Field and Lookup Resources (#95)

* Issue #85: bump Web API Core version to 2.0.0
* Added strict flag=true by default for metadata and availability checks. JSON Schema WIP.
* Issue #88: added batch test runner as well as improved Web API Messages
* Issue #91: added field availability metrics
* Issue #91: availability and field count rollups
* Issue #91: calculate lookup value availabilities
* Issue #91: lookup rollups
* Issue #91: added lookup availability
* Issue #91: resource availability calculations
* Issue #91: adding comments and addressing null values
* Issue #91: cleanup lookup iterations
* Issue #88: added XMLMetadataToJSONSchemaSerializer
* Issue #93: Added Field and Lookup resource
* Issue #93: Updated README
This commit is contained in:
Joshua Darnell 2021-12-12 21:22:42 -08:00 committed by GitHub
parent 6ff3562792
commit 9181ef192d
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
60 changed files with 78415 additions and 96653 deletions

View File

@ -16,7 +16,7 @@ explains how to run the following tests:
* Data Dictionary 1.7
* Data Dictionary Availability Report
* IDX Payload 1.7
* Web API Core 1.0.2
* Web API Core 2.0.0
## [Command-Line OData Web API Tools](/doc/CLI.md)
The RESO Commander contains command line tools for working with OData Web APIs.

View File

@ -87,17 +87,17 @@ final String certOutputDir = 'build/certification',
certReportsDir = certOutputDir + '/reports'
task testWebApiCore_1_0_2() {
task testWebApiCore_2_0_0() {
group = 'RESO Certification'
description = 'Web API Core 1.0.2 Acceptance Tests' +
description = 'Web API Core 2.0.0 Acceptance Tests' +
'\nExample: ' +
'\n ./gradlew testWebApiServer_1_0_2_Core -DpathToRESOScript=/path/to/web-api-core-1.0.2.resoscript -DshowResponses=true' +
'\n ./gradlew testWebApiCore_2_0_0 -DpathToRESOScript=/path/to/web-api-core-2.0.0.resoscript -DshowResponses=true' +
'\n\nNote: by default the Web API tests assume Collection(Edm.EnumType).' +
'\nPass -DuseCollections=false if using OData IsFlags.' +
'\n\n[Report location: ' + certReportsDir + ']' +
'\n\n'
String reportName = 'web-api-server.core.1.0.2'
String reportName = 'web-api-server.core.2.0.0'
dependsOn jar
doLast {
@ -115,7 +115,7 @@ task testWebApiCore_1_0_2() {
'--plugin',
'html:' + certReportsDir + '/' + reportName + '.html',
'--glue',
'org.reso.certification.stepdefs#WebAPIServer_1_0_2',
'org.reso.certification.stepdefs#WebAPIServerCore',
'src/main/java/org/reso/certification/features/web-api',
'--tags',
'@core-endorsement'
@ -177,7 +177,7 @@ task testDataAvailability_1_7() {
group = 'RESO Certification'
description = 'Data Dictionary 1.7 Data Availability Tests' +
'\nExample:' +
'\n ./gradlew testDataAvailability_1_7 -DpathToRESOScript=/path/to/web-api-core-1.0.2.resoscript' +
'\n ./gradlew testDataAvailability_1_7 -DpathToRESOScript=/path/to/web-api-core-2.0.0.resoscript' +
'\n\n[Report location: ' + certReportsDir + ']' +
'\n\n'
@ -209,7 +209,7 @@ task testIdxPayload_1_7() {
group = 'RESO Certification'
description = 'Data Dictionary 1.7 Payloads Sampling Tests' +
'\nExample:' +
'\n ./gradlew testIdxPayload_1_7 -DpathToRESOScript=/path/to/web-api-core-1.0.2.resoscript' +
'\n ./gradlew testIdxPayload_1_7 -DpathToRESOScript=/path/to/web-api-core-2.0.0.resoscript' +
'\n\n[Report location: ' + certReportsDir + ']' +
'\n\n'

View File

@ -160,7 +160,7 @@ This should display something similar to the following:
Web API Commander Starting... Press <ctrl+c> at any time to exit.
==============================================================
Displaying 44 Request(s)
RESOScript: src/test/resources/mock.web-api-server.core.1.0.2.resoscript
RESOScript: src/test/resources/mock.web-api-server.core.2.0.0.resoscript
==============================================================
@ -196,5 +196,5 @@ Results will be saved to the filenames specified in the given RESOScript, and er
**RESOScript File Format**
For examples of files using the RESOScript format, see:
* [Data Dictionary 1.7 RESOScript Template](sample-data-dictionary.1.7.0.resoscript)
* [Web API Core 1.0.2 RESOScript Template](sample-web-api-server.core.1.0.2.resoscript)
* [Web API Core 2.0.0 RESOScript Template](sample-web-api-server.core.2.0.0.resoscript)

View File

@ -86,7 +86,7 @@ RESO Certification tasks
------------------------
testDataAvailability_1_7 - Data Dictionary 1.7 Data Availability Tests
Example:
./gradlew testDataAvailability_1_7 -DpathToRESOScript=/path/to/web-api-core-1.0.2.resoscript
./gradlew testDataAvailability_1_7 -DpathToRESOScript=/path/to/web-api-core-2.0.0.resoscript
[Report location: build/certification/reports]
@ -104,14 +104,14 @@ To disable strict mode, remove the -Dstrict=true parameter. All applicants MUST
testIdxPayload_1_7 - Data Dictionary 1.7 Payloads Sampling Tests
Example:
./gradlew testIdxPayload_1_7 -DpathToRESOScript=/path/to/web-api-core-1.0.2.resoscript
./gradlew testIdxPayload_1_7 -DpathToRESOScript=/path/to/web-api-core-2.0.0.resoscript
[Report location: build/certification/reports]
testWebApiCore_1_0_2 - Web API Core 1.0.2 Acceptance Tests
testWebApiCore_2_0_0 - Web API Core 2.0.0 Acceptance Tests
Example:
./gradlew testWebApiCore_1_0_2 -DpathToRESOScript=/path/to/web-api-core-1.0.2.resoscript -DshowResponses=true
./gradlew testWebApiCore_2_0_0 -DpathToRESOScript=/path/to/web-api-core-2.0.0.resoscript -DshowResponses=true
Note: by default the Web API tests assume Collection(Edm.EnumType).
Pass -DuseCollections=false if using OData IsFlags.
@ -128,14 +128,14 @@ To use the automated RESO testing tools, you must have a [JDK installed](#java-a
### Web API Core RESOScript Template
To use the Commander for automated Web API Core testing, you need a RESOScript.
For Web API 1.0.2 Server Core Certification, use [this resoscript](sample-web-api-server.core.1.0.2.resoscript) as a template.
For Web API 2.0.0 Server Core Certification, use [this resoscript](sample-web-api-server.core.2.0.0.resoscript) as a template.
For more information regarding Parameters and Client Settings, see the [Web API Walkthrough](https://github.com/RESOStandards/web-api-commander/wiki/Walkthrough:-Automated-Web-API-Certification-Using-the-RESO-Commander#configuring-the-resoscript-file) (in-progress).
### Web API Cucumber Acceptance Tests
The Cucumber BDD acceptance tests for Web API 1.0.2 Core certification are [here](https://github.com/RESOStandards/web-api-commander/blob/issue-37-data-dictionary-testing/src/main/java/org/reso/certification/features/web-api/web-api-server.core.1.0.2.feature). If you have any questions, please [send us an email](mailto:dev@reso.org).
The Cucumber BDD acceptance tests for Web API 2.0.0 Core certification are [here](https://github.com/RESOStandards/web-api-commander/blob/issue-37-data-dictionary-testing/src/main/java/org/reso/certification/features/web-api/web-api-server.core.2.0.0.feature). If you have any questions, please [send us an email](mailto:dev@reso.org).
### Gradle Tasks for Web API 1.0.2 Server Certification
### Gradle Tasks for Web API 2.0.0 Server Certification
While you may use tags to filter tests as you choose, explained in the next section, it's convenient
to be able to run a predefined set of tests Web API Core Certification.
@ -143,19 +143,19 @@ These tasks will also produce reports in the local `build` directory, named acco
#### Core Certification
This will run the Core tests against the Web API 1.0.2 Server provided as `WebAPIURI` in your `web-api-server.core.1.0.2.resoscript` file.
This will run the Core tests against the Web API 2.0.0 Server provided as `WebAPIURI` in your `web-api-server.core.2.0.0.resoscript` file.
**Note**: by default, the Commander uses `Collection(Edm.EnumType)` for multiple enumerations testing.
Pass `-DuseCollections=false` if you are using `IsFlags="true"` instead.
##### MacOS or Linux
```
$ ./gradlew testWebApiCore_1_0_2 -DpathToRESOScript=/path/to/your.web-api-server.core.1.0.2.resoscript -DshowResponses=true
$ ./gradlew testWebApiCore_2_0_0 -DpathToRESOScript=/path/to/your.web-api-server.core.2.0.0.resoscript -DshowResponses=true
```
##### Windows
```
C:\path\to\web-api-commander> gradlew testWebApiCore_1_0_2 -DpathToRESOScript=C:\path\to\your.web-api-server.core.1.0.2.resoscript -DshowResponses=true
C:\path\to\web-api-commander> gradlew testWebApiCore_2_0_0 -DpathToRESOScript=C:\path\to\your.web-api-server.core.2.0.0.resoscript -DshowResponses=true
```
*Note: the first time you run these tasks, they will take some time as the environment must be configured and code is being compiled from the contents of the source directory downloaded in the previous step.
@ -165,12 +165,12 @@ C:\path\to\web-api-commander> gradlew testWebApiCore_1_0_2 -DpathToRESOScript=C:
A sample of the runtime terminal output follows:
```gherkin
> Task :testWebApiCore_1_0_2
> Task :testWebApiCore_2_0_0
@metadata-request @2.4.1
Scenario: REQ-WA103-END3 - Request and Validate Server Metadata
Using RESOScript: ./web-api-server.core.1.0.2.resoscript
Using RESOScript: ./web-api-server.core.2.0.0.resoscript
Given a RESOScript file was provided
RESOScript loaded successfully!
@ -311,7 +311,7 @@ You may filter by tags in any of the Web API or Data Dictionary tests. These are
**Run Web API Core Metadata Tests Only**
```
$ gradle testWebApiCore_1_0_2 -DpathToRESOScript=/path/to/your.web-api-server.core.1.0.2.resoscript -Dcucumber.filter.tags="@metadata"
$ gradle testWebApiCore_2_0_0 -DpathToRESOScript=/path/to/your.web-api-server.core.2.0.0.resoscript -Dcucumber.filter.tags="@metadata"
```
**Run Data Dictionary Tests on IDX Fields Only**

View File

@ -1,10 +1,10 @@
# Codegen
The RESO Commander CLI contains code generation for the following items:
* [Generating RESO Data Dictionary Acceptance Tests](#Generating RESO Data Dictionary Acceptance Tests)
* [Generating RESO Web API Reference Server Data Models](#Generating RESO Web API Reference Server Data Models)
* [Generating RESO Data Dictionary Reference Metadata](#Generating RESO Data Dictionary Reference Metadata)
* [Generating RESO Data Dictionary 1.7 Reference DDL](#Generating RESO Data Dictionary 1.7 Reference DDL)
* [Converting OData XML Metadata to Open API 3 Format](#Converting OData XML Metadata to Open API 3 Format)
* [Generating RESO Data Dictionary Acceptance Tests](#generating-reso-data-dictionary-acceptance-tests)
* [Generating RESO Web API Reference Server Data Models](#generating-reso-web-api-reference-server-data-models)
* [Generating RESO Data Dictionary Reference Metadata](#generating-reso-data-dictionary-reference-metadata)
* [Generating RESO Data Dictionary 1.7 Reference DDL](#generating-reso-data-dictionary-17-reference-ddl)
* [Converting OData XML Metadata to Open API 3 Format](#converting-odata-xml-metadata-to-open-api-3-format)
## Generating RESO Data Dictionary Acceptance Tests
The RESO Commander can be used to generate Data Dictionary acceptance tests from the currently approved [Data Dictionary Spreadsheet](src/main/resources/RESODataDictionary-1.7.xlsx).
@ -74,6 +74,13 @@ The following items need to be added to the DDL generator still:
## Converting OData XML Metadata to Open API 3 Format
See documentation regarding running the [nodejs-based tools in odata-openapi/lib/README.md](odata-openapi/lib/README.md).
In order to generate an Open API 3 Spec from the reference metadata, run the following command from
the root of the odata-openapi3 directory:
```
$ odata-openapi3 --host 'api.reso.org' --scheme 'https' --basePath '' ../src/main/resources/RESODataDictionary-1.7.xml
```
You will need to issue an `npm install` command from the odata-openapi3 directory in order for the packages to be available.
See documentation regarding running the nodejs-based tools [in the odata-openapi README.md](../odata-openapi/README.md).

View File

@ -35,7 +35,7 @@ mkdir commander-tmp; \
cd commander-tmp; \
git clone https://github.com/RESOStandards/web-api-commander.git; \
cd web-api-commander; \
docker run --rm -u gradle -v "$PWD":/home/gradle/project -v /path/to/your/resoscripts:/home/gradle/project/resoscripts -w /home/gradle/project gradle gradle testWebAPIServer_1_0_2_Core -DpathToRESOScript=/home/gradle/project/resoscripts/your.web-api-server.core.1.0.2.resoscript -DshowResponses=true
docker run --rm -u gradle -v "$PWD":/home/gradle/project -v /path/to/your/resoscripts:/home/gradle/project/resoscripts -w /home/gradle/project gradle gradle testWebAPIServer_2_0_0_Core -DpathToRESOScript=/home/gradle/project/resoscripts/your.web-api-server.core.2.0.0.resoscript -DshowResponses=true
```
Note that this will create a directory in your home directory for the project, and build artifacts and the log will be placed in that directory,
@ -44,5 +44,5 @@ which is also where you will end up after runtime.
#### Windows All-In-One WIP
```
cd C:\;mkdir commander-tmp;cd commander-tmp;git clone https://github.com/RESOStandards/web-api-commander.git;cd web-api-commander; docker run --rm -u gradle -v C:\current\path\web-api-commander:/home/gradle/project -v C:\path\to\your\resoscripts:/home/gradle/project/resoscripts -w /home/gradle/project gradle gradle testWebAPIServer_1_0_2_Core -DpathToRESOScript=/home/gradle/project/resoscripts/your.web-api-server.core.1.0.2.resoscript -DshowResponses=true
cd C:\;mkdir commander-tmp;cd commander-tmp;git clone https://github.com/RESOStandards/web-api-commander.git;cd web-api-commander; docker run --rm -u gradle -v C:\current\path\web-api-commander:/home/gradle/project -v C:\path\to\your\resoscripts:/home/gradle/project/resoscripts -w /home/gradle/project gradle gradle testWebAPIServer_2_0_0_Core -DpathToRESOScript=/home/gradle/project/resoscripts/your.web-api-server.core.2.0.0.resoscript -DshowResponses=true
```

118
runResoscripts/.gitignore vendored Normal file
View File

@ -0,0 +1,118 @@
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
.pnpm-debug.log*
# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# Snowpack dependency directory (https://snowpack.dev/)
web_modules/
# TypeScript cache
*.tsbuildinfo
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Microbundle cache
.rpt2_cache/
.rts2_cache_cjs/
.rts2_cache_es/
.rts2_cache_umd/
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
.env.test
.env.production
# parcel-bundler cache (https://parceljs.org/)
.cache
.parcel-cache
# Next.js build output
.next
out
# Nuxt.js build / generate output
.nuxt
dist
# Gatsby files
.cache/
# Comment in the public line in if your project uses Gatsby and not Next.js
# https://nextjs.org/blog/next-9-1#public-directory-support
# public
# vuepress build output
.vuepress/dist
# Serverless directories
.serverless/
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/
# TernJS port file
.tern-port
# Stores VSCode versions used for testing VSCode extensions
.vscode-test
# yarn v2
.yarn/cache
.yarn/unplugged
.yarn/build-state.yml
.yarn/install-state.gz
.pnp.*

193
runResoscripts/app.js Normal file
View File

@ -0,0 +1,193 @@
const fs = require('fs');
const fse = require('fs-extra');
const { execSync } = require('child_process');
//parse command line args
const yargs = require('yargs/yargs');
const { hideBin } = require('yargs/helpers');
const argv = yargs(hideBin(process.argv)).argv;
const { processDataDictionaryResults } = require('./services/postResultsToApi.js');
const { processDataAvailabilityReport } = require('./services/processDataAvailabilityReport.js');
const { COMMANDER_PATH } = require('./batch-config.json');
const CERTIFICATION_RESULTS_PATH = `${COMMANDER_PATH}/build/certification`;
const buildRecipientPath = (providerUoi, recipientUoi) => {
if (!providerUoi) throw Error('providerUoi is required!');
if (!recipientUoi) throw Error('recipientUoi is required!');
return `${providerUoi}/${recipientUoi}`;
};
const createResoscriptBearerTokenConfig = ({uri, token} = config) => '<?xml version="1.0" encoding="utf-8" ?>' +
'<OutputScript>' +
' <ClientSettings>' +
` <WebAPIURI>${uri}</WebAPIURI>` +
' <AuthenticationType>authorization_code</AuthenticationType>' +
` <BearerToken>${token}</BearerToken>` +
' </ClientSettings>' +
'</OutputScript>';
const createResoscriptClientCredentialsConfig = ( { uri, clientCredentials } = config) => '<?xml version="1.0" encoding="utf-8" ?>' +
'<OutputScript>' +
' <ClientSettings>' +
` <WebAPIURI>${uri}</WebAPIURI>` +
' <AuthenticationType>client_credentials</AuthenticationType>' +
` <ClientIdentification>${clientCredentials.clientId}</ClientIdentification>` +
` <ClientSecret>${clientCredentials.clientSecret}</ClientSecret>` +
` <TokenURI>${clientCredentials.tokenUri}</TokenURI>` +
` ${clientCredentials.scope ? '<ClientScope>' + clientCredentials.scope + '</ClientScope>': ''}` +
' </ClientSettings>' +
'</OutputScript>';
const isClientCredentalsConfig = ( config = { clientCredentials: {} } ) => config.clientCredentials
&& config.clientCredentials.clientId
&& config.clientCredentials.clientSecret
&& config.clientCredentials.tokenUri;
const isBearerTokenConfig = ( config = { token: '' } ) => !!config.token;
const buildResoscript = (config={}) => {
if (isClientCredentalsConfig(config)) {
return createResoscriptClientCredentialsConfig(config);
} else if (isBearerTokenConfig(config)) {
return createResoscriptBearerTokenConfig(config);
}
return null;
}
const runTests = async jsonConfigPath => {
if (!jsonConfigPath) throw Error("Missing jsonConfigPath.");
try {
providerInfo = JSON.parse(fs.readFileSync(jsonConfigPath));
} catch (err) {
throw new Error('Could not read provider info!');
}
const { providerUoi, configs } = providerInfo;
if (!providerUoi) throw new Error('providerUoi is required!');
if (!configs || !configs.length) throw new Error('configs must contain valid configurations');
try {
if (fs.existsSync(providerUoi)) {
try {
fs.renameSync(providerUoi, `${providerUoi}-old-${Date.now()}`);
} catch (err) {
console.error(err);
throw new Error('Could not rename directory! Exiting!');
}
}
//create root directory
fs.mkdirSync(providerUoi);
const totalTestCount = configs.length;
let failedTestCount = 0;
configs.forEach(config => {
const
RECIPIENT_PATH = buildRecipientPath(providerUoi, config.recipientUoi),
RESOSCRIPT_CONFIG = buildResoscript(config),
CONFIG_PATH = `${COMMANDER_PATH}/${RECIPIENT_PATH}/config.xml`;
if (!RESOSCRIPT_CONFIG) throw new Error('There was a problem creating a RESOScript config for recipientUoi: ' + config.recipientUoi);
//create recipient directory
fs.mkdirSync(RECIPIENT_PATH);
fs.writeFileSync(CONFIG_PATH, RESOSCRIPT_CONFIG);
//run dd tests
const dataDictionaryResult = execSync(`${COMMANDER_PATH}/gradlew testDataDictionary_1_7 -DpathToRESOScript='${CONFIG_PATH}'`,
{ stdio: ['inherit', 'inherit', 'pipe'] });
if (dataDictionaryResult && dataDictionaryResult.stderr) {
console.error('Data Dictionary testing failed for recipientUoi: ' + config.recipientUoi);
console.error(Error(dataDictionaryResult.stderr));
//TODO, create error directory with each corresponding log
process.exitCode = 1;
}
//run data availability tests
const dataAvailabilityResult = execSync(`${COMMANDER_PATH}/gradlew testDataAvailability_1_7 -DpathToRESOScript='${CONFIG_PATH}'`,
{ stdio: ['inherit', 'inherit', 'pipe'] });
if (dataAvailabilityResult && dataAvailabilityResult.stderr) {
console.error('Data Dictionary testing failed for recipientUoi: ' + config.recipientUoi);
console.error(Error(dataAvailabilityResult.stderr));
process.exitCode = 1;
}
const paths = ['results', 'reports', 'cucumberJson'];
paths.forEach(path => {
fse.copySync(`${CERTIFICATION_RESULTS_PATH}/${path}`, RECIPIENT_PATH, { overwrite: true }, err => {
if (err) {
console.error(err);
} else {
console.log(`Copied ${path} to ${RECIPIENT_PATH}`);
}
});
});
});
console.log("Testing complete! Tests passed: " + totalTestCount);
} catch (err) {
console.error(err)
}
};
const processDDResult = async (providerUoi, recipientUoi) =>
await processDataDictionaryResults(providerUoi, recipientUoi, buildRecipientPath(providerUoi, recipientUoi));
// const cliHandler = argv => {
// argv.command({
// command: "action",
// description: "top level command",
// builder: {
// command: "bar",
// description: "child command of foo",
// builder: function() {
// console.log("builder barr!");
// },
// handler: a => {
// console.log("handler barr!");
// }
// },
// handler: args => {
// console.log("handler foo!");
// }
// })
// .demand(1, "must provide a valid command")
// .help("h")
// .alias("h", "help")
// .argv
// if (runTests) {
// const { configFilePath } = argv;
// if (!configFilePath) console.log('configFilePath is required!\nUsage: $ node . --runTests');
// } else if (processDDResult) {
// } else if (dataAvailabilityEtl) {
// } else {
// }
// };
module.exports = {
runTests,
processDDResult,
processDataAvailabilityReport
};

View File

@ -0,0 +1,4 @@
{
"API_KEY": "",
"COMMANDER_PATH": ""
}

616
runResoscripts/package-lock.json generated Normal file
View File

@ -0,0 +1,616 @@
{
"name": "runResoscripts",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"dependencies": {
"aws-sdk": "^2.1026.0",
"axios": "^0.24.0",
"fs-extra": "^10.0.0",
"yargs": "^17.3.0"
}
},
"node_modules/ansi-regex": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
"integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==",
"engines": {
"node": ">=8"
}
},
"node_modules/ansi-styles": {
"version": "4.3.0",
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
"integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
"dependencies": {
"color-convert": "^2.0.1"
},
"engines": {
"node": ">=8"
},
"funding": {
"url": "https://github.com/chalk/ansi-styles?sponsor=1"
}
},
"node_modules/aws-sdk": {
"version": "2.1027.0",
"resolved": "https://registry.npmjs.org/aws-sdk/-/aws-sdk-2.1027.0.tgz",
"integrity": "sha512-j3UjPV9hzyCvkmfcbhRscMggdmrPqlhvo8QzkXCGFfPXjZMh1OJd4HkCEH2NaunzLOyF2Y3QzxKrGOLMT7sNzg==",
"dependencies": {
"buffer": "4.9.2",
"events": "1.1.1",
"ieee754": "1.1.13",
"jmespath": "0.15.0",
"querystring": "0.2.0",
"sax": "1.2.1",
"url": "0.10.3",
"uuid": "3.3.2",
"xml2js": "0.4.19"
},
"engines": {
"node": ">= 10.0.0"
}
},
"node_modules/axios": {
"version": "0.24.0",
"resolved": "https://registry.npmjs.org/axios/-/axios-0.24.0.tgz",
"integrity": "sha512-Q6cWsys88HoPgAaFAVUb0WpPk0O8iTeisR9IMqy9G8AbO4NlpVknrnQS03zzF9PGAWgO3cgletO3VjV/P7VztA==",
"dependencies": {
"follow-redirects": "^1.14.4"
}
},
"node_modules/base64-js": {
"version": "1.5.1",
"resolved": "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz",
"integrity": "sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
]
},
"node_modules/buffer": {
"version": "4.9.2",
"resolved": "https://registry.npmjs.org/buffer/-/buffer-4.9.2.tgz",
"integrity": "sha512-xq+q3SRMOxGivLhBNaUdC64hDTQwejJ+H0T/NB1XMtTVEwNTrfFF3gAxiyW0Bu/xWEGhjVKgUcMhCrUy2+uCWg==",
"dependencies": {
"base64-js": "^1.0.2",
"ieee754": "^1.1.4",
"isarray": "^1.0.0"
}
},
"node_modules/cliui": {
"version": "7.0.4",
"resolved": "https://registry.npmjs.org/cliui/-/cliui-7.0.4.tgz",
"integrity": "sha512-OcRE68cOsVMXp1Yvonl/fzkQOyjLSu/8bhPDfQt0e0/Eb283TKP20Fs2MqoPsr9SwA595rRCA+QMzYc9nBP+JQ==",
"dependencies": {
"string-width": "^4.2.0",
"strip-ansi": "^6.0.0",
"wrap-ansi": "^7.0.0"
}
},
"node_modules/color-convert": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
"dependencies": {
"color-name": "~1.1.4"
},
"engines": {
"node": ">=7.0.0"
}
},
"node_modules/color-name": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA=="
},
"node_modules/emoji-regex": {
"version": "8.0.0",
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
},
"node_modules/escalade": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/escalade/-/escalade-3.1.1.tgz",
"integrity": "sha512-k0er2gUkLf8O0zKJiAhmkTnJlTvINGv7ygDNPbeIsX/TJjGJZHuh9B2UxbsaEkmlEo9MfhrSzmhIlhRlI2GXnw==",
"engines": {
"node": ">=6"
}
},
"node_modules/events": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/events/-/events-1.1.1.tgz",
"integrity": "sha1-nr23Y1rQmccNzEwqH1AEKI6L2SQ=",
"engines": {
"node": ">=0.4.x"
}
},
"node_modules/follow-redirects": {
"version": "1.14.5",
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.14.5.tgz",
"integrity": "sha512-wtphSXy7d4/OR+MvIFbCVBDzZ5520qV8XfPklSN5QtxuMUJZ+b0Wnst1e1lCDocfzuCkHqj8k0FpZqO+UIaKNA==",
"funding": [
{
"type": "individual",
"url": "https://github.com/sponsors/RubenVerborgh"
}
],
"engines": {
"node": ">=4.0"
},
"peerDependenciesMeta": {
"debug": {
"optional": true
}
}
},
"node_modules/fs-extra": {
"version": "10.0.0",
"resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-10.0.0.tgz",
"integrity": "sha512-C5owb14u9eJwizKGdchcDUQeFtlSHHthBk8pbX9Vc1PFZrLombudjDnNns88aYslCyF6IY5SUw3Roz6xShcEIQ==",
"dependencies": {
"graceful-fs": "^4.2.0",
"jsonfile": "^6.0.1",
"universalify": "^2.0.0"
},
"engines": {
"node": ">=12"
}
},
"node_modules/get-caller-file": {
"version": "2.0.5",
"resolved": "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz",
"integrity": "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==",
"engines": {
"node": "6.* || 8.* || >= 10.*"
}
},
"node_modules/graceful-fs": {
"version": "4.2.8",
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.8.tgz",
"integrity": "sha512-qkIilPUYcNhJpd33n0GBXTB1MMPp14TxEsEs0pTrsSVucApsYzW5V+Q8Qxhik6KU3evy+qkAAowTByymK0avdg=="
},
"node_modules/ieee754": {
"version": "1.1.13",
"resolved": "https://registry.npmjs.org/ieee754/-/ieee754-1.1.13.tgz",
"integrity": "sha512-4vf7I2LYV/HaWerSo3XmlMkp5eZ83i+/CDluXi/IGTs/O1sejBNhTtnxzmRZfvOUqj7lZjqHkeTvpgSFDlWZTg=="
},
"node_modules/is-fullwidth-code-point": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
"integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==",
"engines": {
"node": ">=8"
}
},
"node_modules/isarray": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/isarray/-/isarray-1.0.0.tgz",
"integrity": "sha1-u5NdSFgsuhaMBoNJV6VKPgcSTxE="
},
"node_modules/jmespath": {
"version": "0.15.0",
"resolved": "https://registry.npmjs.org/jmespath/-/jmespath-0.15.0.tgz",
"integrity": "sha1-o/Iiqarp+Wb10nx5ZRDigJF2Qhc=",
"engines": {
"node": ">= 0.6.0"
}
},
"node_modules/jsonfile": {
"version": "6.1.0",
"resolved": "https://registry.npmjs.org/jsonfile/-/jsonfile-6.1.0.tgz",
"integrity": "sha512-5dgndWOriYSm5cnYaJNhalLNDKOqFwyDB/rr1E9ZsGciGvKPs8R2xYGCacuf3z6K1YKDz182fd+fY3cn3pMqXQ==",
"dependencies": {
"universalify": "^2.0.0"
},
"optionalDependencies": {
"graceful-fs": "^4.1.6"
}
},
"node_modules/punycode": {
"version": "1.3.2",
"resolved": "https://registry.npmjs.org/punycode/-/punycode-1.3.2.tgz",
"integrity": "sha1-llOgNvt8HuQjQvIyXM7v6jkmxI0="
},
"node_modules/querystring": {
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/querystring/-/querystring-0.2.0.tgz",
"integrity": "sha1-sgmEkgO7Jd+CDadW50cAWHhSFiA=",
"deprecated": "The querystring API is considered Legacy. new code should use the URLSearchParams API instead.",
"engines": {
"node": ">=0.4.x"
}
},
"node_modules/require-directory": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz",
"integrity": "sha1-jGStX9MNqxyXbiNE/+f3kqam30I=",
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/sax": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/sax/-/sax-1.2.1.tgz",
"integrity": "sha1-e45lYZCyKOgaZq6nSEgNgozS03o="
},
"node_modules/string-width": {
"version": "4.2.3",
"resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
"integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
"dependencies": {
"emoji-regex": "^8.0.0",
"is-fullwidth-code-point": "^3.0.0",
"strip-ansi": "^6.0.1"
},
"engines": {
"node": ">=8"
}
},
"node_modules/strip-ansi": {
"version": "6.0.1",
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
"integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
"dependencies": {
"ansi-regex": "^5.0.1"
},
"engines": {
"node": ">=8"
}
},
"node_modules/universalify": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/universalify/-/universalify-2.0.0.tgz",
"integrity": "sha512-hAZsKq7Yy11Zu1DE0OzWjw7nnLZmJZYTDZZyEFHZdUhV8FkH5MCfoU1XMaxXovpyW5nq5scPqq0ZDP9Zyl04oQ==",
"engines": {
"node": ">= 10.0.0"
}
},
"node_modules/url": {
"version": "0.10.3",
"resolved": "https://registry.npmjs.org/url/-/url-0.10.3.tgz",
"integrity": "sha1-Ah5NnHcF8hu/N9A861h2dAJ3TGQ=",
"dependencies": {
"punycode": "1.3.2",
"querystring": "0.2.0"
}
},
"node_modules/uuid": {
"version": "3.3.2",
"resolved": "https://registry.npmjs.org/uuid/-/uuid-3.3.2.tgz",
"integrity": "sha512-yXJmeNaw3DnnKAOKJE51sL/ZaYfWJRl1pK9dr19YFCu0ObS231AB1/LbqTKRAQ5kw8A90rA6fr4riOUpTZvQZA==",
"deprecated": "Please upgrade to version 7 or higher. Older versions may use Math.random() in certain circumstances, which is known to be problematic. See https://v8.dev/blog/math-random for details.",
"bin": {
"uuid": "bin/uuid"
}
},
"node_modules/wrap-ansi": {
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
"integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==",
"dependencies": {
"ansi-styles": "^4.0.0",
"string-width": "^4.1.0",
"strip-ansi": "^6.0.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/chalk/wrap-ansi?sponsor=1"
}
},
"node_modules/xml2js": {
"version": "0.4.19",
"resolved": "https://registry.npmjs.org/xml2js/-/xml2js-0.4.19.tgz",
"integrity": "sha512-esZnJZJOiJR9wWKMyuvSE1y6Dq5LCuJanqhxslH2bxM6duahNZ+HMpCLhBQGZkbX6xRf8x1Y2eJlgt2q3qo49Q==",
"dependencies": {
"sax": ">=0.6.0",
"xmlbuilder": "~9.0.1"
}
},
"node_modules/xmlbuilder": {
"version": "9.0.7",
"resolved": "https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-9.0.7.tgz",
"integrity": "sha1-Ey7mPS7FVlxVfiD0wi35rKaGsQ0=",
"engines": {
"node": ">=4.0"
}
},
"node_modules/y18n": {
"version": "5.0.8",
"resolved": "https://registry.npmjs.org/y18n/-/y18n-5.0.8.tgz",
"integrity": "sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA==",
"engines": {
"node": ">=10"
}
},
"node_modules/yargs": {
"version": "17.3.0",
"resolved": "https://registry.npmjs.org/yargs/-/yargs-17.3.0.tgz",
"integrity": "sha512-GQl1pWyDoGptFPJx9b9L6kmR33TGusZvXIZUT+BOz9f7X2L94oeAskFYLEg/FkhV06zZPBYLvLZRWeYId29lew==",
"dependencies": {
"cliui": "^7.0.2",
"escalade": "^3.1.1",
"get-caller-file": "^2.0.5",
"require-directory": "^2.1.1",
"string-width": "^4.2.3",
"y18n": "^5.0.5",
"yargs-parser": "^21.0.0"
},
"engines": {
"node": ">=12"
}
},
"node_modules/yargs-parser": {
"version": "21.0.0",
"resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-21.0.0.tgz",
"integrity": "sha512-z9kApYUOCwoeZ78rfRYYWdiU/iNL6mwwYlkkZfJoyMR1xps+NEBX5X7XmRpxkZHhXJ6+Ey00IwKxBBSW9FIjyA==",
"engines": {
"node": ">=12"
}
}
},
"dependencies": {
"ansi-regex": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
"integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ=="
},
"ansi-styles": {
"version": "4.3.0",
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
"integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
"requires": {
"color-convert": "^2.0.1"
}
},
"aws-sdk": {
"version": "2.1027.0",
"resolved": "https://registry.npmjs.org/aws-sdk/-/aws-sdk-2.1027.0.tgz",
"integrity": "sha512-j3UjPV9hzyCvkmfcbhRscMggdmrPqlhvo8QzkXCGFfPXjZMh1OJd4HkCEH2NaunzLOyF2Y3QzxKrGOLMT7sNzg==",
"requires": {
"buffer": "4.9.2",
"events": "1.1.1",
"ieee754": "1.1.13",
"jmespath": "0.15.0",
"querystring": "0.2.0",
"sax": "1.2.1",
"url": "0.10.3",
"uuid": "3.3.2",
"xml2js": "0.4.19"
}
},
"axios": {
"version": "0.24.0",
"resolved": "https://registry.npmjs.org/axios/-/axios-0.24.0.tgz",
"integrity": "sha512-Q6cWsys88HoPgAaFAVUb0WpPk0O8iTeisR9IMqy9G8AbO4NlpVknrnQS03zzF9PGAWgO3cgletO3VjV/P7VztA==",
"requires": {
"follow-redirects": "^1.14.4"
}
},
"base64-js": {
"version": "1.5.1",
"resolved": "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz",
"integrity": "sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA=="
},
"buffer": {
"version": "4.9.2",
"resolved": "https://registry.npmjs.org/buffer/-/buffer-4.9.2.tgz",
"integrity": "sha512-xq+q3SRMOxGivLhBNaUdC64hDTQwejJ+H0T/NB1XMtTVEwNTrfFF3gAxiyW0Bu/xWEGhjVKgUcMhCrUy2+uCWg==",
"requires": {
"base64-js": "^1.0.2",
"ieee754": "^1.1.4",
"isarray": "^1.0.0"
}
},
"cliui": {
"version": "7.0.4",
"resolved": "https://registry.npmjs.org/cliui/-/cliui-7.0.4.tgz",
"integrity": "sha512-OcRE68cOsVMXp1Yvonl/fzkQOyjLSu/8bhPDfQt0e0/Eb283TKP20Fs2MqoPsr9SwA595rRCA+QMzYc9nBP+JQ==",
"requires": {
"string-width": "^4.2.0",
"strip-ansi": "^6.0.0",
"wrap-ansi": "^7.0.0"
}
},
"color-convert": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
"requires": {
"color-name": "~1.1.4"
}
},
"color-name": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA=="
},
"emoji-regex": {
"version": "8.0.0",
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
},
"escalade": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/escalade/-/escalade-3.1.1.tgz",
"integrity": "sha512-k0er2gUkLf8O0zKJiAhmkTnJlTvINGv7ygDNPbeIsX/TJjGJZHuh9B2UxbsaEkmlEo9MfhrSzmhIlhRlI2GXnw=="
},
"events": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/events/-/events-1.1.1.tgz",
"integrity": "sha1-nr23Y1rQmccNzEwqH1AEKI6L2SQ="
},
"follow-redirects": {
"version": "1.14.5",
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.14.5.tgz",
"integrity": "sha512-wtphSXy7d4/OR+MvIFbCVBDzZ5520qV8XfPklSN5QtxuMUJZ+b0Wnst1e1lCDocfzuCkHqj8k0FpZqO+UIaKNA=="
},
"fs-extra": {
"version": "10.0.0",
"resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-10.0.0.tgz",
"integrity": "sha512-C5owb14u9eJwizKGdchcDUQeFtlSHHthBk8pbX9Vc1PFZrLombudjDnNns88aYslCyF6IY5SUw3Roz6xShcEIQ==",
"requires": {
"graceful-fs": "^4.2.0",
"jsonfile": "^6.0.1",
"universalify": "^2.0.0"
}
},
"get-caller-file": {
"version": "2.0.5",
"resolved": "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz",
"integrity": "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg=="
},
"graceful-fs": {
"version": "4.2.8",
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.8.tgz",
"integrity": "sha512-qkIilPUYcNhJpd33n0GBXTB1MMPp14TxEsEs0pTrsSVucApsYzW5V+Q8Qxhik6KU3evy+qkAAowTByymK0avdg=="
},
"ieee754": {
"version": "1.1.13",
"resolved": "https://registry.npmjs.org/ieee754/-/ieee754-1.1.13.tgz",
"integrity": "sha512-4vf7I2LYV/HaWerSo3XmlMkp5eZ83i+/CDluXi/IGTs/O1sejBNhTtnxzmRZfvOUqj7lZjqHkeTvpgSFDlWZTg=="
},
"is-fullwidth-code-point": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
"integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg=="
},
"isarray": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/isarray/-/isarray-1.0.0.tgz",
"integrity": "sha1-u5NdSFgsuhaMBoNJV6VKPgcSTxE="
},
"jmespath": {
"version": "0.15.0",
"resolved": "https://registry.npmjs.org/jmespath/-/jmespath-0.15.0.tgz",
"integrity": "sha1-o/Iiqarp+Wb10nx5ZRDigJF2Qhc="
},
"jsonfile": {
"version": "6.1.0",
"resolved": "https://registry.npmjs.org/jsonfile/-/jsonfile-6.1.0.tgz",
"integrity": "sha512-5dgndWOriYSm5cnYaJNhalLNDKOqFwyDB/rr1E9ZsGciGvKPs8R2xYGCacuf3z6K1YKDz182fd+fY3cn3pMqXQ==",
"requires": {
"graceful-fs": "^4.1.6",
"universalify": "^2.0.0"
}
},
"punycode": {
"version": "1.3.2",
"resolved": "https://registry.npmjs.org/punycode/-/punycode-1.3.2.tgz",
"integrity": "sha1-llOgNvt8HuQjQvIyXM7v6jkmxI0="
},
"querystring": {
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/querystring/-/querystring-0.2.0.tgz",
"integrity": "sha1-sgmEkgO7Jd+CDadW50cAWHhSFiA="
},
"require-directory": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz",
"integrity": "sha1-jGStX9MNqxyXbiNE/+f3kqam30I="
},
"sax": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/sax/-/sax-1.2.1.tgz",
"integrity": "sha1-e45lYZCyKOgaZq6nSEgNgozS03o="
},
"string-width": {
"version": "4.2.3",
"resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
"integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
"requires": {
"emoji-regex": "^8.0.0",
"is-fullwidth-code-point": "^3.0.0",
"strip-ansi": "^6.0.1"
}
},
"strip-ansi": {
"version": "6.0.1",
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
"integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
"requires": {
"ansi-regex": "^5.0.1"
}
},
"universalify": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/universalify/-/universalify-2.0.0.tgz",
"integrity": "sha512-hAZsKq7Yy11Zu1DE0OzWjw7nnLZmJZYTDZZyEFHZdUhV8FkH5MCfoU1XMaxXovpyW5nq5scPqq0ZDP9Zyl04oQ=="
},
"url": {
"version": "0.10.3",
"resolved": "https://registry.npmjs.org/url/-/url-0.10.3.tgz",
"integrity": "sha1-Ah5NnHcF8hu/N9A861h2dAJ3TGQ=",
"requires": {
"punycode": "1.3.2",
"querystring": "0.2.0"
}
},
"uuid": {
"version": "3.3.2",
"resolved": "https://registry.npmjs.org/uuid/-/uuid-3.3.2.tgz",
"integrity": "sha512-yXJmeNaw3DnnKAOKJE51sL/ZaYfWJRl1pK9dr19YFCu0ObS231AB1/LbqTKRAQ5kw8A90rA6fr4riOUpTZvQZA=="
},
"wrap-ansi": {
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
"integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==",
"requires": {
"ansi-styles": "^4.0.0",
"string-width": "^4.1.0",
"strip-ansi": "^6.0.0"
}
},
"xml2js": {
"version": "0.4.19",
"resolved": "https://registry.npmjs.org/xml2js/-/xml2js-0.4.19.tgz",
"integrity": "sha512-esZnJZJOiJR9wWKMyuvSE1y6Dq5LCuJanqhxslH2bxM6duahNZ+HMpCLhBQGZkbX6xRf8x1Y2eJlgt2q3qo49Q==",
"requires": {
"sax": ">=0.6.0",
"xmlbuilder": "~9.0.1"
}
},
"xmlbuilder": {
"version": "9.0.7",
"resolved": "https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-9.0.7.tgz",
"integrity": "sha1-Ey7mPS7FVlxVfiD0wi35rKaGsQ0="
},
"y18n": {
"version": "5.0.8",
"resolved": "https://registry.npmjs.org/y18n/-/y18n-5.0.8.tgz",
"integrity": "sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA=="
},
"yargs": {
"version": "17.3.0",
"resolved": "https://registry.npmjs.org/yargs/-/yargs-17.3.0.tgz",
"integrity": "sha512-GQl1pWyDoGptFPJx9b9L6kmR33TGusZvXIZUT+BOz9f7X2L94oeAskFYLEg/FkhV06zZPBYLvLZRWeYId29lew==",
"requires": {
"cliui": "^7.0.2",
"escalade": "^3.1.1",
"get-caller-file": "^2.0.5",
"require-directory": "^2.1.1",
"string-width": "^4.2.3",
"y18n": "^5.0.5",
"yargs-parser": "^21.0.0"
}
},
"yargs-parser": {
"version": "21.0.0",
"resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-21.0.0.tgz",
"integrity": "sha512-z9kApYUOCwoeZ78rfRYYWdiU/iNL6mwwYlkkZfJoyMR1xps+NEBX5X7XmRpxkZHhXJ6+Ey00IwKxBBSW9FIjyA=="
}
}
}

View File

@ -0,0 +1,8 @@
{
"dependencies": {
"aws-sdk": "^2.1026.0",
"axios": "^0.24.0",
"fs-extra": "^10.0.0",
"yargs": "^17.3.0"
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,110 @@
const { Axios } = require('axios');
const fs = require('fs');
const { API_KEY } = require('../batch-config.json');
const getDataDictionaryOptions = (providerUoi, recipientUoi, data) => {
if (!providerUoi) throw new Error('providerUoi is required!');
if (!recipientUoi) throw new Error('recipientUoi is required!');
if (!data) throw new Error('data is required!');
return {
'method': "post",
'baseURL': 'https://certification.reso.org',
'url': `/api/v1/certification_reports/data_dictionary/${providerUoi}`,
'headers': {
'Authorization': `ApiKey ${API_KEY}`,
'recipientUoi': recipientUoi,
'Content-Type': "application/json",
'User-Agent': "CommanderBatchProcess/0.1",
'Accept': "*/*",
'Cache-Control': "no-cache",
'Host': 'certification.reso.org',
'Accept-Encoding': 'gzip, deflate',
'Connection': 'keep-alive'
},
data
};
};
const getDataAvailabilityOptions = (metadataReportId, data) => {
if (!metadataReportId) throw new Error('metadataReportId is required!');
if (!data) throw new Error('data is required!');
return {
'method': "post",
'baseURL': 'https://certification.reso.org',
'url': `/api/v1/payload/data_availability/${metadataReportId}`,
'headers': {
'Authorization': `ApiKey ${API_KEY}`,
'Content-Type': 'application/json',
'User-Agent': 'CommanderBatchProcess/0.1',
'Accept': '*/*',
'Cache-Control': 'no-cache',
'Host': 'certification.reso.org',
'Accept-Encoding': 'gzip, deflate',
'Connection': 'keep-alive'
},
data
};
};
const buildDataDictionaryResultsPath = (providerUoi, recipientUoi) => `${providerUoi}/${recipientUoi}/metadata-report.json`;
const buildDataAvailabilityResultsPath = (providerUoi, recipientUoi) => `${providerUoi}/${recipientUoi}/data-availability-report.json`;
const postDataDictionaryResultsToApi = async (providerUoi, recipientUoi) => {
if (!providerUoi) throw new Error('providerUoi is required!');
if (!recipientUoi) throw new Error('recipientUoi is required!');
try {
const data = await fs.readFileAsync(buildDataDictionaryResultsPath(providerUoi, recipientUoi), 'utf8');
const response = await Axios.post(getDataDictionaryOptions(providerUoi, recipientUoi, data));
if (!response.id) throw new Error('Did not receive the required id parameter from the response!');
return response.id;
} catch (err) {
throw new Error('Could not post data dictionary results to API!' + '\n' + err);
}
}
const postDataAvailabilityResultsToApi = async (metadataReportId, providerUoi, recipientUoi) => {
try {
const data = await fs.readFileAsync(buildDataAvailabilityResultsPath(providerUoi, recipientUoi), 'utf8');
const response = await Axios.post(getDataAvailabilityOptions(metadataReportId, data));
if (!response || !response.success) throw new Error('Api did not report a successful response! ');
return response.id;
} catch (err) {
throw new Error('Could not post data availability results to API!' + '\n' + err);
}
};
const snooze = ms => new Promise(resolve => setTimeout(resolve, ms));
const processDataDictionaryResults = async (providerUoi, recipientUoi) => {
try {
await snooze(5 * 1000); //wait 5s for the dust to settle to avoid thrashing the server
console.log('Posting Data Dictionary results...');
const reportId = await postDataDictionaryResultsToApi(providerUoi, recipientUoi);
console.log('Results posted, reportId: ' + reportId);
await snooze(5 * 1000); //wait 5s for the dust to settle to avoid thrashing the server
if (reportId) {
console.log('Posting data availability results for reportId')
return await postDataAvailabilityResultsToApi(reportId, providerUoi, recipientUoi);
}
} catch (err) {
throw new Error('Could not process data dictionary results! \nError:' + err);
}
return null;
};
module.exports = {
processDataDictionaryResults
}

View File

@ -0,0 +1,324 @@
const fs = require('fs').promises;
const { standardMeta } = require('../references/standardMeta');
const { lookupMap } = require('../references/lookupMap.js');
/**
* Defines the bins template for stats.
* @returns bins template with all bins initialized to 0.
*/
const getBinsTemplate = () => {
return {
eq0: 0,
gt0: 0,
gte25: 0,
gte50: 0,
gte75: 0,
eq100: 0
};
};
/**
* Defines the totals template for stats.
* @returns totals template with all bins initialized to 0.
*/
const getTotalsTemplate = () => {
return {
total: getBinsTemplate(),
reso: getBinsTemplate(),
idx: getBinsTemplate(),
local: getBinsTemplate()
};
};
/**
* Defines the availability template for stats. This is the structure of the processed results.
* @returns availability template with all totals and bins initialized to 0.
*/
const getAvailabilityTemplate = () => {
return {
fields: [],
lookups: [],
lookupValues: [],
resources: [],
availability: {
fields: getTotalsTemplate(),
lookups: getTotalsTemplate(),
resources: {},
resourcesBinary: {}
}
};
};
/**
* Builds a standard field cache from a list of standard fields.
* @param {Array} fields an array of standard fields.
* @returns map of all standard fields addressable by cache[resourceName][fieldName]
* or an empty map if there are none.
*/
const createStandardFieldCache = (fields = []) => {
const resourceFieldCache = {};
fields.forEach(field => {
if (!resourceFieldCache[field.resourceName]) {
resourceFieldCache[field.resourceName] = {};
}
resourceFieldCache[field.resourceName][field.fieldName] = field;
});
return resourceFieldCache;
};
/**
* Builds a lookup value cache from a list of individual lookup value items.
* @param {Array} lookupValues the lookup values to create the cache from.
* @returns map of all lookups addressable by cache[resourceName][fieldName][lookupValue]
* or an empty map if there are none.
*/
const createLookupValueCache = (lookupValues = []) => {
const resourceFieldLookupCache = {};
lookupValues.forEach(lookupValue => {
if (!resourceFieldLookupCache[lookupValue.resourceName]) {
resourceFieldLookupCache[lookupValue.resourceName] = {};
}
if (!resourceFieldLookupCache[lookupValue.resourceName][lookupValue.fieldName]) {
resourceFieldLookupCache[lookupValue.resourceName][lookupValue.fieldName] = {};
}
resourceFieldLookupCache[lookupValue.resourceName][lookupValue.fieldName][lookupValue.lookupValue] = lookupValue;
});
return resourceFieldLookupCache;
}
/**
* Determines whether a given field is an IDX field.
* TODO: The performance could be improved here in that there's a filter being done on each payloads array.
* There's potential speedup if each payload were turned into a nested property rather than an array.
* @param {String} resourceName the name of the resource for the field.
* @param {String} fieldName the name of the field.
* @param {Object} standardFieldCache a field cache created by createStandardFieldCache().
* @returns true if the given field is an IDX field, false otherwise.
*/
const isIdxField = (resourceName, fieldName, standardFieldCache = {}) => resourceName && fieldName
&& isResoField(resourceName, fieldName, standardFieldCache)
&& !!standardFieldCache[resourceName][fieldName].payloads.filter(x => x === "IDX").length > 0;
/**
* Determines whether a given field is a RESO field.
* @param {String} resourceName the name of the resource for the field.
* @param {String} fieldName the name of the field.
* @param {Object} standardFieldCache a field cache created by createStandardFieldCache().
* @returns true if the given field is a RESO field, false otherwise.
*/
const isResoField = (resourceName, fieldName, standardFieldCache = {}) => resourceName && fieldName
&& standardFieldCache[resourceName] && !!standardFieldCache[resourceName][fieldName];
/**
* Determines if a given lookup is a RESO lookup.
* @param {*} resourceName the name of the resource for the field.
* @param {*} fieldName the name of the field.
* @param {*} lookupValue the name of the lookup to test.
* @param {*} standardFieldCache a field cache created by createStandardFieldCache().
* @returns the RESO lookup, if found, otherwise null.
*/
const findResoLookup = (resourceName, fieldName, lookupValue, standardFieldCache = {}) => {
if (resourceName && fieldName && standardFieldCache[resourceName] && standardFieldCache[resourceName][fieldName]) {
const field = standardFieldCache[resourceName][fieldName];
if (field && field.simpleDataType.includes('String List') && field.type.includes('.')) {
const lookupName = field.type.substring(field.type.lastIndexOf('.') + 1, field.type.length);
const lookup = lookupMap[lookupName] && lookupMap[lookupName].find(x => x.lookupValue === lookupValue || x.lookupDisplayName === lookupValue);
//TODO: turn the lookup map into its own inverted hash by lookup values and display names
return lookup ? { lookupName, lookup } : null;
}
}
return null;
};
/**
* Computes availability from existing bins.
* @param {Number} availability the current availability value.
* @param {Object} bins existing bins containing past availability values.
* @returns a new object following the getBinsTemplate structure that contains updated availabilities for each bin.
*/
const computeBins = (availability, bins) => {
if (!bins) return getBinsTemplate();
return {
eq0: availability === 0 ? bins.eq0 + 1 : bins.eq0 || 0,
gt0: availability > 0 ? bins.gt0 + 1 : bins.gt0 || 0,
gte25: availability >= 0.25 ? bins.gte25 + 1 : bins.gte25 || 0,
gte50: availability >= 0.5 ? bins.gte50 + 1 : bins.gte50 || 0,
gte75: availability >= 0.75 ? bins.gte75 + 1 : bins.gte75 || 0,
eq100: availability === 1 ? bins.eq100 + 1 : bins.eq100 || 0
}
};
/**
* Translates existing numeric bins into booleans.
* @param {Object} bins existing bins object.
* @returns the resulting bins object with values transformed to booleans.
*/
const computeBooleanBins = bins => {
const booleanBins = {};
Object.entries(bins).forEach( ([bin, value]) => booleanBins[bin] = !!value);
return booleanBins;
}
/**
* Computes availability from discrete bins, meaning ones with integer values (tallies).
* @param {Object} discreteBins bins using the getBinsTemplate() structure with integer availability values.
* @param {Number} resourceSampleCount the count of the number of sampled records for a given resource.
* @returns a bins object with the decimal availabilities computed.
*/
const computeAvailabilityFromDiscreteBins = (discreteBins=getBinsTemplate(), resourceSampleCount=0) => {
if (!resourceSampleCount) return discreteBins;
const availabilities = {};
Object.entries(discreteBins).forEach(([binName, value]) => availabilities[binName] = 1.0 * value / resourceSampleCount);
return availabilities;
};
/**
* Processes a RESO Data Availability Report and creates aggregates and rollups.
* TODO: individual totals calculations could be tidied up a bit.
* @param {Object} availablityReport the RESO availability report JSON to process.
* @returns a JSON availability report with the appropriate rollups and aggregates.
*/
const process = async availablityReport => {
//iterate over each field and lookup and compute their availabilities
const { resources, fields, lookups, lookupValues } = availablityReport;
const transformed = getAvailabilityTemplate();
const standardFieldCache = createStandardFieldCache(standardMeta.fields);
const resourceCounts = {};
resources.forEach(resource => resourceCounts[resource.resourceName] = resource.numRecordsFetched);
const processedFields = [], processedLookupValues = [], lookupValueCache = createLookupValueCache(lookupValues);
//binary resource availability cache
const resourcesBinary = {};
//process fields
fields.forEach(field => {
const availability = resourceCounts[field.resourceName] !== 0 ? 1.0 * field.frequency / resourceCounts[field.resourceName] : 0;
const fieldBins = computeBins(availability, getBinsTemplate());
//update field availability
transformed.availability.fields.total = computeBins(availability, transformed.availability.fields.total);
//add totals template for this resource name if it doesn't already exist
if (!resourcesBinary[field.resourceName]) {
resourcesBinary[field.resourceName] = { fields: getTotalsTemplate(), lookups: getTotalsTemplate() };
}
//update binary resource bins
resourcesBinary[field.resourceName].fields.total = computeBins(availability, resourcesBinary[field.resourceName].fields.total);
if (isResoField(field.resourceName, field.fieldName, standardFieldCache)) {
//update RESO totals
transformed.availability.fields.reso = computeBins(availability, transformed.availability.fields.reso);
resourcesBinary[field.resourceName].fields.reso = computeBins(availability, resourcesBinary[field.resourceName].fields.reso);
if (isIdxField(field.resourceName, field.fieldName, standardFieldCache)) {
//update IDX totals
transformed.availability.fields.idx = computeBins(availability, transformed.availability.fields.idx);
resourcesBinary[field.resourceName].fields.idx = computeBins(availability, resourcesBinary[field.resourceName].fields.idx);
}
} else {
//otherwise, update local totals
transformed.availability.fields.local = computeBins(availability, transformed.availability.fields.local);
resourcesBinary[field.resourceName].fields.local = computeBins(availability, resourcesBinary[field.resourceName].fields.local);
}
//only process if there are lookups for this field
const lookupsForField = lookupValueCache[field.resourceName] && lookupValueCache[field.resourceName][field.fieldName];
if (lookupsForField) {
Object.values(lookupsForField).forEach(lookupValue => {
if (lookupValue.lookupValue !== 'null' && lookupValue.lookupValue !== 'NULL_VALUE') {
const lookupAvailability = !!lookupValue.frequency && !!resourceCounts[field.resourceName]
? 1.0 * lookupValue.frequency / resourceCounts[field.resourceName] : 0;
const lookupBins = computeBins(lookupAvailability, getBinsTemplate());
transformed.availability.lookups.total = computeBins(availability, transformed.availability.lookups.total);
resourcesBinary[field.resourceName].lookups.total = computeBins(availability, resourcesBinary[field.resourceName].lookups.total);
if (isResoField(lookupValue.resourceName, lookupValue.fieldName, standardFieldCache) &&
findResoLookup(lookupValue.resourceName, lookupValue.fieldName, lookupValue.lookupValue, standardFieldCache)) {
transformed.availability.lookups.reso = computeBins(availability, transformed.availability.lookups.reso);
resourcesBinary[field.resourceName].lookups.reso = computeBins(availability, resourcesBinary[field.resourceName].lookups.reso);
if (isIdxField(lookupValue.resourceName, lookupValue.fieldName, standardFieldCache)) {
transformed.availability.lookups.idx = computeBins(availability, transformed.availability.lookups.idx);
resourcesBinary[field.resourceName].lookups.idx = computeBins(availability, resourcesBinary[field.resourceName].lookups.idx);
}
} else {
transformed.availability.lookups.local = computeBins(availability, transformed.availability.lookups.local);
resourcesBinary[field.resourceName].lookups.local = computeBins(availability, resourcesBinary[field.resourceName].lookups.local);
}
processedLookupValues.push({ ...lookupValue, lookupAvailability, ...computeBooleanBins(lookupBins) });
}
});
}
if (!!field) {
processedFields.push({ ...field, availability, ...computeBooleanBins(fieldBins) });
}
});
transformed.resources = resources;
transformed.fields = processedFields;
transformed.lookups = lookups;
transformed.lookupValues = processedLookupValues;
transformed.availability.resourcesBinary = resourcesBinary;
//compute resource availability rollups from the discrete bins
const resourceAvailability = {};
transformed.availability.resources = Object.entries(resourcesBinary).forEach(([resourceName, value]) => {
if (!resourceAvailability[resourceName]) resourceAvailability[resourceName] = {};
const { fields = getTotalsTemplate(), lookups = getTotalsTemplate() } = value;
const resourceCount = resourceCounts[resourceName] || 0;
resourceAvailability[resourceName].fields = {
total: computeAvailabilityFromDiscreteBins(fields.total, resourceCount),
reso: computeAvailabilityFromDiscreteBins(fields.reso, resourceCount),
idx: computeAvailabilityFromDiscreteBins(fields.idx, resourceCount),
local: computeAvailabilityFromDiscreteBins(fields.local, resourceCount)
};
resourceAvailability[resourceName].lookups = {
total: computeAvailabilityFromDiscreteBins(lookups.total, resourceCount),
reso: computeAvailabilityFromDiscreteBins(lookups.reso, resourceCount),
idx: computeAvailabilityFromDiscreteBins(lookups.idx, resourceCount),
local: computeAvailabilityFromDiscreteBins(lookups.local, resourceCount)
};
});
transformed.availability.resources = resourceAvailability;
return transformed;
}
/**
* Processes a RESO data availability report at the given path and writes it to a local file
* in the current path called 'availability-processed.json'.
* @param {String} pathToDataAvailabilityReport the path to the data availability report to process.
*/
const processDataAvailabilityReport = async pathToDataAvailabilityReport => {
try {
const availablityReport = JSON.parse(await fs.readFile(pathToDataAvailabilityReport, 'utf8'));
const startTime = new Date();
await fs.writeFile('./availability-processed.json', JSON.stringify(await process(availablityReport)));
console.log("Time taken: ", new Date() - startTime, "ms");
} catch (err) {
console.error(err);
}
}
module.exports = {
processDataAvailabilityReport
};

View File

@ -83,17 +83,6 @@
</ClientSettings>
<!--
############################################################
Parameters Section - add your testing variables here
############################################################-->
<Parameters>
<!-- OPTIONAL: Useful for testing the OData Format Parameter - Value="?$format=application/xml" -->
<Parameter Name="OptionalMetadataFormatParameter" Value="" />
</Parameters>
<!--
############################################################
Requests Section - Queries used during testing,
@ -103,8 +92,8 @@
<Request
RequestId="metadata-request"
OutputFile="metadata-metadata-request.xml"
Url="*ClientSettings_WebAPIURI*/$metadata*Parameter_OptionalMetadataFormatParameter*"
OutputFile="metadata-request.xml"
Url="*ClientSettings_WebAPIURI*/$metadata?$format=application/xml"
/>
</Requests>

View File

@ -202,10 +202,6 @@
Value="*Parameter_MultipleValueLookupNamespace*'*Parameter_MultipleLookupValue1*'"/>
<Parameter Name="MultipleValueLookupValue2"
Value="*Parameter_MultipleValueLookupNamespace*'*Parameter_MultipleLookupValue2*'"/>
<!-- OPTIONAL: Useful for testing the OData Format Parameter - Value="?$format=application/xml" -->
<Parameter Name="OptionalMetadataFormatParameter" Value="?$format=application/xml"/>
</Parameters>
<!--
@ -218,7 +214,7 @@
<Request
RequestId="metadata-request"
OutputFile="metadata-request.xml"
Url="*ClientSettings_WebAPIURI*/$metadata*Parameter_OptionalMetadataFormatParameter*"
Url="*ClientSettings_WebAPIURI*/$metadata?$format=application/xml"
/>
<Request

View File

@ -20,6 +20,7 @@ import static org.junit.Assert.assertTrue;
import static org.reso.certification.codegen.WorksheetProcessor.WELL_KNOWN_DATA_TYPES.*;
import static org.reso.certification.codegen.WorksheetProcessor.WELL_KNOWN_FIELD_HEADERS.COLLECTION;
import static org.reso.certification.codegen.WorksheetProcessor.WELL_KNOWN_FIELD_HEADERS.STANDARD_NAME;
import static org.reso.commander.common.DataDictionaryMetadata.v1_7.LOOKUP_FIELDS_AND_VALUES;
import static org.reso.commander.common.ErrorMsg.getDefaultErrorMessage;
public abstract class WorksheetProcessor {
@ -301,7 +302,8 @@ public abstract class WorksheetProcessor {
}
String getDirectoryName() {
return startTimestamp + "-" + REFERENCE_WORKSHEET.toLowerCase().substring(0, REFERENCE_WORKSHEET.lastIndexOf("."));
return startTimestamp + "-" + getReferenceResource()
.toLowerCase().substring(0, getReferenceResource().lastIndexOf("."));
}
public String getReferenceResource() {
@ -331,11 +333,7 @@ public abstract class WorksheetProcessor {
}
public void buildEnumerationMap() {
final String ENUMERATION_TAB_NAME = "Lookup Fields and Values";
final int LOOKUP_NAME_INDEX = 0, STANDARD_NAME_INDEX = 1;
DataFormatter formatter = new DataFormatter();
Sheet sheet = getReferenceWorkbook().getSheet(ENUMERATION_TAB_NAME);
Sheet sheet = getReferenceWorkbook().getSheet(LOOKUP_FIELDS_AND_VALUES);
buildWellKnownStandardEnumerationHeaderMap(sheet);
AtomicReference<ReferenceStandardLookup> standardEnumeration = new AtomicReference<>();
@ -350,7 +348,6 @@ public abstract class WorksheetProcessor {
standardEnumerationsMap.get(standardEnumeration.get().getLookupField()).add(standardEnumeration.get());
}
});
//enumerations.forEach((key, items) -> LOG.info("key: " + key + " , items: " + items.toString()));
}
//TODO: convert to parallel stream

View File

@ -27,17 +27,20 @@ import org.reso.commander.common.DataDictionaryMetadata;
import org.reso.commander.common.TestUtils;
import org.reso.models.*;
import java.io.File;
import java.io.InputStream;
import java.net.URI;
import java.util.*;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.concurrent.atomic.AtomicReference;
import java.util.stream.Collectors;
import static org.junit.Assert.*;
import static org.reso.commander.Commander.*;
import static org.reso.commander.common.ErrorMsg.getDefaultErrorMessage;
import static org.reso.commander.common.TestUtils.HEADER_ODATA_VERSION;
import static org.reso.commander.common.TestUtils.JSON_VALUE_PATH;
import static org.reso.models.Request.loadFromRESOScript;
/**
* Encapsulates Commander Requests and Responses during runtime
@ -98,6 +101,7 @@ public final class WebAPITestContainer implements TestContainer {
private final AtomicReference<ODataEntitySetRequest<ClientEntitySet>> clientEntitySetRequest = new AtomicReference<>();
private final AtomicReference<ODataRetrieveResponse<ClientEntitySet>> clientEntitySetResponse = new AtomicReference<>();
private final AtomicReference<ClientEntitySet> clientEntitySet = new AtomicReference<>();
private static final String WEB_API_CORE_REFERENCE_REQUESTS = "reference-web-api-core-requests.xml";
//singleton variables
private static final AtomicReference<Map<String, Map<String, CsdlProperty>>> fieldMap = new AtomicReference<>();
@ -110,6 +114,13 @@ public final class WebAPITestContainer implements TestContainer {
Commander.Builder builder = new Commander.Builder().useEdmEnabledClient(true);
if (!isUsingMetadataFile.get()) {
//overwrite any requests loaded with the reference queries
//TODO: make the reference requests something that can be passed in during initialization
getSettings().setRequests(loadFromRESOScript(new File(Objects.requireNonNull(
getClass().getClassLoader().getResource(WEB_API_CORE_REFERENCE_REQUESTS)).getPath()))
.stream().map(request -> Settings.resolveParameters(request, getSettings())).collect(Collectors.toList()));
setServiceRoot(getSettings().getClientSettings().get(ClientSettings.SERVICE_ROOT));
//TODO: add base64 un-encode when applicable
@ -125,7 +136,7 @@ public final class WebAPITestContainer implements TestContainer {
setRedirectUri(getSettings().getClientSettings().get(ClientSettings.REDIRECT_URI));
setScope(getSettings().getClientSettings().get(ClientSettings.CLIENT_SCOPE));
LOG.info("Service root is: " + getServiceRoot());
LOG.debug("Service root is: " + getServiceRoot());
builder
.clientId(getClientId())

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: ContactListingNotes
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: ContactListings
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: Contacts
Background:

View File

@ -0,0 +1,38 @@
# This file was autogenerated on: 20211212171220893
Feature: Field
Background:
Given a RESOScript or Metadata file are provided
When a RESOScript file is provided
Then Client Settings and Parameters can be read from the RESOScript
And a test container was successfully created from the given RESOScript file
And the test container uses an Authorization Code or Client Credentials for authentication
And valid metadata were retrieved from the server
When a metadata file is provided
Then a test container was successfully created from the given metadata file
And valid metadata are loaded into the test container
@Field
Scenario: FieldKey
When "FieldKey" exists in the "Field" metadata
Then "FieldKey" MUST be "String" data type
@Field
Scenario: ResourceName
When "ResourceName" exists in the "Field" metadata
Then "ResourceName" MUST be "String" data type
@Field
Scenario: FieldName
When "FieldName" exists in the "Field" metadata
Then "FieldName" MUST be "String" data type
@Field
Scenario: DisplayName
When "DisplayName" exists in the "Field" metadata
Then "DisplayName" MUST be "String" data type
@Field
Scenario: ModificationTimestamp
When "ModificationTimestamp" exists in the "Field" metadata
Then "ModificationTimestamp" MUST be "Timestamp" data type

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: HistoryTransactional
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: InternetTracking
Background:

View File

@ -0,0 +1,43 @@
# This file was autogenerated on: 20211212171220893
Feature: Lookup
Background:
Given a RESOScript or Metadata file are provided
When a RESOScript file is provided
Then Client Settings and Parameters can be read from the RESOScript
And a test container was successfully created from the given RESOScript file
And the test container uses an Authorization Code or Client Credentials for authentication
And valid metadata were retrieved from the server
When a metadata file is provided
Then a test container was successfully created from the given metadata file
And valid metadata are loaded into the test container
@Lookup
Scenario: LookupKey
When "LookupKey" exists in the "Lookup" metadata
Then "LookupKey" MUST be "String" data type
@Lookup
Scenario: LookupName
When "LookupName" exists in the "Lookup" metadata
Then "LookupName" MUST be "String" data type
@Lookup
Scenario: LookupValue
When "LookupValue" exists in the "Lookup" metadata
Then "LookupValue" MUST be "String" data type
@Lookup
Scenario: StandardLookupValue
When "StandardLookupValue" exists in the "Lookup" metadata
Then "StandardLookupValue" MUST be "String" data type
@Lookup
Scenario: LegacyODataValue
When "LegacyODataValue" exists in the "Lookup" metadata
Then "LegacyODataValue" MUST be "String" data type
@Lookup
Scenario: ModificationTimestamp
When "ModificationTimestamp" exists in the "Lookup" metadata
Then "ModificationTimestamp" MUST be "Timestamp" data type

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: Media
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: Member
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: Office
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: OpenHouse
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: OtherPhone
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: OUID
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: Property
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: PropertyGreenVerification
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: PropertyPowerProduction
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: PropertyRooms
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: PropertyUnitTypes
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: Prospecting
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: Queue
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: Rules
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: SavedSearch
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: Showing
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: SocialMedia
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: TeamMembers
Background:

View File

@ -1,4 +1,4 @@
# This file was autogenerated on: 20210717230753113
# This file was autogenerated on: 20211212171220893
Feature: Teams
Background:

View File

@ -25,13 +25,13 @@ Feature: Payloads Sampling (Web API)
Given that valid metadata have been requested from the server
And the metadata contains RESO Standard Resources
And "payload-samples" has been created in the build directory
Then up to 10000 records are sampled from each resource with payload samples stored in "payload-samples"
Then up to 100000 records are sampled from each resource with payload samples stored in "payload-samples"
@local-resource-sampling @dd-1.7 @payloads-sampling
Scenario: Non Standard Resource Sampling - Request Data from Each Server Resource
Given that valid metadata have been requested from the server
And the metadata contains local resources
Then up to 10000 records are sampled from each local resource
Then up to 100000 records are sampled from each local resource
@payloads-sampling @dd-1.7
Scenario: A Data Availability Report is Created from Sampled Records

View File

@ -19,7 +19,7 @@ Feature: Web API Server Add/Edit Endorsement
#
# This is without the prefer header and minimal value
#
@create @create-succeeds @add-edit-endorsement @rcp-010 @1.0.2
@create @create-succeeds @add-edit-endorsement @rcp-010 @2.0.0
Scenario: Create operation succeeds using a given payload
Given valid metadata have been retrieved
And request data has been provided in "create-succeeds.json"
@ -64,7 +64,7 @@ Feature: Web API Server Add/Edit Endorsement
# OData-Version: 4.01
# Content-Type: application/json
# Accept: application/json
@create @create-fails @add-edit-endorsement @rcp-010 @1.0.2
@create @create-fails @add-edit-endorsement @rcp-010 @2.0.0
Scenario: Create operation fails using a given payload
Given valid metadata have been retrieved
And request data has been provided in "create-fails.json"

View File

@ -8,7 +8,7 @@ Feature: Web API Server Core Endorsement
And a test container was successfully created from the given RESOScript
And the test container uses an authorization_code or client_credentials for authentication
@metadata-validation @core-endorsement @add-edit-endorsement @1.0.2
@metadata-validation @core-endorsement @add-edit-endorsement @2.0.0
Scenario: metadata-validation - Request and Validate Server Metadata
When XML Metadata are requested from the service root in "ClientSettings_WebAPIURI"
Then the server responds with a status code of 200
@ -22,7 +22,7 @@ Feature: Web API Server Core Endorsement
And the metadata contains the "Parameter_EndpointResource" resource
And the metadata contains at least one resource from "Parameter_WebAPI102_RequiredResourceList"
@service-document @core-endorsement @1.0.2
@service-document @core-endorsement @2.0.0
Scenario: service-document - Service Document Request
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "service-document"
@ -30,7 +30,7 @@ Feature: Web API Server Core Endorsement
And the server has an OData-Version header value of "4.0" or "4.01"
And the response is valid JSON
@fetch-by-key @core-endorsement @1.0.2
@fetch-by-key @core-endorsement @2.0.0
Scenario: fetch-by-key - fetch by Key Field
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "fetch-by-key"
@ -40,7 +40,7 @@ Feature: Web API Server Core Endorsement
And the response has singleton results in "Parameter_KeyField"
And the provided "Parameter_KeyValue" is returned in "Parameter_KeyField"
@select @core-endorsement @1.0.2
@select @core-endorsement @2.0.0
Scenario: select - Query Support: $select
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "select"
@ -51,7 +51,7 @@ Feature: Web API Server Core Endorsement
And resource metadata for "Parameter_EndpointResource" contains the fields in the given select list
And data are present for fields contained within the given select list
@top @core-endorsement @1.0.2
@top @core-endorsement @2.0.0
Scenario: top - Query Support: $top
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "top"
@ -63,7 +63,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And the number of results is less than or equal to "Parameter_TopCount"
@skip @core-endorsement @1.0.2
@skip @core-endorsement @2.0.0
Scenario: skip - Query Support: $skip
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "skip"
@ -82,7 +82,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And data in the "Parameter_Key" fields are different in the second request than in the first
@count @core-endorsement @1.0.2
@count @core-endorsement @2.0.0
Scenario: count - Query Support: $count=true
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "count"
@ -96,7 +96,7 @@ Feature: Web API Server Core Endorsement
# INTEGER COMPARISONS
##############################################
@filter-int-and @core-endorsement @1.0.2
@filter-int-and @core-endorsement @2.0.0
Scenario: filter-int-and - $filter - Integer Comparison: and
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-int-and"
@ -108,7 +108,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Integer data in "Parameter_IntegerField" "gt" "Parameter_IntegerValueLow" "and" "lt" "Parameter_IntegerValueHigh"
@filter-int-or @core-endorsement @1.0.2
@filter-int-or @core-endorsement @2.0.0
Scenario: filter-int-or - $filter - Integer Comparison: or
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-int-or"
@ -120,7 +120,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Integer data in "Parameter_IntegerField" "gt" "Parameter_IntegerValueLow" "or" "lt" "Parameter_IntegerValueHigh"
@filter-int-not @core-endorsement @1.0.2
@filter-int-not @core-endorsement @2.0.0
Scenario: filter-int-not - $filter - Integer Comparison: not() operator
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-int-not"
@ -132,7 +132,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Integer data in "Parameter_FilterNotField" "ne" "Parameter_FilterNotValue"
@filter-int-eq @core-endorsement @1.0.2
@filter-int-eq @core-endorsement @2.0.0
Scenario: filter-int-eq - $filter - Integer Comparison: eq
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-int-eq"
@ -144,7 +144,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Integer data in "Parameter_IntegerField" "eq" "Parameter_IntegerValueLow"
@filter-int-ne @core-endorsement @1.0.2
@filter-int-ne @core-endorsement @2.0.0
Scenario: filter-int-ne - $filter - Integer Comparison: ne
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-int-ne"
@ -156,7 +156,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Integer data in "Parameter_IntegerField" "ne" "Parameter_IntegerValueLow"
@filter-int-gt @core-endorsement @1.0.2
@filter-int-gt @core-endorsement @2.0.0
Scenario: filter-int-gt - $filter - Integer Comparison: gt
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-int-gt"
@ -168,7 +168,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Integer data in "Parameter_IntegerField" "gt" "Parameter_IntegerValueLow"
@filter-int-ge @core-endorsement @1.0.2
@filter-int-ge @core-endorsement @2.0.0
Scenario: filter-int-ge - $filter - Integer Comparison: ge
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-int-ge"
@ -180,7 +180,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Integer data in "Parameter_IntegerField" "ge" "Parameter_IntegerValueLow"
@filter-int-lt @core-endorsement @1.0.2
@filter-int-lt @core-endorsement @2.0.0
Scenario: filter-int-lt - $filter - Integer Comparison: lt
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-int-lt"
@ -192,7 +192,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Integer data in "Parameter_IntegerField" "lt" "Parameter_IntegerValueHigh"
@filter-int-le @core-endorsement @1.0.2
@filter-int-le @core-endorsement @2.0.0
Scenario: filter-int-le - $filter - Integer Comparison: le
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-int-le"
@ -209,7 +209,7 @@ Feature: Web API Server Core Endorsement
# DECIMAL COMPARISONS
##############################################
@filter-decimal-ne @core-endorsement @1.0.2
@filter-decimal-ne @core-endorsement @2.0.0
Scenario: filter-decimal-ne - $filter - Decimal Comparison: ne
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-decimal-ne"
@ -221,7 +221,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Decimal data in "Parameter_DecimalField" "ne" "Parameter_DecimalValueLow"
@filter-decimal-gt @core-endorsement @1.0.2
@filter-decimal-gt @core-endorsement @2.0.0
Scenario: filter-decimal-gt - $filter - Decimal Comparison: gt
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-decimal-gt"
@ -233,7 +233,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Decimal data in "Parameter_DecimalField" "gt" "Parameter_DecimalValueLow"
@filter-decimal-ge @core-endorsement @1.0.2
@filter-decimal-ge @core-endorsement @2.0.0
Scenario: filter-decimal-ge - $filter - Decimal Comparison: ge
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-decimal-ge"
@ -245,7 +245,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Decimal data in "Parameter_DecimalField" "ge" "Parameter_DecimalValueLow"
@filter-decimal-lt @core-endorsement @1.0.2
@filter-decimal-lt @core-endorsement @2.0.0
Scenario: filter-decimal-lt - $filter - Decimal Comparison: lt
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-decimal-lt"
@ -257,7 +257,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Decimal data in "Parameter_DecimalField" "lt" "Parameter_DecimalValueHigh"
@filter-decimal-le @core-endorsement @1.0.2
@filter-decimal-le @core-endorsement @2.0.0
Scenario: filter-decimal-le - $filter - Decimal Comparison: le
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-decimal-le"
@ -274,7 +274,7 @@ Feature: Web API Server Core Endorsement
# ISO 8601 DATES IN 'yyyy-mm-dd' FORMAT
##############################################
@filter-date-eq @core-endorsement @1.0.2
@filter-date-eq @core-endorsement @2.0.0
Scenario: filter-date-eq - DateField eq 'yyyy-mm-dd' date value
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-date-eq"
@ -286,7 +286,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Date data in "Parameter_DateField" "eq" "Parameter_DateValue"
@filter-date-ne @core-endorsement @1.0.2
@filter-date-ne @core-endorsement @2.0.0
Scenario: filter-date-ne - DateField ne 'yyyy-mm-dd' date value
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-date-ne"
@ -298,7 +298,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Date data in "Parameter_DateField" "ne" "Parameter_DateValue"
@filter-date-gt @core-endorsement @1.0.2
@filter-date-gt @core-endorsement @2.0.0
Scenario: filter-date-gt - DateField gt 'yyyy-mm-dd' date value
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-date-gt"
@ -310,7 +310,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Date data in "Parameter_DateField" "gt" "Parameter_DateValue"
@filter-date-ge @core-endorsement @1.0.2
@filter-date-ge @core-endorsement @2.0.0
Scenario: filter-date-ge - DateField ge 'yyyy-mm-dd' date value
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-date-ge"
@ -322,7 +322,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Date data in "Parameter_DateField" "ge" "Parameter_DateValue"
@filter-date-lt @core-endorsement @1.0.2
@filter-date-lt @core-endorsement @2.0.0
Scenario: filter-date-gt - DateField lt 'yyyy-mm-dd' date value
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-date-lt"
@ -334,7 +334,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Date data in "Parameter_DateField" "lt" "Parameter_DateValue"
@filter-date-le @core-endorsement @1.0.2
@filter-date-le @core-endorsement @2.0.0
Scenario: filter-date-le - DateField le 'yyyy-mm-dd' date value
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-date-le"
@ -351,7 +351,7 @@ Feature: Web API Server Core Endorsement
# ISO 8601 Timestamps
##############################################
@filter-datetime-gt @core-endorsement @1.0.2
@filter-datetime-gt @core-endorsement @2.0.0
Scenario: filter-datetime-lt - TimestampField gt DateTimeOffset
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-datetime-gt"
@ -363,7 +363,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And DateTimeOffset data in "Parameter_TimestampField" "gt" "Parameter_DateTimeValue"
@filter-datetime-ge @core-endorsement @1.0.2
@filter-datetime-ge @core-endorsement @2.0.0
Scenario: filter-datetime-gt - TimestampField ge DateTimeOffset
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-datetime-ge"
@ -375,7 +375,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And DateTimeOffset data in "Parameter_TimestampField" "ge" "Parameter_DateTimeValue"
@filter-datetime-lt-now @core-endorsement @1.0.2
@filter-datetime-lt-now @core-endorsement @2.0.0
Scenario: filter-datetime-lt-now - TimestampField lt now() DateTimeOffset
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-datetime-lt"
@ -387,7 +387,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And DateTimeOffset data in "Parameter_TimestampField" "lt" now()
@filter-datetime-le-now @core-endorsement @1.0.2
@filter-datetime-le-now @core-endorsement @2.0.0
Scenario: filter-datetime-le-now - TimestampField le now() DateTimeOffset
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-datetime-le"
@ -399,7 +399,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And DateTimeOffset data in "Parameter_TimestampField" "le" now()
@filter-datetime-ne-now @core-endorsement @1.0.2
@filter-datetime-ne-now @core-endorsement @2.0.0
Scenario: filter-datetime-ne - TimestampField ne now() DateTimeOffset
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-datetime-ne"
@ -416,7 +416,7 @@ Feature: Web API Server Core Endorsement
# ISO 8601 TIMESTAMP SORTING TESTS
##############################################
@orderby-timestamp-asc @core-endorsement @1.0.2
@orderby-timestamp-asc @core-endorsement @2.0.0
Scenario: orderby-timestamp-asc - Query Support: $orderby ascending
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "orderby-timestamp-asc"
@ -428,7 +428,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And DateTimeOffset data in "Parameter_TimestampField" is sorted in "asc" order
@orderby-timestamp-desc @core-endorsement @1.0.2
@orderby-timestamp-desc @core-endorsement @2.0.0
Scenario: orderby-timestamp-desc - Query Support: $orderby timestamp descending
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "orderby-timestamp-desc"
@ -445,7 +445,7 @@ Feature: Web API Server Core Endorsement
# ISO 8601 TIMESTAMP + INTEGER COMPARISONS
##############################################
@orderby-timestamp-asc-filter-int-gt @core-endorsement @1.0.2
@orderby-timestamp-asc-filter-int-gt @core-endorsement @2.0.0
Scenario: orderby-timestamp-asc-filter-int-gt - Query Support: $orderby timestamp asc
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "orderby-timestamp-asc-filter-int-gt"
@ -457,7 +457,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And DateTimeOffset data in "Parameter_TimestampField" is sorted in "asc" order
@orderby-timestamp-desc-filter-int-gt @core-endorsement @1.0.2
@orderby-timestamp-desc-filter-int-gt @core-endorsement @2.0.0
Scenario: orderby-timestamp-desc-filter-int-gt - Query Support: $orderby desc filtered
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "orderby-timestamp-desc-filter-int-gt"
@ -474,7 +474,7 @@ Feature: Web API Server Core Endorsement
# SINGLE VALUE ENUMERATIONS
##############################################
@filter-enum-single-has @core-endorsement @1.0.2
@filter-enum-single-has @core-endorsement @2.0.0
Scenario: filter-enum-single-has - Support Single Value Lookups
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-enum-single-has"
@ -486,7 +486,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Single Valued Enumeration Data in "Parameter_SingleValueLookupField" "has" "Parameter_SingleLookupValue"
@filter-enum-single-eq @core-endorsement @1.0.2
@filter-enum-single-eq @core-endorsement @2.0.0
Scenario: filter-enum-single-eq - Query Support: Single Edm.EnumType, eq
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-enum-single-eq"
@ -496,7 +496,7 @@ Feature: Web API Server Core Endorsement
And the response has results
And Single Valued Enumeration Data in "Parameter_SingleValueLookupField" "eq" "Parameter_SingleLookupValue"
@filter-enum-ne @core-endorsement @1.0.2
@filter-enum-ne @core-endorsement @2.0.0
Scenario: filter-enum-single-ne - Query Support: Single Edm.EnumType, ne
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-enum-single-ne"
@ -510,7 +510,7 @@ Feature: Web API Server Core Endorsement
# MULTI-VALUE ENUMERATIONS - IsFlags
##############################################
@filter-enum-multi-has @core-endorsement @1.0.2
@filter-enum-multi-has @core-endorsement @2.0.0
Scenario: filter-enum-multi-has - Support Multi Value Lookups
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-enum-multi-has"
@ -522,7 +522,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Multiple Valued Enumeration Data in "Parameter_MultipleValueLookupField" has "Parameter_MultipleLookupValue1"
@filter-enum-multi-has-and @core-endorsement @1.0.2
@filter-enum-multi-has-and @core-endorsement @2.0.0
Scenario: filter-enum-multi-has-and - Support Multi Value Lookups multiple values
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "filter-enum-multi-has-and"
@ -540,7 +540,7 @@ Feature: Web API Server Core Endorsement
# MULTI-VALUE ENUMERATIONS - Collections
##############################################
@filter-coll-enum-any @core-endorsement @1.0.2
@filter-coll-enum-any @core-endorsement @2.0.0
Scenario: filter-coll-enum-any - Collections for Multi-Enumerations: any()
Given valid metadata have been retrieved
And field "Parameter_MultipleValueLookupField" in "Parameter_EndpointResource" has Collection of Enumeration data type
@ -553,7 +553,7 @@ Feature: Web API Server Core Endorsement
And data are present for fields contained within the given select list
And Multiple Valued Enumeration Data in "Parameter_MultipleValueLookupField" has "Parameter_MultipleLookupValue1"
@filter-coll-enum-all @core-endorsement @1.0.2
@filter-coll-enum-all @core-endorsement @2.0.0
Scenario: filter-coll-enum-all - Collections of Multi-Enumerations: all()
Given valid metadata have been retrieved
And field "Parameter_MultipleValueLookupField" in "Parameter_EndpointResource" has Collection of Enumeration data type
@ -572,7 +572,7 @@ Feature: Web API Server Core Endorsement
# RESPONSE CODE TESTING
##############################################
@response-code-400 @core-endorsement @1.0.2
@response-code-400 @core-endorsement @2.0.0
Scenario: response-code-400 - 400 Bad Request
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "response-code-400"
@ -580,7 +580,7 @@ Feature: Web API Server Core Endorsement
# Disable this check for now until Olingo-1380 is fixed - see: https://issues.apache.org/jira/browse/OLINGO-1380
# And the server has an OData-Version header value of "4.0" or "4.01"
@response-code-404 @core-endorsement @1.0.2
@response-code-404 @core-endorsement @2.0.0
Scenario: response-code-404 - 404 Not Found Request
Given valid metadata have been retrieved
When a GET request is made to the resolved Url in "response-code-404"

View File

@ -59,16 +59,35 @@ public class DataAvailability {
private static final String POSTAL_CODE_FIELD = "PostalCode";
private static final int TOP_COUNT = 100;
private static final int MAX_RETRIES = 3;
private static final int MAX_TIMESTAMP_RETRIES = 3;
private static final String BUILD_DIRECTORY_PATH = "build";
private static final String CERTIFICATION_PATH = BUILD_DIRECTORY_PATH + File.separator + "certification";
private static final String DATA_AVAILABILITY_REPORT_PATH = BUILD_DIRECTORY_PATH + File.separator + "certification" + File.separator + "results";
private static final String SAMPLES_DIRECTORY_TEMPLATE = BUILD_DIRECTORY_PATH + File.separator + "%s";
private static final String PATH_TO_RESOSCRIPT_KEY = "pathToRESOScript";
private static final String USE_STRICT_MODE_ARG = "strict";
private static final String A_B_TESTING_MODE_ARG = "abTesting";
final String REQUEST_URI_TEMPLATE = "?$filter=%s" + " lt %s&$orderby=%s desc&$top=" + TOP_COUNT;
final String COUNT_REQUEST_URI_TEMPLATE = "?$count=true";
//strict mode is enabled by default
private final boolean strictMode =
System.getProperty(USE_STRICT_MODE_ARG) == null || Boolean.parseBoolean(System.getProperty(USE_STRICT_MODE_ARG));
//abTesting mode is disabled by default
private final boolean abTestingMode =
System.getProperty(A_B_TESTING_MODE_ARG) != null && Boolean.parseBoolean(System.getProperty(A_B_TESTING_MODE_ARG));
//TODO: read from params
final String ORIGINATING_SYSTEM_FIELD = "OriginatingSystemName";
final String ORIGINATING_SYSTEM_FIELD_VALUE = EMPTY_STRING;
final boolean USE_ORIGINATING_SYSTEM_QUERY = ORIGINATING_SYSTEM_FIELD.length() > 0 && ORIGINATING_SYSTEM_FIELD_VALUE.length() > 0;
final String ORIGINATING_SYSTEM_QUERY = ORIGINATING_SYSTEM_FIELD + " eq '" + ORIGINATING_SYSTEM_FIELD_VALUE + "'";
final String REQUEST_URI_TEMPLATE = "?$filter="
+ (USE_ORIGINATING_SYSTEM_QUERY ? ORIGINATING_SYSTEM_QUERY + " and " : EMPTY_STRING)
+ "%s" + " lt %s&$orderby=%s desc&$top=" + TOP_COUNT;
final String COUNT_REQUEST_URI_TEMPLATE = "?" + (USE_ORIGINATING_SYSTEM_QUERY ? "$filter=" + ORIGINATING_SYSTEM_QUERY + "&": EMPTY_STRING) + "$count=true";
//TODO: get this from the parameters
private final static boolean DEBUG = false;
@ -100,7 +119,10 @@ public class DataAvailability {
@Inject
public DataAvailability(WebAPITestContainer c) {
container.set(c);
if (container.get() == null) {
container.set(c);
LOG.info("Using strict mode: " + strictMode);
}
}
@Before
@ -132,6 +154,7 @@ public class DataAvailability {
gsonBuilder.registerTypeAdapter(PayloadSampleReport.class, payloadSampleReport);
Utils.createFile(DATA_AVAILABILITY_REPORT_PATH, reportName, gsonBuilder.create().toJson(payloadSampleReport));
}
/**
@ -227,6 +250,7 @@ public class DataAvailability {
final AtomicReference<Map<String, String>> encodedSample = new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
final AtomicReference<ODataTransportWrapper> transportWrapper = new AtomicReference<>();
final AtomicReference<ResWrap<EntityCollection>> entityCollectionResWrap = new AtomicReference<>();
final AtomicReference<ResWrap<EntityCollection>> lastEntityCollectionResWrap = new AtomicReference<>();
final AtomicReference<String> timestampField = new AtomicReference<>();
final AtomicBoolean hasRecords = new AtomicBoolean(true);
final AtomicReference<PayloadSample> payloadSample = new AtomicReference<>();
@ -236,7 +260,7 @@ public class DataAvailability {
boolean hasStandardTimestampField = false;
String requestUri;
int recordsProcessed = 0;
int numRetries = 0;
int numTimestampRetries = 0;
int lastTimestampCandidateIndex = 0;
container.get().getEdm().getSchemas().forEach(edmSchema ->
@ -297,17 +321,17 @@ public class DataAvailability {
// immediately, but retry a couple of times before we bail
if (recordsProcessed == 0 && transportWrapper.get().getResponseData() == null) {
//only count retries if we're constantly making requests and not getting anything
numRetries += 1;
numTimestampRetries += 1;
} else {
numRetries = 0;
numTimestampRetries = 0;
}
if (numRetries >= MAX_RETRIES) {
if (numTimestampRetries >= MAX_TIMESTAMP_RETRIES) {
if (timestampCandidateFields.size() > 0 && (lastTimestampCandidateIndex < timestampCandidateFields.size())) {
LOG.info("Trying next candidate timestamp field: " + timestampCandidateFields.get(lastTimestampCandidateIndex));
numRetries = 0;
numTimestampRetries = 0;
} else {
LOG.info("Could not fetch records from the " + resourceName + " resource after " + MAX_RETRIES
LOG.info("Could not fetch records from the " + resourceName + " resource after " + MAX_TIMESTAMP_RETRIES
+ " tries from the given URL: " + requestUri);
break;
}
@ -331,12 +355,22 @@ public class DataAvailability {
try {
payloadSample.get().setResponseSizeBytes(transportWrapper.get().getResponseData().getBytes().length);
lastEntityCollectionResWrap.set(entityCollectionResWrap.get());
entityCollectionResWrap.set(container.get().getCommander().getClient()
.getDeserializer(ContentType.APPLICATION_JSON)
.toEntitySet(new ByteArrayInputStream(transportWrapper.get().getResponseData().getBytes())));
if (entityCollectionResWrap.get().getPayload().getEntities().size() > 0) {
LOG.info("Hashing " + resourceName + " payload values...");
if (lastEntityCollectionResWrap.get() != null && entityCollectionResWrap.get() != null
&& lastEntityCollectionResWrap.get().getPayload().hashCode() == entityCollectionResWrap.get().getPayload().hashCode()) {
//if the payload is the same between pages, we need to skip it and subtract some more time
LOG.info("Found identical pages. Subtracting one day from the time...");
lastFetchedDate.set(lastFetchedDate.get().minus(1, ChronoUnit.DAYS));
break;
} else if (entityCollectionResWrap.get().getPayload().getEntities().size() > 0) {
LOG.debug("Hashing " + resourceName + " payload values...");
entityCollectionResWrap.get().getPayload().getEntities().forEach(entity -> {
encodedSample.set(Collections.synchronizedMap(new LinkedHashMap<>()));
@ -378,11 +412,17 @@ public class DataAvailability {
ArrayList<String> values = new ArrayList<>();
if (value == null || value.contentEquals("[]")) {
values.add("null");
if (value == null) {
values.add("NULL_VALUE");
} else if (value.contentEquals("[]")) {
values.add("EMPTY_LIST");
} else {
if (property.isCollection()) {
property.asCollection().forEach(v -> values.add(v.toString()));
if (property.asCollection().size() > 0) {
property.asCollection().forEach(v -> values.add(v.toString()));
} else {
values.add("EMPTY_LIST");
}
} else {
if (value.contains(",")) {
values.addAll(Arrays.asList(value.split(",")));
@ -422,14 +462,14 @@ public class DataAvailability {
});
payloadSample.get().addSample(encodedSample.get());
});
LOG.info("Values encoded!");
LOG.debug("Values encoded!");
recordsProcessed += entityCollectionResWrap.get().getPayload().getEntities().size();
LOG.info("Records processed: " + recordsProcessed + ". Target record count: " + targetRecordCount + "\n");
payloadSample.get().setResponseTimeMillis(transportWrapper.get().getElapsedTimeMillis());
if (encodedResultsDirectoryName != null) {
if (abTestingMode && encodedResultsDirectoryName != null) {
//serialize results once resource processing has finished
Utils.createFile(String.format(SAMPLES_DIRECTORY_TEMPLATE, encodedResultsDirectoryName),
resourceName + "-" + Utils.getTimestamp() + ".json",
@ -585,8 +625,17 @@ public class DataAvailability {
LOG.info("No resource payload samples found! Skipping...");
assumeTrue(true);
}
LOG.info("\n\nCreating data availability report!");
createDataAvailabilityReport(resourcePayloadSampleMap.get(), reportFileName, resourceCounts.get(), resourceFieldLookupTallies.get());
try {
LOG.info("\n\nCreating data availability report!");
createDataAvailabilityReport(resourcePayloadSampleMap.get(), reportFileName, resourceCounts.get(), resourceFieldLookupTallies.get());
} catch (Exception ex) {
final String errorMsg = "Data Availability Report could not be created.\n" + ex;
if (strictMode) {
failAndExitWithErrorMessage(errorMsg, scenario);
} else {
LOG.error(errorMsg);
}
}
}
@And("{string} has been created in the build directory")

View File

@ -65,7 +65,7 @@ public class DataDictionary {
private static final AtomicReference<Map<String, Map<String, Set<String>>>> ignoredItems = new AtomicReference<>(new LinkedHashMap<>());
private static XMLMetadata referenceMetadata = null;
private static boolean areMetadataValid = false;
private static boolean isMetadataValid = false;
//named args
private static final String SHOW_RESPONSES_ARG = "showResponses";
@ -76,7 +76,10 @@ public class DataDictionary {
//extract any params here
private final boolean showResponses = Boolean.parseBoolean(System.getProperty(SHOW_RESPONSES_ARG));
private final boolean strictMode = Boolean.parseBoolean(System.getProperty(USE_STRICT_MODE_ARG));
//strict mode is enabled by default
private final boolean strictMode = System.getProperty(USE_STRICT_MODE_ARG) == null || Boolean.parseBoolean(System.getProperty(USE_STRICT_MODE_ARG));
private final String pathToMetadata = System.getProperty(PATH_TO_METADATA_ARG);
private final String pathToRESOScript = System.getProperty(PATH_TO_RESOSCRIPT_ARG);
@ -181,7 +184,7 @@ public class DataDictionary {
//if we have gotten to this point without exceptions, then metadata are valid
container.validateMetadata();
areMetadataValid = container.hasValidMetadata();
isMetadataValid = container.hasValidMetadata();
//create metadata report
Commander.generateMetadataReport(container.getEdm());
@ -214,7 +217,15 @@ public class DataDictionary {
container.setShouldValidateMetadata(false);
//if we have gotten to this point without exceptions, then metadata are valid
areMetadataValid = container.hasValidMetadata();
isMetadataValid = container.hasValidMetadata();
if (!isMetadataValid) {
failAndExitWithErrorMessage("OData XML Metadata MUST be valid!", scenario);
}
//save metadata locally
Utils.createFile("build" + File.separator + "certification" + File.separator + "results",
"metadata.xml", container.getXMLResponseData());
//create metadata report
Commander.generateMetadataReport(container.getEdm());
@ -224,7 +235,7 @@ public class DataDictionary {
@When("{string} exists in the {string} metadata")
public void existsInTheMetadata(String fieldName, String resourceName) {
if (strictMode && !areMetadataValid) {
if (strictMode && !isMetadataValid) {
failAndExitWithErrorMessage("Metadata validation failed, but is required to pass when using strict mode!", scenario);
}

View File

@ -52,7 +52,7 @@ import static org.reso.commander.common.TestUtils.Operators.*;
import static org.reso.models.Request.loadFromRESOScript;
/**
* Contains the glue code for Web API Core 1.0.2 Certification as well as previous Platinum tests,
* Contains the glue code for Web API Core 2.0.0 Certification as well as previous Platinum tests,
* which will converted to standalone endorsements, where applicable.
*/
public class WebAPIServerCore implements En {
@ -77,7 +77,7 @@ public class WebAPIServerCore implements En {
//TODO: change this to allow passing of a given set of testing queries
//for now this assumes the requests will always be Web API Core Server test queries, but could be $expand, for instance
private static final String WEB_API_CORE_REFERENCE_REQUESTS = "reference-web-api-core-requests.xml";
//private static final String WEB_API_CORE_REFERENCE_REQUESTS = "reference-web-api-core-requests.xml";
@Before
public void beforeStep(Scenario scenario) {
@ -90,10 +90,11 @@ public class WebAPIServerCore implements En {
if (!container.get().getIsInitialized()) {
container.get().setSettings(Settings.loadFromRESOScript(new File(System.getProperty(PATH_TO_RESOSCRIPT_KEY))));
//overwrite any requests loaded with the reference queries
container.get().getSettings().setRequests(loadFromRESOScript(new File(Objects.requireNonNull(
getClass().getClassLoader().getResource(WEB_API_CORE_REFERENCE_REQUESTS)).getPath()))
.stream().map(request -> Settings.resolveParameters(request, container.get().getSettings())).collect(Collectors.toList()));
//moved to container initialization
// //overwrite any requests loaded with the reference queries
// container.get().getSettings().setRequests(loadFromRESOScript(new File(Objects.requireNonNull(
// getClass().getClassLoader().getResource(WEB_API_CORE_REFERENCE_REQUESTS)).getPath()))
// .stream().map(request -> Settings.resolveParameters(request, container.get().getSettings())).collect(Collectors.toList()));
container.get().initialize();
}
@ -311,21 +312,18 @@ public class WebAPIServerCore implements En {
assertNotNull(getDefaultErrorMessage("request was null! \nCheck RESOScript to make sure requestId exists."), container.get().getRequest());
try {
LOG.info("Asserted Response Code: " + assertedResponseCode + ", Server Response Code: " + container.get().getResponseCode());
String errorMessage = "";
if (container.get().getODataClientErrorException() != null) {
if (container.get().getODataClientErrorException().getODataError().getMessage() != null) {
LOG.error(getDefaultErrorMessage("Request failed with the following message:",
container.get().getODataClientErrorException().getODataError().getMessage()));
errorMessage = container.get().getODataClientErrorException().getODataError().getMessage();
} else if (container.get().getODataClientErrorException().getMessage() != null) {
LOG.error(getDefaultErrorMessage("Request failed with the following message:",
container.get().getODataClientErrorException().getMessage()));
errorMessage = container.get().getODataClientErrorException().getMessage();
}
}
if (container.get().getODataServerErrorException() != null) {
LOG.error(getDefaultErrorMessage("Request failed with the following message:",
container.get().getODataServerErrorException().toString()));
} else if (container.get().getODataServerErrorException() != null) {
errorMessage = container.get().getODataServerErrorException().getMessage();
scenario.log(getDefaultErrorMessage("Request failed with the following message:", errorMessage));
if (container.get().getODataServerErrorException().toString().contains(String.valueOf(HttpStatus.SC_INTERNAL_SERVER_ERROR))) {
container.get().setResponseCode(HttpStatus.SC_INTERNAL_SERVER_ERROR);
}
@ -333,12 +331,18 @@ public class WebAPIServerCore implements En {
//TODO: clean up logic
if (container.get().getResponseCode() != null && assertedResponseCode.intValue() != container.get().getResponseCode().intValue()) {
fail(getAssertResponseCodeErrorMessage(assertedResponseCode, container.get().getResponseCode()));
final String responseCodeErrorMessage = getAssertResponseCodeErrorMessage(assertedResponseCode, container.get().getResponseCode());
if (errorMessage.length() > 0) {
scenario.log(errorMessage);
}
scenario.log(responseCodeErrorMessage);
fail(responseCodeErrorMessage + "\n" + errorMessage);
}
//if we make it through without failing, things are good
assertTrue(container.get().getResponseCode() > 0 && assertedResponseCode > 0);
} catch (Exception ex) {
scenario.log(ex.toString());
fail(getDefaultErrorMessage(ex));
}
});
@ -616,18 +620,21 @@ public class WebAPIServerCore implements En {
from(container.get().getResponseData()).getList(JSON_VALUE_PATH, ObjectNode.class).forEach(item -> {
fieldValue.set(item.get(fieldName).toString());
String assertMessage = EMPTY_STRING;
if (useCollections) {
if (item.get(fieldName).isArray()) {
result.set(result.get() && TestUtils.testAnyOperator(item, fieldName, assertedValue.get()));
LOG.info("Assert True: " + fieldValue.get() + " contains " + assertedValue.get() + " ==> " + result.get());
assertTrue(result.get());
assertMessage = "Assert True: " + fieldValue.get() + " contains " + assertedValue.get() + " ==> " + result.get();
LOG.info(assertMessage);
assertTrue(assertMessage, result.get());
} else {
fail(getDefaultErrorMessage(fieldName, "MUST contain an array of values but found:", item.get(fieldName).toString()));
}
} else {
result.set(fieldValue.get().contains(assertedValue.get()));
LOG.info("Assert True: " + fieldValue.get() + " has " + assertedValue.get() + " ==> " + result.get());
assertTrue(result.get());
assertMessage = "Assert True: " + fieldValue.get() + " has " + assertedValue.get() + " ==> " + result.get();
LOG.info(assertMessage);
assertTrue(assertMessage, result.get());
}
});
} catch (Exception ex) {

View File

@ -3,6 +3,7 @@ package org.reso.commander;
import org.apache.commons.cli.*;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.olingo.commons.api.edm.Edm;
import org.apache.olingo.commons.api.format.ContentType;
import org.reso.certification.codegen.*;
import org.reso.models.ClientSettings;
@ -13,10 +14,12 @@ import org.reso.models.Settings;
import java.io.File;
import java.util.Arrays;
import java.util.Date;
import java.util.Map;
import static org.reso.commander.Commander.*;
import static org.reso.commander.common.ErrorMsg.getDefaultErrorMessage;
import static org.reso.commander.common.Utils.getTimestamp;
import static org.reso.commander.common.XMLMetadataToJSONSchemaSerializer.convertEdmToJsonSchemaDocuments;
/**
* Entry point of the RESO Web API Commander, which is a command line OData client that uses the Java Olingo
@ -79,7 +82,7 @@ public class App {
//if we're running a batch, initialize variables from the settings file rather than from command line options
Settings settings = null;
LOG.debug("Service Root is:" + commanderBuilder.serviceRoot);
LOG.debug("Service Root is: " + commanderBuilder.serviceRoot);
//If the RESOScript option was passed, then the correct commander instance should exist at this point
if (cmd.hasOption(APP_OPTIONS.ACTIONS.RUN_RESOSCRIPT)) {
@ -242,6 +245,14 @@ public class App {
} catch (Exception ex) {
LOG.error(getDefaultErrorMessage(ex));
}
} else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_JSON_SCHEMAS_FROM_XML_METADATA)) {
try {
Edm edm = deserializeEdmFromPath(inputFilename, commander.getClient());
final Map<String, String> jsonSchemaMap = convertEdmToJsonSchemaDocuments(edm);
//jsonSchemaMap.forEach((model, jsonSchema) -> LOG.info("Model is: " + model + "\nSchema is: " + jsonSchema));
} catch (Exception ex) {
LOG.error(getDefaultErrorMessage(ex));
}
} else if (cmd.hasOption(APP_OPTIONS.ACTIONS.GENERATE_QUERIES)) {
APP_OPTIONS.validateAction(cmd, APP_OPTIONS.ACTIONS.GENERATE_QUERIES);
@ -424,6 +435,8 @@ public class App {
}
} else if (action.matches(ACTIONS.GENERATE_QUERIES)) {
validationResponse = validateOptions(cmd, INPUT_FILE);
} else if (action.matches(ACTIONS.GENERATE_JSON_SCHEMAS_FROM_XML_METADATA)) {
validationResponse = validateOptions(cmd, INPUT_FILE);
}
if (validationResponse != null) {
@ -522,6 +535,8 @@ public class App {
.desc("Runs commands in RESOScript file given as <inputFile>.").build())
.addOption(Option.builder().argName("t").longOpt(ACTIONS.GENERATE_DD_ACCEPTANCE_TESTS)
.desc("Generates acceptance tests in the current directory.").build())
.addOption(Option.builder().argName("j").longOpt(ACTIONS.GENERATE_JSON_SCHEMAS_FROM_XML_METADATA)
.desc("Generates JSON Schema documents from the given XML metadata.").build())
.addOption(Option.builder().argName("i").longOpt(ACTIONS.GENERATE_RESOURCE_INFO_MODELS)
.desc("Generates Java Models for the Web API Reference Server in the current directory.").build())
.addOption(Option.builder().argName("r").longOpt(ACTIONS.GENERATE_REFERENCE_EDMX)
@ -560,6 +575,7 @@ public class App {
public static final String GENERATE_DD_ACCEPTANCE_TESTS = "generateDDAcceptanceTests";
public static final String GENERATE_REFERENCE_EDMX = "generateReferenceEDMX";
public static final String GENERATE_REFERENCE_DDL = "generateReferenceDDL";
public static final String GENERATE_JSON_SCHEMAS_FROM_XML_METADATA = "generateJSONSchemasFromXMLMetadata";
public static final String GENERATE_QUERIES = "generateQueries";
public static final String RUN_RESOSCRIPT = "runRESOScript";
public static final String GET_METADATA = "getMetadata";

View File

@ -11,182 +11,191 @@ import static org.reso.certification.containers.WebAPITestContainer.EMPTY_STRING
import static org.reso.commander.common.DataDictionaryMetadata.v1_7.WELL_KNOWN_RESOURCE_KEYS.*;
public class DataDictionaryMetadata {
private static final Logger LOG = LogManager.getLogger(DataDictionaryMetadata.class);
private static final Logger LOG = LogManager.getLogger(DataDictionaryMetadata.class);
public static final class v1_7 {
//TODO: clean up
public static final Set<String> WELL_KNOWN_RESOURCES = new LinkedHashSet<>(Arrays.asList(
PROPERTY,
MEMBER,
OFFICE,
CONTACTS,
CONTACT_LISTINGS,
HISTORY_TRANSACTIONAL,
INTERNET_TRACKING,
MEDIA,
OPEN_HOUSE,
OUID,
PROSPECTING,
QUEUE,
RULES,
SAVED_SEARCH,
SHOWING,
TEAMS,
TEAM_MEMBERS,
CONTACT_LISTING_NOTES,
OTHER_PHONE,
PROPERTY_GREEN_VERIFICATION,
PROPERTY_POWER_PRODUCTION,
PROPERTY_ROOMS,
PROPERTY_UNIT_TYPES,
SOCIAL_MEDIA
));
public static final String LOOKUP_FIELDS_AND_VALUES = "Lookup Fields and Values";
public static final class v1_7 {
//TODO: clean up
public static final Set<String> WELL_KNOWN_RESOURCES = new LinkedHashSet<>(Arrays.asList(
PROPERTY,
MEMBER,
OFFICE,
CONTACTS,
CONTACT_LISTINGS,
HISTORY_TRANSACTIONAL,
INTERNET_TRACKING,
MEDIA,
OPEN_HOUSE,
OUID,
PROSPECTING,
QUEUE,
RULES,
SAVED_SEARCH,
SHOWING,
TEAMS,
TEAM_MEMBERS,
CONTACT_LISTING_NOTES,
OTHER_PHONE,
PROPERTY_GREEN_VERIFICATION,
PROPERTY_POWER_PRODUCTION,
PROPERTY_ROOMS,
PROPERTY_UNIT_TYPES,
SOCIAL_MEDIA,
FIELD,
LOOKUP
));
public static final String LOOKUP_FIELDS_AND_VALUES = "Lookup Fields and Values";
//TODO: clean up
public static class WELL_KNOWN_RESOURCE_KEYS {
public static final String
PROPERTY = "Property",
MEMBER = "Member",
OFFICE = "Office",
CONTACTS = "Contacts",
CONTACT_LISTINGS = "ContactListings",
HISTORY_TRANSACTIONAL = "HistoryTransactional",
INTERNET_TRACKING = "InternetTracking",
MEDIA = "Media",
OPEN_HOUSE = "OpenHouse",
OUID = "OUID",
PROSPECTING = "Prospecting",
QUEUE = "Queue",
RULES = "Rules",
SAVED_SEARCH = "SavedSearch",
SHOWING = "Showing",
TEAMS = "Teams",
TEAM_MEMBERS = "TeamMembers",
CONTACT_LISTING_NOTES = "ContactListingNotes",
OTHER_PHONE = "OtherPhone",
PROPERTY_GREEN_VERIFICATION = "PropertyGreenVerification",
PROPERTY_POWER_PRODUCTION = "PropertyPowerProduction",
PROPERTY_ROOMS = "PropertyRooms",
PROPERTY_UNIT_TYPES = "PropertyUnitTypes",
SOCIAL_MEDIA = "SocialMedia";
}
public static Boolean isPrimaryKeyField(String resource, String fieldName) {
return getKeyFieldForResource(resource).contentEquals(fieldName);
}
public static String getKeyFieldForResource(String resourceName) {
switch (resourceName) {
case PROPERTY:
return "ListingKey";
case MEMBER:
return "MemberKey";
case OFFICE:
return "OfficeKey";
case CONTACTS:
case CONTACT_LISTING_NOTES:
return "ContactKey";
case CONTACT_LISTINGS:
return "ContactListingsKey";
case HISTORY_TRANSACTIONAL:
return "HistoryTransactionalKey";
case INTERNET_TRACKING:
return "EventKey";
case MEDIA:
return "MediaKey";
case OPEN_HOUSE:
return "OpenHouseKey";
case OUID:
return "OrganizationUniqueIdKey";
case PROSPECTING:
return "ProspectingKey";
case QUEUE:
return "QueueTransactionKey";
case RULES:
return "RuleKey";
case SAVED_SEARCH:
return "SavedSearchKey";
case SHOWING:
return "ShowingKey";
case TEAMS:
return "TeamKey";
case TEAM_MEMBERS:
return "TeamMemberKey";
case OTHER_PHONE:
return "OtherPhoneKey";
case PROPERTY_GREEN_VERIFICATION:
return "GreenBuildingVerificationKey";
case PROPERTY_POWER_PRODUCTION:
return "PowerProductionKey";
case PROPERTY_ROOMS:
return "RoomKey";
case PROPERTY_UNIT_TYPES:
return "UnitTypeKey";
case SOCIAL_MEDIA:
return "SocialMediaKey";
default:
LOG.error("Cannot find key name for resource: " + resourceName);
return EMPTY_STRING;
}
}
public static Boolean isPrimaryKeyNumericField(String resource, String fieldName) {
return getKeyNumericFieldForResource(resource).contentEquals(fieldName);
}
public static String getKeyNumericFieldForResource(String resourceName) {
switch (resourceName) {
case PROPERTY:
return "ListingKeyNumeric";
case MEMBER:
return "MemberKeyNumeric";
case OFFICE:
return "OfficeKeyNumeric";
case CONTACTS:
case CONTACT_LISTING_NOTES:
return "ContactKeyNumeric";
case CONTACT_LISTINGS:
return "ContactListingsKeyNumeric";
case HISTORY_TRANSACTIONAL:
return "HistoryTransactionalKeyNumeric";
case INTERNET_TRACKING:
return "EventKeyNumeric";
case MEDIA:
return "MediaKeyNumeric";
case OPEN_HOUSE:
return "OpenHouseKeyNumeric";
case OUID:
return "OrganizationUniqueIdKeyNumeric";
case PROSPECTING:
return "ProspectingKeyNumeric";
case QUEUE:
return "QueueTransactionKeyNumeric";
case RULES:
return "RuleKeyNumeric";
case SAVED_SEARCH:
return "SavedSearchKeyNumeric";
case SHOWING:
return "ShowingKeyNumeric";
case TEAMS:
return "TeamKeyNumeric";
case TEAM_MEMBERS:
return "TeamMemberKeyNumeric";
case OTHER_PHONE:
return "OtherPhoneKeyNumeric";
case PROPERTY_GREEN_VERIFICATION:
return "GreenBuildingVerificationKeyNumeric";
case PROPERTY_POWER_PRODUCTION:
return "PowerProductionKeyNumeric";
case PROPERTY_ROOMS:
return "RoomKeyNumeric";
case PROPERTY_UNIT_TYPES:
return "UnitTypeKeyNumeric";
case SOCIAL_MEDIA:
return "SocialMediaKeyNumeric";
default:
LOG.error("Cannot find key name for resource: " + resourceName);
return EMPTY_STRING;
}
}
//TODO: clean up
public static class WELL_KNOWN_RESOURCE_KEYS {
public static final String
PROPERTY = "Property",
MEMBER = "Member",
OFFICE = "Office",
CONTACTS = "Contacts",
CONTACT_LISTINGS = "ContactListings",
HISTORY_TRANSACTIONAL = "HistoryTransactional",
INTERNET_TRACKING = "InternetTracking",
MEDIA = "Media",
OPEN_HOUSE = "OpenHouse",
OUID = "OUID",
PROSPECTING = "Prospecting",
QUEUE = "Queue",
RULES = "Rules",
SAVED_SEARCH = "SavedSearch",
SHOWING = "Showing",
TEAMS = "Teams",
TEAM_MEMBERS = "TeamMembers",
CONTACT_LISTING_NOTES = "ContactListingNotes",
OTHER_PHONE = "OtherPhone",
PROPERTY_GREEN_VERIFICATION = "PropertyGreenVerification",
PROPERTY_POWER_PRODUCTION = "PropertyPowerProduction",
PROPERTY_ROOMS = "PropertyRooms",
PROPERTY_UNIT_TYPES = "PropertyUnitTypes",
SOCIAL_MEDIA = "SocialMedia",
FIELD = "Field",
LOOKUP = "Lookup";
}
public static Boolean isPrimaryKeyField(String resource, String fieldName) {
return getKeyFieldForResource(resource).contentEquals(fieldName);
}
public static String getKeyFieldForResource(String resourceName) {
switch (resourceName) {
case PROPERTY:
return "ListingKey";
case MEMBER:
return "MemberKey";
case OFFICE:
return "OfficeKey";
case CONTACTS:
case CONTACT_LISTING_NOTES:
return "ContactKey";
case CONTACT_LISTINGS:
return "ContactListingsKey";
case HISTORY_TRANSACTIONAL:
return "HistoryTransactionalKey";
case INTERNET_TRACKING:
return "EventKey";
case MEDIA:
return "MediaKey";
case OPEN_HOUSE:
return "OpenHouseKey";
case OUID:
return "OrganizationUniqueIdKey";
case PROSPECTING:
return "ProspectingKey";
case QUEUE:
return "QueueTransactionKey";
case RULES:
return "RuleKey";
case SAVED_SEARCH:
return "SavedSearchKey";
case SHOWING:
return "ShowingKey";
case TEAMS:
return "TeamKey";
case TEAM_MEMBERS:
return "TeamMemberKey";
case OTHER_PHONE:
return "OtherPhoneKey";
case PROPERTY_GREEN_VERIFICATION:
return "GreenBuildingVerificationKey";
case PROPERTY_POWER_PRODUCTION:
return "PowerProductionKey";
case PROPERTY_ROOMS:
return "RoomKey";
case PROPERTY_UNIT_TYPES:
return "UnitTypeKey";
case SOCIAL_MEDIA:
return "SocialMediaKey";
case FIELD:
return "FieldKey";
case LOOKUP:
return "LookupKey";
default:
LOG.error("Cannot find key name for resource: " + resourceName);
return EMPTY_STRING;
}
}
public static Boolean isPrimaryKeyNumericField(String resource, String fieldName) {
return getKeyNumericFieldForResource(resource).contentEquals(fieldName);
}
public static String getKeyNumericFieldForResource(String resourceName) {
switch (resourceName) {
case PROPERTY:
return "ListingKeyNumeric";
case MEMBER:
return "MemberKeyNumeric";
case OFFICE:
return "OfficeKeyNumeric";
case CONTACTS:
case CONTACT_LISTING_NOTES:
return "ContactKeyNumeric";
case CONTACT_LISTINGS:
return "ContactListingsKeyNumeric";
case HISTORY_TRANSACTIONAL:
return "HistoryTransactionalKeyNumeric";
case INTERNET_TRACKING:
return "EventKeyNumeric";
case MEDIA:
return "MediaKeyNumeric";
case OPEN_HOUSE:
return "OpenHouseKeyNumeric";
case OUID:
return "OrganizationUniqueIdKeyNumeric";
case PROSPECTING:
return "ProspectingKeyNumeric";
case QUEUE:
return "QueueTransactionKeyNumeric";
case RULES:
return "RuleKeyNumeric";
case SAVED_SEARCH:
return "SavedSearchKeyNumeric";
case SHOWING:
return "ShowingKeyNumeric";
case TEAMS:
return "TeamKeyNumeric";
case TEAM_MEMBERS:
return "TeamMemberKeyNumeric";
case OTHER_PHONE:
return "OtherPhoneKeyNumeric";
case PROPERTY_GREEN_VERIFICATION:
return "GreenBuildingVerificationKeyNumeric";
case PROPERTY_POWER_PRODUCTION:
return "PowerProductionKeyNumeric";
case PROPERTY_ROOMS:
return "RoomKeyNumeric";
case PROPERTY_UNIT_TYPES:
return "UnitTypeKeyNumeric";
case SOCIAL_MEDIA:
return "SocialMediaKeyNumeric";
default:
LOG.error("Cannot find key name for resource: " + resourceName);
return EMPTY_STRING;
}
}
}
}

View File

@ -0,0 +1,144 @@
package org.reso.commander.common;
import com.google.gson.JsonElement;
import com.google.gson.JsonSerializationContext;
import com.google.gson.JsonSerializer;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.olingo.commons.api.edm.Edm;
import org.apache.olingo.commons.api.edm.EdmElement;
import org.apache.olingo.commons.api.edm.EdmPrimitiveTypeKind;
import java.lang.reflect.Type;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Map;
public class XMLMetadataToJSONSchemaSerializer implements JsonSerializer<XMLMetadataToJSONSchemaSerializer> {
private static final Logger LOG = LogManager.getLogger(XMLMetadataToJSONSchemaSerializer.class);
private XMLMetadataToJSONSchemaSerializer() {
//should not use default constructor
}
/**
* Gson invokes this call-back method during serialization when it encounters a field of the
* specified type.
*
* <p>In the implementation of this call-back method, you should consider invoking
* {@link JsonSerializationContext#serialize(Object, Type)} method to create JsonElements for any
* non-trivial field of the {@code src} object. However, you should never invoke it on the
* {@code src} object itself since that will cause an infinite loop (Gson will call your
* call-back method again).</p>
*
* @param src the object that needs to be converted to Json.
* @param typeOfSrc the actual type (fully genericized version) of the source object.
* @param context
* @return a JsonElement corresponding to the specified object.
*/
@Override
public JsonElement serialize(XMLMetadataToJSONSchemaSerializer src, Type typeOfSrc, JsonSerializationContext context) {
return null;
}
/**
* Converts an OData Entity Data Model into a collection of JSON Schema 6 Documents
*
* @param edm
* @return HashMap containing a collection of resource name, JSON Schema pairs
*/
public static Map<String, String> convertEdmToJsonSchemaDocuments(Edm edm) {
final Map<String, String> jsonSchemas = Collections.synchronizedMap(new LinkedHashMap<>());
final String
JSON_SCHEMA_RESOURCE_VALUE_WRAPPER =
"{\n" +
" \"$id\": \"https://reso.org/data-dictionary/schemas/1.7/%s\",\n" + /* resource name */
" \"$schema\": \"https://json-schema.org/draft/2020-12/schema\",\n" +
" \"type\": \"array\",\n" +
" \"required\": [\"value\", \"@odata.context\" ],\n" +
" \"properties\" : {\n" +
" \"@odata.context\" : { \"type\": \"string\" }, \n" +
" \"value\": { \"type\": \"array\",\n" +
" \"items\": { \"$ref\": \"#/$defs/%s\" }, \n" + /* resource name */
" }\n" +
"},\n",
JSON_SCHEMA_TEMPLATE_DEFS =
"$defs: {\n" +
" \"%s\": { \n" + /* resource name, string */
" \"type\": \"object\",\n" +
" \"required\" : [ %s ],\n" + /* key fields, string list with quotes */
" \"properties\" : { \n" +
" %s\n" + /* comma-separated JSON Schema type definition fragments */
" }\n" +
" }\n" +
" }\n" +
"}\n";
edm.getSchemas().forEach(edmSchema -> {
StringBuilder schemaDocument = new StringBuilder();
//serialize entities (resources) and members (fields)
edmSchema.getEntityTypes().forEach(edmEntityType -> {
edmEntityType.getPropertyNames().forEach(propertyName -> {
final String jsonSchemaFragment = getJsonSchemaType(edmEntityType.getProperty(propertyName));
if (jsonSchemaFragment != null) {
schemaDocument
.append(schemaDocument.length() > 0 ? ",\n" : "")
.append(" \"")
.append(propertyName)
.append("\": ")
.append(getJsonSchemaType(edmEntityType.getProperty(propertyName)));
}
});
final String schemaString = String.format(JSON_SCHEMA_RESOURCE_VALUE_WRAPPER, edmEntityType.getName(), schemaDocument.toString());
jsonSchemas.put(edmEntityType.getName(), schemaString);
});
// //serialize enum types
// edmSchema.getEnumTypes().forEach(edmEnumType -> {
// edmEnumType.getMemberNames().forEach(memberName -> {
//
// });
// });
});
return jsonSchemas;
}
private static String getJsonSchemaType(EdmElement element) {
final String fullyQualifiedName = element.getType().getFullQualifiedName().getFullQualifiedNameAsString();
final String
EDM_STRING = EdmPrimitiveTypeKind.String.getFullQualifiedName().getFullQualifiedNameAsString(),
EDM_BINARY = EdmPrimitiveTypeKind.Binary.getFullQualifiedName().getFullQualifiedNameAsString(),
EDM_SBYTE = EdmPrimitiveTypeKind.SByte.getFullQualifiedName().getFullQualifiedNameAsString(),
EDM_DATE_TIME_OFFSET = EdmPrimitiveTypeKind.DateTimeOffset.getFullQualifiedName().getFullQualifiedNameAsString(),
EDM_DATE = EdmPrimitiveTypeKind.Date.getFullQualifiedName().getFullQualifiedNameAsString(),
EDM_DECIMAL = EdmPrimitiveTypeKind.Decimal.getFullQualifiedName().getFullQualifiedNameAsString(),
EDM_INT_64 = EdmPrimitiveTypeKind.Int64.getFullQualifiedName().getFullQualifiedNameAsString(),
EDM_INT_32 = EdmPrimitiveTypeKind.Int32.getFullQualifiedName().getFullQualifiedNameAsString(),
EDM_INT_16 = EdmPrimitiveTypeKind.Int16.getFullQualifiedName().getFullQualifiedNameAsString(),
EDM_BOOLEAN = EdmPrimitiveTypeKind.Boolean.getFullQualifiedName().getFullQualifiedNameAsString();
if (fullyQualifiedName.contentEquals(EDM_STRING)
|| fullyQualifiedName.contentEquals(EDM_SBYTE)
|| fullyQualifiedName.contentEquals(EDM_BINARY)) {
return "{ \"type\" : \"string\" }";
} else if (fullyQualifiedName.contentEquals(EDM_DATE_TIME_OFFSET)) {
return "{ \"type\": \"string\", \"format\": \"date-time\" }";
} else if (fullyQualifiedName.contentEquals(EDM_DATE)) {
return "{ \"type\": \"string\", \"format\": \"date\" }";
} else if (fullyQualifiedName.contentEquals(EDM_DECIMAL)
|| fullyQualifiedName.contentEquals(EDM_INT_64)
|| fullyQualifiedName.contentEquals(EDM_INT_32)
|| fullyQualifiedName.contentEquals(EDM_INT_16)) {
return "{ \"type\": \"number\" }";
} else if (fullyQualifiedName.contentEquals(EDM_BOOLEAN)) {
return "{ \"type\": \"boolean\" }";
} else {
LOG.error("Unsupported type mapping! Type:" + fullyQualifiedName);
return null;
}
}
}

View File

@ -5,12 +5,12 @@ import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.olingo.commons.api.edm.Edm;
import org.apache.olingo.commons.api.edm.EdmElement;
import org.apache.regexp.RE;
import org.reso.commander.common.Utils;
import java.lang.reflect.Type;
import java.time.OffsetDateTime;
import java.time.format.DateTimeFormatter;
import java.time.format.DateTimeParseException;
import java.util.*;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicLong;
@ -42,12 +42,12 @@ public class PayloadSampleReport implements JsonSerializer<PayloadSampleReport>
/**
* FieldsJson uses a JSON payload with the following structure:
*
* {
* "resourceName": "Property",
* "fieldName": "AboveGradeFinishedArea",
* "availability": 0.1
* }
* <p>
* {
* "resourceName": "Property",
* "fieldName": "AboveGradeFinishedArea",
* "availability": 0.1
* }
*/
private static final class FieldsJson implements JsonSerializer<FieldsJson> {
static final String
@ -95,11 +95,11 @@ public class PayloadSampleReport implements JsonSerializer<PayloadSampleReport>
}
/**
* resourceName: "Property",
* fieldName: "StateOrProvince",
* lookupName: "StateOrProvince",
* lookupValue: "CA",
* availability: 0.03
* resourceName: "Property",
* fieldName: "StateOrProvince",
* lookupName: "StateOrProvince",
* lookupValue: "CA",
* availability: 0.03
*/
private static final class LookupValuesJson implements JsonSerializer<LookupValuesJson> {
final String resourceName, fieldName, lookupValue;
@ -147,12 +147,15 @@ public class PayloadSampleReport implements JsonSerializer<PayloadSampleReport>
}
private static Map<String, Map<String, Integer>> createResourceFieldTallies(Map<String, List<PayloadSample>> resourcePayloadSamplesMap) {
AtomicReference<Map<String, Map<String, Integer>>> resourceTallies = new AtomicReference<>(new LinkedHashMap<>());
AtomicReference<Map<String, Map<String, Integer>>> resourceTallies = new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));
AtomicInteger numSamples = new AtomicInteger(0);
resourcePayloadSamplesMap.keySet().forEach(resourceName -> {
LOG.info("Processing resource: " + resourceName);
//if there are samples for the given resource, sum the tallies, otherwise 0.
numSamples.set(resourcePayloadSamplesMap.get(resourceName) != null
? resourcePayloadSamplesMap.get(resourceName).stream().reduce(0, (a, f) -> a + f.getSamples().size(), Integer::sum) : 0);
LOG.info("Sample size: " + numSamples.get());
//for each resource, go through the keys and tally the data presence counts for each field
@ -163,11 +166,11 @@ public class PayloadSampleReport implements JsonSerializer<PayloadSampleReport>
.forEach(payloadSample -> payloadSample.getSamples()
.forEach(sample -> sample
.forEach((fieldName, encodedValue) -> {
if (encodedValue != null) {
resourceTallies.get().get(resourceName).putIfAbsent(fieldName, 0);
resourceTallies.get().get(resourceName).put(fieldName, resourceTallies.get().get(resourceName).get(fieldName) + 1);
}
})));
if (encodedValue != null) {
resourceTallies.get().get(resourceName).putIfAbsent(fieldName, 0);
resourceTallies.get().get(resourceName).put(fieldName, resourceTallies.get().get(resourceName).get(fieldName) + 1);
}
})));
}
});
return resourceTallies.get();
@ -214,8 +217,8 @@ public class PayloadSampleReport implements JsonSerializer<PayloadSampleReport>
//serialize lookup values
JsonArray lookupValues = new JsonArray();
lookupValueFrequencyMap.get().forEach((lookupValue, frequency) -> {
LookupValuesJson lookupValuesJson = new LookupValuesJson(lookupValue.getResourceName(), lookupValue.getFieldName(), lookupValue.getLookupValue(), frequency);
lookupValues.add(lookupValuesJson.serialize(lookupValuesJson, LookupValuesJson.class, null));
LookupValuesJson lookupValuesJson = new LookupValuesJson(lookupValue.getResourceName(), lookupValue.getFieldName(), lookupValue.getLookupValue(), frequency);
lookupValues.add(lookupValuesJson.serialize(lookupValuesJson, LookupValuesJson.class, null));
});
JsonObject availabilityReport = new JsonObject();
@ -247,34 +250,41 @@ public class PayloadSampleReport implements JsonSerializer<PayloadSampleReport>
AtomicReference<OffsetDateTime> offsetDateTime = new AtomicReference<>();
resourcePayloadSamplesMap.get().get(resourceName).forEach(payloadSample -> {
resourcesJson.totalBytesReceived.getAndAdd(payloadSample.getResponseSizeBytes());
resourcesJson.totalResponseTimeMillis.getAndAdd(payloadSample.getResponseTimeMillis());
resourcesJson.numSamplesProcessed.getAndIncrement();
resourcesJson.numRecordsFetched.getAndAdd(payloadSample.encodedSamples.size());
resourcesJson.totalBytesReceived.getAndAdd(payloadSample.getResponseSizeBytes());
resourcesJson.totalResponseTimeMillis.getAndAdd(payloadSample.getResponseTimeMillis());
resourcesJson.numSamplesProcessed.getAndIncrement();
resourcesJson.numRecordsFetched.getAndAdd(payloadSample.encodedSamples.size());
payloadSample.encodedSamples.forEach(encodedSample -> {
offsetDateTime.set(OffsetDateTime.parse(encodedSample.get(payloadSample.dateField)));
if (offsetDateTime.get() != null) {
if (resourcesJson.dateLow.get() == null) {
resourcesJson.dateLow.set(offsetDateTime.get());
} else if (offsetDateTime.get().isBefore(resourcesJson.dateLow.get())) {
resourcesJson.dateLow.set(offsetDateTime.get());
}
payloadSample.encodedSamples.forEach(encodedSample -> {
try {
offsetDateTime.set(OffsetDateTime.parse(encodedSample.get(payloadSample.dateField)));
if (offsetDateTime.get() != null) {
if (resourcesJson.dateLow.get() == null) {
resourcesJson.dateLow.set(offsetDateTime.get());
} else if (offsetDateTime.get().isBefore(resourcesJson.dateLow.get())) {
resourcesJson.dateLow.set(offsetDateTime.get());
}
if (resourcesJson.dateHigh.get() == null) {
resourcesJson.dateHigh.set(offsetDateTime.get());
} else if (offsetDateTime.get().isAfter(resourcesJson.dateHigh.get())) {
resourcesJson.dateHigh.set(offsetDateTime.get());
}
}
if (resourcesJson.dateHigh.get() == null) {
resourcesJson.dateHigh.set(offsetDateTime.get());
} else if (offsetDateTime.get().isAfter(resourcesJson.dateHigh.get())) {
resourcesJson.dateHigh.set(offsetDateTime.get());
}
}
if (encodedSample.containsKey(POSTAL_CODE_KEY)) {
postalCodes.add(encodedSample.get(POSTAL_CODE_KEY));
}
});
if (encodedSample.containsKey(POSTAL_CODE_KEY)) {
postalCodes.add(encodedSample.get(POSTAL_CODE_KEY));
}
} catch (DateTimeParseException dateTimeParseException) {
LOG.error("Could not parse date for field " + payloadSample.dateField + ", with value: "
+ encodedSample.get(payloadSample.dateField) + ". Expected ISO 8601 timestamp format!"
);
throw dateTimeParseException;
}
});
if (resourcesJson.pageSize.get() == 0) resourcesJson.pageSize.set(payloadSample.getSamples().size());
});
if (resourcesJson.pageSize.get() == 0) resourcesJson.pageSize.set(payloadSample.getSamples().size());
});
}
if (postalCodes.size() > 0) {
resourcesJson.postalCodes.set(postalCodes);
@ -302,9 +312,9 @@ public class PayloadSampleReport implements JsonSerializer<PayloadSampleReport>
}
final String
RESOURCE_NAME_KEY = "resourceName",
FIELD_NAME_KEY = "fieldName",
NUM_LOOKUPS_TOTAL = "numLookupsTotal";
RESOURCE_NAME_KEY = "resourceName",
FIELD_NAME_KEY = "fieldName",
NUM_LOOKUPS_TOTAL = "numLookupsTotal";
/**
* Gson invokes this call-back method during serialization when it encounters a field of the
@ -404,7 +414,7 @@ public class PayloadSampleReport implements JsonSerializer<PayloadSampleReport>
? src.dateLow.get().format(DateTimeFormatter.ISO_INSTANT) : null);
totals.addProperty(DATE_HIGH_KEY, src.dateHigh.get() != null
? src.dateHigh.get().format(DateTimeFormatter.ISO_INSTANT): null);
? src.dateHigh.get().format(DateTimeFormatter.ISO_INSTANT) : null);
JsonArray keyFields = new JsonArray();
src.keyFields.get().forEach(keyFields::add);

View File

@ -1,7 +1,7 @@
{
"description": "RESO Data Dictionary Metadata Report",
"version": "1.7",
"generatedOn": "2021-07-14T06:35:26.913Z",
"generatedOn": "2021-12-13T01:44:51.102Z",
"fields": [
{
"resourceName": "Property",
@ -29170,6 +29170,204 @@
"value": "The website URL or ID of social media site or account of the member. This is a repeating element. Replace [Type] with any of the options from the SocialMediaType field to create a unique field for that type of social media. For example: SocialMediaFacebookUrlOrID, SocialMediaSkypeUrlOrID, etc."
}
]
},
{
"resourceName": "Field",
"fieldName": "FieldKey",
"type": "Edm.String",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Field Key"
},
{
"term": "Core.Description",
"value": "The key used to uniquely identify the Field."
}
]
},
{
"resourceName": "Field",
"fieldName": "ResourceName",
"type": "Edm.String",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Resource Name"
},
{
"term": "Core.Description",
"value": "The name of the resource the field belongs to. This will be a RESO Standard Name, when applicable, but may also be a local resource name, for example \"Property.\""
}
]
},
{
"resourceName": "Field",
"fieldName": "FieldName",
"type": "Edm.String",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Field Name"
},
{
"term": "Core.Description",
"value": "The name of the field as expressed in the payload. For OData APIs, this field MUST meet certain naming requirements and should be consistent with whats advertised in the OData XML metadata (to be verified in certification). For example, \"ListPrice.\""
}
]
},
{
"resourceName": "Field",
"fieldName": "DisplayName",
"type": "Edm.String",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Display Name"
},
{
"term": "Core.Description",
"value": "The display name for the field. SHOULD be provided in all cases where the use of display names is needed, even if the display name is the same as the underlying field name. The DisplayName MAY be a RESO Standard Display Name or a local one."
}
]
},
{
"resourceName": "Field",
"fieldName": "ModificationTimestamp",
"type": "Edm.DateTimeOffset",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Modification Timestamp"
},
{
"term": "Core.Description",
"value": "The timestamp when the field metadata item was last modified. This is used to help rebuild caches when metadata items change so consumers don\u0027t have to re-pull and reprocess the entire set of metadata when only a small number of changes have been made."
}
]
},
{
"resourceName": "Lookup",
"fieldName": "LookupKey",
"type": "Edm.String",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Lookup Key"
},
{
"term": "Core.Description",
"value": "The key used to uniquely identify the Lookup entry."
}
]
},
{
"resourceName": "Lookup",
"fieldName": "LookupName",
"type": "Edm.String",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Lookup Name"
},
{
"term": "Core.Description",
"value": "It is called a \"LookupName\" in this proposal because more than one field can have a given lookup, so it refers to the name of the lookup rather than a given field. For example, Listing with CountyOrParish and Office with OfficeCountyOrParish having the same CountyOrParish LookupName. This MUST match the Data Dictionary definition for in cases where the lookup is defined. Vendors MAY add their own enumerations otherwise. The LookupName a given field uses is required to be annotated at the field level in the OData XML Metadata, as outlined later in this proposal."
}
]
},
{
"resourceName": "Lookup",
"fieldName": "LookupValue",
"type": "Edm.String",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Lookup Value"
},
{
"term": "Core.Description",
"value": "The human-friendly display name the data consumer receives in the payload and uses in queries. This MAY be a local name or synonym for a given RESO Data Dictionary lookup item."
}
]
},
{
"resourceName": "Lookup",
"fieldName": "StandardLookupValue",
"type": "Edm.String",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Standard Lookup Value"
},
{
"term": "Core.Description",
"value": "The Data Dictionary LookupDisplayName of the enumerated value. This field is required when the LookupValue for a given item corresponds to a RESO standard value, meaning a standard lookup display name, known synonym, local name, or translation of that value. Local lookups MAY omit this value if they don\u0027t correspond to an existing RESO standard lookup value."
}
]
},
{
"resourceName": "Lookup",
"fieldName": "LegacyODataValue",
"type": "Edm.String",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Legacy OData Value"
},
{
"term": "Core.Description",
"value": "The Legacy OData lookup value that the server vendor provided in their OData XML Metadata. This value is optional, and has been included in order to provide a stable mechanism for translating OData lookup values to RESO standard lookup display names, as well as for historical data that might have included the OData value at some point, even after the vendor had converted to human friendly display names."
}
]
},
{
"resourceName": "Lookup",
"fieldName": "ModificationTimestamp",
"type": "Edm.DateTimeOffset",
"nullable": true,
"isCollection": false,
"unicode": true,
"annotations": [
{
"term": "RESO.OData.Metadata.StandardName",
"value": "Modification Timestamp"
},
{
"term": "Core.Description",
"value": "The timestamp for when the enumeration value was last modified. The timestamp for when the enumeration value was last modified."
}
]
}
],
"lookups": [

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

View File

@ -38,7 +38,7 @@
<Request
RequestId="metadata-request"
OutputFile="metadata-request.xml"
Url="*ClientSettings_WebAPIURI*/$metadata*Parameter_OptionalMetadataFormatParameter*"
Url="*ClientSettings_WebAPIURI*/$metadata?$format=application/xml"
/>
<Request