acfd0edd38
This means integration tests no longer need to depend on a $CI_CHROMEDRIVER_VERSION_ARG environment variable to specify which chromedriver version to download to match the locally installed chrome. This was bad DX and not having it specified was not reliable as webdriver-manager would not always download the chromedriver version to work with the locally installed chrome. webdriver-manager update --gecko=false --standalone=false $CI_CHROMEDRIVER_VERSION_ARG is now replaced with node webdriver-manager-update.js in the root package.json, which checks which version of chrome puppeteer has come bundled with & downloads informs webdriver-manager to download the corresponding chrome driver version. Integration tests now use "webdriver-manager": "file:../../node_modules/webdriver-manager" so they don't have to waste time calling webdriver-manager update in postinstall "// resolutions": "Ensure a single version of webdriver-manager which comes from root node_modules that has already run webdriver-manager update", "resolutions": { "**/webdriver-manager": "file:../../node_modules/webdriver-manager" } This should speed up each integration postinstall by a few seconds. Further, integration test package.json files link puppeteer via file:../../node_modules/puppeteer which is the ideal situation as the puppeteer post-install won't download chrome if it is already downloaded. In CI, since node_modules is cached it should not need to download Chrome either unless the node_modules cache is busted. NB: each version of puppeteer comes bundles with a specific version of chrome. Root package.json & yarn.lock currently pull down puppeteer 2.1.0 which comes with chrome 80. See https://github.com/puppeteer/puppeteer#q-which-chromium-version-does-puppeteer-use for more info. Only two references to CI_CHROMEDRIVER_VERSION_ARG left in integration tests at integration/bazel-schematics/test.sh which I'm not entirely sure how to get rid of it Use a lightweight puppeteer=>chrome version mapping instead of launching chrome and calling browser.version() Launching puppeteer headless chrome and calling browser.version() was a heavy-handed approach to determine the Chrome version. A small and easy to update mappings file is a better solution and it means that the `yarn install` step does not require chrome shared libs available on the system for its postinstall step PR Close #35049 |
||
---|---|---|
.. | ||
de | ||
en | ||
fr | ||
legacy | ||
runtime | ||
README.md | ||
app.po.ts | ||
protractor.conf.js | ||
tsconfig.json |
README.md
E2E tests
There are four different sets of e2e tests in this folder. They are all testing different translation scenarios, but they are all built with IVY enabled.
runtime
A new polyfills.ts
file is provided (polyfills-runtime.ts
) which is swapped in by a file
replacement in the angular.json
configuration. In this new file:
- Runtime translations are provided (
loadTranslations()
). - The current locale is set (
$localize.locale = 'fr'
) and loaded (registerLocaleData(localeFr);
)
de and fr
The application is built (into the dist
folder) and then two sets of translations
(src/locales/messages.(de|fr).json
) are used to generate two copies of the app, which have
been translated (compile-time inlined).
These translated apps are stored in tmp/translations/(de|fr)
.
legacy
The legacy ng xi18n
tool extracts the messages from the Angular templates, into the XLIFF 1.2
format with legacy message ids (tmp/legacy-locales/messages.legacy.xlf
).
The translation file is modified to apply a simple translation.
The app must be compiled using the i18nLegacyMessageIdFormat
option set to ensure that the correct
message ids are used to match those in the translation files.
The app is translated using the compile-time inlining tool to generate a copy that has the translated message in it.
Hosting
Since the CLI hosts from and in-memory file-system the compile-time inliner is not able to
translate the output files. So the de
, fr
and legacy
apps must be statically built to
disk and translated there.
Since the translated app is now on disk, we cannot use the CLI to serve it. Instead we use a simple static HTTP server instead.