angular-cn/packages/core/test/render3/perf
Paul Gschwendtner 5615928df9 build: no longer run tslint from within gulp task (#35800)
Switches our tslint setup to the standard `tslint.json` linter excludes.
The set of files that need to be linted is specified through a Yarn script.

For IDEs, open files are linted with the closest tslint configuration, if the
tslint IDE extension is set up, and the source file is not excluded.

We cannot use the language service plugin for tslint as we have multiple nested
tsconfig files, and we don't want to add the plugin to each tsconfig. We
could reduce that bloat by just extending from a top-level tsconfig that
defines the language service plugin, but unfortunately the tslint plugin does
not allow the use of tslint configs which are not part of the tsconfig project.

This is problematic since the tslint configuration is at the project root, and we
don't want to copy tslint configurations next to each tsconfig file.

Additionally, linting of `d.ts` files has been re-enabled. This has been
disabled in the past and a TODO has been left. This commit fixes the
lint issues and re-enables linting.

PR Close #35800
2020-03-03 09:20:49 -08:00
..
class_binding test(ivy): correct var count in perf benchmarks. (#35071) 2020-02-03 08:48:41 -08:00
directive_inputs refactor(ivy): Explicitly pass in `TView` (#35069) 2020-02-04 12:56:47 -08:00
directive_instantiate perf(ivy): fix creation time micro-benchmarks (#34031) 2019-11-25 12:47:58 -05:00
duplicate_map_based_style_and_class_bindings refactor(ivy): Explicitly pass in `TView` (#35069) 2020-02-04 12:56:47 -08:00
duplicate_style_and_class_bindings refactor(ivy): Explicitly pass in `TView` (#35069) 2020-02-04 12:56:47 -08:00
element_text_create perf(ivy): fix creation time micro-benchmarks (#34031) 2019-11-25 12:47:58 -05:00
host_binding perf(core): use multiple directives in host bindings micro benchmark (#35736) 2020-02-28 12:26:43 -08:00
interpolation refactor(ivy): Explicitly pass in `TView` (#35069) 2020-02-04 12:56:47 -08:00
listeners perf(ivy): fix creation time micro-benchmarks (#34031) 2019-11-25 12:47:58 -05:00
map_based_style_and_class_bindings refactor(ivy): Explicitly pass in `TView` (#35069) 2020-02-04 12:56:47 -08:00
ng_template perf(ivy): fix creation time micro-benchmarks (#34031) 2019-11-25 12:47:58 -05:00
noop_change_detection refactor(ivy): Explicitly pass in `TView` (#35069) 2020-02-04 12:56:47 -08:00
property_binding refactor(ivy): Explicitly pass in `TView` (#35069) 2020-02-04 12:56:47 -08:00
property_binding_update refactor(ivy): Explicitly pass in `TView` (#35069) 2020-02-04 12:56:47 -08:00
style_and_class_bindings refactor(ivy): Explicitly pass in `TView` (#35069) 2020-02-04 12:56:47 -08:00
style_binding refactor(ivy): Explicitly pass in `TView` (#35069) 2020-02-04 12:56:47 -08:00
view_destroy_hook perf(core): add micro benchmark for destroy hook invocation (#35784) 2020-03-03 08:57:58 -08:00
BUILD.bazel perf(core): add micro benchmark for destroy hook invocation (#35784) 2020-03-03 08:57:58 -08:00
README.md test(ivy): correct var count in perf benchmarks. (#35071) 2020-02-03 08:48:41 -08:00
micro_bench.ts build: no longer run tslint from within gulp task (#35800) 2020-03-03 09:20:49 -08:00
noop_renderer.ts test(ivy): support `className` in micro benchmarks (#33392) 2019-10-25 09:17:52 -07:00
noop_renderer_spec.ts test(ivy): support `className` in micro benchmarks (#33392) 2019-10-25 09:17:52 -07:00
profile_all.js build: migrate references and scripts that set to build with ivy via compile=aot to use config=ivy (#33983) 2019-11-26 16:38:40 -05:00
profile_in_browser.html build: migrate references and scripts that set to build with ivy via compile=aot to use config=ivy (#33983) 2019-11-26 16:38:40 -05:00
setup.ts perf(core): add micro benchmark for destroy hook invocation (#35784) 2020-03-03 08:57:58 -08:00
shared.ts test(ivy): introduce a benchmark for duplicate style/class bindings (#33600) 2019-11-07 17:50:33 +00:00

README.md

Build

yarn bazel build //packages/core/test/render3/perf:${BENCHMARK}_lib.min_debug.es2015.js --config=ivy

Run

node dist/bin/packages/core/test/render3/perf/${BENCHMARK}_lib.min_debug.es2015.js

Profile

node --no-turbo-inlining --inspect-brk dist/bin/packages/core/test/render3/perf/${BENCHMARK}_lib.min_debug.es2015.js

then connect with a debugger (the --inspect-brk option will make sure that benchmark execution doesn't start until a debugger is connected and the code execution is manually resumed).

The actual benchmark code has calls that will start (console.profile) and stop (console.profileEnd) a profiling session.

Deoptigate

yarn add deoptigate
yarn deoptigate dist/bin/packages/core/test/render3/perf/${BENCHMARK}_lib.min_debug.es2015.js

Run All

To run all of the benchmarks use the profile_all.js script:

node packages/core/test/render3/perf/profile_all.js

NOTE: This command will build all of the tests, so there is no need to do so manually.

Optionally use the --write command to save the run result to a file for later comparison.

node packages/core/test/render3/perf/profile_all.js --write baseline.json

Comparing Runs

If you have saved the baseline (as described in the step above) you can use it to get change in performance like so:

node packages/core/test/render3/perf/profile_all.js --read baseline.json

The resulting output should look something like this:

┌────────────────────────────────────┬─────────┬──────┬───────────┬───────────┬───────┐
│              (index)               │  time   │ unit │ base_time │ base_unit │   %   │
├────────────────────────────────────┼─────────┼──────┼───────────┼───────────┼───────┤
│       directive_instantiate        │ 276.652 │ 'ms' │  286.292  │   'ms'    │ -3.37 │
│        element_text_create         │ 262.868 │ 'ms' │  260.031  │   'ms'    │ 1.09  │
│           interpolation            │ 257.733 │ 'us' │  260.489  │   'us'    │ -1.06 │
│             listeners              │  1.997  │ 'us' │   1.985   │   'us'    │  0.6  │
│ map_based_style_and_class_bindings │  10.07  │ 'ms' │   9.786   │   'ms'    │  2.9  │
│       noop_change_detection        │ 93.256  │ 'us' │  91.745   │   'us'    │ 1.65  │
│          property_binding          │ 290.777 │ 'us' │  280.586  │   'us'    │ 3.63  │
│      property_binding_update       │ 588.545 │ 'us' │  583.334  │   'us'    │ 0.89  │
│      style_and_class_bindings      │  1.061  │ 'ms' │   1.047   │   'ms'    │ 1.34  │
│           style_binding            │ 543.841 │ 'us' │  545.385  │   'us'    │ -0.28 │
└────────────────────────────────────┴─────────┴──────┴───────────┴───────────┴───────┘

Notes

To run the benchmark use bazel run <benchmark_target>, example:

  • yarn bazel run --config=ivy //packages/core/test/render3/perf:noop_change_detection

To profile, append _profile to the target name and attach a debugger via chrome://inspect, example:

  • yarn bazel run --config=ivy //packages/core/test/render3/perf:noop_change_detection_profile

To interactively edit/rerun benchmarks use ibazel instead of bazel.

To debug

  • yarn bazel build --config=ivy //packages/core/test/render3/perf:noop_change_detection
  • node --inspect-brk bazel-out/darwin-fastbuild/bin/packages/core/test/render3/perf/noop_change_detection.min_debug.es2015.js