Details
-
Bug
-
Status: Open
-
P3
-
Resolution: Unresolved
-
None
-
None
-
None
Description
Currently we have some performance benchmarks in Beam, however they are not always monitored and we may not notice when a regression happens, see [1].
Filing this issue to improve the tooling. Note that if automation becomes too noisy (too many regressions) people likely won't be paying attention either.
As a first step, we could monitor regression in 50th percentile of performance metrics we collect for benchmarks that we run repeatedly.
cc: kamilwu kasiak markflyhigh