Difference between mispredicted branches retired and mispredicted branches executed-Collection of common programming errors

I have an unusual case where I’m examining a relatively trivial Java which should have well predicted branching behavior, but likwid-perfctr reports a huge (50%) branch mispredict rate. However, the runtime is not consistent with that misprediction rate (CPI is very low, at ~0.3, and stalls on the front and back end are low). A similar C++ program suffers no mispredicts but actually has a slightly longer runtime.

It seems that the discrepancy is in BR_MISP_RETIRED_ALL_BRANCHES versus BR_MISP_EXEC_ANY events. The former, which is used to calculate the mispredict ratio, is large, at one event per loop. The latter, is very small, consistent with the expected mispredict rate, and also with the runtime.

For other sample programs that I tried, which do exhibit serious mispredicts (e.g., branching on random value in an array), the values are very similar.

Is anyone familiar with the exact meaning of these two counters? How can the “retired” counter be 25x larger than the “exec” counter for mispredicted branches – when any branch must be executed before it is retired?