To measure performance I made an application with a timer which changes 54 Boolean values (as for the test), the values are bound from a wpf (xaml) window and shown as simple textblocks, this works just perfect.
However if the timer runs e.g every 100 ms. it makes the process using approx. 15-20 times more CPU load than if it runs every 50 ms. (% Processor Time for the process measured by PerfMon).
This seems weird, someone has an approximated explanation ?
Seems like the interval 100 ms is "unreasonably bad" while the interval 50 ms is "unreasonably good"
Some more (approx.) numbers:
Interval = 25 ms, load = 12%
Interval = 50 ms, load = 0.25%
Interval = 100 ms, load = 4%
Interval = 250 ms, load = 0.6%
Interval = 500 ms, load = 0.5%
Interval = 1000 ms, load = 0.1%