I'm switching sides guys. After watching bhoese's(who I highly respect) video this morning I decided to take another look, maybe a harder look, at my dots. They do change speeds! I don't know why I didn't notice or see it before, but I see it change from normal to slow to normal to slow. I went to BPB single hole play on hole #14 knowing it had a back to front slope.
I see the dots change particularly well when 30 feet left of the front pin. I have tons of dots on the screen as the grid is 8 squares wide by 20 squares long. So I did some quick tests using Chrome with its FPS counter enabled and with the Windows Resource Monitor open to watch CPU usage.
30 ft left of front pin, Normal speed dots: CPU usage about 30%, FPS a steady 60 FPS
30 ft left of front pin, Slow dots: CPU usage in high teens to low 20s (about 18 to 22%), erratic FPS in the 40 to 45 FPS range.
I saw no difference turning hardware acceleration on/off in Flash.
I saw no difference enabling/disabling my Nvidea GPU vs the integrated Intel 4600.
On Waterfox (optimized for 64bit O/S)
Similar results as Chrome.
30 ft left of front pin, Normal speed dots: CPU usage about 19 to 21%, no FPS counter on WF
30 ft left of front pin, Slow dots: CPU usage in low teens(about 12 to 14%), no FPS counter on WF
No difference turning hardware acceleration on/off in Flash.
No difference enabling/disabling my Nvidea GPU vs the integrated Intel 4600.
The CPU workload reducing with slow dots and no difference seen reconfiguring my PC seems to indicate something in the Game Client is changing the FPS to the screen. It could be the Game Client doing other work before writing to the screen or its built in to however WGT changed the frame rate a while back.
**edit**
I see normal speed dots and no slowdown using Chip mode to view the dots. At least in Waterfox, not checked Chrome yet.