I have my own little theory on this but as I don't know how the image on a screen is produced it would probably fall apart under scrutiny.
But I reckon it is all to do with timing and screen refresh rates. If images are produced on a screen line by line one pixel wide from top to bottom then my theory might be able to hold water.
When you use the vertical meter the screen has to draw only a short line across the width of the meter and that only takes a fraction of a second, but with the horizontal meter the screen has to draw a longer horizontal line to a certain point on the meter one pixel wide, then it has to draw another one underneath that and another and another until the meter has its width filled up with green lines to a certain point, then it has to do that all over again every time the meter moves a little bit further forward. Now, if the meter is moving forward at a faster rate than the screen can refresh itself maybe the meter appears to jump forward because the screen can't draw the lines fast enough. Whereas the lines drawn across the vertical meter are so much shorter, they can be drawn quicker and the screen refresh rate doesn't have so much trouble with it.
(It's only a laymans theory so no need to shoot me down in flames, hehe)