I successfully completed the Greyhound program, however I noticed that although my dogs are progressing a random amount between 1 and 4 pixels (average of 2.5 pixels) it took more time than expected for them to complete the race.
I therefore added a stopwatch and a counter on Timer1.
I consistently now get for each race:
* c.490 iteration of the timer
* c. 7.5 seconds of race according to the stopwatch.
This surprises me as Timer1 is set at 10ms, which means that an average race should take 4.9 second to complete!
Is it because of how I programmed things, or the computer I am working on? But even then, this program is so primitive that I'd be shocked to learn that modern computers cannot deal with this program in less than 10 ms!
Any ideas or clues as to what is happening or where I should look for to improve the program?