Animation is one of the hardest kinds of programming to get right, but it's so obvious when it's not. Other kinds of programs might have errors that nobody knows about, but animation is in your face.
For full animation control you need a separate image for each frame. GIF also lets you do animation within a single file, but that's a different game. For your program to control it, you need separate images. I made eight beachball images, each slightly different from the next (but all identical in size), which you can see here:
For your animation to be credible, you need to observe how real balls (or whatever) move, so you can reproduce the same flow of motion. You also want the differences between the images to be a lot smaller, but this will demonstrate the techniques. Here I tried to get a little precession (wobble) in the ball rotation, but have it come back to the starting point so I can re-use the same eight images as the ball rolls across the screen.
In HTML we display an image thus:
<img SRC="Ballz/Ball0.gif" BORDER=0 height=29 width=29>where the height and width are the actual dimensions of your image. When you output this text using document.write() inside your script, it shows the picture at that place in the window. If you give some other sizes, the image will be stretched or shrunk to fit, but that will take extra time that you might prefer to use in a higher frame rate. Images in HTML are placed next in the normal text flow on a standard web page. We deal with that in this program by filling the lines above and the space to the left with something else, in my case another image that is plain white and stretched to the size I want. As before, I will pass the image state in the hash code attached to the URL.
This should get you started. Have fun!
rev. 2010 December 23
* Note on Frame rate. I think the perception of flicker depends mostly on how much time the screen is showing something other than what the viewer is supposed to see. Movie projectors spend about 2/3 of their time showing the film, and 1/3 of the time (about 100th of a second) in the dark while moving the film to the next frame. Old-style TV tubes had a little tiny spot of electron brightness flying all over the screen, so 99% of the time any one part of the screen was dark. They fixed that by alternating odd and even lines (called "interlace") so the dark time was 1/60th of a second, then they used high-persistence phosphors in the tube which did not go out so quickly after the electron beam moved on. As a result, the image was not dark any longer than in the movies. HDTV projectors are "progressive scan" (not interlace), so the dark time is essentially doubled, requiring a much higher frame rate to overcome the flicker. Modern liquid-crystal (LCD) displays (including most so-called LED displays) depend on the sluggish quality of the organic LCD goo to keep the image there long after the electronics has gone on to refresh the other pixels -- in fact they work hard to speed that up, because otherwise rapidly moving images tend to blur; you will see more flicker from the flourescent backlight than from the LCD itself.
and image rasterizer are working to replace it with new image. Clever programmers
often "double-buffer" the image -- that is they leave the old image up
on the display while preparing the new display invisibly in memory; then
when it's ready, the blit (a fast transfer of pixels) from the memory
image to the screen. This way the use sees no dark (or white, as the case
may be) screen at all, only the actual pixels. If they are not too different,
the eye integrates them very well. For now we depend on the browser to
do that for us (or not).