I've heard a lot of discussion of the accuracy of IQ tests when applied to adults, as far as developing a "baseline" to compare against. Most of the "basic" IQ tests were designed for children, and the score is simply the ratio of their mental (developmentally speaking) age, as determined by the test, to their physical age (hence the name Intelligence Quotient). So an 8-year-old that thinks like a 10-year-old should show an IQ of 125, but a 10-year-old that thinks like an 8-year-old would have an IQ of 80. Thus 100 is considered the baseline.
Unfortunately, such tests don't make much sense for adults, since after some point, development will cease, so a 20-year-old should be just as "developed" as a 45-year-old or an 80-year-old. So for adults, they had to readjust the testing procedures and gear them more towards advanced logical thinking, vocabulary, mathematics, etc, and "invent" a baseline based on a bunch of people taking the tests. So a lot of the common "adult" IQ tests are much measurements of social variables (education, upbringing, etc) as they are actual intelligence.
I actually had an (actual) IQ test administered when I was a kid as part of a "gifted" (*snort*) program entry requirement, and scored a 142 overall. The web test gave me a 135. But they're different tests, so I don't know if the results can be compared in any meaningful manner. Then again, that site probably gives everyone good scores, so that they'll buy their IQ test reports and brag to their friends. It'd be an interesting experiment to take it blindfolded and see if the score is above 100