Huge samples are very nice if you can get 'em, though such is not
always the case, alas.
So one thing that I would like to see from people who do have
gigantic N is to do some analyses to determine at what point the
data reach some asymptote. In other words, if you've collected
1,000,000 people, at what earlier point in your sampling could you
have stopped, and come to the identical conclusions with valid
statistics?
Obviously, the answer to this question will be different for
different types of studies with different types of variance and so
forth. But having the large N allows one to perform this
calculation, so that next time one does a similar study, one could
reasonably stop after reaching a smaller and more manageable sample
size.
Has anybody already done this for those large samples that were
recently discussed? It would be really helpful for those who cannot
always collect such samples.
Best
Robert
-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
Robert J. Zatorre, Ph.D.
Montreal Neurological Institute
3801 University St.
Montreal, QC Canada H3A 2B4
phone: 1-514-398-8903
fax: 1-514-398-1338
e-mail: robert.zatorre@xxxxxxxxx
web site: www.zlab.mcgill.ca