Reply
Sun 8 Jun, 2014 03:32 pm
I took a test on which the scale shown for scores goes from 70-130. 100 represents the average score for my peer group. My score was 105. They say the standard deviation is 15.
I am wondering how accurate the test is and what the range of true scores would be in a hypothetical perfectly devised test. E.g., could a score of 105 really reflect anywhere from 95-115? 85-125? So I am thinking that what I am wanting to know would be the "margin of error." Is that correct?
But, I read that the margin of error is usually twice the standard deviation. If both of the preceding were true, then a score of 100 could really reflect anywhere from truly being 70-130 on a hypothetical perfectly devised test?