I have studied the linearity of a method on two different occasions and got different results for the y-intercept: on one occasion the 95% confidence interval of the y-intercept included the zero value, on the other it did not. Why does this happen? Could someone give me a hint?
Another question: when checking the linearity of an assay method from 80% to 120% of the target concentration, must the 95% confidence interval of the y-intercept include the zero value?
Thanks in advance.
![]()
![]()
![]()
![]()
By bookoon on Tuesday, September 30, 2003 - 05:21 am:
hi
i don't think the zero value should be included.
By 'zero value' i hope what you mean is zero value in both co-ordinates, right?
If that is so then this value is hypothetical, not a real one, while all the other points are real ones (80,90,100,110 and 120%).
if you include zero you are increasing one more point. that is why linearity changes.
hope i was clear enough.
![]()
![]()
![]()
![]()
By Anonymous on Tuesday, September 30, 2003 - 09:24 pm:
you can justify the use of single point calibration if the zero value is included.
can't say much if you have done it only twice...could be error on pipetting, instrument error or high baseline...