Linearity-Method Validation

Chromatography Forum: LC Archives: Linearity-Method Validation
Top of pagePrevious messageNext messageBottom of pageLink to this message  By Anonymous on Tuesday, July 8, 2003 - 04:28 am:

Does anyone know how to interpret residual sum of square for linearity? What does it mean?

Thanks


Top of pagePrevious messageNext messageBottom of pageLink to this message  By Anonymous on Tuesday, July 8, 2003 - 07:18 am:

The residual for each data point is the difference between the measured value and the calculated value using the slope and intercept determined by a fit of all data.

Square and sum the residuals and you'll get the residual sum of square.

I think it's best to plot the residuals against analyte concentration. In this case, the residuals should be distributed both above and below the zero residual line (representing random precision of the method, with no obvious outliers. The approach can be usefull to determine whether a systematic error exists in the method.

Look at: Snyder, L.R., Kirkland, J.J., Glajch, J.L. Practical HPLC Method Development, page 693.


Top of pagePrevious messageNext messageBottom of pageLink to this message  By Tom Mizukami on Tuesday, July 8, 2003 - 11:49 am:

In HPLC the residuals will never be equally distributed over a large concentration range. Area or height precision is normally injector limited and displays a constant %RSD not a constant SD. This will result in wedge shaped residual plots.

The practical consequence is that random error at the high concentration levels will dominate the the statistic and give meaningless intercept values.


Top of pagePrevious messageNext messageBottom of pageLink to this message  By Anonymous on Wednesday, July 9, 2003 - 05:51 am:

Thanks everyone.

That was clear. However, do you present a residual vs conc. plot or a residual sum of square value in the validation section of the submission document (e.g. NDA)?


Top of pagePrevious messageNext messageBottom of pageLink to this message  By Tom Mizukami on Wednesday, July 9, 2003 - 10:02 am:

We don't. We just give the equation of the line and the correlation coefficient. If we are validating a method over a wide range, assay and impurity, then we force the calibration curves through zero and give a short statistical justification.

If you wanted to include one or the other, I think a plot of the residuals is more informative.


Top of pagePrevious messageNext messageBottom of pageLink to this message  By Anonymous on Thursday, July 10, 2003 - 12:07 am:

I agree that plot of residuals is more informative. For me a value of sum of squares doesn't provide any additional information about the linearity.

I would still say your method might be nothing but linear if you are only looking at the equation of the line and the correlation coefficient. Several references out there, e.g:

1. Green, J.M. Analytical Chemistry (1996), 68(9), 305A-309A

2. Boque, R., Rius, F. X., Massart, D.L. Journal of Chemical Education (1994), 71(3), 230-2.

3. Dorschel, C.A., Ekmanis, J.L., Oberholtzer, J.E., Warren, F.V.J., Bidlingmeyer, B.A. Analytical Chemistry (1989), 61(17), 951A-954A, 956A, 958A, 960A, 962A, 966A, 968A.

Another simple, additional method to check for linearity is to plot the response factors against concentration. Ideally this gives a straight line with a slope of zero.

These rarely gives perfect results but still provides valuable additional information. I'm not saying you should set these as an acceptance criteria for the validation. It takes 5 minutes to plot these and I would say you don't count on what you are doing if you ignore them.

For example you might be validating over a wide range with a correlation of 0.9999. Plot of resp.factor and residuals looks terrible. Split your range in two and your plots looks much better. Doesn't this mean, you should use two different ranges for quantification?

Don't loose your chance to really learn about the method. You can still keep regulators happy.


Top of pagePrevious messageNext messageBottom of pageLink to this message  By Tom Mizukami on Saturday, July 12, 2003 - 05:22 pm:

I agree regarding the response factors. We take a good look at our calibration model in our pre-validation studies. We always use plots of specific response to determine if a single point calibration is appropriate.


Top of pagePrevious messageNext messageBottom of pageLink to this message  By Anonymous on Monday, July 14, 2003 - 08:05 am:

I totally agree with anon july 10.
We also plot response factor / concentration and compare it graphically with the average response factor.
In a second approach we calculate a regression line and compare the known concentration of each standard with the values we would get on basis of that line if the standards were unknown samples.
Excel is very helpful for both.

You get 50% deviation between calculated and known standard concentration but have a 0.9999 regression, you eliminate two low concentration levels and still have 0.9999, but now response factor plot and deviations look fine.
Both will bring you to the question how can people be happy with a regression of 0.9999.
PETER


Top of pagePrevious messageNext messageBottom of pageLink to this message  By syx on Friday, September 5, 2003 - 03:38 am:

Which is the best for linearity test? Using gradual dilution of a stock standard solution or weighing the standard one by one?


Top of pagePrevious messageNext messageBottom of pageLink to this message  By Alex on Sunday, September 7, 2003 - 10:45 pm:

syx,
The important thing for the linearity test is the right concentration of your analyte in the solution. How you prepare your solution depends on skills. We usually dilute a stock solution.
Alex


Add a Message


This is a private posting area. A valid username and password combination is required to post messages to this discussion.
Username:  
Password: