Is there a rule of thumb between the film thickness, elution temperature (which is length dependant) and internal diameter?
I often see "fast GC" separations ascribed to a reduction in column diameter, but often a much faster oven ramp is used. So the question is which factor is responsible for the fast GC..the smaller id, or the faster oven ramp?
By Jason Ellis on Monday, March 4, 2002 - 03:22 pm:
Fast GC can be a complicated discussion for a number of reasons. "Fast" is a relative term -- what's fast to one person might be slow to another. There are also many ways to speed up analysis in GC: reduce film thickness, increase oven temperature, reduce column length, change phase type, increase gas velocity, etc.. For this discussion, I'll make an assumption that the method is already optimized on a given dimension, the relative elution pattern must be preserved and no resolution can be lost.
It is very easy to speed up methods by reducing column length, but if we do this without changing column inner diameter then we reduce total plate count. Smaller diamter columns generate more plates per meter, so we are able to achieve the same total plate count on a shorter piece of 100um ID column than on a 320um ID. To maintain the relative elution order on the shorter column, the temperature program must be "scaled" as well. To preserve relative elution order the solutes must experience the same "thermal environment" in the column. This is where increased programming rates come into play: if the column is shorter, then the ramp rate must be increased in order maintain "thermal environment".
A good way to play with these conditions is to try the Agilent Method Translation software. It allows you to see offline what changes are required to scale a temperature program when column dimensions are changed, and also what those changes will do to the total analysis time. Keep in mind that accurate method scaling requires maintaining the phase ratio (beta). This tool is available free on the Agilent website.
If you would like to discuss this in more detail, please contact me directly.
GC Column Technical Support
By Anonymous on Wednesday, March 6, 2002 - 12:24 pm:
An excellent answer but here is an additional 2 cents worth on a related issue for discussion:
One reason Fast GC has not replaced conventional GC analysis is not its lack of speed, it is the raising of detection limits, or the need for a more expensive detector to see the same level of analyte. What would have been a simple TCD analysis now requires a pHID; a FID analysis now requires a MSD or Quad MS. Fast GC to save time and money often costs an arm and a leg.
One example: USP residual solvent headspace analysis in Pharmaceuticals has a resolution specification. Fast GC (microbore column) may perform this analysis in 3-4 minutes, Megabore column analysis can be done in 8 minutes (optimized: in 5 minutes), but headspace equilibration requires 6-15 minutes per sample.
So why do it faster? Do you have the technical training to keep a Fast GC system running properly? There are simple apps like air monitoring where Fast GC can be useful.
Just be certain you need Fast GC before you invest your money into it.
By Jason Ellis on Wednesday, March 6, 2002 - 01:14 pm:
Before converting a method to "fast GC", there are certainly many issues to consider. Some methods make sense to speed up, others do not. When moving down to a microbore (100um or smaller) column there are issues surrounding the ability to get sample onto the column. There are also issues associated with detector sampling rate and the "dynamic range" of the microbore system. I think the answer is that it really needs to be evaluated on a case-by-case basis.
Most FID methods can easily be scaled down to small ID columns because FID can provide very high data acquisition rates as well as sensitivity. Other detectors can become trickier. Again, the costs and benefits of high speed analysis need to be considered per application. Often times, analysis times can be improved substantially without much increase in cost of hardware. It really depends on what you need to accomplish. It doesn't always require special hardware or fancy detectors.
By H W Mueller on Thursday, March 7, 2002 - 12:15 am:
How does Flash GC (still Thermedics Detection.... and/or...?) fit into this which operates with normal diameter columns?
By Jason Ellis on Thursday, March 7, 2002 - 04:44 pm:
Total theoretical plates are decreased any time column length is reduced without a corresponding reduction in inner diameter. Most of the flash GC applications that I have seen make use of "standard ID" columns of short lengths. This combined with very rapid temperature programming speeds up chromatography, however this is at the expense of resolution. This approach is generally not an option for complex samples where resolution cannot be sacrificed.
There are a range of compromises that must be made when addressing high-speed GC techniques. The traditional compromises are between speed, capacity and resolution. There is no "free lunch".
By H W Mueller on Friday, March 8, 2002 - 12:19 am:
Thanks, I figured on something like that. There is a similarity in HPLC, some fast HPLC is a acadamic solution to people who do not have enough resolution as is.
By Leon on Friday, March 8, 2002 - 02:38 pm:
Of course there is no “free lunch”. But it might well be that someone has already paid for the lunch. It might also be that the available payment goes down the drain instead of being utilized to pay for the “lunch”.
A method in question might have been developed long time ago when the instruments with the currently available oven heating rates (over 100 °C/min, not counting flash GC), data acquisition rates (such as 200 Hz), high column pressure (such as 10 bar), more sensitive detectors, low volume sample introduction, etc., etc. were not commercially available. Now they are (a GC manufacturer might have already paid for the “lunch”). A list of specific steps to reduce the analysis time, and some steps for the new method development with the shortest possible analysis time were described in my January 17, 2002 posting. A quantitative measure of expected improvement for each step was also provided.
Speaking of a “free lunch”, one should also recognize that not all existing methods are optimized. Examining many published GC methods (hundreds and hundreds of them), I found that not all are optimal (i.e. the existing resources are not utilized). Some have too low or too high carrier gas flow rate, too low or too high heating rate (see, for example, my November 2, 2001 posting). It might also be that the signal-to-noise ratio is much higher than necessary for all peaks in a chromatogram, etc., etc. Elimination of shortfalls and excesses can be translated into a faster analysis with acceptable or no loss in other performance characteristics. (For the default, nearly optimal conditions, see my above mentioned January 17, 2002 posting).
No one claims, of course, that reduction in the analysis time is always possible and necessary, but, in many cases, it can be very useful, readily available, and, some times, – YES, “FREE OF CHARGE”.
One thing is VERY IMPORTANT (Jason Ellis also mentioned it in his posting). To preserve a peak elution pattern resulted from an existing method,
ALWAYS COMPLIMENT A CHANGE IN METHOD PARAMETERS (column dimensions, carrier gas type and flow rate, etc.) BY THE APPROPRIATE CHANGES (THE TRANSLATION) IN THE TEMPERATURE PROGRAM!
Use Agilent’s GC Method Translation software to find the translated temperature program for changes in other method parameters. To my best knowledge (I do not work for Agilent), the GC Method Translation software can be downloaded free of charge from www.chem.agilent.com/cag/servsup/usersoft/main.html#mxlator.
Posting is currently disabled in this topic. Contact your discussion moderator for more information.