I am using Thermal Desorption tubes as a means of monitoring Occupational Exposure to solvents and was wondering if anyone has experience of same. In particular i would be most interested in hearing about preperation of standards. I am currently using gas bags to make up standards (Componants b.p. range 50oC to 150oC. I have also heard that standards can be made up by direct injection of solvent onto the front end of the tube while pumping the tube. Does anyone have pros and cons for both methods or reasons for prefering one over the other. I would appreciate any advice .
Thanks Sean
![]()
![]()
![]()
![]()
By Mike on Thursday, August 31, 2000 - 04:40 am:
Gas bags and glass containers are OK up to about 80C boiling point. Thereafter you would need to check for surface adsorption in some way, eg a low-boiling internal std. to compare detector response. I sometimes use this method, sometimes certified cylinders, occasionally generated test atmospheres, but mostly liquid spiking, except for very volatile materials. Even though it is my favourite method there are many problems. First you must decide whether to purge the dilution solvent (my vote), or leave it on. If you purge, are you washing out the analyte? You have to check these things. Second do you really know what the microlitre syringe is delivering? Everything depends on it. Forget manufacturer certificates, forget mercury or water (unless that is your solvent). Certificates, except your own, have no value here due to surface tension effects etc. I prefer methanol solvent,inject 5uL and calibrate gravimetrically by injection of the solvent into very small septum-sealed flasks, weighing to an uncertainty of +/-0.03mg.
Thermal desorption has many advantages, but one of the down sides is that you must make meticulous traceable small volume/mass measurements at some stage. No current ISO standard deals with it yet. Volumetric apparatus stds. say you must allow for evaporation by timed weighings. Evaporation is so fast at the uL level that it must not be allowed to happen in liquid syringe calibration. Perhaps someone can put me right on whether there is an ASTM standard for calibration at the 5-10 microlitre level. 1uL plunger-in-needle syringes might be very good on absolute delivery, but try proving it. I have used bromoform to get the weight up, but it is not really convincing.
If there are so many problems why do I usually prefer liquid spiking? The main reason is easier traceability by mass. I depend on only a single outside contractor to certify the balance and my own documented microsyringe calibration. Certified gas cyinders of high-boiling VOCs must be treated with caution. The suppliers know what they put in, but you cannot always be sure what comes out of your delivery system (and certificates of "analysis" usually mean what analysis of what they put in, despite the impression to the contrary).
What size tubes are you using? If Perkin-Elmer type there is only one true certified reference material so far (BCR-112). Secondary RMs are available through http://www.markes.com. If you are using any other tube type, sorry, there are no true primary RMs available.
If you use Perkin-Elmer type, consider joining a proficiency testing scheme, probably the UK scheme WASP in your case or AIHA/PAT scheme for USA. AIHA/PAT tubes are supplied from WASP so it doesn't matter which.
Posting is currently disabled in this topic. Contact your discussion moderator for more information.