Subject: | |
From: | |
Reply To: | |
Date: | Fri, 16 Feb 2007 15:46:40 +0530 |
Content-Type: | text/plain |
Parts/Attachments: |
|
|
Hi all
We have been routinely analysing carbonates in a gas bench (Delta plus XP)
for last 1.5 years. we obtained good precision of ~0.01 to 0.03 per mil for
oxygen (in a single measurement) over a range of 44 signal 3000 mv to 20000
mv (from 100 ug to several hundred ug level of pure carbonate). Over last
one month the precision level has been bad (>0.1 per mil) particularly for
low signal (like <7000 mv) or very high signal (say >25000 mv). The
precision is generally good between 8000mv and 20000 mv even now. For water
sample of ~ 12000 mv excellent precision is obtained even now. My question
is
1) Why precision goes bad at >25000 mv signal? Is it that the nafion tube
cant cope up with so much of water produced n eventually affecting the
source?
2) Why precision has gone down for <7000 mv signal when we were getting a
good precision earlier for even 100 ug (~3000 mv) sample? In fact we were
getting ready to push the machine for analysing even 50 ug sample without
cryofocussing n normal configuration of gas bench (i.e. normal large vial).
Does it indicate detereorating filament condition after 1.5 yrs of usage?
Trap/Box current though dont suggest that.
---------------------------------------------------------------------
Anindya Sarkar
Associate Professor & PI, National Stable Isotope Facility
Department of Geology and Geophysics
Indian Institute of Technology
Kharagpur 721302
West Bengal, INDIA
Tel.: 0091-3222-283392 (O) 283393 (R) 220184 (R)
Cell: 09434043377
Fax.: 0091-3222-282268
http://anindya-sarkar.tripod.com
------------------------------------------------------------------
|
|
|