My reference cylinder hydrogen has the value
(-270) closer to my light working standard (-332) than my heavy standard
(-17). As a result, the fluctuation is larger for the heavy standard. If this is
true, then the differences may not be always the same. The machine may
have "bias" toward these two standards. The same thing happens
when I use high T, glassy-C method for dD analysis.
----- Original Message -----
Sent: Friday, December 05, 2003 2:33
Subject: Re: [ISOGEOCHEM] Raw dD value
We have see a similar effect here using a FM
H/Device, and I know another researcher using a Micromass continuous flow
system who has a similar effect.
I assume the results you show below
are then normalized to the correct value of the standards. However, that
means that the difference between them should be the same. Your data
shows the difference between the standards has changed from -281 to -289, an 8
delta difference when I presume the precision of your system is better than 1
I assume that most analysts are
using a two point calibration for D analysis as recommended in:
W. A, T. B. Coplen. 2001. An interlaboratory study to test
instrument performance of hydrogen dual-inlet isotope-ratio mass
spectrometers. Fresenius J. Anal. Chem. 370:358-362.
Doing a two
point calibration should take care of any change in each standard if it is
consistent over the course of the analysis run. However, what seems to
happen is that during the course of an analysis using chromium, if one
calibrates with one standard, the other standard of a different isotope ratio
does not always drift or change at the same rate. I calibrate with a
+3.5 standard and then, if analyzing samples from 0 to -80 which is our usual
range, us a -95 standard for the two point calibration. I was originally
using two standards so close together because memory effects were a problem
from the syringe in the auto-sampler, a problem we have now
However, just calibrating with the 3.5 the -95 standard would
change during the course of a 23 hour 100 injection analysis, as shown
One can see here that -95
standard has a tendency to be more negative at the beginning of a run than at
the end, but not consistently. I eventually created a a spreadsheet that fits
a curve to both the 3.5 and -95 standards, and then for every injection does a
two point fit between the 3.5 and -95 curve. Quality controls
since I started this procedure have been excellent, better than plus or minus
0.4 delta units long term external precision.
| run number
|beginning of run
|end of run
I was able to modify my
spreadsheet to work with 30 hour analysis runs that a colleague was doing with
a Micromass continuous flow system and was seeing the same effect.
Therefore the effect would not seem to be from the instrument, but some effect
of the chromium.
I would be interested in any other researchers who
have seen a similar effect.
AM 12/5/03 -0800, you wrote:
I use IsoPrime to run
dD of water by Cr reduction method. The raw values of the working standards
sometimes fluctuate day by day. The values of -360 and -79 for the
first day can be -354 and -65 the next day. There is no big change of the
machine condition except a slight shift of peak center. However, the
calculated dD values of the repeated samples are perfectly match from day to
day. It seems not a real big problem. I will feel better if more people
telling me they have the similar experiences.
Paul D. Brooks,
Center for Stable Isotope Biogeochemistry,
Dept. Integrative Biology MC3140,
3060 Valley Life Sciences Building,
UC Berkeley, Ca. 94720-3140.