Print

Print


Greetings all. I'm spinning our D/H GC-TC-IRMS back up after a month of
disuse, and I've been running into a perplexing problem when checking
standards.

I'm seeing massive offsets in raw delta values (the ones calculated vs the
ref gas injections) from few-runs to few-runs. For example: If I run a
sequence of standard injections (range of Schimmelmann alkanes) for 24
hours, about half the runs come out at around where they should be, and
half the runs come out up to 50(!) permil too negative. If I plot offsets
from known values versus retention time, each compound is split into two
clusters of 'high' and 'low' delta values. The two run 'types' seem to
alternate randomly every 1 to 5 runs (a run takes about 70 minutes).

Now, I've seen instances where things like this happened with a visible
pattern (regular, quasi-sinusoidal offsets in a GC-C-IRMS over the course
of the day that appeared to track with the room's air conditioning, for
example), but nothing completely random and abrupt like this. Does anyone
have any experience with anything like this, or thoughts for what to dig
into?

Context: The chromatography is fine; the retention times are stable; peak
height and area are stable; H3+ factor is nice and low; source has recently
been tuned; air leaks as per argon are negligible; reactor is newly
installed and conditioned; GC column just had a meter trimmed off the end
of it and the inlet was re-assembled (new liner, new ferrules, etc); K
factor for the GC is about where one would expect; ref gas values are
stable and reproducible with a run; column bleed/background is where it's
always been.

Setup: Thermo Trace GC Ultra -> Thermo GC-TC interface (furnace/reactor) ->
Thermo GC-C III (open splits and backflush plumbing) -> Delta V Plus IRMS.

Thoughts anyone? Thanks!

Dr. Matthew Wolhowe
Research Scientist/Engineer III
Lab Manager, Oceanographic and Geochemical Stable Isotope Facility
School of Oceanography
College of the Environment
University of Washington