[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: long term variation and instrumental drift



Michael and all,

Uhhhhh!  Michael, how do you do this?  My present way of running is to put 
all the frames from one 56 exposure run in a file.  I then point the 
pipeline to that file and process it to completion.

OK here is some detail of how I have been running.
0) I set the Mark IV driver parameters to do everything but calculate the 
darks and flats
1) I arrange the setup parameters to point to the usual files, and get the 
input from a directory where I put one set of 56 exposures from a single 
area of the sky.
2) I have saved a file containing the darks, flats and the catalog stars.
3) I erase whatever is in the output file  rm -f ../output
4) I transfer the darks, flats, and catalog into the output file
5) tclsh < cmd.in >& cmd.out       I run the script
6) I transfer the .cal file to a directory full of same.
7) I have a directory full of copies of the setup parameters each pointing 
to a different directory for a single set of images.  I transfer the next 
copy into setup.param and continue from 1) above until done.

On my dual processor machine, I have two pipeline directories and two sets 
of scripts to do the above.  I just start a process going in each directory 
and go to bed.  This works fine as long as I do not copy the wrong disk to 
reload everything after the machine crashes.  ;^(

I think this results in color terms being computed for each frame set of 
the same region of the sky.

Note that I am not saving the color terms.  We will have to think about the 
best way to do such bookkeeping.   If I were to ask for a solution today, I 
would ask for a header on the .cal file were such stuff was put.  This is 
probably the wrong way to do it.  But note that the .cal file has a unique 
name, where as the astrometry file does not.  So how about giving the 
astrometry file the same name as the .cal file with a suitable 
extension.  Then it would be easy to save a set of them.  This holds for 
everything else that should be saved from the processing.

Note that we are presently getting a 300/1 reduction between raw storage 
files and .cal files.  So it would not be a burden to keep more.  But 
probably not all the .ast, .coo, .fits, .clt files.

Probably someone other than me should be trying to process large quantities 
of raw files so that they can work out what needs to be saved along the 
way.  I have all the source disks, so perhaps that will be the easiest thing.

At the moment, the biggest pain is just loading the CDs into memory.  I 
have more CDs than memory even though I have 160 GB on the production 
machine.  I doubt that I will be able to keep up with hard disk space to 
hold all the raw data.

Some things to think about.

Tom Droege

At 07:55 PM 6/9/02 -0400, you wrote:
>   The transformation coefficients are solved once per NIGHT, not
>once per run.  I allow the zero-points to vary, but keep the
>coefficients fixed over the entire night.  Yes, garbage in will
>still yield garbage out, of course.