[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Everett and Howell Paper
The Everett and Howell paper is nice because it shows two professionals (I
assume) doing what we are trying to do. I like papers that I can just sit
down and read and understand everything they are doing. I recommend it to
you all. This is a very readable paper! I congratulate the authors for
taking the time to make it readable.
Their camera has 8k x 8k pixels. The equivalent of 16 of our cameras,
Wow! So they get data 8x as fast as one of our dual cameras. At first
glance, it would seem that they can really outproduce us. But wait, they
got the camera for only 5 days, and during "bright time". I assume "bright
time" is when the moon is around. I suspect that this might be their
year's quota of time to do something like this. I will soon have three
systems running (6 ea 2k x 2k cameras). I can run every clear night. We
also cover 16 square degrees to their one. So while they get more stars
per degree and go deeper, we have a chance to cover the whole
sky. Further, there are three more of you with systems, and a fourth is on
the way. So we can take data at roughly the rate of a big camera at a big
So while professionals at a good site with a better camera can out produce
tass a few nights of the year, I doubt that they will keep it up. They
have demonstrated what can be done. Their data reduction appears to be
just what we should be doing. The results look very nice. We will have
more camera noise and more sky noise, etc., so I don't expect we will
achieve their precision, but we should be close.
There exposure times are similar to what I have been taking. Their duty
cycle is not quite as good as ours, so we can achieve a similar duty cycle
with the shorter exposure that seems appropriate for our conditions.
What the people that want this data tell me (e.g. Bohden Paczynski), is
that they want someone to stick with it for a long time. I don't think
that most professionals can afford to do this. Doing it once makes a nice
paper. They have shown that they can get to 0.002 mag for a single
exposure, and to 0.00019 for the series of measurements. We should be able
to get close to these results as limited by our sky and camera noise.
Their (b) curve data is where we should be able to excel. They state that
"This phenomenon is relatively common in our data (occurring in a few
percent of the stars). A few percent is a lot of new variables to be found
and cataloged. To get these measured well, we just have to stick with it
night after night. Again, I think that this is something that
professionals cannot afford to do. I already have about 50 fields exposed
56 times with many overlaps since starting serious running in
September. Depending on what a "few percent" means, I should have a few
thousand type (b) stars. It just means cranking the data through a
pipeline to get them. That I plan to start to do this winter when the
viewing is lousy. OK, I do not underestimate this work. It will take a
lot of fussing. But I plan to stick with it 10 years or so.
I would bet that Everett and Howell will now go off to do specific science
things, like searching for planets or whatever the latest science dictates.
I would again like to encourage you all to keep at this. It is worthwhile
science. Probably no one else will do what we have set out to do until
some satellite goes up that does it. This may be receding off into the
future as NASA will sink more and more money into the space station and
less into real science.
Again, a very nice paper, it demonstrates what we should be doing.