[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: color table trouble



I wrote (rather too flippantly) just a moment ago:

> I know, I know. I'm a sucker for these color questions, but I
> LOVE this question!
> 
>    IDL> Device, Decomposed=0

Here is a modest proposal for RSI. Why not turn color
decomposition OFF by default. Then people like
Gary won't get confused and I won't have to answer
this question 10 times a week (multiplied by the
number of technical support engineers at RSI).

But here is the real kicker reason for me: if I
*care* about displaying my 24-bit images correctly
(and color decomposition really only makes sense
if I have 24-bit images, it seems to me) then--as
I posted last week--I HAVE to set the DECOMPOSED
keyword anyway:

   Device, Decomposed=1
   TV, image24

If I don't do this, then I run the risk that my
image will not be displayed properly, since with
color decomposition off, ALL values, even those in
24-bit image values, are run through the color table.

I understand the argument that if people have
24-bit hardware they expect it to behave like 24-bit
hardware, but the reality for the vast majority of
IDL users, I feel sure, is that color decomposition
is something they can safely put off learning for
many more months and years to come. There is just
way too much code out there (including much of the
code supplied with IDL) that forces us to think 
in an undecomposed color way.

Cheers,

David

-- 
David Fanning, Ph.D.
Fanning Software Consulting
Phone: 970-221-0438 E-Mail: davidf@dfanning.com
Coyote's Guide to IDL Programming: http://www.dfanning.com/
Toll-Free IDL Book Orders: 1-888-461-0155