| ??? 12/13/08 00:10 Read: times |
#160892 - Not Really Responding to: ???'s previous message |
For years it's been obvious that some video displays were generated with interlace. That's the only way some of them could present a display of sufficient vertical resolution, but it was often obvious. Some people are VERY annoyed with the vertical "jiggle" that this introduces.
Some educational videos were annoyingly obvious in their interlace, often generated with AMIGA hardware (according to the credits). No doubt, it can be done right, and wrong as well. The assumption underlying the acceptance of interlace in U.S. (NTSC) video, is that the combination of phosphor persistence and retinal persistence produces no flicker. PAL systems use nearly the same horizontal rate, though there are more scan lines, so the ~20 ms half-frame rate is still acceptable to a large portion of the population. This isn't really worth arguing, since we're getting away from that whole scheme in a couple of months. I just wanted the "frame-rate" to be discussed some, since it can be disappointing to get it all "working" only to observe the flicker firsthand. It depends on brightness, timing, and ambient lighting, but I'm not at all sure to what extent each has impact. My own experience has been that small displays refreshed at 60 Hz work just fine and no flicker is observed (or complained about). I've not used LED's recently, and certainly not for long displays. I'm contemplating it, just out of curious interest, but, ... that's low-priority. RE |



