Color TV scan rate question revisited

Color TV scan rate question revisited

Post by Gary Glaenze » Fri, 19 Oct 2001 10:30:16



Finally had a chance to dig the old books out of the ba***t.

The reasons given for the change in H freq from 15750 to 15734.264 and V
freq from 60 to 59.94, and the selection of 3.579545 MHz as the color
subcarrier frequency are two fold:

1) To interleave the color information into the spectrum between the H-freq
multiples of the luminance information (color subcarrier frequency is an odd
multiple (455th) of one-half the H scan frequency)

while simultaneously:

2) minimizing the herringbone effect caused by a beat (approximately 920
kHz)  between the color subcarrier and the 4.5 MHz sound IF (this is
accomplished by making the subcarrier frequency such that the BEAT frequency
between the 4.5 IF and the subcarrier is also an odd multiple (117th) of
one-half the H scan frequency).

By choosing these frequencies, any chrominance information that shows up in
the luminance channel is of opposite polarities on successive scans, and is
effectively cancelled out by persistence of vision.

Likewise, any beat-frequency interference at the 920.455 kHz point (A-B mix
of the 4.5 MHz IF and the 3.579545 MHz subcarrier frequency) is also
cancelled by persistence of vision.

These changes did not cause any problems for B&W sets, as the change
amounted to 1/10 of one percent of the 'old' standards.

 
 
 

Color TV scan rate question revisited

Post by mark nelso » Fri, 19 Oct 2001 10:54:54


I'm still amazed at the really clever engineering that went into
creating the NTSC color system back in the very early '50s.  They really
understood the visual effects of everything they did, and were equally
creative in the time and the frequency domains.  Not until HDTV's
"rectangular" broadcast spectrum have I seen the like.

--
Mark Nelson  AJ2X (TV engineer for 20 years+)
  near Boston

A collector of TV signal boosters and UHF converters -- God help me!    
http://tv-boxes.com

 
 
 

Color TV scan rate question revisited

Post by Gary Glaenze » Fri, 19 Oct 2001 11:06:42


I'm elbow deep into another 'lesson' from my school days that explains how
the relative bandwidths of the 'I' and 'Q' signals that modulated the
subcarrier were set...............one had more bandwidth because of the
eye's ability to discern fine detail only in certain colors

when I get thru it and have it straight in my mind, I'll post that

but you're right about the engineering..............when I re-read this
stuff for the first time since early 1970, I had the same thoughts as back
then.................'whoa...pretty tricky!'...............


Quote:

> I'm still amazed at the really clever engineering that went into
> creating the NTSC color system back in the very early '50s.  They really
> understood the visual effects of everything they did, and were equally
> creative in the time and the frequency domains.  Not until HDTV's
> "rectangular" broadcast spectrum have I seen the like.

> --
> Mark Nelson  AJ2X (TV engineer for 20 years+)
>   near Boston

> A collector of TV signal boosters and UHF converters -- God help me!
> http://tv-boxes.com

 
 
 

Color TV scan rate question revisited

Post by John Byr » Sat, 20 Oct 2001 05:07:53



Quote:

> Finally had a chance to dig the old books out of the ba***t.

> The reasons given for the change in H freq from 15750 to 15734.264 and V
> freq from 60 to 59.94, and the selection of 3.579545 MHz as the color
> subcarrier frequency are two fold:

> 1) To interleave the color information into the spectrum between the H-freq
> multiples of the luminance information (color subcarrier frequency is an odd
> multiple (455th) of one-half the H scan frequency)

> while simultaneously:

> 2) minimizing the herringbone effect caused by a beat (approximately 920
> kHz)  between the color subcarrier and the 4.5 MHz sound IF (this is
> accomplished by making the subcarrier frequency such that the BEAT frequency
> between the 4.5 IF and the subcarrier is also an odd multiple (117th) of
> one-half the H scan frequency).

> By choosing these frequencies, any chrominance information that shows up in
> the luminance channel is of opposite polarities on successive scans, and is
> effectively cancelled out by persistence of vision.

> Likewise, any beat-frequency interference at the 920.455 kHz point (A-B mix
> of the 4.5 MHz IF and the 3.579545 MHz subcarrier frequency) is also
> cancelled by persistence of vision.

> These changes did not cause any problems for B&W sets, as the change
> amounted to 1/10 of one percent of the 'old' standards.

But this doesn't explain why the choice was made to change the horizontal
scanning frequency, rather than changing the sound intercarrier
frequency?  The horizontal scanning frequency could have been left at
15,750 Hz, which would have given a color sub carrier frequency of
3.583125 mHz.  The sound carrier would then have to be moved up 4,500 Hz,
giving the 4.5045 mHz intercarrier frequency necessary to minimize the
herringbone effect.  I can think of at least four reasons for changing the
horizontal scanning frequency, rather than the intercarrier frequency, but
it would be interesting to know what the official reasoning was?

Regards,

John Byrns

Surf my web pages at,  http://www.FoundCollection.com/~jbyrns/index.html

 
 
 

Color TV scan rate question revisited

Post by Jeff Goldsmit » Sat, 20 Oct 2001 05:53:31


   I have a book that was the FCC presentation for the color TV
standards somewhere in my boxes.  Must be 800 pages or more.  I recall
tests detailed with different scan and chroma burst frequencies along
with a few other systems besides NTSC-type.  These unique test
televisions are also detailed with pictures and schematics.  Times and
dates of the various test transmissions are listed.  

   This could easily contain the reasoning for the choice of the
15750/3579545 system.   I will remember to post some pages when it turns
up.

                Jeff Goldsmith

Quote:



> > Finally had a chance to dig the old books out of the ba***t.

> > The reasons given for the change in H freq from 15750 to 15734.264 and V
> > freq from 60 to 59.94, and the selection of 3.579545 MHz as the color
> > subcarrier frequency are two fold:

> > 1) To interleave the color information into the spectrum between the H-freq
> > multiples of the luminance information (color subcarrier frequency is an odd
> > multiple (455th) of one-half the H scan frequency)

> > while simultaneously:

> > 2) minimizing the herringbone effect caused by a beat (approximately 920
> > kHz)  between the color subcarrier and the 4.5 MHz sound IF (this is
> > accomplished by making the subcarrier frequency such that the BEAT frequency
> > between the 4.5 IF and the subcarrier is also an odd multiple (117th) of
> > one-half the H scan frequency).

> > By choosing these frequencies, any chrominance information that shows up in
> > the luminance channel is of opposite polarities on successive scans, and is
> > effectively cancelled out by persistence of vision.

> > Likewise, any beat-frequency interference at the 920.455 kHz point (A-B mix
> > of the 4.5 MHz IF and the 3.579545 MHz subcarrier frequency) is also
> > cancelled by persistence of vision.

> > These changes did not cause any problems for B&W sets, as the change
> > amounted to 1/10 of one percent of the 'old' standards.

> But this doesn't explain why the choice was made to change the horizontal
> scanning frequency, rather than changing the sound intercarrier
> frequency?  The horizontal scanning frequency could have been left at
> 15,750 Hz, which would have given a color sub carrier frequency of
> 3.583125 mHz.  The sound carrier would then have to be moved up 4,500 Hz,
> giving the 4.5045 mHz intercarrier frequency necessary to minimize the
> herringbone effect.  I can think of at least four reasons for changing the
> horizontal scanning frequency, rather than the intercarrier frequency, but
> it would be interesting to know what the official reasoning was?

> Regards,

> John Byrns

> Surf my web pages at,  http://www.FoundCollection.com/~jbyrns/index.html

 
 
 

Color TV scan rate question revisited

Post by Gary Glaenze » Sat, 20 Oct 2001 05:55:40



Quote:


> > Finally had a chance to dig the old books out of the ba***t.

> > The reasons given for the change in H freq from 15750 to 15734.264 and V
> > freq from 60 to 59.94, and the selection of 3.579545 MHz as the color
> > subcarrier frequency are two fold:

> > 1) To interleave the color information into the spectrum between the
H-freq
> > multiples of the luminance information (color subcarrier frequency is an
odd
> > multiple (455th) of one-half the H scan frequency)

> > while simultaneously:

> > 2) minimizing the herringbone effect caused by a beat (approximately 920
> > kHz)  between the color subcarrier and the 4.5 MHz sound IF (this is
> > accomplished by making the subcarrier frequency such that the BEAT
frequency
> > between the 4.5 IF and the subcarrier is also an odd multiple (117th) of
> > one-half the H scan frequency).

> > By choosing these frequencies, any chrominance information that shows up
in
> > the luminance channel is of opposite polarities on successive scans, and
is
> > effectively cancelled out by persistence of vision.

> > Likewise, any beat-frequency interference at the 920.455 kHz point (A-B
mix
> > of the 4.5 MHz IF and the 3.579545 MHz subcarrier frequency) is also
> > cancelled by persistence of vision.

> > These changes did not cause any problems for B&W sets, as the change
> > amounted to 1/10 of one percent of the 'old' standards.

> But this doesn't explain why the choice was made to change the horizontal
> scanning frequency, rather than changing the sound intercarrier
> frequency?  The horizontal scanning frequency could have been left at
> 15,750 Hz, which would have given a color sub carrier frequency of
> 3.583125 mHz.  The sound carrier would then have to be moved up 4,500 Hz,
> giving the 4.5045 mHz intercarrier frequency necessary to minimize the
> herringbone effect.  I can think of at least four reasons for changing the
> horizontal scanning frequency, rather than the intercarrier frequency, but
> it would be interesting to know what the official reasoning was?

> Regards,

> John Byrns

in a word, compatability

the NTSC committe was charged with 'developing a COMPATIBLE system',
included in the definitions of 'compatibility' was that the sound carrier
would be 4.5 MHz above the video carrier, in order to not make obsolete or
require retuning of the B&W sets then in operation.

that 4.5 MHz spacing, along with video carrier 1.25 MHz above the low end of
the channel, vestigal sideband video transmission, FM audio transmission
with a +/- 25 kHz swing, 4:3 aspect ratio, and 525 lines, interlaced, were
ironclad, no variations were allowed, to prevent problems with the B&W sets.

the frame and field frequencies, however, were required only to be 'within
the allowable tolerances for monochrome transmission'; they were made
flexible to allow the color information to be interleaved in the same
spectrum as the luminace information.

 
 
 

Color TV scan rate question revisited

Post by dwight elv » Sat, 20 Oct 2001 08:41:53


Hi
 You'll notice that on convergence, vay little effort is
put into exact blue images. This is because the mind doesn't
focus well on blue so as long as it is close, most don't
notice it. Greens to yellow are the colors used primarily
for focusing. Red alignment is critical because that is
the strongest gun. Any miss aim and you get a lot of the
wrong color. Focus of red isn't as important as the green
but more important than blue.
Dwight
Quote:

> I'm elbow deep into another 'lesson' from my school days that explains how
> the relative bandwidths of the 'I' and 'Q' signals that modulated the
> subcarrier were set...............one had more bandwidth because of the
> eye's ability to discern fine detail only in certain colors

> when I get thru it and have it straight in my mind, I'll post that

> but you're right about the engineering..............when I re-read this
> stuff for the first time since early 1970, I had the same thoughts as back
> then.................'whoa...pretty tricky!'...............



> > I'm still amazed at the really clever engineering that went into
> > creating the NTSC color system back in the very early '50s.  They really
> > understood the visual effects of everything they did, and were equally
> > creative in the time and the frequency domains.  Not until HDTV's
> > "rectangular" broadcast spectrum have I seen the like.

> > --
> > Mark Nelson  AJ2X (TV engineer for 20 years+)
> >   near Boston

> > A collector of TV signal boosters and UHF converters -- God help me!
> > http://tv-boxes.com

 
 
 

Color TV scan rate question revisited

Post by mark nelso » Sat, 20 Oct 2001 11:53:39


I seem to recall some demo of a projection color TV display years ago,
when such things were very rare, that had only two CRTs -- red and
green.  To get "full color" they added a fixed blue field with a
projection lamp, at about 10% of the full brightness of the other two
colors.  No attempt at all was made to vary it, and the results were
pretty good for most broadcast images.  And I recall a homemade color TV
camera article in the late '60s (Radio-Electronics Magazine maybe?) that
used two vidicons with a similar arrangement, just adding a fixed blue
bias into the color matrix.  They fudged the optical color filters a bit
too.  "Good enough" results were claimed.

Though I eventually was on the design team for several RCA broadcast
color cameras and (later) Magnavox projection TVs, where getting the
color just right was a top priority, I never forgot these early examples
of the dictum, "blue is overrated."

--
Mark Nelson  AJ2X
  near Boston

A collector of TV signal boosters and UHF converters -- God help me!    
http://tv-boxes.com

 
 
 

Color TV scan rate question revisited

Post by Robert Case » Sun, 21 Oct 2001 12:52:54


Quote:

> But this doesn't explain why the choice was made to change the horizontal
> scanning frequency, rather than changing the sound intercarrier
> frequency?  The horizontal scanning frequency could have been left at
> 15,750 Hz, which would have given a color sub carrier frequency of
> 3.583125 mHz.  The sound carrier would then have to be moved up 4,500 Hz,
> giving the 4.5045 mHz intercarrier frequency necessary to minimize the
> herringbone effect.  I can think of at least four reasons for changing the
> horizontal scanning frequency, rather than the intercarrier frequency, but
> it would be interesting to know what the official reasoning was?

A major reason for not changing the sound carrier was to maintain
compatability with existing B&W TV sets out in the field.  The
intercarrier sound IF is fixed at 4.5MHz and is not user adjusatable.
The horiz and vert oscillators were at the time user adjustable
(hold controls) and had more than enough range to compensate
for the frequency change.  As it turned out, almost noone needed
to fiddle with the horiz and vert holds as the circuits had a wide
enough lock in range.