• Like
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.




Published in Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads


Total Views
On SlideShare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide


  • 1. Submitted in partial fulfillment of the award of Degree of Bachelor of TechnologyTELEVISION STANDARD AND COMMUNICATION SYSTEM USED IN DOORDARSHAN AT DOORDARSHAN KENDRA PATNA FROM 4 DEC T0 24 DEC 2012 ABHISHEK PRASAD (9910005003)
  • 2. Implant Training Report Submitted By ABHISHEK PRASAD (9910005003) In partial fulfillment of Bachelor of Technology in Electronics and Communication Engineering From 4/12/2012 to 24/12/2012 Kalasalingam University (Kalasalingam Academy of Research and Education) Krishnankoil-626190
  • 4. COMPANY PROFILEDOORDARSHAN (Hindi: ; literally Distant Show) is an Indian public servicebroadcaster, a division of PrasarBharati. It is one of the largest broadcastingorganizations in India in terms of the infrastructure of studios and transmitters. Recently,it has also started Digital Terrestrial Transmitters. On September 15, 2009, Doordarshancelebrated its 50th anniversary. The DD provides television, radio, online and mobileservices throughout metropolitan and regional India, as well as overseas through theIndian Network and Radio IndiaDoordarshan had a modest beginning with the experimental telecast starting in Delhi on15 September 1959 with a small transmitter and a makeshift studio. The regular dailytransmission started in 1965 as a part of All India Radio. The television service wasextended to Bombay (now Mumbai) and Amritsar in 1972. Up until 1975, only sevenIndian cities had a television service and Doordarshan remained the sole provider oftelevision in India. Television services were separated from radio in April 1 1976 Eachoffice of All India Radio and Doordarshan were placed under the management of twoseparate DirectorGenerals in New Delhi. Finally, in 1982, Doordarshan as a NationalBroadcaster came into existence.In the year 1982, color TV was introduced in the Indian market with the live telecast ofthe Independence Day speech by then prime minister Indira Gandhi on 15 August 1982,followed by the 1982 Asian Games which were held in Delhi. Now more than 1,400terrestrial transmitters are there and about 46 Doordarshan studios producing TVprograms today. Presently, Doordarshan operates 21 channels – two All India channels - DD National and DD News, 11 Regional language Satellite Channels (RLSC), fourState Networks (SN), an International channel, a Sports Channel DD Sports and twochannels RajyaSabha TV & DD-LokSabha for live broadcast of parliamentaryproceedings. TELEVISION STANDARDS
  • 5. There are three main television standards used throughout the world. NTSC - National Television Standards Committee Developed in the US and first used in 1954, NTSC is the oldest existing broadcast standard. It consists of 525 horizontal lines of display and 60 vertical lines. Only one type exists, known as NTSC M. It is sometimes irreverently referred to as "Never Twice the Same Color." SECAM - SystèmeÉlectronique pour Couleur avec Mèmoire. Developed in France and first used in 1967. It uses a 625-line vertical, 50-line horizontal display. PAL - Phase Alternating Line Developed in Germany and first used in 1967. A variant of NTSC, PAL uses a 625/50- line display. TELEVISION STANDARD USED IN INDIA IS PAL.PAL ENCODER AND DECODER:-The gamma corrected RGB signals are combined in the Y-matrix to form the Y signal. TheU-V matrix combines the R,B and Y signals to obtain R-Y and B-Y which are weighted toobtain U and V signals. Weighting by the factor 0.477 for R-Y and 0.895 for preventingover modulation on saturated colors. This gives:Y=0.30R+0.59G+0.11B U=0.477(R-Y) V=0.895(B-Y)PAL encoder input is the primary colors input into the matrix circuit which generates thesignal (R-Y) and (B-Y). The matrix circuit also generates a delay line of 0.6µs.the mainpurpose of PAL encoder is the alternation of the phase of clock pulses generated by syncpulse generator. This (R-Y) and (B-Y) signal is allowed to pass through a low pass filtercircuit to become these signal in a band limitation. The keying scheme used in encoding thissignal is the balanced quadrature phase shift (QPSK) which generates the signal Chroma-Vand Chroma-U which gets added at adder block. The sync pulse generates the clockfrequency of 4.43MHz, 7.8 KHz (ident signal), burst key and sync pulse. The 4.43MHzsignal and ident signal is allowed to pass through a 180 degree phase shifter but before thata colour subcarrier is added to 4.43 MHz video signal. The output of the 180 degree phaseshifter is allowed to move through the +/-45 degree signal which gives the output as 135
  • 6. and 225 degree respectively and this signal is fed to the burst generator where burst keygenerated from the sync pulse generator gets added to this block. The final output isobtained from the added block where the delay line signal, burst signal, sync signal andsignal from the balanced quadrature phase modulator is input.The generation of Burst key is important because they contain the colour informationpresent in the signal. Burst generates an envelope which creates an area/loop for Chroma-Uand Chroma-V. The absence of burst will cause the signal to turn completely black andwhite. All the colour information will lost.The PAL decoder works just opposite to the encoder. The input in case of encoder is theprimary colour signal while in case of decoder the input is the signal which is transmittedby the encoding system. The only difference in between encoder and decoder is in case ofencoder we use phase shifter to generate phase shifter to generate phase angle differencewhile in case of decoder we use “gates” to generate the phase angles Spectrum of a System I television channel with PAL TELEVISION PRINCIPLES AND SCANNING
  • 7. The basic principal is a spectrum of energy that starts with low frequency radio wavesthrough VHF-TV,FM radio, UHF-TV (which now includes the new digital TV band offrequencies),all the way through x-rays. The visible light portion of the electromagneticspectrum consist of all the colour of the rainbow. Which combine to produce white light.The fact that white light consist of all colour of light added together can be demonstratedwith the help of a prism.Displaying an imageA cathode-ray tube (CRT) television displays an image by scanning a beamof electrons across the screen in a pattern of horizontal lines known as a raster. At theend of each line the beam returns to the start of the next line; at the end of the last line itreturns to the top of the screen. As it passes each point the intensity of the beam isvaried, varying the luminance of that point. Acolor television system is identical exceptthat an additional signal known as chrominance controls the color of the spot.Raster scanning is shown in a slightly simplified form below.
  • 8. When analog television was developed, no affordable technology for storing any videosignals existed; the luminance signal has to be generated and transmitted at the sametime at which it is displayed on the CRT. It is therefore essential to keep the rasterscanning in the camera (or other device for producing the signal) inexact synchronization with the scanning in the television.The physics of the CRT require that a finite time interval be allowed for the spot tomove back to the start of the next line (horizontal retrace) or the start of the screen(vertical retrace). The timing of the luminance signal must allow for this.The human eye has a characteristic called Persistence of vision. Quickly displayingsuccessive scan images will allow the apparent illusion of smooth motion. Flickering ofthe image can be partially solved using a long persistence phosphor coating on the CRT,so that successive imagesfade slowly. However, slow phosphor has the negative side-effect of causing imagesmearing and blurring when there is a large amount of rapid on-screen motion occurring.The maximum frame rate depends on the bandwidth of the electronics and thetransmission system, and the number of horizontal scan lines in the image. A frame rateof 25 or 30 hertz is a satisfactory compromise, while the process of interlacing two videofields of the picture per frame is used to build the image. This process doubles theapparent number of video fields per second and further reduces flicker and other defectsin transmission.Receiving signals:-The television system for each country will specify a number of televisionchannels within the UHF or VHF frequency ranges. A channel actually consists of twosignals: the picture information is transmitted using amplitude modulation on onefrequency, and the sound is transmitted with frequency modulation at a frequency at afixed offset (typically 4.5 to 6 MHz) from the picture signal.The channel frequencies chosen represent a compromise between allowingenough bandwidth for video (and hence satisfactory picture resolution), and allowingenough channels to be packed into the available frequency band. In practice a technique
  • 9. called vestigial sideband is used to reduce the channel spacing, which would be at leasttwice the video bandwidth if pure AM was used.Signal reception is invariably done via a superheterodyne receiver: the first stage isa tuner which selects a television channel and frequency-shifts it to a fixed intermediatefrequency (IF). The signal amplifier (from the microvolt range to fractions of a volt)performs amplification to the IF stages.Extracting the sound:-At this point the IF signal consists of a video carrier wave at one frequency and thesound carrier at a fixed offset. A demodulator recovers the video signal and sound as anFM signal at the offset frequency (this is known as intercarrier sound).The FM sound carrier is then demodulated, amplified, and used to drive a loudspeaker.Until the advent of the NICAM and MTS systems, television sound transmissions wereinvariably monophonic.COLOUR COMPOSITE VIDEO SIGNAL
  • 10. The video carrier is demodulated to give a composite video signal; this containsluminance, chrominance and synchronization signals; this is identical to the video signalformat used by analog video devices such as VCRs or CCTV cameras. Note that the RFsignal modulation is inverted compared to the conventional AM: the minimum videosignal level corresponds to maximum carrier amplitude, and vice versa. The carrier isnever shut off all together ; this is to ensure that intercarrier sounddemodulation can stilloccur.Each line of the displayed image is transmitted using a signal as shown above. The samebasic format (with minor differences mainly related to timing and the encoding of color)is used for PAL,NTSC and SECAM television systems. A monochrome signal isidentical to a color one, with the exception that the elements shown in color in thediagram (the color burst, and the chrominance signal) are not present.The front porch is a brief (about 1.5 microsecond) period inserted between the end ofeach transmitted line of picture and the leading edge of the next line sync pulse.Its purpose was to allow voltage levels to stabilize in older televisions, preventinginterference between picture lines. The front porch is the first component ofthe horizontal blanking interval which also contains the horizontal sync pulse andthe back porch.
  • 11. The back porch is the portion of each scan line between the end (rising edge) of thehorizontal sync pulse and the start of active video. It is used to restore the black level(300 mV.) reference in analog video. In signal processing terms, it compensates forthe fall time and settling time following the sync pulse.In color television systems such as PAL and NTSC, this period also includesthe colorburst signal. In the SECAM system it contains the reference subcarrier for eachconsecutive color difference signal in order to set the zero-color reference.In some professional systems, particularly satellite links between locations, the audio isembedded within the back porch of the video signal, to save the cost of renting a secondchannel.Monochrome video signal extraction:-The luminance component of a composite video signal varies between 0 V andapproximately 0.7 V above the "black" level. In the NTSC system, there isa blanking signal level used during the front porch and back porch, and a black signallevel 75 mV above it; in PAL and SECAM these are identical.In a monochrome receiver the luminance signal is amplified to drive the control grid inthe electron gun of the CRT. This changes the intensity of the electron beam andtherefore the brightness of the spot being scanned. Brightness and contrast controlsdetermine the DC shift and amplification, respectively.Color video signal extraction:-A color signal conveys picture information for each of the red, green, and bluecomponents of an image.However, these are not simply transmitted as three separate signals, because:Such a signal would not be compatible with monochrome receivers (an importantconsideration when color broadcasting was first introduced);it would occupy three times the bandwidth of existing television, requiring a decrease inthe number of television channels available; and,typical problems with signal transmission (such as differing received signal levelsbetween different colors) would produce unpleasant side effects.
  • 12. Instead, the RGB signals are converted into Y, U, V form, where the Y signal representsthe overall brightness, and can be transmitted as the luminance signal. This ensures amonochrome receiver will display a correct picture. The U and V signals are thedifference between the Y signal and the B and R signals respectively. The U signal thenrepresents how "blue" the color is, and the V signal how "red" it is. The advantage ofthis scheme is that the U and V signals are zero when the picture has no color content.Since the human eye is more sensitive to errors in luminance than in color, the U and Vsignals can be transmitted in a relatively lossy (specifically: bandwidth-limited) waywith acceptable results. The G signal is not transmitted in the YUV system, but rather itis recovered electronically at the receiving end.The two signals (U and V) modulate both the amplitude and phase of the color carrier,so to demodulate them it is necessary to have a reference signal against which tocompare it. For this reason, a short burst of reference signal known as the color burst istransmitted during the back porch (re-trace period) of each scan line. A referenceoscillator in the receiver locks onto this signal (see phase-locked loop) to achieve aphase reference, and uses its amplitude to set an AGC system to achieve an amplitudereference.The U and V signals are then demodulated by band-pass filtering to retrieve the colorsubcarrier, mixing it with the in-phase and quadrature signals from the referenceoscillator, and low-pass filtering the results.Synchronization:-Synchronizing pulses added to the video signal at the end of every scan line and videoframe ensure that the sweep oscillators in the receiver remain locked in step with thetransmitted signal, so that the image can be reconstructed on the receiver screen.A sync separator circuit detects the sync voltage levels and sorts the pulses intohorizontal and vertical sync.Horizontal synchronization:-The horizontal synchronization pulse (horizontal sync HSYNC), separates the scan lines.The horizontal sync signal is a single short pulse which indicates the start of every line.The rest of thescan line follows, with the signal ranging from 0.3 V (black) to 1 V(white), until the next horizontal or vertical synchronization pulse.
  • 13. The format of the horizontal sync pulse varies. In the 525-line NTSC system it is a4.85 µs-long pulse at 0 V. In the 625-line PAL system the pulse is 4.7 µssynchronization pulse at 0 V . This is lower than the amplitude of any video signal(blacker than black) so it can be detected by the level-sensitive "sync stripper" circuit ofthe receiver.Vertical synchronization:-Vertical synchronization (Also vertical sync or VSYNC) separates the video fields. InPAL and NTSC, the vertical sync pulse occurs within the vertical blanking interval. Thevertical sync pulses are made by prolonging the length of HSYNC pulses through almostthe entire length of the scan line.The vertical sync signal is a series of much longer pulses, indicating the start of a newfield. The sync pulses occupy the whole of line interval of a number of lines at thebeginning and end of a scan; no picture information is transmitted during verticalretrace. The pulse sequence is designed to allow horizontal sync to continue duringvertical retrace; it also indicates whether each field represents even or odd lines ininterlaced systems (depending on whether it begins at the start of a horizontal line, ormid-way through).The format of such a signal in 525-line NTSC is:pre-equalizing pulses (6 to start scanning odd lines, 5 to start scanning even lines)long-sync pulses (5 pulses)post-equalizing pulses (5 to start scanning odd lines, 4 to start scanning even lines)Each pre- or post- equalizing pulse consists in half a scan line of black signal: 2 µs at0 V, followed by 30 µs at 0.3 V.Each long sync pulse consists in an equalizing pulse with timings inverted: 30 µs at 0 V,followed by 2 µs at 0.3 V.In video production and computer graphics, changes to the image are often kept in stepwith the vertical synchronization pulse to avoid visible discontinuity of the image. Sincethe frame buffer of a computer graphics display imitates the dynamics of a cathode-raydisplay, if it is updated with a new image while the image is being transmitted to thedisplay, the display shows a mishmash of both frames, producing a pagetearing artifact partway down the image.
  • 14. Vertical synchronization eliminates this by timing frame buffer fills to coincide withthe vertical blanking interval, thus ensuring that only whole frames are seen on-screen.Software such as video games and computer aided design (CAD) packages often allowvertical synchronization as an option, because it delays the image update until thevertical blanking interval. This produces a small penalty in latency, because the programhas to wait until the video controller has finished transmitting the image to the displaybefore continuing. Triple buffering reduces this latency significantly.Two timingintervals are defined – the front porch between the end of displayed video and the startof the sync pulse, and the back porch after the sync pulse and before displayed video.These and the sync pulse itself are called the horizontalblanking (or retrace) interval and represent the time that the electron beam in the CRT isreturning to the start of the next display line.CAMERAS
  • 15. Studio CamerasThe studio television camera is the beginning of the video signal. It is here that visiblelight is transformed or transduced into electrical energy. The video signal remains in theform of electrical energy, either analog or digital, for most of the remaining process untila picture monitor (TV set) converts the electrical signal back into visible light. Theprinciple parts of the studio camera are; the camera head (including lens, imagingdevice, and viewfinder), the camera mount, and the studio pedestal.The Parts of CameraLens: The external optics is designed to collect and focus the light onto the face of theimaging device. The lens contains focusing, focal length, and aperture controls. The firsttwo controls are made by the camera operator at the camera head, and the aperturecontrol is typically made by the video engineer at the CCU. at KTSC-TV have servocontrols for zoom, and manual controls for focus. The servo zoom control, whichprovides smooth and variable speed zooms with a little practice, is located on the rightpan handle while the focus control is located on the left pan handle. On a properlymaintained camera and lens, focus should be set with the lens set to maximum focallength. Once set, the lens will maintain accurate focus throughout the zoom range aslong as the distance between subject and lens does not change.Imaging Devices: The internal optics, including the beam splitter, are housed in thecamera body. KTSC-TVs Hitachi Z-One B cameras employ CCD (Charge-CoupledDevice) imaging devices and are immune to the problem of image retention and burn-in.View Finder: The monochrome (black and white) monitor on top of the camera head isyour window on the world. And while it provides no information about the colors beingreproduced, it is an accurate display for the purpose of framing, focus and composition.The angle of the VF is adjustable to provide optimum viewing regardless of the height ofthe camera or the height of the operator. The VF has contrast and brightness controls andshould be adjusted for your particular situation. These controls do not in any way affectthe video output of the camera.The Camera Mount
  • 16. The camera is attached to a head which is in turn attached to the camera support--in ourcase a tripod and dolly combination. Types of professional camera heads include camheads and fluid heads. Both allow for smooth pans and tilts. However, the smoothness ofthese movements is determined in part by the operators proficiency and muscularcoordination. Hours of practice are necessary before one can be fully proficient withcamera moves worthy of "on-air" service. Please be aware of the location and use of thepan and tilt locks and tension adjustments. Never try to operate the camera head with thelocks engaged, or with the tension adjustments tightened. Whenever the operator is atthe camera, both the pan and tilt adjustments should be unlocked and loose enough sothat the camera movements can be executed smoothly and quickly according to thedirectors wishes. Before the operator leaves the camera, even for a moment, the pan andtilt should be locked securely. Please follow these directions carefully! TV STUDIO Main parts of T.V studio are: Action area Production control room (P.C.R) Master switching room (M.S.R) Control apparatus room(C.A.R) Character generator(C.G) Video tape recording(V.T.R) Earth station(E.R) ACTION AREA It is the areas were the artist performs the action i.e. the program which is to broadcasted. The main parts of action area are as follow:- Camera head unit Floor preparation Audio connecter boxes Lightning drum and Lightning stand and color Light Resonance minimizer i.e. carp orated walls ,blanket Talk back management
  • 17. Camera head unit consists of the following :-1. Lens assembly2. Dichroic mirror3. Focusing arrangement4. Servo management5. Back flow PRODUCTION CONTROL ROOM The P.C.R also known as studio control room. The P.C.R include A video monitor which monitors the POGRAM, VTRs, CAMERAS, GRAPHICS and other video sources. Video monitors consists of series of television sets or computer monitors which is capable of displaying multiple sources. A vision mixer, a large control panel used to select the video sources to be seen on air and in many cases in any monitors on the set. As audio mixing console and other audio equipment add the audio to video signals digital video effects or DVE, for manipulation of video sources. In newer vision mixer the DVE is integrated into vision mixer The technical director watches the waveform of the video signals and direct the cameraman accordingly for the high quality video signals in CCUs waveform 1monitors and vectroscope . MASTER CONTROL ROOM
  • 18. The MCR houses equipment that is too noisy or runs too hot for the productioncontrol room. It also makes sure that the wire length and installation requirementskeep with in manageable length, since most high quality wiring runs only betweendevices in the rooms ,this includeThe actual circuitry and connection boxes of vision mixer, DVE and charactergenerator.Camera control unitsVTRsPatch panels for the reconfiguration of the wiring between the various pieces ofequipment.It also controls on-air signal. It may include controls to play back programs, switchlocal or network feeds, satellite feeds and monitors the transmitter.CHARACTER GENERATORSIt creates the majority of names and graphics that is to be inserted into programs.CONTROL APPARATUS ROOMIt includes the power control room, UPS room and generator for uninterruptedpower supply.VIDEO TAPE RECORDERA video tape recorder is tape recorder that can record video material, usuallyOn a magnetic tape.VTRs originated as individual tape reels, serving asreplacement for motion picture film stock, and making recording for televisionapplications cheaper and quicker. An improved form include the tape within thevideo cassette recorder (VCR)VIDEO CASSETTE RECORDERThe video cassette is a type of electro-mechanical devices that uses the removablevideo cassette that contain the magnetic tape for the recording audio and videofrom television broadcast so that image and sound can be played back at a
  • 19. convenient time. This facility afforded by a VCR machine is commonly referred toas television program TIME SHIFT.EARTH STATIONAn earth station is terrestrial terminal station design for the extra planetarytelecommunication with spacecraft, and reception of radio waves from anastronomical radio source. Earth station are located either on the surface of theearth , or within earth’s atmosphere. Earth station communicate with spacecraft bytransmitting and receiving radio waves in the super high frequency or extremelyhigh frequency bands. When an earth station successfully transmit radio waves to aspacecraft it establishes a telecommunication link.Specialized satellite earth station are used to telecommunicate with satellite chieflyCommunication satellites. Other earth station communicate with manned spacestation or un manned space probe. An earth station that primarily receivetemporary data, or that follows satellite not in geostationary orbit, is called trackingstation.When a satellite is within an earth station’s line of sight, the earth station is said tohave a view of satellite. It is possible for a satellite to communicate with more thanone earth station at time. A pair of earth station is said to have a satellite in mutualview when the station share simultaneous, unobstructed ,line of sight contact withthe satellite.
  • 20. UPLINK:-Pertaining to satellite communications, an uplink (UL or U/L) is the portion of acommunications link used for the transmission of signals from an Earth terminal toa satellite or to an airborne platform. An uplink is the inverse of a downlink. Anuplink or downlink is distinguished from reverse link or forward link.Pertaining to GSM and cellular networks, the radio uplink is the transmission pathfrom the mobile station (cell phone) to a base station (cell site). Traffic andsignaling flows within the BSSand NSS may also be identified as uplink anddownlink.Pertaining to computer networks, an uplink is a connection from datacommunications equipment toward the network core. This is also known asan upstream connection.DOWNLINK:-In the context of satellite communications, a downlink (DL) is the link from asatellite to a ground station.Pertaining to cellular networks, the radio downlink is the transmission path from acell site to the cell phone. Traffic and signalling flows within the base stationsubsystem (BSS) and network switching subsystem (NSS) may also be identifiedas uplink and downlink.Pertaining to a computer networks, a downlink is a connection from datacommunications equipment towards data terminal equipment. This is also knownas a downstream connection.
  • 21. TRANSMITTER:-In electronics and telecommunications a transmitter or radio transmitter isan electronic device which, with the aid of an antenna, Produces radio waves. Thetransmitter itself generates a radio frequency alternating current, which is appliedto the antenna. When excited by this alternating current, the antenna radiates radiowaves. In addition to their use in broadcasting, transmitters are necessarycomponent parts of many electronic devices that communicate by radio, suchas cell phones, wireless computer networks, Bluetooth enabled devices, garagedoor openers, two-way radios in aircraft, ships, and spacecraft, radar sets, andnavigational beacons. The term transmitter is usually limited to equipment thatgenerates radio waves for communication purposes; or radiolocation, suchas radar and navigational transmitters. Generators of radio waves for heating orindustrial purposes, such as microwave ovens or diathermy equipment, are notusually called transmitters even though they often have similar circuits.The term is popularly used more specifically to refer to a broadcast transmitter, atransmitter used in broadcasting, as in FM radio transmitter ortelevisiontransmitter. This usage usually includes both the transmitter proper, the antenna,and often the building it is housed in. An unrelated use of the term is inindustrial process control, where a "transmitter" is a telemetry device which
  • 22. converts measurements from a sensor into a signal, and sends it, usually via wires,to be received by some display or control device located a distance away.Block diagram of a TV transmitter (intercarrier method).
  • 23. CONCLUSIONIt was a wonderful experience to be a part of the inplant training at PARSERBHARTI, DOORDARSHAN KENDRA, patna. Being a part of this training Iwas very pleased to see the technology that was able to broadcast the video toalmost every homes of India.I learnt about the how the video is captured and how we transmit these videowithout any distortion in the video signal, what are the different way totransmit these signal, what is channel and many such thing.I thank my guide at DOORDARSHAN KENDRA, PATNA Er. N.K SINGHfor teaching and showing some practical while teaching.
  • 24. REFRENCE www.shareslides.com www.google.com www.howstuffworks.com http://www.scribd.com/doc/44694977/Modern-Communication- Systems-PART-1-TELEVISION