Glossary

Pinnacle Studio corner top left Glossary Pinnacle Studio border Glossary Pinnacle Studio corner top right Glossary
Pinnacle Studio border Glossary
Pinnacle Studio spacer Glossary
Pinnacle Studio spacer Glossary Pinnacle Studio blue button Glossary Pinnacle Studio spacer Glossary

To learn more about

Pinnacle Studio in general

click here to get a list of tutorials available
on our website (internet connection needed)!

Pinnacle Studio spacer Glossary
Pinnacle Studio spacer Glossary
Pinnacle Studio border Glossary
Pinnacle Studio corner bottom left Glossary Pinnacle Studio border Glossary Pinnacle Studio corner bottom right Glossary

Multimedia terminology contains computer and video terminology. The most important terms are defined below. Cross-references are indicated by &.

720p: A high-definition (HD) video format with a resolution of 1280×720 and progressive (non-interlaced) frames.

108i: A high-definition (HD) video format with a resolution of 1440×1080 and interlaced frames.

ActiveMovie: Software interface by Microsoft for the control of multimedia devices under Windows. & DirectShow, DirectMedia

ADPCM: Acronym for Adaptive Delta Pulse Code Modulation, a method of storing audio information in a digital format. This is the audio encoding and compression method used in CD-I and & CD-ROM production.

Address: All available saving positions in a computer are numbered (addressed). By means of these addresses each saving position can be occupied. Some addresses are reserved for the exclusive use of particular hardware components. If two components are using the same address, this is called an “address conflict”.

Aliasing: An inaccurate display of an image due to the limitations of the output device. Typically, aliasing appears in the form of jagged edges along curves and angled shapes.

Anti-aliasing: A method of smoothing out jagged edges in bitmap images. This is usually accomplished by shading the edges with pixels intermediate in color between the edge and the background, making the transition less apparent. Another method of anti-aliasing involves using higher resolution output devices.

Aspect ratio: The ratio of width to height in an image or graphic. Keeping the aspect ratio fixed means that any change to one value is immediately reflected in the other.

AVI: Audio Video Interleaved, a standard format for digital video (and & Video for Windows).

Batch capture: An automated process that uses an & edit decision list to locate and recapture specific clips from a videotape, usually at a higher data rate than the clip was originally captured.

BIOS: Acronym for Basic Input Output System, which refers to basic input and output commands saved in a & ROM, PROM or & EPROM. The essential task of the BIOS is the control of input and output. When the system is started, the ROM-BIOS carries out some tests. & Parallel port, IRQ, I/O


Bit:
Abbreviation of “BInary digiT, the smallest element of a computer’s memory. Among other things, bits are used to store the color values of pixels in an image. The more bits used for each & pixel, the greater the number of available colors. For example:

1-bit: each pixel is either black or white.

4-bit: allows 16 colors or gray shades.

8-bit: allows 256 colors or gray shades.

16-bit: allows 65,536 colors.

24-bit: allows about 16.7 million colors.

Bitmap: An image format made up of a collection of dots or “pixels” arranged in rows. & Pixel

Blacking: The process of preparing a videotape for insert editing by recording video black and continuous control track on the entire tape. If the recording deck supports timecode, continuous timecode will be recorded simultaneously (also called “striping”).

Brightness: Also “luminance”. Indicates the brightness of video.

Byte: One byte corresponds to eight & bits. With one byte, exactly one alphanumeric character can be displayed (i.e. a letter, number).

CD-ROM: Mass storage media for digital data, such as digital video. CD-ROMs can be read from but not written (recorded) onto: & ROM is an acronym for Read-Only Memory.

Channel: Classifications of information within a data file to isolate a particular aspect of the file. For example, color images use different channels to classify the color components in the image. Stereo audio files use channels to identify the sounds intended for the left and right speakers. Video files use combinations of the channels used for image and audio files.

Clip: In Studio, any media type that goes on the Movie Window Storyboard or Timeline, including video images, trimmed video scenes, images, audio files and disc menus.

Clipboard: A temporary storage area shared by all Windows programs, used to hold data during cut, copy, and paste operations. Any new data you place onto the clipboard immediately replaces the existing data.

Closed GOP: & GOP

Codec: Contraction of compressor/decompressor – an algorithm that compresses (packs) and decompresses (unpacks) image data. Codecs can be implemented in either software or hardware.

Color depth: Number of bits delivering the color information for each pixel. A 1‑bit color depth allows 21=2 colors, an 8‑bit depth allows 28=256 colors, and a 24‑bit depth allows 224=16,777,216 colors.

Color model: A way to mathematically describe and define colors and the way they relate to each other. Each color model has its own strengths. The two most common color models are & RGB and & YUV.

Color saturation: Intensity of a color.

Complementary color: Complementary colors are opposite in value to primary colors. If you were to combine a color with its complement, the result would be white. For example, the complementary colors of red, green and blue are cyan, magenta and yellow respectively.

COM Port: A serial port located on the back of your computer for attaching a modem, plotter, printer or mouse to the system.

Composite video: Composite video encodes luminance and chrominance information into one signal. & VHS and 8mm are formats that record and play back composite video.

Compression: A method for making files smaller in size. There are two types of compression: lossless and lossy. Files compressed with a lossless scheme can be restored unchanged from their original state. Lossy schemes discard data during compression, so some image quality is sacrificed. The loss of quality may be negligible or severe depending on the amount of compression.

Cropping: Choosing the area of an image to be displayed.

Data rate: The quantity of data transmitted per unit time; for example, the number of bytes read from or written to a hard drive per second, or the amount of video data processed per second.

Data transfer rate: The measurement of the speed at which information passes between the storage device (e.g. & CD-ROM or hard drive) and the display device (e.g. monitor or & MCI device). Depending on the devices used, some transfer rates may offer better performance than others.

DCT: Discrete Cosine Transformation – part of & JPEG image data compression and related algorithms. The brightness and color information is saved as a frequency coefficient.

DirectShow: System extension by Microsoft for multimedia applications under Windows. & ActiveMovie

DirectMedia: System extension by Microsoft for multimedia applications under Windows. & ActiveMovie

DirectX: A bundle of several system extensions developed by Microsoft for Windows 95 and its successors to make possible video and game acceleration.

Dissolve: A transitional effect in which the video is faded from one scene to the next.

Dithering: Increasing the number of apparent colors in an image by the application of color patterns.

Decibel (dB): A unit of measurement of the loudness of sound. An increase of 3 dB doubles the loudness.

Digital8: Digital videotape format that records & DV-coded audio and video data on & Hi8 tapes. Currently sold only by Sony, Digital8 camcorders and VCRs can play both Hi8 and 8mm cassettes.

Digital video: Digital video stores information & bit by bit in a file (in contrast to analog storage media).

DMA: Direct Memory Access.

Driver: A file containing information needed to operate peripherals. The video capture driver operates a video capture board, for example.

DV: Digital videotape format for recording digital audio and video on ¼”-wide metal evaporated tape. Mini-DV tapes hold up to 60 minutes of content, while standard DV tapes can hold up to 270 minutes.

ECP: “Enhanced Compatible Port”. Enables accelerated bi-directional data transfer via the & parallel port. & EPP

Edit decision list (EDL): A list of clips and effects in a particular order that will be recorded onto your output tape, disc or file. Studio allows you to create and edit your own edit decision list by adding, deleting and reordering clips and effects in the Movie Window.

EPP: “Enhanced Parallel Port”. Enables accelerated bi-directional data transfer via the & parallel port; recommended for Studio DV. & ECP

EPROM: “Erasable Programmable Read-Only Memory”. Memory chip that after programming retains its data without power supply. The memory contents can be erased with ultraviolet light and rewritten.

Fade to/from black: A digital effect that fades up from black at the beginning of a clip or down to black at the end.

Field: A & frame of video consists of horizontal lines and is divided into two fields. The odd lines in the frame are Field 1; the even-numbered lines are Field 2.

File format: The organization of information within a computer file such as an image or word processor document. The format of a file is usually indicated by its “file extension” (e.g. doc, avi or wmf).

Filters: Tools that alter data to produce special effects.

FireWire: Apple Computer’s trademarked name for the & IEEE-1394 serial data protocol.

Frame: A single image in a video or animation sequence. If using full NTSC or PAL resolution, one frame consists of two interlaced fields. & NTSC, PAL, field, resolution

Frame rate: The frame rate defines how many frames of a video sequence are played in one second. The frame rate for & NTSC video is 30 frames per second. The frame rate for & PAL video is 25 frames per second.

Frame size: The maximum size for displaying image data in a video or animation sequence. If an image intended for the sequence is larger than the frame size, it must be cropped or scaled to fit.

Frequency: The number of repetitions in a periodic process (like a sound wave or an alternating vol
tage) per unit of time. Usually measured in repetitions per second, or Hertz (Hz).

GOP: In & MPEG compression the data stream is first divided into “Groups Of Pictures” – sections of several frames each. Each GOP contains three types of frames: I-Frames, P-Frames (pictures) and B-Frames.

GOP size: The GOP size defines, how many I-Frames, B-Frames and P-Frames are included in one & GOP. For example, current GOP sizes are 9 or 12.

Hardware codec: Compression method that uses special hardware to create and play back compressed digital video sequences. A hardware codec may offer better encoding speed and image quality than a codec implemented completely in software. & Codec, Software codec

HD: High Definition video. Most HD formats in use have a resolution of either 1920×1080 resolution or 1280×720 resolution. A substantial difference exists between the 1080 and 720 standards: the larger format uses 2.25 more pixels per frame. This difference substantially increases requirements for processing 1080 content in terms of encoding time, decoding speed, and storage. The 720 formats are all progressive. The 1080 format has a mixture of progressive and interlaced frame types. Computers and their displays are inherently progressive, whereas television broadcasting has been based on interlaced techniques and standards. For HD terminology, we indicate progressive with the letter “p” and interlaced with the letter “i”

HDV: A format for the recording and playback of high-definition video on a DV cassette tape. has been established as the “HDV” format”. Instead of the “DV” codec, HDV uses a flavor of MPEG-2 . There are two varieties of HDV: HDV1 and HDV2. HDV1 is 1280×720 resolution with progressive frames (720p). The MPEG transport stream is 19.7 Mbps/s. HDV2 is 1440×1080 resolution with interlaced frames (1080i). The MPEG transport stream is 25 Mbps/s.

Hi8: Improved version of & Video8 using & S-Video recorded on metal particle or metal evaporated tape. Because of higher luminance resolution and wider bandwidth, the result is sharper pictures than Video8.

HiColor: For images, this normally means a 16‑bit (5‑6‑5) data type that can contain up to 65,536 colors. TGA file formats support images of this type. Other file formats require prior conversion of a HiColor image into & TrueColor. For displays, HiColor normally refers to 15-bit (5‑5‑5) display adapters that can display up to 32,768 colors. & Bit

Huffman coding: Technique used in & JPEG and other data compression methods in which seldom occurring values receive a long code, while frequently-occurring values receive a short code.

IDE: “Integrated Device Electronics” – a hard-drive interface that combines all drive control electronics on the drive itself, rather than on the adapter connecting the drive to the expansion bus.

IEEE-1394: Developed by Apple Computers and introduced as FireWire, this is a serial data transmission protocol with rates up to 400 Mbits/sec. Sony offers a slightly modified version for transmitting DV signals named i.LINK, providing transmission speeds up to 100 Mbits/sec.

Image: An image is a reproduction, or picture of something. The term is often applied to digitized pictures, consisting of pixels, that can be shown on a computer display and manipulated by software.

Image compression: Method of reducing the amount of data required to store digital image and video files.

Interlaced: The screen refresh method used by television systems. The & PAL TV image consists of two interleaved image halves (& fields) of 312½ lines each. The & NTSC TV image consists of two image halves of 242½ lines each. The fields are displayed alternately to produce a blended image.

Interleave: An arrangement of audio and video to promote smoother playback and synchronization or compression. The standard & AVI format equally spaces audio and video.

I/O: Input/Output.

IRQ: “Interrupt Request”. An “interrupt” is a temporary break in the main processing stream of a computer so that housekeeping or background tasks can be performed. Interrupts can be requested by either hardware (e.g. keyboard, mouse) or software.

JPEG: Joint Photographic Experts Group, and the standard developed by them for compressing digital frames based on & DCT.

Kbyte (also KB): One Kbyte (kilobyte) contains 1024 & bytes. The “K” here stands for the number 1024 (210), and not
1000 as in the metric prefix.

Key color: A color whose display is suppressed so that a background image can show through. Most commonly used when overlaying one video sequence on top of another, allowing the underlying video to display wherever the key color appears.

Key frames: In some compression methods, such as & MPEG, the video data of certain frames – the key frames – is stored completely in the compressed file, while any intervening frames are only partially saved. On decompression these partial frames reconstruct their data from the key frames.

Laser disc: Medium that stores analog video. Information on laser discs cannot be modified.

LPT: & Parallel port

Luminance: & Brightness

M1V: (File extension for) an MPEG file that contains video data only. & MPA, MPEG, MPG

Mbyte (also MB): One Mbyte (megabyte) corresponds to 1024 & Kbytes – 1024 x 1024 bytes.

Mark In / Mark Out: In video editing, the mark in and mark out times refer to the starting and ending timecodes that identify the portions of clips to be included in the project.

MCI: Media Control Interface. Programming interface developed by Microsoft recording and playing back audio and video data. It is also used to connect a computer to an external video source such as a VCR or laser disc.

Modulation: The encoding of information upon an empty carrier signal.

Motion-JPEG (M-JPEG): A & Video for Windows format, specified by Microsoft, for encoding video sequences. & JPEG compression is used to compress each frame individually.

MPA: (File extension for) an MPEG file that contains audio data only. & M1V, MPEG, MPG

MPEG: Motion Picture Experts Group, and the standard developed by them for the compression of moving images. Compared to M-JPEG, it offers 75-80% data reduction with the same visual quality.

MPG: (File extension for) an MPEG file that contains both video and audio data. & M1V, MPEG, MPA

MPV: (File extension for) an MPEG file that contains video data only. & MPA, MPEG, MPG

Non-interlaced: Describes an image refresh method in which the complete image is generated as a single field without skipping lines. A non-interlaced image (most computer monitors) flickers much less than an interlaced image (most TVs).

NTSC: National Television Standards Committee, and the color TV standard created by them in 1953. NTSC video has 525 lines per frame and 60 image fields per second. It is used in North and Central America, Japan and other countries. & PAL, SECAM

PAL: “Phase Alternation Line”, a color TV standard developed in Germany and used throughout most of Europe. PAL video has 625 lines per frame and 50 image fields per second. & NTSC, SECAM

Parallel port: Parallel port data is transmitted via an 8‑bit data line. This means that eight & bits (one & byte) can be transmitted at once. This kind of transmission is much faster than serial transmission, but is not appropriate for long-distance connections. Parallel ports are often named “LPTn”, where n is a number (e.g. “LPT1”). & Serial port

Pixel: The smallest element of a monitor image. The word is an ab­breviation of “picture element”.

Port: Electrical transfer point for the transmission of audio, video, control or other data between two devices. & Serial port, Parallel port

Primary colors: The colors that are the basis of the RGB color model: red, green, and blue. It is possible to create most other colors on a computer screen by varying the blend of these primaries.

QSIF: Quarter Standard Image Format. An MPEG-1 format specifying a resolution of 176 x 144 under PAL and 176 x 120 under NTSC. & MPEG, SIF

Quantization: One part of the & JPEG image data compression strategy. Relevant details are represented precisely, while details that are less relevant for the human eye are represented with less precision.

Raster: The area of a video display that is covered by sweeping the electron beam of the display in a series of horizontal lines from upper left to lower right (from the viewer’s perspective).

Redundancy: This trait of images is exploited by compression algorithms. Superfluous information can be eliminated during compression and restored without loss during decompression.

Resolution: The number of pixels that can be displayed on the monitor horizontally and vertically. The higher the resolution, the more details can be displayed. & Pixel

RGB: Red, Green and Blue: the primary colors in additive color mixing. RGB designates the method used in computer technology of encoding image infor­mation in pixels, each containing some combination of the three primaries.

ROM: Read Only Memory: Memory storage that, having been programmed once, retains its data without requiring electrical power. & EPROM

Run Length Encoding (RLE): A technique used in many image compression methods, inclu­ding & JPEG. Repeating values are not stored separately but with a counter to indicate how many times the value occurs in succession – the length of the “run”.

Scaling: Adaptation of an image to a desired size.

SCSI: Small Computers System Interface. SCSI was long preferred as the hard drive interface for some high-performance PCs because of its high data rate. Up to eight SCSI devices can be connected to a computer at the same time.

SECAM: “Séquentiel Couleur à Mémoire”, a color TV transmission system used in France and Eastern Europe. Like PAL, SECAM video has 625 lines per frame and 50 image fields per second. & NTSC, PAL

Serial port: Data transmitted via a serial port is processed one & bit at a time; that is, “serially” – one after another. The transmission rate is much slower than that of a parallel port, where parallel data lines allow multiple bits to be sent simultaneously. Serial ports are named “COMn”, where n is a number (e.g. “COM2”). & Parallel port

SIF: Standard Image Format. An MPEG-1 format specifying a resolution of 352 x 288 under PAL and 352 x 240 under NTSC. & MPEG, QSIF

Single frame: A single & frame is part of a series or sequence. When this series is viewed at sufficient speed, the illusion of a “moving picture” is created.

Software codec: Compression method that can create and play back compressed digital video sequences without special hardware. The quality of the sequences depends on the performance of the complete system. & Codec, Hardware codec

Still video: Still images (or “freeze-frames”) extracted from video.

S-VHS: Improved version of VHS using S-Video and metal particle tape to deliver higher luminance resolution, resulting in sharper pictures than VHS. & VHS, S-Video

S-Video: With S-Video (Y/C) signals, the brightness (luminance or “Y”) and the color (chrominance or “C”) information are transferred separately using multiple wires, avoiding modulating and demodulating the video and the resulting loss of picture quality.

Timecode: Timecode identifies the position of each frame in a video sequence with respect to a starting point (normally the beginning of the shot). The usual format is H:M:S:F (hours, minutes, seconds, frames), e.g. “01:22:13:21”. Unlike a tape counter (which can be “zeroed” or reset at any point in a tape), timecode is an electronic signal written onto videotape, and is permanent once it is assigned.

Transition: The visual connection between adjacent video clips, ranging from a simple “cut” to a showy animated effect. The common transitions like cuts, fades, dissolves, wipes and slides are part of the visual language of film and video. They can convey passages of time and changes of viewpoint concisely – and often subliminally.

TrueColor: The name indicates an image with enough color resolution to appear “true to l
ife”. In practice, TrueColor normally refers to 24-bit RGB color, which allows about 16.7 million combinations of the red, green and blue primary colors.
& Bit, HiColor

TWAIN driver: TWAIN is a standardized software interface allowing graphics and capture programs to communicate with devices that supply graphical data. If the TWAIN driver is installed, the capture function of a graphics application can be used to load images directly from your video source into the program. The driver supports 32-bit programs only and captures images in 24-bit mode.

VCR: “Video cassette recorder”.

VHS: “Video Home System” – Popular video standard for home VCRs. Half-inch tape is used to store “composite” signals incorporating both brightness and color information.

VISCA: Protocol used with certain devices for control­ling external video sources from computers.

Video8: Analog video system using 8mm tape. Video8 recorders generate composite signals.

Video CD: CD-ROM standard that uses & MPEG com­pressed videos.

Video decoder: Converts digital information into analog video signals.

Video encoder: Converts analog video signals into digital information.

Video for Windows: A Microsoft Windows system extension that can record digital video sequences to files on a hard drive and subsequently play them back.

Video scan rate: Frequency with which the video signal is scanned onto an image display. The higher the video scan rate, the higher the image quality and the less noticeable the flicker.

WAV: (File extension for) a popular file format for digitized audio signals.

White balance: In an electronic camera, this is the adjustment of the amplifiers for the three color channels (red, green and blue) so that white areas of the scene do not show a color cast.

Y/C: Y/C is a color signal with two components: brightness information (Y) and color information (C).

YUV: The color model of a video signal where Y delivers the brightness information and U and V the color information.

Glossary