Multimedia terminology contains computer and video terminology. The most important terms are defined below. Cross-references are indicated by &.
ADPCM: Acronym for Adaptive Delta Pulse Code Modulation, a method of storing audio information in a digital format. This is the audio encoding and compression method used in CD-I and & CD-ROM production.
Address: All available saving positions in a computer are numbered (addressed). By means of these addresses each saving position can be occupied. Some addresses are reserved for the exclusive use of particular hardware components. If two components are using the same address, this is called an “address conflict”.
Anti-aliasing: A method of smoothing out jagged edges in bitmap images. This is usually accomplished by shading the edges with pixels intermediate in color between the edge and the background, making the transition less apparent. Another method of anti-aliasing involves using higher resolution output devices.
BIOS: Acronym for Basic Input Output System, which refers to basic input and output commands saved in a & ROM, PROM or & EPROM. The essential task of the BIOS is the control of input and output. When the system is started, the ROM-BIOS carries out some tests. & Parallel port, IRQ, I/O
Bit: Abbreviation of “BInary digiT”, the smallest element of a computer’s memory. Among other things, bits are used to store the color values of pixels in an image. The more bits used for each & pixel, the greater the number of available colors. For example:
1-bit: each pixel is either black or white.
4-bit: allows 16 colors or gray shades.
8-bit: allows 256 colors or gray shades.
16-bit: allows 65,536 colors.
24-bit: allows about 16.7 million colors.
Blacking: The process of preparing a videotape for insert editing by recording video black and continuous control track on the entire tape. If the recording deck supports timecode, continuous timecode will be recorded simultaneously (also called “striping”).
Channel: Classifications of information within a data file to isolate a particular aspect of the file. For example, color images use different channels to classify the color components in the image. Stereo audio files use channels to identify the sounds intended for the left and right speakers. Video files use combinations of the channels used for image and audio files.
Clipboard: A temporary storage area shared by all Windows programs, used to hold data during cut, copy, and paste operations. Any new data you place onto the clipboard immediately replaces the existing data.
Color depth: Number of bits delivering the color information for each pixel. A 1‑bit color depth allows 21=2 colors, an 8‑bit depth allows 28=256 colors, and a 24‑bit depth allows 224=16,777,216 colors.
Complementary color: Complementary colors are opposite in value to primary colors. If you were to combine a color with its complement, the result would be white. For example, the complementary colors of red, green and blue are cyan, magenta and yellow respectively.
Compression: A method for making files smaller in size. There are two types of compression: lossless and lossy. Files compressed with a lossless scheme can be restored unchanged from their original state. Lossy schemes discard data during compression, so some image quality is sacrificed. The loss of quality may be negligible or severe depending on the amount of compression.
Data transfer rate: The measurement of the speed at which information passes between the storage device (e.g. & CD-ROM or hard drive) and the display device (e.g. monitor or & MCI device). Depending on the devices used, some transfer rates may offer better performance than others.
DV: Digital videotape format for recording digital audio and video on ¼”-wide metal evaporated tape. Mini-DV tapes hold up to 60 minutes of content, while standard DV tapes can hold up to 270 minutes.
Edit decision list (EDL): A list of clips and effects in a particular order that will be recorded onto your output tape, disc or file. Studio allows you to create and edit your own edit decision list by adding, deleting and reordering clips and effects in the Movie Window.
File format: The organization of information within a computer file such as an image or word processor document. The format of a file is usually indicated by its “file extension” (e.g. doc, avi or wmf).
Frame rate: The frame rate defines how many frames of a video sequence are played in one second. The frame rate for & NTSC video is 30 frames per second. The frame rate for & PAL video is 25 frames per second.
GOP: In & MPEG compression the data stream is first divided into “Groups Of Pictures” – sections of several frames each. Each GOP contains three types of frames: I-Frames, P-Frames (pictures) and B-Frames.
Hardware codec: Compression method that uses special hardware to create and play back compressed digital video sequences. A hardware codec may offer better encoding speed and image quality than a codec implemented completely in software. & Codec, Software codec
HD: High Definition video. Most HD formats in use have a resolution of either 1920×1080 resolution or 1280×720 resolution. A substantial difference exists between the 1080 and 720 standards: the larger format uses 2.25 more pixels per frame. This difference substantially increases requirements for processing 1080 content in terms of encoding time, decoding speed, and storage. The 720 formats are all progressive. The 1080 format has a mixture of progressive and interlaced frame types. Computers and their displays are inherently progressive, whereas television broadcasting has been based on interlaced techniques and standards. For HD terminology, we indicate progressive with the letter “p” and interlaced with the letter “i”
HDV: A format for the recording and playback of high-definition video on a DV cassette tape. has been established as the “HDV” format”. Instead of the “DV” codec, HDV uses a flavor of MPEG-2 . There are two varieties of HDV: HDV1 and HDV2. HDV1 is 1280×720 resolution with progressive frames (720p). The MPEG transport stream is 19.7 Mbps/s. HDV2 is 1440×1080 resolution with interlaced frames (1080i). The MPEG transport stream is 25 Mbps/s.
Hi8: Improved version of & Video8 using & S-Video recorded on metal particle or metal evaporated tape. Because of higher luminance resolution and wider bandwidth, the result is sharper pictures than Video8.
HiColor: For images, this normally means a 16‑bit (5‑6‑5) data type that can contain up to 65,536 colors. TGA file formats support images of this type. Other file formats require prior conversion of a HiColor image into & TrueColor. For displays, HiColor normally refers to 15-bit (5‑5‑5) display adapters that can display up to 32,768 colors. & Bit
IEEE-1394: Developed by Apple Computers and introduced as FireWire, this is a serial data transmission protocol with rates up to 400 Mbits/sec. Sony offers a slightly modified version for transmitting DV signals named i.LINK, providing transmission speeds up to 100 Mbits/sec.
Interlaced: The screen refresh method used by television systems. The & PAL TV image consists of two interleaved image halves (& fields) of 312½ lines each. The & NTSC TV image consists of two image halves of 242½ lines each. The fields are displayed alternately to produce a blended image.
IRQ: “Interrupt Request”. An “interrupt” is a temporary break in the main processing stream of a computer so that housekeeping or background tasks can be performed. Interrupts can be requested by either hardware (e.g. keyboard, mouse) or software.
Key color: A color whose display is suppressed so that a background image can show through. Most commonly used when overlaying one video sequence on top of another, allowing the underlying video to display wherever the key color appears.
Key frames: In some compression methods, such as & MPEG, the video data of certain frames – the key frames – is stored completely in the compressed file, while any intervening frames are only partially saved. On decompression these partial frames reconstruct their data from the key frames.
MCI: Media Control Interface. Programming interface developed by Microsoft recording and playing back audio and video data. It is also used to connect a computer to an external video source such as a VCR or laser disc.
Non-interlaced: Describes an image refresh method in which the complete image is generated as a single field without skipping lines. A non-interlaced image (most computer monitors) flickers much less than an interlaced image (most TVs).
NTSC: National Television Standards Committee, and the color TV standard created by them in 1953. NTSC video has 525 lines per frame and 60 image fields per second. It is used in North and Central America, Japan and other countries. & PAL, SECAM
Parallel port: Parallel port data is transmitted via an 8‑bit data line. This means that eight & bits (one & byte) can be transmitted at once. This kind of transmission is much faster than serial transmission, but is not appropriate for long-distance connections. Parallel ports are often named “LPTn”, where n is a number (e.g. “LPT1”). & Serial port
Quantization: One part of the & JPEG image data compression strategy. Relevant details are represented precisely, while details that are less relevant for the human eye are represented with less precision.
RGB: Red, Green and Blue: the primary colors in additive color mixing. RGB designates the method used in computer technology of encoding image information in pixels, each containing some combination of the three primaries.
Run Length Encoding (RLE): A technique used in many image compression methods, including & JPEG. Repeating values are not stored separately but with a counter to indicate how many times the value occurs in succession – the length of the “run”.
SCSI: Small Computers System Interface. SCSI was long preferred as the hard drive interface for some high-performance PCs because of its high data rate. Up to eight SCSI devices can be connected to a computer at the same time.
Serial port: Data transmitted via a serial port is processed one & bit at a time; that is, “serially” – one after another. The transmission rate is much slower than that of a parallel port, where parallel data lines allow multiple bits to be sent simultaneously. Serial ports are named “COMn”, where n is a number (e.g. “COM2”). & Parallel port
Software codec: Compression method that can create and play back compressed digital video sequences without special hardware. The quality of the sequences depends on the performance of the complete system. & Codec, Hardware codec
S-Video: With S-Video (Y/C) signals, the brightness (luminance or “Y”) and the color (chrominance or “C”) information are transferred separately using multiple wires, avoiding modulating and demodulating the video and the resulting loss of picture quality.
Timecode: Timecode identifies the position of each frame in a video sequence with respect to a starting point (normally the beginning of the shot). The usual format is H:M:S:F (hours, minutes, seconds, frames), e.g. “01:22:13:21”. Unlike a tape counter (which can be “zeroed” or reset at any point in a tape), timecode is an electronic signal written onto videotape, and is permanent once it is assigned.
Transition: The visual connection between adjacent video clips, ranging from a simple “cut” to a showy animated effect. The common transitions like cuts, fades, dissolves, wipes and slides are part of the visual language of film and video. They can convey passages of time and changes of viewpoint concisely – and often subliminally.
TrueColor: The name indicates an image with enough color resolution to appear “true to l
ife”. In practice, TrueColor normally refers to 24-bit RGB color, which allows about 16.7 million combinations of the red, green and blue primary colors. & Bit, HiColor
TWAIN driver: TWAIN is a standardized software interface allowing graphics and capture programs to communicate with devices that supply graphical data. If the TWAIN driver is installed, the capture function of a graphics application can be used to load images directly from your video source into the program. The driver supports 32-bit programs only and captures images in 24-bit mode.