NOTE: Free essay sample provided on this page should be used for references or sample purposes only. The sample essay is available to anyone, so any direct quoting without mentioning the source will be considered plagiarism by schools, colleges and universities that use plagiarism detection software. To get a completely brand-new, plagiarism-free essay, please use our essay writing service.
One click instant price quote
Recent Developments in Computer Peripherals Including an in-depth look at Multimedia Input Devices Development in Peripherals Three An in-depth look Digital Cameras Six The History of the Digital Camera Six A Peripheral device is any external device attached to a computer. Without Peripherals a computer is just a box full of wires, transistors and circuits, which is able to: - 1. Respond to a specific set of instructions in a well-defined manner. 2. Execute a prerecorded list of instructions (a program). The only problem being that without any input Peripherals you cannot tell the computer to do any of the above processes, and if you could, without an output device of some kind, the computer has no way of delivering the result to the user!
Examples of peripherals include printers, disk drives, display monitors, keyboards, and mice etc. These can be separated into two categories: - An input device is any machine that feeds data into a computer. For example, a keyboard is an input device. Input devices other than the keyboard are sometimes called alternate input devices. Mice, trackballs, and light pens are all alternate input devices. An output device is any machine capable of representing information from a computer.
This includes display screens, printers, plotters, and synthesizers. Developments in Peripherals in the Last Few Years There have been many advances in the field of Peripherals over the last few years. Even the humble keyboard and mouse have been re-invented to produce the Ergonomic keyboard and the cordless and laser mouse. There have also been advances in monitors such as flat screen displays and LCD screens.
But there have also been advances in technology, which although not new, have been made commercially available for home use such as the digital camera, scanners, digital video camera and the colour printer. To look at some of the advances in detail we should put them into their categories. Printers have developed from the daisy wheel printer to the thermal printers of today. Other advancement in printers have been the laser printer (Same technology as photocopiers), which is used commonly in offices as it produces very high quality text and graphics. I general the Printer has developed in four areas. It has made improvements in the quality of type, the speed in which it can print, the quality of graphics and the number of fonts now available.
Monitors a few years ago were usually monochrome, displaying only two colours. E. g. orange and black or green and black.
Another type of Old Style monitor is a Grey Scale monitor. This type of monitor displays different shades of grey. These monitors are now used in the minority. Most of the Monitor used today are colour monitors, capable of displaying 16 to over one million different colours!
There are a number of different types of colour monitor as shown in the table below. Video Standard Resolution Simultaneous Colors VGA (Video Graphics Array) 640 by 480 16 SVGA (Super Video Graphics Array) 800 by 600 16 XGA (Extended Graphics Array) 640 by 480 65, 536 A mouse (Fig 2) is the device that controls the movement of the cursor or pointer on a display screen. The mouse has been around about 40 years. It was invented by Douglas Engelbart of Stanford Research Center in 1963, and pioneered by Xerox in the 1970 s.
The mouse is one of the great breakthroughs in computer ergonomics because it frees the user to a large extent from using the keyboard. In particular, the mouse is important for graphical user interfaces because you can simply point to options and objects and click a mouse button. Such applications are often called point-and-click programs. A significant advance in the use of the mouse is the cordless mouse. The mouse isnt physically connected to the computer at all. Instead it rely's on infrared or radio waves to communicate with the computer.
Cordless mice are more expensive than normal mice, but it does eliminate the cord, which can sometimes get in the way. Other advances in mice have been The keyboard is the most commonly used peripheral for inputting information onto a computer. The standard layout of letters, numbers, and punctuation is known as a QWERTY keyboard because the first six keys on the top row of letters spell QWERTY. The QWERTY keyboard was designed in the 1800 s for mechanical typewriters and was actually designed to slow typists down to avoid jamming the keys. This type of arrangement is still used today but has not advanced much with the exception of the ergonomic Keyboard. This was designed to allow more effective use of the QWERTY layout.
But the above peripherals have not had nearly as much advancement as multimedia peripherals. This is the use of computers to present text, graphics, video, animation, and sound in an integrated way. Long touted as the future revolution in computing, multimedia applications were, until the mid- 90 s, uncommon due to the expensive hardware required. With increases in performance and decreases in price, however, multimedia is now commonplace. Nearly all PCs are capable of displaying video, though the resolution available depends on the power of the computer's video adapter and CPU. Because of the storage demands of multimedia applications, the most effective media are CD-ROMs.
Listed below are some of the devices, which are now commonplace in our homes. Scanners are used to scan in images and are used in conjunction the graphics applications. Scanners can also scan text by making use of OCR (Optical Character Reader) technology, which can then be manipulated in word processing applications. Most home scanners are Flat Bed Scanners. Flatbed Scanners are like a photocopy machines. It consists of a board on which you lay books, magazines, and other documents that you want to scan.
A digital camera stores images digitally rather than recording them on film. Once a picture has been taken, it can be downloaded to a computer system, and then manipulated with a graphics program and printed. Unlike film photographs, which have an almost infinite resolution, digital photos are limited by the amount of memory in the camera, the optical resolution of the digitizing mechanism, and, finally, by the resolution of the final output device. Even the best digital cameras connected to the best printers cannot produce film-quality photos.
However, if the final output device is a laser printer, it doesn't really matter whether you take a real photo and then scan it, or take a digital photo. In both cases, the image must eventually be reduced to the resolution of the printer. The big advantage of digital cameras is that making photos is both inexpensive and fast because there is no film processing. Interestingly, one of the biggest boosters of digital photography is Kodak, the largest producer of film. Kodak developed the Kodak Photo CD format, which has become the de facto standard for storing digital photographs. A web cam is similar to a digital camera, but instead of taking still shots, it can stream images much like a video camera.
Thus enabling the user to send live video form their computer across the Internet. The history of the digital camera starts with the evolution of the television, back in the 1940 's and 50 's. When television was first broadcast it was all live. A way had to be found to record the images being broadcast. In 1951 Bing Crosby laboratories introduced the VTR, which recorded the electrical impulses onto magnetic tape.
By 1956 the VTR technology worked well, and it began to have a large impact on the television industry. This, tied in with the development of computers in the 1950 's started the digital age. The next large step occurred with NASA in the 1960 's. Before NASA sent astronauts to the moon, probes were sent to map the surface of the moon. These probes sent back analogue signals to earth, NASA engineers found that the transmissions were too weak to compete with natural radio sources in the cosmos. Current television receivers could not decipher the images sent back from the moon, so NASA engineers had to find a way to enhance and sharpen the images.
Images were processed through a computer and turned into a digital signal, and all noise and corruption of the data was removed. By the time Apollo went to the moon, transmissions were coming back crystal clear. After that the cold war accelerated development of digital imaging, mostly used for spy satellites and imaging systems. In 1995 Kodak released the dc 40 and at under a $ 1000 it was the first digital camera marketed for consumers. The Apple Quick Take 100 was also made available at the same time. Both connected to the computer via serial cable.
No two digital cameras are the same, and therefore this explanation of how a digital camera works is merely a generalization of the internal process of how a digital camera takes a picture. There are two versions of this description, an entry level description meant for the casual user who just wants to know how a digital camera works in general, and in relation to a 35 mm camera. The other description is a little more technical, and for this review click here. As the internal workings on each camera works a little differently the following information may not be 100 % correct as per your digital camera. A digital camera is similar to a 35 mm camera in the way that it takes pictures and stores them on some sort of memory card. The way a digital camera differs from a 35 mm camera is what's inside.
When you prepare to take a picture on your digital camera, (by pressing the shutter release button half way down), the auto focus (if you are using it. ), is applied, the CCD (charged-couple device) charges up and becomes prepared for the picture to be taken, and possibly the flash prepares to be used. Once the shutter button is fully depressed, the shutter will open, allowing light to enter the camera and strike the CCD. The light is then measured electronically on the CCD and is then sent off to the internal memory of the camera, called the buffer. Once the image information reaches the buffer, it is then compressed (if selec...
Free research essays on topics related to: digital, digital camera, camera, laser printer, input devices
Research essay sample on Digital Camera Input Devices