SUPPORT THE WORK

GetWiki

image sensor

ARTICLE SUBJECTS
aesthetics  →
being  →
complexity  →
database  →
enterprise  →
ethics  →
fiction  →
history  →
internet  →
knowledge  →
language  →
licensing  →
linux  →
logic  →
method  →
news  →
perception  →
philosophy  →
policy  →
purpose  →
religion  →
science  →
sociology  →
software  →
truth  →
unix  →
wiki  →
ARTICLE TYPES
essay  →
feed  →
help  →
system  →
wiki  →
ARTICLE ORIGINS
critical  →
discussion  →
forked  →
imported  →
original  →
image sensor
[ temporary import ]
please note:
- the content below is remote from Wikipedia
- it has been imported raw for GetWiki
File:Ccd-sensor.jpg|A CCD image sensor on a flexible circuit boardflexible circuit boardFile:Cromemco_Cyclops_Camera_Board_1.jpg|An American Microsystems, Inc. (AMI) 1 Kilo-bit DRAM chip (center chip with glass window) used as an image sensor on the thumb|230pxAn image sensor or imager is a sensor that detects and conveys information used to make an image. It does so by converting the variable attenuation of light waves (as they pass through or reflect off objects) into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, digital imaging tends to replace analog imaging.Early analog sensors for visible light were video camera tubes. Currently, used types are semiconductor charge-coupled devices (CCD) or active pixel sensors in complementary metal–oxide–semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies. Analog sensors for invisible radiation tend to involve vacuum tubes of various kinds. Digital sensors include flat panel detectors.In February 2018, researchers at Dartmouth College announced a new image sensing technology that the researchers call QIS, for Quanta Image Sensor. Instead of pixels, QIS chips have what the researchers call "jots." Each jot can detect a single particle of light, called a photon.WEB,weblink Super Sensitive Sensor Sees What You Can't, npr.org, 28 April 2018, no,weblink 24 March 2018,

CCD vs CMOS technology

File:A micrograph of the corner of the photosensor array of a ‘webcam’.jpeg|thumb|A micrograph of the corner of the photosensor array of a webcamwebcam(File:Image sensor and motherbord nikon coolpix l2.JPG|thumb|Image sensor (upper left) on the motherboard of a {{nowrap|Nikon Coolpix L2 6 MP}})Cameras integrated in small consumer products generally use CMOS sensors, which are usually cheaper and have lower power consumption in battery powered devices than CCDs.WEB,weblink CMOS Is Winning the Camera Sensor Battle, and Here's Why, techhive.com, 2011-12-29, 2017-04-27, no,weblink" title="web.archive.org/web/20170501024004weblink">weblink 2017-05-01, CCD sensors are used for high end broadcast quality video cameras, and (C)MOS sensors dominate in still photography and consumer goods where overall cost is a major concern. Both types of sensor accomplish the same task of capturing light and converting it into electrical signals.Each cell of a CCD image sensor is an analog device. When light strikes the chip it is held as a small electrical charge in each photo sensor. The charges in the line of pixels nearest to the (one or more) output amplifiers are amplified and output, then each line of pixels shifts its charges one line closer to the amplifier(s), filling the empty line closest to the amplifiers(s). This process is then repeated until all the lines of pixels have had their charge amplified and output.WEB,weblink CCD and CMOS sensors - Canon Professional Network, Canon Europa N.V. and Canon Europe Ltd, 2002-2017., Canon Professional Network, 28 April 2018, no,weblink" title="web.archive.org/web/20180428122601weblink">weblink 28 April 2018, A CMOS image sensor has an amplifier for each pixel compared to the few amplifiers of a CCD. This results in less area for the capture of photons than a CCD, but this problem has been overcome by using microlenses in front of each photodiode, which focus light into the photodiode that would have otherwise hit the amplifier and not be detected. Some CMOS imaging sensors also use Back-side illumination to increase the number of photons that hit the photodiode.WEB,weblink CWhat is a backlit CMOS sensor?, techradar.com, 2012-07-02, 2017-04-27, no,weblink" title="web.archive.org/web/20170506184555weblink">weblink 2017-05-06, CMOS sensors can potentially be implemented with fewer components, use less power, and/or provide faster readout than CCD sensors.WEB, Moynihan, Tom, CMOS Is Winning the Camera Sensor Battle, and Here's Why,weblink 10 April 2015, no,weblink" title="web.archive.org/web/20150925220239weblink">weblink 25 September 2015, They are also less vulnerable to static electricity discharges.Another design, a hybrid CCD/CMOS architecture (sold under the name "sCMOS") consists of CMOS readout integrated circuits (ROICs) that are bump bonded to a CCD imaging substrate – a technology that was developed for infrared staring arrays and has been adapted to silicon-based detector technology.scmos.com {{webarchive|url=https://web.archive.org/web/20120603082322weblink |date=2012-06-03 }}, home page Another approach is to utilize the very fine dimensions available in modern CMOS technology to implement a CCD like structure entirely in CMOS technology: such structures can be achieved by separating individual poly-silicon gates by a very small gap; though still a product of research hybrid sensors can potentially harness the benefits of both CCD and CMOS imagers.ieee.org - CCD in CMOS {{webarchive|url=https://web.archive.org/web/20150622073305weblink |date=2015-06-22 }} Padmakumar R. Rao et al., "CCD structures implemented in standard 0.18 µm CMOS technology"

Performance

{{see also|EMVA1288}}There are many parameters that can be used to evaluate the performance of an image sensor, including dynamic range, signal-to-noise ratio, and low-light sensitivity. For sensors of comparable types, the signal-to-noise ratio and dynamic range improve as the size increases.

Color separation

thumb|right|200px|Bayer pattern on sensor(File:Absorption-X3.svg|right|thumb|200px|Foveon's scheme of vertical filtering for color sensing)There are several main types of color image sensors, differing by the type of color-separation mechanism:
  • Bayer filter sensor, low-cost and most common, using a color filter array that passes red, green, and blue light to selected pixel sensors. Each individual sensor element is made sensitive to red, green, and blue by means of a color gel made of chemical dye placed over each individual element. Although inexpensive to manufacture, this technique lacks the color purity of dichroic filters. Because the color gel segment must be separated from the others by a freme (similar to stained glass windows), less of the areal density of a Bayer filter sensor is available to capture light, making the Bayer filter sensor less sensitive than other color sensors of similar size. This loss can be negated by using larger sensor size, albeit at greater cost. The most common Bayer filter matrix uses two green pixels, and one each for red and blue. This results in less resolution for red and blue colors, corresponding to the human eye's reduced sensitivity at the limits of the visual spectrum. The missing color samples may interpolated using a demosaicing algorithm, or ignored altogether by lossy compression. In order to improve color information, techniques like color co-site sampling use a piezo mechanism to shift the color sensor in pixel steps.
  • Foveon X3 sensor, using an array of layered pixel sensors, separating light via the inherent wavelength-dependent absorption property of silicon, such that every location senses all three color channels. This method is similar to how color film for photography works.
  • 3CCD, using three discrete image sensors, with the color separation done by a dichroic prism. The dichroic elements provide a sharper color separation, thus improving color quality. Because each sensor is equally sensitive within its passband, and at full resolution, 3-CCD sensors produce better color quality and better low light performance. 3-CCD sensors produce a full (Chroma subsampling|4:4:4) signal, which is preferred in television broadcasting, video editing and chroma key visual effects.

Specialty sensors

File:A deep infrared view of the Orion Nebula from HAWK-I - Eso1625a.jpg|left|thumb|Infrared view of the Orion Nebula taken by ESOESOSpecial sensors are used in various applications such as thermography, creation of multi-spectral images, video laryngoscopes, gamma cameras, sensor arrays for x-rays, and other highly sensitive arrays for astronomy.{{citation needed|date=February 2013}}While in general digital cameras use a flat sensor, Sony prototyped a curved sensor in 2014 to reduce/eliminate Petzval field curvature that occurs with a flat sensor. Use of a curved sensor allows a shorter and smaller diameter of the lens with reduced elements and components with greater aperture and reduced light fall-off at the edge of the photo.WEB,weblink Sony's first 'curved sensor' photo may herald better images, cheaper lenses, Steve Dent, July 8, 2014, no,weblink" title="web.archive.org/web/20140711224002weblink">weblink July 11, 2014, {{clear}}

Sensors used in digital cameras

A listing of digital camera sensors can be found here.

See also

References

{{Reflist}}

External links

{{Photography}}{{Sensors}}{{Authority control}}

- content above as imported from Wikipedia
- "image sensor" does not exist on GetWiki (yet)
- time: 10:24am EST - Mon, Dec 10 2018
[ this remote article is provided by Wikipedia ]
LATEST EDITS [ see all ]
GETWIKI 09 MAY 2016
GETWIKI 18 OCT 2015
M.R.M. Parrott
Biographies
GETWIKI 20 AUG 2014
GETWIKI 19 AUG 2014
GETWIKI 18 AUG 2014
Wikinfo
Culture
CONNECT