Get 25% off Blinkist premium and enjoy 2 memberships for the price of 1! Start your 7-day free trial by clicking on this link: www.blinkist.com/asianometry
Image sensors are some of the most interesting parts of digital technology today and something which most people take for granted, especially with how cheap these sensors have become.
I fly FPV so I definitely appreciate how good they have become. Transmitting HD video only requires 7-8 grams of hardware which can go 30 miles with the right antennas/conditions. Today it's all about latency, weight and image quality. I can't wait to see what the market looks like in 5 to 10 years.
When I was little, and this is in the early 1990s, textbook says that video cameras used vidicon tubes (look up what that is). When I was in high school they started having DV camcorders, not VHS ones or anything. These were fairly small (by the standards of the day) and took very clear videos. In those days a decent camcorder was about 1000 dollars, and the good ones had 3 CCD sensors, one for each color. I imagine today camcorders are pretty much a much smaller market, for youtubers or whatever, but most will just be using cell phones. In the early 2000s cell phone cameras were utter shit, it took very grainy images in the best of time. But today their image quality is better than camcorders of the late 1990s.
CCDs were once used as high-speed serial memory devices. Tektronix back in the early 1980s produced a digital oscilloscope using CCD memory which could capture up to 500 million samples per second, faster than any other digital oscilloscope at the time.
Cool! Also check out Tektronix 7912 from 1970s. It was based on a special scan converter CRT and was capable of recording 512 points in 5 nanoseconds. Very good for digitizing fast transient processes in nuclear physics and other similar applications.
@@douro20 Yes, they were using these digitizers to record the actual course of the chain reaction -- how fast the neutrons multiply in the bomb during the explosion. The main engineering difficulty in this is the enormous dynamic range of the signal. To record the entire transient with good accuracy, they used many different sensors to get good resolution at both the low end and the high end of expected values, with a bunch of digitizers working in parallel.
CCDs do not use the photoelectric effect (5:52). The photoelectric effect is when an photon exceeds the workfunction of a surface and creates a free election. Meanwhile CCDs or CMOS image sensors are like solar cells, they create electron-hole pairs in the *bulk* of the crystal. The photon generates an exciton state, which is weakly coupled electron and hole, this exciton is broken in the case of silicon by thermal means and the electron and hole are separated by diffusion or field transport (classic diode transport).
One thing that should be made clear in the video, is the difference between the photoelectric effect and the photovoltaic effect. The photoelectric effect is when a photon knock off an electron from a metal and that electron can conduce a current. This doesn't require a junction and was discovered in 1921. The energy of visible light can knock electron off alkaline metals. The photovoltaic effect is when the photon creates a couple {electron - hole} in a doped PN junction. The junction separates the two and creates an electric current. The latter is the effect utilised in CCD and CMOS image sensors.
The consumer market may have been taken over by CMOS, but for specialty applications like in astronomy and on spacecrafts, CCD still dominates. Big telescopes, on earth or in space, almost all exclusively use CCDs for their superior image quality. The smearing issue mentioned at 7:45 is a drawback for consumer products, but is actually a feature for ground observating satellites. It happens because when the charges in one CCD pixel moves to the next pixel, that pixel is still sensitive to light, so the readout process of CCD is essentially like moving the entire sensor across the image field. But on a satellite, if the readout speed of the sensor matches the speed the image of the ground moves across the sensor, you can stabilize the image and get longer integration time for free. The CCD now operates like a scanner, and can churn out an image tens of thousands of pixels long by thousands wide. Many spin stabilized spacecrafts also take advantage of this feature, for example NASA's Juno and ESA's Gaia.
Thats interesting, especially considering CMOS apparently got partially made viable by NASA. Tbf tho, does that mean that CCD are inherently better for big telescopes, or that CMOS hasnt reached full viability for big telescope sensor? Most of the push for CMOS seems to come from commercial and military cameras, which usually rely on smaller, light and efficient sensors. Maybe research lags behind at large scale.
@@termitreter6545 Of note is... Yeah thy studied it and didn't pick it up. Since yes CCD needs more external hardware and so on, but well they quality was better. Specially compared to early CMOS. Issue with CMOS is for example heat. Those transistors flipping on the pixel? That generates heat. Not much heat, but enough heat that it matter on highest end scientific use. Since heat means thermal electrons, thermal electrons mean noise. We are talking "we are cooling this in a liquid nitrogen cryogenic devar" levels of "we don't want thermal electrons". As such in that case the "needs external hardware" is benefit, not a hindrance. Needs external hardware? Fantastic, we were anyways planning to separate out as much of the equipment and electronics as far away from the sensor chip as possible to isolate noise. Plus those amplifiers on each CMOS pixel? Tiny and packed it tight space. Which again often means more noise. They have gotten better with it, but it is still a game of "trying to stuff lot of tiny amplifiers". CCD? Uses one big main amplifier. Which can be made as perfect as possible with little regards for size. That single amplifier brings another benefit. Consistency. One wants the scientific sensor to be flat. not just "look similar" flat, but perfectly flat and linear in response across the pixels. Since one often uses thos pixels for comparative measurements. If one pixels amplifier is little different responding to another, that is problem. Response not linear and predictable. SO again a "problem" of CCD is a benefit in this speciality case. Since one can be sure the amplifier and analog to digital response is same for all pixels. Since there is only one amplifier and ADC circuit. However again this isn't mere military or industrial grade situation. They are fine with the little noises and guibles of CMOS. "the picture looks fine" and so on. It works. This is "we hunt single photons, if we have 2 photons worth of noise on the pixel, that is a problem". Nobody else would care about 2 photons worth of noise, but big telescope hunting for single photons from another galaxy does care. However in CMOS is also used in astronomy, where it's features benefit. For example lucky imaging, which is based on taking lot of images fast and picking best ones regarding atmospheric conditions. Obviously then, a fast reading CMOS is better. So it's all about "what is the desired effects" and well for scientific cases sometimes CCDs quirks are benefits. Plus the unbeaten raw quantum efficiency, where CMOS might be "as good as CCD", but in scietific "as good as" is not enough, one wants the best.
@@aritakalo8011 Counterpoint: all of the IR focal plane instruments flying on the JWST are CMOS based, in principle. The photosensitive absorber layer is "read out" by a similar (if not exactly the same) as a CMOS pixel amplifier as found in any modern commercial CMOS sensor. The LEO satellites that were launched with scanning arrays are "old" in the grand scheme of things. Sure, they work fantastically well, but it can be readily assumed that if it can be replaced by a 2D focal plane, it will. Eventually. Or maybe it already has. Less integration time and more frames can make up for the loss in SNR by averaging. Generally, any nonlinearities related to the amplifier can be calibrated out with a dark frame (or Correlated Double Sampling, as mentioned in the video) or by sweeping the reset voltage and checking the output of the amplifier. Yes, it's not convenient to have to perform a calibration, but it's generally good for a while, and it can be done as easily as taking an image (no physical access is required).
What i have seen so far, at least when it comes to the big telescopes, they have CCD and CMOS instruments, depending on the wavelengths and usecases. For visible light CMOS is preferred, not only because of bigger sensor sizes but also because of higher resolution/smaller pixels.
@@milantrcka121 yes that might be true on the consumer field, but Kodak high end CCDs were best performing devices until late 2000s. with the leader I didn't mean in volume, but in performance. I would say somewhere in mid 2000s Sony started to make sensors of similar performance to Kodak (their imaging sensor division got taken over by ON Semiconductor) with CCD performance. Kodak sensors at the time were mostly too expensive for consumer cameras.
@@100brsta Indeed! Kodak engineering was the best or second to none in many fields especially optics (satellite telescope mirrors) and commercial optics (Instamatic). Also special magnetics and materials. And of course CCDs...
Still are - CCDs utterly thrash CMOS for sensitivity - something telescopes require. CCDs are preferred in movie cameras too - rolling shutter from CMOS can ruin an action scene.
Jon, another excellent essay. Next in the imaging category... Stacked image sensors similar to the Foveon. Initial patents on the technology have or are about to run out. Newer process nodes that have a better understanding on how to stack components, along with more sophisticated image processing software and hardware portend this to be the next 'big thing'
There's a simpler cousin to the photomultiplier tube. A simple phototube, which works similarly, but does not have the intermediate "dynodes". Their most common use was to sample the audio soundtrack, on old movie film, and convert it to a voltage. The soundtrack was a strip next to the image fields, which would block more or less light, according to the recorded audio level. The light source was fairly bright, so a photomultiplier was not required. Photomultipliers were, and still are, used in mostly scientific instruments. There was one consumer app, from the 1950s, where GM had several high end car models with actively dimming headlights. The photomultiplier was used to know when to momentarily power off the high beams, reducing the light going to the oncoming car. It was called the Autronic-Eye.
I remember those complicated Autronic Eye systems.... I NEVER did understand why the car manufacturers chose to fuck about with a high-voltage vacuum tube for that, instead of a MUCH more simpler system using a CdS cell... 🤷🏻
Wow, how the *heck* do you manage to do the research your vids require, and still work a day job?! Just incredible content! I was in the consumer imaging business for many years (still am, on a semi-retired basis). My understanding of what really killed off CCDs in consumer cameras was HD video. CCD image quality at the time was markedly superior, but you couldn’t clock them fast enough to do HD video on higher-resolution chips. You could clock data off a ~3 megapixel CCD chip fast enough to one way or another end up with VGA video, but as chips got bigger with more pixels CCDs couldn’t keep up. It might have been more the power required for such fast readout rather than the absolute maximum clock speed; I just remember from talking with camera company engineers at the time that it was HD video that was the death knell. I remember being chagrined at the sharp drop in image quality that happened as a result; as you said, the noise was the problem and became particularly evident at higher ISO (light sensitivity) levels. Another fantastic video, thanks! (My one criticism: As someone else noted below, it’s not actually the photoelectric effect that’s at work, it’s that photons make hole-electron pairs, the same phenomenon that drives solar cells. Rather than turning the energy into a continuous output current though, sensor pixels simply collect the charge in the potential wells of diode structures. - Hence, photodiodes :-)
I suggest an episode focused solely on Canon's CMOS achievements. They committed to the technology early and produced superior results across digicam market segments. Thanks for your series!
Very good episode! I've always wanted to see your take on imaging technology, and you've presented a good history on it. I would love to see this continued into subjects like: - Sony's history with imaging technology - How cinema camera makers (e.g. ARRI, Red, Blackmagic) sources their imaging sensors -- IIRC, Blackmagic seems to source their sensor from ON Semi, which you mentioned in this episode -- Furthermore, their popular Pocket Cinema Camera is so small that their circuitry overlaps, introducing noise. They counter by implementing noise cancellation features
When I think of bubble memory, I think of the non-volatile magnetic memory promoted by TI and others in the 1970's. The development of winchester disks and CMOS memory put the bubble memory to bed. I remember at TI we were also attempting to develop CCD imaging. The problem was that the metal layer on top of the device made it difficult to make the collection area for each pixel as large as it could be. So a technique of sandblasting the back of the chip so thin that light could come through the backside of the chip. Today we talk mostly about the successful ideas, but there was a lot of effort put into those ideas that never made it.
kodak 1st stay on the moon , the guys had a 640x480 mp digital camera , this should be common knowledge . I don't know why Kodak didn't use that bragging right during the digital camera wars
Just a minor issue, but when talking about the shutter of a camera in front of the sensor you are actually showing the aperture blades of a lens which are not the same thing.
10:14 Turn transistors 2 and 5 off, and turn transistor 7 on. Electrons flow in to charge the capacitor (3) negatively. Now turn transistor 7 off, and transistor 2 on. Light hitting the photodiode (1) will allow some electrons to go to ground... reducing the charge. Now turn transistor 2 off and transistor 5 on. Assuming transistor 4 is properly configured, electrons will flow through it in proportion to the voltage at the gate (which is the voltage in the capacitor). That current is what's being read by the ADC and converted to a subpixel.
Photodiodes make use of the PHOTOVOLTAIC EFFECT, not the photoelectric effect. Only alkali metals (sodium, potassium, etc) have a very low energy threshold within the visible light spectrum. The photovoltaic effect doesn't knock out the electrons, but moves them to the valence band. From there the doping of the silicon works its magic attracting them towards one pole while being repelled from the other.
Good video. The reason CCDs are still here is the fact they are global shutter by design. CMOS needs another storage node for that making the pixel even more complex requiring also more space. With BSI technology and deep trench isolation they are able to make such small pixels with good dynamic range and sensitivity.
1:18 Ben Thompson says "hi" 😂 16:40 it also helped that those fabs that used to make CPUs and GPUs for the PS2 and PS3 transitioned into making CMOS sensors.
I remember working in a hybrid electronics lab back around 1985-88. We had to test bare chips that were bonded to ceramic substrate before packaging. We quickly found that when testing a class of CMOS and NMOS circuits, we had to turn the lab lights off and work with a subdued desk lamp otherwise crazy things would happen.
An interesting offshoot of these chips is the impending IR / night vision CMOS revolution. T-Rex Arms did a video showing how much clarity you can get from a typical DSLR camera when you remove the IR filter and film at night. Unless there are export/sales restrictions every American militiaman will have access to night vision rivaling the army's new IVAS / ENVG-B. Your phone's camera's will take better night vision photos too.
CMOS sensors are still not sensitive enough to replace night vision devices, and their sensitivity in near IR is not so great either. So, even if you have a modern, monochrome CMOS sensor without any IR filters whatsoever, even fairly old and cheap night vision tube can still beat it in low light conditions, if you want real-time image at least, and don't want to use external IR light source. That's from my experience, I have IMX462 based astro camera, which is highly sensitive to IR (one of the most IR sensitive consumer chips today), and an old British night vision tube (P8079HP) and that tube is clearly better. However, if you use longer exposures (eg. 1s), then camera can produce much better and image, though it's not that much useful for real time observation (moving objects would get smeared etc.). We're not quite there yet, but not far away. There are some specialised sensors that can have great performance in (very) low light setting (eg. EMCCD), but they are quite expensive and mostly used for serious science. "Normal" NVG, even those high-end, would still probably be much cheaper.
Just had to drop a merci beaucoup, for the amazing content once more, im always looking forward for some nuts and bolts approach that i feel on this topics that this guy has, 🙏
Kinda interesting that right now CCD Sensors are seeing somewhat of a resurgence, big problem still being the amount of energy needed... while CMOS Sensors have hit a wall
Love you channel and videos, was hoping for bit more fundamental explanation of function of CCD, CMOS and improvements in function over generations. Still great topic.
The thing on 7:50 is the aperture mechanism - not the shutter. The shutter on most DSLRs are made of two flaps - one that opens and another that closes. On very old cameras it's a blade with a hole that lets light in for the desired by mathing the hole with the camera's permanent hole - that usually had a small lens of glass in front of it. (Old means first half of the 20th century)
The picture at 5:30 shows a magnetic domain memory chip. From my understanding, it's a different type of memory based on a different (and unrelated) physical principle of "magnetic bubbles". So that is not a CCD chip and it's based on a different technology than CCD chips. (Incidentally, I'm not sure the inventors of CCDs used the term "bubbles" for anything other than magnetic domains in that other technology. But I may be mistaken.) I can see the source of confusion, it's confusing for me as well, because I think magnetic bubbles served as the inspiration for CCDs, of sorts, and they share the general idea and the high level structure. It's just that those two are based on very different physical principles.
thanks for the video as always. I am a complete amateur in digital sensor tech, but i read that the latest jet fighter F-35 has an EOTS system that is capable to see and target something that is 50km+ away of the size of a suitcase. I bet that is sensor tech to its extreme (at least by today's standard). I wonder who are the companies that possess these state of the art sensor tech, and how far behind are US's competitor. I also read that there are three primary types of chip tech these days: logic chip, memory chip, and sensor chip. I always wanna know which type of chip is the most difficult to develop, and in what way. I bet memory chip is the easiest one to crack, but am unsure whether logic chip or sensor chip is more difficult to advance in the long run. 🙏
Sadly no mention of BSI and dual gain, 2 tricks that actually make CMOS wipe CCD even in low light w/o active cooling. And ARRI ALEXA sensor clever trick that not only measure charge, but also time untill capacitor get full (oversatureted ) that make even more information and dynamic range that surpass even analog cinema film.
BBD chips are still used to produce echo and especially chorus effects for guitars and synths. While this could be done digitally, analogue synths and effects have become fashionable enough that BBD chips are back in production.
Photomultiplier tubes are still a current technology, there is no modern replacement for those. Its basically the only kind of vacuum tube thats still beeing produced in large volumes for serious applications (Aside from magnetrons).
All TRUE night vision devices use them, as there is no solid state alternative thatbworks nearly as well as a photomultiplier tube does. There are still other types of vacuum tubes made in large quantities for today's applications. Almost EVERY household in America uses a fairly high powered vacuum tube in their kitchens on a daily basis.... The Magnetron in their microwaves. Edit: Oops, didn't see the Magnetron listed at the end of your post... Audio tubes are another one that is still produces in large quantities today. MANY musical instument amplifiers still use tubes today for their pleasing sounding distortion when pushed beyond their ratings. Transmitting tubes are still alive today also.
First a quick note, Caltech does not capitalize the T, even though it seems like its an abbreviation. Second, also wanted to mention the obscure sensor technology from Foveon that tried to use multi-layered image diodes to capture different wavelengths without a filter. Unfortunately it was complicated enough to fabricate that they could not easily ride process scaling to produce better and better chips, though they still struggle on as part of Sigma and promise they'll have their next generation sensor out any day now.
The sensitivity of the Foveon sensor was also not great - they rely on different wavelengths being able to penetrate different depths into the sensor - red being able to penetrate the deepest. However if the red bit isn't strong enough it doesn't get detected in the red layer and can cause issues... Basically they just thought it's not a worthwhile tradeoff for most applications, especially when demosaicing have been refined to the point that it's good enough and cheap enough to do for 99% of colour applications. Still a fan of mono sensors though!
Ehh... PMT's are not imaging devices, you meant camera tubes like Vidicon. CID sensors were completely skipped. CDS was originally used for CCD to get rid of the reset noise. EMCCD, that is basically a solid state, imaging PMT was also missed. Kodak and their transparent gate technology, SITe and their early back-illuminated CCDs (that were used on space probes), Foveon and LBCAST sensors were all missed. And is there a CMOS sensor with ADC in each pixel? There are ones with ADC in every row, is all. Many, many interesting technical details and historycal events ignored or incorrectly presented.
I believe that it's more appropriate to speak of photovoltaic rather than photoelectric effect in this case, as electrons aren't ejected from the material, but promoted from valence to conduction band.
You are an absolute champion of the people for providing such accurate and useful information in a way that normal working class individuals can understand.
There is a lot more room for advances in this area, imagine a single photon detector on top of each column of a 3D memory so that the photons were counted digitally at each pixel as they came in. The next step beyond that is to bin the counts by photon energy level, colour. Then you have every single pixel as sensitive as possible and acting as a spectrometer.
@@brodriguez11000 The quantum of light required in photosynthesis is 8 photons. The ideal system is superconducting nanowires, that exists now, but the superconducting part would be hard, in a consumer device.
There's a limit on how far we can push sensor technologies. QE on modern sensors is already very high, the noise at high gain(high ISO) is already dominated by shot noise, the discrete nature of light. But SPAD based sensors would be a huge step up for dynamic range, as we are no longer limited by each pixel's well capacity. But unfortunately I don't think reading out the energy of the incoming photon is something that can be done with current semiconductor technologies? Maybe some 'quantum' sensors can do that? Idk
I understood CCD was primarily used for Astronomy and have the advantage of allowing extended exposure times, with less noise, I think the term "light bucket" was used for this. As in the video says also for non-visible light. As CMOS improves it has also started being used in Astronomy. I wonder how this related to Infrared cameras?
All silicon devices are inherently sensitive to light, especially infrared. Both CCD and CMOS image sensors are very sensitive to infrared. In normal visible-light cameras, we use IR-blocking filters (visible at 7:00, it is the pale blue glass window of the sensor module). In pure infrared cameras, we use visible-light-blocking filters (conceptually similar to the purplish-black infrared windows on many remote controls and the devices they control). UV is a different matter: Silicon tends not to be particularly sensitive to it, so they have to deliberately design them for it. (Together with the choice of lens materials, since many common types of glass and optical plastic tend to block UV.)
Thank you for an interesting video. I worked on CCDs, using them as analogue data storage in 1972 and found out the hard way that they were light sensitive as my test circuits failed when the sun came out. I haven't had a cause to use them since. Though I did have a Nokia N95 phone for a while. I always wondered why they were being replaced by CMOS.
There are global shutter CMOS chips and have been for several years now. CMOS has basically taken over the amateur astrophotography market as well because of superior noise characteristics - all my astronomy cameras have been CMOS. CCDs still have some relevance for scientific purposes (I think?) but CCDs are kind of dinosaur tech at the amateur level.
Two wrong information I found: 1. 2:13 this diagram is not related to the photoelectric effect. This is Hertz's other experiment that proved the existence of electromagnetic waves. 2. You mentioned that the PMTs have 5 to 7 dynodes. Actually it may have many more. 10, 12 dynode PMTs are easily available. We use 12 stage PMTs in our lab.
seeing how the CCD transfer data, opposed to crossbar AM arrays, it reminded me of racetrack memory. racetrack memory stores information using electron spins. a correlated electron spin gives you low resistance path, and opposing electron spin gives high resistance path. then you can use spin-orbital coupling, apply current, transfer spin, etc... take a look if interested :P
CCDs are a current based technology, whereas CMOS is a voltage based technology. As the resolution increases, the heat from the current required for CCDs increases. This became a real problem for professional video camera makers in the early days of HD. The increased resolution of HD made the smaller CCD chips impossible because of the excess heat. While CMOS was a possible solution, the rolling shutter effect of CMOS was disliked by professional video producers. Because this was the only solution to shooting HD on smaller chips, it took time for the professional video market to accept this limitation.
I happily, recently acquired a used Phase One IQ 280 digital back & XF body. I chose that over the Fujifilm GFX 100 II. I already have the Sony a7rIV... which is close enough to the GFX camera.
I'm sorry, but you are confusing bubble memory, which works by having segments of magnetic fields on the material and CCDs which use charges. CCDs have, BTW been used well into the 1980s for storage devices.
Eric Fossum is still active on the DPReview Photographic Science and Technology discussion group... he has many interesting threads about his research.
Modern CMOS image sensors (CIS) have solved the fill factor issue by using Back Side Illumination (BSI) circa 2010 then this approach led to Stacking (2 layer semiconductor) circa 2015 and now triple stack is being expected as the big news of iphone release 2023, all this 3D semiconductor approach is specific to CIS but has much wider implications, such as memory stacking, this could be another story covered by this wonderfull chanel
This is such a great channel. You might drop the Asian focused byline and just do tech wordwide (which you mostly seem to do). You are really good at it. Thanks!
This makes me think of how in computer rendering you need to average many samples together per pixel to create a reasonable-quality image that isn't a steamy pile of garbage. If this was done at the hardware level in image sensors, I bet that would be awesome.
I remember Photobit. I applied for a job with them in end of the 1990's after I finished my Masters. They had their offices near the University of Oslo, Blindern in Oslo, Norway (I didn't get the job though...)
I was doing some reading on the Henschel Hs 117 surface to air missle and noticed its detonation system used (detonated by acoustic and photoelectric proximity fuses) I suspect they may have used a early photomultiplier tube but I need to do more research.
I would argue that even today CCDs are far better sensors than CMOS, and their "Shutter" can be done electronically as well. This because the electrons on a back side illuminated CCD have to migrate toward the wells by an applied electric field. The big reason why CMOS has "mostly" replaced CCDs is that it is more conducive with the fact that most modern microchips use CMOS technology in their design, and thus it is easier to manufacture, for most fabs. Almost all modern astronomy cameras still use CCD technology as they have better dynamic range, higher quantum efficiency and lower noise. In the last few years I used a modern industrial CCD camera could still out perform almost any other CMOS camera in terms of low light and high QE for a particular application.
I wonder why do the manafacturers not use some form of 3D layering (perhaps like the AMD is using for their 3D V Cache?) to get the benefit of 100% coverage by photo layer. Seems so obvious - and with massive potential benefits - so I have to assume it must be hellishly hard to manafacture.
Sony has been doing it recently, about 2 years. December 16, 2021 Sony Develops World’s First*1 Stacked CMOS Image Sensor Technology with 2-Layer Transistor Pixel Widens Dynamic Range and Reduces Noise by Approximately Doubling*2 Saturation Signal Level*3
If the sensor is basically capturing photons to displace electrons into wells why doesn’t it charge the battery? Does the amplifier drain the charge in excess of additional charge? Maybe the capture process can power the amplifier before the signal is transmitted to the decoder…
Could you do a video on CQDs (Colloidal quantum dots ) and SPAD (single-photon avalanche diode) also how's the potential of such technology to replace CMOS?
Get 25% off Blinkist premium and enjoy 2 memberships for the price of 1! Start your 7-day free trial by clicking on this link: www.blinkist.com/asianometry
pictures at 2:35 and 3:00 is heavily shopped. 📺
I am the blinkist! **starts blinking really hard** see?? **blinks even harder** aaarrrrrggghhhhh!!!! My eye lid muscles are massive!
Plus, I was tired. Bumbaclot!😅
Image sensors are some of the most interesting parts of digital technology today and something which most people take for granted, especially with how cheap these sensors have become.
I fly FPV so I definitely appreciate how good they have become. Transmitting HD video only requires 7-8 grams of hardware which can go 30 miles with the right antennas/conditions. Today it's all about latency, weight and image quality. I can't wait to see what the market looks like in 5 to 10 years.
No one take it for a granted infact most people don't even know it's existed.
They are not digital at all 😅
When I was little, and this is in the early 1990s, textbook says that video cameras used vidicon tubes (look up what that is). When I was in high school they started having DV camcorders, not VHS ones or anything. These were fairly small (by the standards of the day) and took very clear videos. In those days a decent camcorder was about 1000 dollars, and the good ones had 3 CCD sensors, one for each color.
I imagine today camcorders are pretty much a much smaller market, for youtubers or whatever, but most will just be using cell phones. In the early 2000s cell phone cameras were utter shit, it took very grainy images in the best of time. But today their image quality is better than camcorders of the late 1990s.
@@yash_kambli that's literally what people mean when they say "take it for granted" lol.
CCDs were once used as high-speed serial memory devices. Tektronix back in the early 1980s produced a digital oscilloscope using CCD memory which could capture up to 500 million samples per second, faster than any other digital oscilloscope at the time.
Cool! Also check out Tektronix 7912 from 1970s. It was based on a special scan converter CRT and was capable of recording 512 points in 5 nanoseconds. Very good for digitizing fast transient processes in nuclear physics and other similar applications.
@@cogoid Yeah I heard about that. They apparently had a trailer full of them to record data for underground nuclear testing.
@@douro20 Yes, they were using these digitizers to record the actual course of the chain reaction -- how fast the neutrons multiply in the bomb during the explosion. The main engineering difficulty in this is the enormous dynamic range of the signal. To record the entire transient with good accuracy, they used many different sensors to get good resolution at both the low end and the high end of expected values, with a bunch of digitizers working in parallel.
@@cogoid No I'm not checking out anything you suggest anymore - we both know what happened last time
@@douro20 Heard from who? Please send me their contact info
CCDs do not use the photoelectric effect (5:52). The photoelectric effect is when an photon exceeds the workfunction of a surface and creates a free election. Meanwhile CCDs or CMOS image sensors are like solar cells, they create electron-hole pairs in the *bulk* of the crystal. The photon generates an exciton state, which is weakly coupled electron and hole, this exciton is broken in the case of silicon by thermal means and the electron and hole are separated by diffusion or field transport (classic diode transport).
OmGgg yessss thank you glad I'm not the only one
Jesus ...you really understand this magic ....I am impressed....
you are a god
It goes both ways 😉
So yes, the CIA are probably watching you through your displays. Don't even get me started on recording homes in 3D using wifi.
@@defeatSpace Oh no now they know I eat my cats hair I am going to burn all my devices
One thing that should be made clear in the video, is the difference between the photoelectric effect and the photovoltaic effect.
The photoelectric effect is when a photon knock off an electron from a metal and that electron can conduce a current. This doesn't require a junction and was discovered in 1921. The energy of visible light can knock electron off alkaline metals.
The photovoltaic effect is when the photon creates a couple {electron - hole} in a doped PN junction. The junction separates the two and creates an electric current.
The latter is the effect utilised in CCD and CMOS image sensors.
The consumer market may have been taken over by CMOS, but for specialty applications like in astronomy and on spacecrafts, CCD still dominates. Big telescopes, on earth or in space, almost all exclusively use CCDs for their superior image quality.
The smearing issue mentioned at 7:45 is a drawback for consumer products, but is actually a feature for ground observating satellites. It happens because when the charges in one CCD pixel moves to the next pixel, that pixel is still sensitive to light, so the readout process of CCD is essentially like moving the entire sensor across the image field. But on a satellite, if the readout speed of the sensor matches the speed the image of the ground moves across the sensor, you can stabilize the image and get longer integration time for free. The CCD now operates like a scanner, and can churn out an image tens of thousands of pixels long by thousands wide. Many spin stabilized spacecrafts also take advantage of this feature, for example NASA's Juno and ESA's Gaia.
Thats interesting, especially considering CMOS apparently got partially made viable by NASA.
Tbf tho, does that mean that CCD are inherently better for big telescopes, or that CMOS hasnt reached full viability for big telescope sensor? Most of the push for CMOS seems to come from commercial and military cameras, which usually rely on smaller, light and efficient sensors. Maybe research lags behind at large scale.
@@termitreter6545 Of note is... Yeah thy studied it and didn't pick it up. Since yes CCD needs more external hardware and so on, but well they quality was better. Specially compared to early CMOS.
Issue with CMOS is for example heat. Those transistors flipping on the pixel? That generates heat. Not much heat, but enough heat that it matter on highest end scientific use. Since heat means thermal electrons, thermal electrons mean noise. We are talking "we are cooling this in a liquid nitrogen cryogenic devar" levels of "we don't want thermal electrons". As such in that case the "needs external hardware" is benefit, not a hindrance. Needs external hardware? Fantastic, we were anyways planning to separate out as much of the equipment and electronics as far away from the sensor chip as possible to isolate noise.
Plus those amplifiers on each CMOS pixel? Tiny and packed it tight space. Which again often means more noise. They have gotten better with it, but it is still a game of "trying to stuff lot of tiny amplifiers". CCD? Uses one big main amplifier. Which can be made as perfect as possible with little regards for size.
That single amplifier brings another benefit. Consistency. One wants the scientific sensor to be flat. not just "look similar" flat, but perfectly flat and linear in response across the pixels. Since one often uses thos pixels for comparative measurements. If one pixels amplifier is little different responding to another, that is problem. Response not linear and predictable. SO again a "problem" of CCD is a benefit in this speciality case. Since one can be sure the amplifier and analog to digital response is same for all pixels. Since there is only one amplifier and ADC circuit.
However again this isn't mere military or industrial grade situation. They are fine with the little noises and guibles of CMOS. "the picture looks fine" and so on. It works.
This is "we hunt single photons, if we have 2 photons worth of noise on the pixel, that is a problem". Nobody else would care about 2 photons worth of noise, but big telescope hunting for single photons from another galaxy does care.
However in CMOS is also used in astronomy, where it's features benefit. For example lucky imaging, which is based on taking lot of images fast and picking best ones regarding atmospheric conditions. Obviously then, a fast reading CMOS is better.
So it's all about "what is the desired effects" and well for scientific cases sometimes CCDs quirks are benefits. Plus the unbeaten raw quantum efficiency, where CMOS might be "as good as CCD", but in scietific "as good as" is not enough, one wants the best.
@@aritakalo8011 Counterpoint: all of the IR focal plane instruments flying on the JWST are CMOS based, in principle. The photosensitive absorber layer is "read out" by a similar (if not exactly the same) as a CMOS pixel amplifier as found in any modern commercial CMOS sensor.
The LEO satellites that were launched with scanning arrays are "old" in the grand scheme of things. Sure, they work fantastically well, but it can be readily assumed that if it can be replaced by a 2D focal plane, it will. Eventually. Or maybe it already has. Less integration time and more frames can make up for the loss in SNR by averaging.
Generally, any nonlinearities related to the amplifier can be calibrated out with a dark frame (or Correlated Double Sampling, as mentioned in the video) or by sweeping the reset voltage and checking the output of the amplifier. Yes, it's not convenient to have to perform a calibration, but it's generally good for a while, and it can be done as easily as taking an image (no physical access is required).
What i have seen so far, at least when it comes to the big telescopes, they have CCD and CMOS instruments, depending on the wavelengths and usecases.
For visible light CMOS is preferred, not only because of bigger sensor sizes but also because of higher resolution/smaller pixels.
You didn’t mention Kodak in CCD producers. They were a long time leader in the field until Sony took over.
You didn't mention Sony in CCD producers. They were a long time leader in the field until Sony took over.
Kodak built the first digital camera. It was decided that digital camera would take away from the extremely lucrative film business...
@@milantrcka121 yes that might be true on the consumer field, but Kodak high end CCDs were best performing devices until late 2000s. with the leader I didn't mean in volume, but in performance. I would say somewhere in mid 2000s Sony started to make sensors of similar performance to Kodak (their imaging sensor division got taken over by ON Semiconductor) with CCD performance. Kodak sensors at the time were mostly too expensive for consumer cameras.
@@ti75425 No conspiracy present or implied. Just a fact supported by history.
@@100brsta Indeed! Kodak engineering was the best or second to none in many fields especially optics (satellite telescope mirrors) and commercial optics (Instamatic). Also special magnetics and materials. And of course CCDs...
As someone who majored in astrophysics, this is awesome! Most of the telescopes we used during our labs were CCD based if I'm remembering correctly...
Still are from what I'm told. Way more sensitive still.
No
Still are - CCDs utterly thrash CMOS for sensitivity - something telescopes require. CCDs are preferred in movie cameras too - rolling shutter from CMOS can ruin an action scene.
@@JohnnyWednesday All modern digital movie cameras use CMOS with global shutters. Scientific work, like telescopes, is the last bastion of CCDs
Great video! Worked 20 years of my life on CCD’s. Seen its rise and decline.
Very well told!
Jon, another excellent essay. Next in the imaging category... Stacked image sensors similar to the Foveon. Initial patents on the technology have or are about to run out. Newer process nodes that have a better understanding on how to stack components, along with more sophisticated image processing software and hardware portend this to be the next 'big thing'
There's a simpler cousin to the photomultiplier tube. A simple phototube, which works similarly, but does not have the intermediate "dynodes". Their most common use was to sample the audio soundtrack, on old movie film, and convert it to a voltage. The soundtrack was a strip next to the image fields, which would block more or less light, according to the recorded audio level. The light source was fairly bright, so a photomultiplier was not required.
Photomultipliers were, and still are, used in mostly scientific instruments. There was one consumer app, from the 1950s, where GM had several high end car models with actively dimming headlights. The photomultiplier was used to know when to momentarily power off the high beams, reducing the light going to the oncoming car. It was called the Autronic-Eye.
I remember those complicated Autronic Eye systems....
I NEVER did understand why the car manufacturers chose to fuck about with a high-voltage vacuum tube for that, instead of a MUCH more simpler system using a CdS cell... 🤷🏻
Wow, how the *heck* do you manage to do the research your vids require, and still work a day job?! Just incredible content!
I was in the consumer imaging business for many years (still am, on a semi-retired basis). My understanding of what really killed off CCDs in consumer cameras was HD video. CCD image quality at the time was markedly superior, but you couldn’t clock them fast enough to do HD video on higher-resolution chips. You could clock data off a ~3 megapixel CCD chip fast enough to one way or another end up with VGA video, but as chips got bigger with more pixels CCDs couldn’t keep up.
It might have been more the power required for such fast readout rather than the absolute maximum clock speed; I just remember from talking with camera company engineers at the time that it was HD video that was the death knell.
I remember being chagrined at the sharp drop in image quality that happened as a result; as you said, the noise was the problem and became particularly evident at higher ISO (light sensitivity) levels.
Another fantastic video, thanks!
(My one criticism: As someone else noted below, it’s not actually the photoelectric effect that’s at work, it’s that photons make hole-electron pairs, the same phenomenon that drives solar cells. Rather than turning the energy into a continuous output current though, sensor pixels simply collect the charge in the potential wells of diode structures. - Hence, photodiodes :-)
He has a team of researchers helping him out.
This channel is an absolute gold mine, thank you for all the work you do!
Many thanks for taking on this topic! 😊
"And because I was tired."
Understandable, have a nice day.
Preparing for an exam that includes CCD and CMOS image sensors, thank you. This is a life saver!
I suggest an episode focused solely on Canon's CMOS achievements. They committed to the technology early and produced superior results across digicam market segments.
Thanks for your series!
Very good episode! I've always wanted to see your take on imaging technology, and you've presented a good history on it.
I would love to see this continued into subjects like:
- Sony's history with imaging technology
- How cinema camera makers (e.g. ARRI, Red, Blackmagic) sources their imaging sensors
-- IIRC, Blackmagic seems to source their sensor from ON Semi, which you mentioned in this episode
-- Furthermore, their popular Pocket Cinema Camera is so small that their circuitry overlaps, introducing noise. They counter by implementing noise cancellation features
When I think of bubble memory, I think of the non-volatile magnetic memory promoted by TI and others in the 1970's. The development of winchester disks and CMOS memory put the bubble memory to bed. I remember at TI we were also attempting to develop CCD imaging. The problem was that the metal layer on top of the device made it difficult to make the collection area for each pixel as large as it could be. So a technique of sandblasting the back of the chip so thin that light could come through the backside of the chip. Today we talk mostly about the successful ideas, but there was a lot of effort put into those ideas that never made it.
10/10 on the Albert Einstein joke reference. 🤣
I could swear Kodak was involved in the early days of CCDs on satellites.
Yes, Kodak was.
kodak 1st stay on the moon , the guys had a 640x480 mp digital camera , this should be common knowledge . I don't know why Kodak didn't use that bragging right during the digital camera wars
I learned about image sensors in radiology school. You did an exceptional job describing the technologies!
Just a minor issue, but when talking about the shutter of a camera in front of the sensor you are actually showing the aperture blades of a lens which are not the same thing.
Thanks, was going to note same. 😎✌️
10:14
Turn transistors 2 and 5 off, and turn transistor 7 on. Electrons flow in to charge the capacitor (3) negatively.
Now turn transistor 7 off, and transistor 2 on. Light hitting the photodiode (1) will allow some electrons to go to ground... reducing the charge.
Now turn transistor 2 off and transistor 5 on. Assuming transistor 4 is properly configured, electrons will flow through it in proportion to the voltage at the gate (which is the voltage in the capacitor). That current is what's being read by the ADC and converted to a subpixel.
As a hobby photographer, this is fascinating. Great work as always! Hope you got enough rest
Photodiodes make use of the PHOTOVOLTAIC EFFECT, not the photoelectric effect. Only alkali metals (sodium, potassium, etc) have a very low energy threshold within the visible light spectrum. The photovoltaic effect doesn't knock out the electrons, but moves them to the valence band. From there the doping of the silicon works its magic attracting them towards one pole while being repelled from the other.
Exactly lol. PE effect is the ejection of electrons above the vacuum level into free space.
In, Ga, Pb...
Good video. The reason CCDs are still here is the fact they are global shutter by design. CMOS needs another storage node for that making the pixel even more complex requiring also more space. With BSI technology and deep trench isolation they are able to make such small pixels with good dynamic range and sensitivity.
Literally the same is true for GS CCDs as well, no free lunch.
1:18 Ben Thompson says "hi" 😂
16:40 it also helped that those fabs that used to make CPUs and GPUs for the PS2 and PS3 transitioned into making CMOS sensors.
I remember working in a hybrid electronics lab back around 1985-88. We had to test bare chips that were bonded to ceramic substrate before packaging. We quickly found that when testing a class of CMOS and NMOS circuits, we had to turn the lab lights off and work with a subdued desk lamp otherwise crazy things would happen.
I'm all in on imaging. Thanks for this episode.
Kodak also made CCDs and sourced them for camera manufacturers like Leica and Olympus.
Thank you for this video on image sensors, it is very appreciated.
Greetings,
Anthony
An interesting offshoot of these chips is the impending IR / night vision CMOS revolution. T-Rex Arms did a video showing how much clarity you can get from a typical DSLR camera when you remove the IR filter and film at night. Unless there are export/sales restrictions every American militiaman will have access to night vision rivaling the army's new IVAS / ENVG-B. Your phone's camera's will take better night vision photos too.
CMOS sensors are still not sensitive enough to replace night vision devices, and their sensitivity in near IR is not so great either. So, even if you have a modern, monochrome CMOS sensor without any IR filters whatsoever, even fairly old and cheap night vision tube can still beat it in low light conditions, if you want real-time image at least, and don't want to use external IR light source. That's from my experience, I have IMX462 based astro camera, which is highly sensitive to IR (one of the most IR sensitive consumer chips today), and an old British night vision tube (P8079HP) and that tube is clearly better.
However, if you use longer exposures (eg. 1s), then camera can produce much better and image, though it's not that much useful for real time observation (moving objects would get smeared etc.).
We're not quite there yet, but not far away. There are some specialised sensors that can have great performance in (very) low light setting (eg. EMCCD), but they are quite expensive and mostly used for serious science. "Normal" NVG, even those high-end, would still probably be much cheaper.
Just had to drop a merci beaucoup, for the amazing content once more, im always looking forward for some nuts and bolts approach that i feel on this topics that this guy has, 🙏
Huhuhuhuh baguette 🥖
Kinda interesting that right now CCD Sensors are seeing somewhat of a resurgence, big problem still being the amount of energy needed... while CMOS Sensors have hit a wall
Love you channel and videos, was hoping for bit more fundamental explanation of function of CCD, CMOS and improvements in function over generations. Still great topic.
Photoelectric effect was actually used in Vidicon camera in 1920s
The thing on 7:50 is the aperture mechanism - not the shutter. The shutter on most DSLRs are made of two flaps - one that opens and another that closes. On very old cameras it's a blade with a hole that lets light in for the desired by mathing the hole with the camera's permanent hole - that usually had a small lens of glass in front of it. (Old means first half of the 20th century)
The picture at 5:30 shows a magnetic domain memory chip. From my understanding, it's a different type of memory based on a different (and unrelated) physical principle of "magnetic bubbles". So that is not a CCD chip and it's based on a different technology than CCD chips. (Incidentally, I'm not sure the inventors of CCDs used the term "bubbles" for anything other than magnetic domains in that other technology. But I may be mistaken.)
I can see the source of confusion, it's confusing for me as well, because I think magnetic bubbles served as the inspiration for CCDs, of sorts, and they share the general idea and the high level structure. It's just that those two are based on very different physical principles.
thanks for the video as always. I am a complete amateur in digital sensor tech, but i read that the latest jet fighter F-35 has an EOTS system that is capable to see and target something that is 50km+ away of the size of a suitcase. I bet that is sensor tech to its extreme (at least by today's standard). I wonder who are the companies that possess these state of the art sensor tech, and how far behind are US's competitor.
I also read that there are three primary types of chip tech these days: logic chip, memory chip, and sensor chip. I always wanna know which type of chip is the most difficult to develop, and in what way. I bet memory chip is the easiest one to crack, but am unsure whether logic chip or sensor chip is more difficult to advance in the long run. 🙏
Sadly no mention of BSI and dual gain, 2 tricks that actually make CMOS wipe CCD even in low light w/o active cooling. And ARRI ALEXA sensor clever trick that not only measure charge, but also time untill capacitor get full (oversatureted ) that make even more information and dynamic range that surpass even analog cinema film.
Charge injection device 🤔
I remember "bubble" as magnetic memory. CCD was called a "bucket brigade device" at the time, and could be used as analogue delay line.
BBD chips are still used to produce echo and especially chorus effects for guitars and synths. While this could be done digitally, analogue synths and effects have become fashionable enough that BBD chips are back in production.
Photomultiplier tubes are still a current technology, there is no modern replacement for those. Its basically the only kind of vacuum tube thats still beeing produced in large volumes for serious applications (Aside from magnetrons).
All TRUE night vision devices use them, as there is no solid state alternative thatbworks nearly as well as a photomultiplier tube does.
There are still other types of vacuum tubes made in large quantities for today's applications.
Almost EVERY household in America uses a fairly high powered vacuum tube in their kitchens on a daily basis.... The Magnetron in their microwaves.
Edit: Oops, didn't see the Magnetron listed at the end of your post...
Audio tubes are another one that is still produces in large quantities today. MANY musical instument amplifiers still use tubes today for their pleasing sounding distortion when pushed beyond their ratings.
Transmitting tubes are still alive today also.
Really tip my hat to the scientists and developers that had strength of stomach and grit to risk their years and fortunes on these gambles..🙏
First a quick note, Caltech does not capitalize the T, even though it seems like its an abbreviation. Second, also wanted to mention the obscure sensor technology from Foveon that tried to use multi-layered image diodes to capture different wavelengths without a filter. Unfortunately it was complicated enough to fabricate that they could not easily ride process scaling to produce better and better chips, though they still struggle on as part of Sigma and promise they'll have their next generation sensor out any day now.
The sensitivity of the Foveon sensor was also not great - they rely on different wavelengths being able to penetrate different depths into the sensor - red being able to penetrate the deepest. However if the red bit isn't strong enough it doesn't get detected in the red layer and can cause issues...
Basically they just thought it's not a worthwhile tradeoff for most applications, especially when demosaicing have been refined to the point that it's good enough and cheap enough to do for 99% of colour applications.
Still a fan of mono sensors though!
A really interesting imaging tech is the fovean sensor. Its based on how far light of different energy penetrate into silicon.
Sure is but try to manage crosstalk between the colors. I think that makes this technology way too expensive or unusable.
Ehh... PMT's are not imaging devices, you meant camera tubes like Vidicon. CID sensors were completely skipped. CDS was originally used for CCD to get rid of the reset noise. EMCCD, that is basically a solid state, imaging PMT was also missed. Kodak and their transparent gate technology, SITe and their early back-illuminated CCDs (that were used on space probes), Foveon and LBCAST sensors were all missed. And is there a CMOS sensor with ADC in each pixel? There are ones with ADC in every row, is all. Many, many interesting technical details and historycal events ignored or incorrectly presented.
15:42 You said “own version of Moore’s Law” at the same time I got a tweet notification informing the passing of Gordon Moore 😢
The odds.
10:13 😁
Why would i need Blinkist, when i got you...
I believe that it's more appropriate to speak of photovoltaic rather than photoelectric effect in this case, as electrons aren't ejected from the material, but promoted from valence to conduction band.
Thank you for the clear information.
Thank you for taking on such a difficult subject. Optics are a rabbit hole.
You are an absolute champion of the people for providing such accurate and useful information in a way that normal working class individuals can understand.
There is a lot more room for advances in this area, imagine a single photon detector on top of each column of a 3D memory so that the photons were counted digitally at each pixel as they came in. The next step beyond that is to bin the counts by photon energy level, colour. Then you have every single pixel as sensitive as possible and acting as a spectrometer.
Better light traps inspired by photosynthesis.
@@brodriguez11000 The quantum of light required in photosynthesis is 8 photons. The ideal system is superconducting nanowires, that exists now, but the superconducting part would be hard, in a consumer device.
There's a limit on how far we can push sensor technologies. QE on modern sensors is already very high, the noise at high gain(high ISO) is already dominated by shot noise, the discrete nature of light. But SPAD based sensors would be a huge step up for dynamic range, as we are no longer limited by each pixel's well capacity. But unfortunately I don't think reading out the energy of the incoming photon is something that can be done with current semiconductor technologies? Maybe some 'quantum' sensors can do that? Idk
That ain't gonna happen lmao
@@HailAzathoth I don't know, the individual parts already exist, why specifically do you think that they can't be integrated in the future?
I would love to see a video about thermal imaging
I understood CCD was primarily used for Astronomy and have the advantage of allowing extended exposure times, with less noise, I think the term "light bucket" was used for this. As in the video says also for non-visible light. As CMOS improves it has also started being used in Astronomy. I wonder how this related to Infrared cameras?
All silicon devices are inherently sensitive to light, especially infrared. Both CCD and CMOS image sensors are very sensitive to infrared. In normal visible-light cameras, we use IR-blocking filters (visible at 7:00, it is the pale blue glass window of the sensor module). In pure infrared cameras, we use visible-light-blocking filters (conceptually similar to the purplish-black infrared windows on many remote controls and the devices they control).
UV is a different matter: Silicon tends not to be particularly sensitive to it, so they have to deliberately design them for it. (Together with the choice of lens materials, since many common types of glass and optical plastic tend to block UV.)
Thank you for an interesting video. I worked on CCDs, using them as analogue data storage in 1972 and found out the hard way that they were light sensitive as my test circuits failed when the sun came out. I haven't had a cause to use them since. Though I did have a Nokia N95 phone for a while. I always wondered why they were being replaced by CMOS.
It's worth noting that CCDs are essential for telescopes and they're vastly preferred in movie cameras as they aren't subject to rolling shutter.
There are global shutter CMOS chips and have been for several years now. CMOS has basically taken over the amateur astrophotography market as well because of superior noise characteristics - all my astronomy cameras have been CMOS. CCDs still have some relevance for scientific purposes (I think?) but CCDs are kind of dinosaur tech at the amateur level.
Thanks so much that I was unaware of
Perfect timing. I'm currently working on a digital camera for a large telescope.
I was just thinking about this last week....perfect.
Two wrong information I found:
1. 2:13 this diagram is not related to the photoelectric effect. This is Hertz's other experiment that proved the existence of electromagnetic waves.
2. You mentioned that the PMTs have 5 to 7 dynodes. Actually it may have many more. 10, 12 dynode PMTs are easily available. We use 12 stage PMTs in our lab.
seeing how the CCD transfer data, opposed to crossbar AM arrays, it reminded me of racetrack memory. racetrack memory stores information using electron spins. a correlated electron spin gives you low resistance path, and opposing electron spin gives high resistance path. then you can use spin-orbital coupling, apply current, transfer spin, etc... take a look if interested :P
Thank you ! This was fantastic , Even the comment section is awesome and informative , That's when you know it's great.
Wow. Super cool to Learn. I'm old. I remember full size camcorders and even 3 ccd cams.
Image at 2:16 doesnt have anything to do with light. It's an early demonstration of radio waves.
I like how the show a woman using a vacuum cleaner while she listens to their digests! I really want it! That silent vacuum cleaner, I mean.
I appreciate your channel. Thanks!
Please Make an episode about phased array radar and qam modulation
Yeah finally drones "coverage" baby!
CCDs are a current based technology, whereas CMOS is a voltage based technology. As the resolution increases, the heat from the current required for CCDs increases. This became a real problem for professional video camera makers in the early days of HD. The increased resolution of HD made the smaller CCD chips impossible because of the excess heat. While CMOS was a possible solution, the rolling shutter effect of CMOS was disliked by professional video producers. Because this was the only solution to shooting HD on smaller chips, it took time for the professional video market to accept this limitation.
Photo CCD. Not to be confused with AMD's CCDs which are Core Complex Dies.
I happily, recently acquired a used Phase One IQ 280 digital back & XF body. I chose that over the Fujifilm GFX 100 II. I already have the Sony a7rIV... which is close enough to the GFX camera.
Great explanation, thanks!
I'm sorry, but you are confusing bubble memory, which works by having segments of magnetic fields on the material and CCDs which use charges. CCDs have, BTW been used well into the 1980s for storage devices.
What do you think abut the Dynamic Vision Sensors? (Event cameras)
The g in 'Hughes' is silent. Thanks for the video.
Eric Fossum is still active on the DPReview Photographic Science and Technology discussion group... he has many interesting threads about his research.
Always hoped the Foveon sensor got used more, kind of interesting to see a third player.
Modern CMOS image sensors (CIS) have solved the fill factor issue by using Back Side Illumination (BSI) circa 2010 then this approach led to Stacking (2 layer semiconductor) circa 2015 and now triple stack is being expected as the big news of iphone release 2023, all this 3D semiconductor approach is specific to CIS but has much wider implications, such as memory stacking, this could be another story covered by this wonderfull chanel
This is such a great channel. You might drop the Asian focused byline and just do tech wordwide (which you mostly seem to do). You are really good at it. Thanks!
Have you already done an episode on bubble memory?
What about that SIGMA Fovion image sensor?
This makes me think of how in computer rendering you need to average many samples together per pixel to create a reasonable-quality image that isn't a steamy pile of garbage. If this was done at the hardware level in image sensors, I bet that would be awesome.
I remember Photobit. I applied for a job with them in end of the 1990's after I finished my Masters. They had their offices near the University of Oslo, Blindern in Oslo, Norway (I didn't get the job though...)
I was doing some reading on the Henschel Hs 117 surface to air missle and noticed its detonation system used (detonated by acoustic and photoelectric proximity fuses) I suspect they may have used a early photomultiplier tube but I need to do more research.
Can you do a video on the new m3p battery that catl is producing?
I would argue that even today CCDs are far better sensors than CMOS, and their "Shutter" can be done electronically as well. This because the electrons on a back side illuminated CCD have to migrate toward the wells by an applied electric field. The big reason why CMOS has "mostly" replaced CCDs is that it is more conducive with the fact that most modern microchips use CMOS technology in their design, and thus it is easier to manufacture, for most fabs. Almost all modern astronomy cameras still use CCD technology as they have better dynamic range, higher quantum efficiency and lower noise. In the last few years I used a modern industrial CCD camera could still out perform almost any other CMOS camera in terms of low light and high QE for a particular application.
Please make video on America's recent breakthrough in Quantum computing and ChatGPT!
Lol you really got to do an appropriate “that X?… Albert Einstein” joke 😂
4:26 forbidden brie wedge
I wonder why do the manafacturers not use some form of 3D layering (perhaps like the AMD is using for their 3D V Cache?) to get the benefit of 100% coverage by photo layer.
Seems so obvious - and with massive potential benefits - so I have to assume it must be hellishly hard to manafacture.
Sony has been doing it recently, about 2 years. December 16, 2021
Sony Develops World’s First*1 Stacked CMOS Image Sensor Technology
with 2-Layer Transistor Pixel
Widens Dynamic Range and Reduces Noise by
Approximately Doubling*2 Saturation Signal Level*3
What about SPAD based image sensor ICs?
the more a global shutter/parallel shift registers registers/synchronous adding/progressing makes sence
If the sensor is basically capturing photons to displace electrons into wells why doesn’t it charge the battery? Does the amplifier drain the charge in excess of additional charge? Maybe the capture process can power the amplifier before the signal is transmitted to the decoder…
The CCDs are still used in scientific machines where they cool them down to absolute zero to mitigate thermal noise.
“And now it is time for me to explain how this accursed thing works”. 😂
Could you do a video on CQDs (Colloidal quantum dots ) and SPAD (single-photon avalanche diode) also how's the potential of such technology to replace CMOS?
Bro did that artfully.
Cool. Thanks for sharing.