The Chips That See: Rise of the Image Sensor

Поделиться
HTML-код
  • Опубликовано: 26 дек 2024

Комментарии • 261

  • @Asianometry
    @Asianometry  Год назад +19

    Get 25% off Blinkist premium and enjoy 2 memberships for the price of 1! Start your 7-day free trial by clicking on this link: www.blinkist.com/asianometry

    • @donaldharlan3981
      @donaldharlan3981 Год назад +2

      pictures at 2:35 and 3:00 is heavily shopped. 📺

    • @annoloki
      @annoloki Год назад

      I am the blinkist! **starts blinking really hard** see?? **blinks even harder** aaarrrrrggghhhhh!!!! My eye lid muscles are massive!

    • @Jblow-u2m
      @Jblow-u2m 4 месяца назад

      Plus, I was tired. Bumbaclot!😅

  • @mcmann7149
    @mcmann7149 Год назад +205

    Image sensors are some of the most interesting parts of digital technology today and something which most people take for granted, especially with how cheap these sensors have become.

    • @ChrisHarmon1
      @ChrisHarmon1 Год назад +11

      I fly FPV so I definitely appreciate how good they have become. Transmitting HD video only requires 7-8 grams of hardware which can go 30 miles with the right antennas/conditions. Today it's all about latency, weight and image quality. I can't wait to see what the market looks like in 5 to 10 years.

    • @yash_kambli
      @yash_kambli Год назад +4

      No one take it for a granted infact most people don't even know it's existed.

    • @OlorinEa
      @OlorinEa Год назад

      They are not digital at all 😅

    • @taiwanluthiers
      @taiwanluthiers Год назад

      When I was little, and this is in the early 1990s, textbook says that video cameras used vidicon tubes (look up what that is). When I was in high school they started having DV camcorders, not VHS ones or anything. These were fairly small (by the standards of the day) and took very clear videos. In those days a decent camcorder was about 1000 dollars, and the good ones had 3 CCD sensors, one for each color.
      I imagine today camcorders are pretty much a much smaller market, for youtubers or whatever, but most will just be using cell phones. In the early 2000s cell phone cameras were utter shit, it took very grainy images in the best of time. But today their image quality is better than camcorders of the late 1990s.

    • @backgammonbacon
      @backgammonbacon 11 месяцев назад

      @@yash_kambli that's literally what people mean when they say "take it for granted" lol.

  • @douro20
    @douro20 Год назад +124

    CCDs were once used as high-speed serial memory devices. Tektronix back in the early 1980s produced a digital oscilloscope using CCD memory which could capture up to 500 million samples per second, faster than any other digital oscilloscope at the time.

    • @cogoid
      @cogoid Год назад +22

      Cool! Also check out Tektronix 7912 from 1970s. It was based on a special scan converter CRT and was capable of recording 512 points in 5 nanoseconds. Very good for digitizing fast transient processes in nuclear physics and other similar applications.

    • @douro20
      @douro20 Год назад +14

      @@cogoid Yeah I heard about that. They apparently had a trailer full of them to record data for underground nuclear testing.

    • @cogoid
      @cogoid Год назад +15

      @@douro20 Yes, they were using these digitizers to record the actual course of the chain reaction -- how fast the neutrons multiply in the bomb during the explosion. The main engineering difficulty in this is the enormous dynamic range of the signal. To record the entire transient with good accuracy, they used many different sensors to get good resolution at both the low end and the high end of expected values, with a bunch of digitizers working in parallel.

    • @TimPerfetto
      @TimPerfetto Год назад

      @@cogoid No I'm not checking out anything you suggest anymore - we both know what happened last time

    • @TimPerfetto
      @TimPerfetto Год назад

      @@douro20 Heard from who? Please send me their contact info

  • @drwho9437
    @drwho9437 Год назад +139

    CCDs do not use the photoelectric effect (5:52). The photoelectric effect is when an photon exceeds the workfunction of a surface and creates a free election. Meanwhile CCDs or CMOS image sensors are like solar cells, they create electron-hole pairs in the *bulk* of the crystal. The photon generates an exciton state, which is weakly coupled electron and hole, this exciton is broken in the case of silicon by thermal means and the electron and hole are separated by diffusion or field transport (classic diode transport).

    • @TimPerfetto
      @TimPerfetto Год назад +4

      OmGgg yessss thank you glad I'm not the only one

    • @elizabethwinsor-strumpetqueen
      @elizabethwinsor-strumpetqueen Год назад +7

      Jesus ...you really understand this magic ....I am impressed....

    • @aternias
      @aternias Год назад +2

      you are a god

    • @defeatSpace
      @defeatSpace Год назад +5

      It goes both ways 😉
      So yes, the CIA are probably watching you through your displays. Don't even get me started on recording homes in 3D using wifi.

    • @TimPerfetto
      @TimPerfetto Год назад +3

      @@defeatSpace Oh no now they know I eat my cats hair I am going to burn all my devices

  • @rayoflight62
    @rayoflight62 Год назад +9

    One thing that should be made clear in the video, is the difference between the photoelectric effect and the photovoltaic effect.
    The photoelectric effect is when a photon knock off an electron from a metal and that electron can conduce a current. This doesn't require a junction and was discovered in 1921. The energy of visible light can knock electron off alkaline metals.
    The photovoltaic effect is when the photon creates a couple {electron - hole} in a doped PN junction. The junction separates the two and creates an electric current.
    The latter is the effect utilised in CCD and CMOS image sensors.

  • @Xiaotian_Guan
    @Xiaotian_Guan Год назад +41

    The consumer market may have been taken over by CMOS, but for specialty applications like in astronomy and on spacecrafts, CCD still dominates. Big telescopes, on earth or in space, almost all exclusively use CCDs for their superior image quality.
    The smearing issue mentioned at 7:45 is a drawback for consumer products, but is actually a feature for ground observating satellites. It happens because when the charges in one CCD pixel moves to the next pixel, that pixel is still sensitive to light, so the readout process of CCD is essentially like moving the entire sensor across the image field. But on a satellite, if the readout speed of the sensor matches the speed the image of the ground moves across the sensor, you can stabilize the image and get longer integration time for free. The CCD now operates like a scanner, and can churn out an image tens of thousands of pixels long by thousands wide. Many spin stabilized spacecrafts also take advantage of this feature, for example NASA's Juno and ESA's Gaia.

    • @termitreter6545
      @termitreter6545 Год назад +3

      Thats interesting, especially considering CMOS apparently got partially made viable by NASA.
      Tbf tho, does that mean that CCD are inherently better for big telescopes, or that CMOS hasnt reached full viability for big telescope sensor? Most of the push for CMOS seems to come from commercial and military cameras, which usually rely on smaller, light and efficient sensors. Maybe research lags behind at large scale.

    • @aritakalo8011
      @aritakalo8011 Год назад +10

      @@termitreter6545 Of note is... Yeah thy studied it and didn't pick it up. Since yes CCD needs more external hardware and so on, but well they quality was better. Specially compared to early CMOS.
      Issue with CMOS is for example heat. Those transistors flipping on the pixel? That generates heat. Not much heat, but enough heat that it matter on highest end scientific use. Since heat means thermal electrons, thermal electrons mean noise. We are talking "we are cooling this in a liquid nitrogen cryogenic devar" levels of "we don't want thermal electrons". As such in that case the "needs external hardware" is benefit, not a hindrance. Needs external hardware? Fantastic, we were anyways planning to separate out as much of the equipment and electronics as far away from the sensor chip as possible to isolate noise.
      Plus those amplifiers on each CMOS pixel? Tiny and packed it tight space. Which again often means more noise. They have gotten better with it, but it is still a game of "trying to stuff lot of tiny amplifiers". CCD? Uses one big main amplifier. Which can be made as perfect as possible with little regards for size.
      That single amplifier brings another benefit. Consistency. One wants the scientific sensor to be flat. not just "look similar" flat, but perfectly flat and linear in response across the pixels. Since one often uses thos pixels for comparative measurements. If one pixels amplifier is little different responding to another, that is problem. Response not linear and predictable. SO again a "problem" of CCD is a benefit in this speciality case. Since one can be sure the amplifier and analog to digital response is same for all pixels. Since there is only one amplifier and ADC circuit.
      However again this isn't mere military or industrial grade situation. They are fine with the little noises and guibles of CMOS. "the picture looks fine" and so on. It works.
      This is "we hunt single photons, if we have 2 photons worth of noise on the pixel, that is a problem". Nobody else would care about 2 photons worth of noise, but big telescope hunting for single photons from another galaxy does care.
      However in CMOS is also used in astronomy, where it's features benefit. For example lucky imaging, which is based on taking lot of images fast and picking best ones regarding atmospheric conditions. Obviously then, a fast reading CMOS is better.
      So it's all about "what is the desired effects" and well for scientific cases sometimes CCDs quirks are benefits. Plus the unbeaten raw quantum efficiency, where CMOS might be "as good as CCD", but in scietific "as good as" is not enough, one wants the best.

    • @pulpufictione
      @pulpufictione Год назад +2

      ​@@aritakalo8011 Counterpoint: all of the IR focal plane instruments flying on the JWST are CMOS based, in principle. The photosensitive absorber layer is "read out" by a similar (if not exactly the same) as a CMOS pixel amplifier as found in any modern commercial CMOS sensor.
      The LEO satellites that were launched with scanning arrays are "old" in the grand scheme of things. Sure, they work fantastically well, but it can be readily assumed that if it can be replaced by a 2D focal plane, it will. Eventually. Or maybe it already has. Less integration time and more frames can make up for the loss in SNR by averaging.
      Generally, any nonlinearities related to the amplifier can be calibrated out with a dark frame (or Correlated Double Sampling, as mentioned in the video) or by sweeping the reset voltage and checking the output of the amplifier. Yes, it's not convenient to have to perform a calibration, but it's generally good for a while, and it can be done as easily as taking an image (no physical access is required).

    • @harrison00xXx
      @harrison00xXx Год назад +1

      What i have seen so far, at least when it comes to the big telescopes, they have CCD and CMOS instruments, depending on the wavelengths and usecases.
      For visible light CMOS is preferred, not only because of bigger sensor sizes but also because of higher resolution/smaller pixels.

  • @100brsta
    @100brsta Год назад +98

    You didn’t mention Kodak in CCD producers. They were a long time leader in the field until Sony took over.

    • @TimPerfetto
      @TimPerfetto Год назад +12

      You didn't mention Sony in CCD producers. They were a long time leader in the field until Sony took over.

    • @milantrcka121
      @milantrcka121 Год назад +13

      Kodak built the first digital camera. It was decided that digital camera would take away from the extremely lucrative film business...

    • @100brsta
      @100brsta Год назад +4

      @@milantrcka121 yes that might be true on the consumer field, but Kodak high end CCDs were best performing devices until late 2000s. with the leader I didn't mean in volume, but in performance. I would say somewhere in mid 2000s Sony started to make sensors of similar performance to Kodak (their imaging sensor division got taken over by ON Semiconductor) with CCD performance. Kodak sensors at the time were mostly too expensive for consumer cameras.

    • @milantrcka121
      @milantrcka121 Год назад +1

      @@ti75425 No conspiracy present or implied. Just a fact supported by history.

    • @milantrcka121
      @milantrcka121 Год назад +4

      @@100brsta Indeed! Kodak engineering was the best or second to none in many fields especially optics (satellite telescope mirrors) and commercial optics (Instamatic). Also special magnetics and materials. And of course CCDs...

  • @pcfdvd1
    @pcfdvd1 Год назад +33

    As someone who majored in astrophysics, this is awesome! Most of the telescopes we used during our labs were CCD based if I'm remembering correctly...

    • @H0mework
      @H0mework Год назад +8

      Still are from what I'm told. Way more sensitive still.

    • @TimPerfetto
      @TimPerfetto Год назад

      No

    • @JohnnyWednesday
      @JohnnyWednesday Год назад +2

      Still are - CCDs utterly thrash CMOS for sensitivity - something telescopes require. CCDs are preferred in movie cameras too - rolling shutter from CMOS can ruin an action scene.

    • @jackiecs8190
      @jackiecs8190 2 месяца назад +2

      @@JohnnyWednesday All modern digital movie cameras use CMOS with global shutters. Scientific work, like telescopes, is the last bastion of CCDs

  • @dnbeuf72
    @dnbeuf72 Год назад +6

    Great video! Worked 20 years of my life on CCD’s. Seen its rise and decline.
    Very well told!

  • @fredinit
    @fredinit Год назад +14

    Jon, another excellent essay. Next in the imaging category... Stacked image sensors similar to the Foveon. Initial patents on the technology have or are about to run out. Newer process nodes that have a better understanding on how to stack components, along with more sophisticated image processing software and hardware portend this to be the next 'big thing'

  • @michaelmoorrees3585
    @michaelmoorrees3585 Год назад +13

    There's a simpler cousin to the photomultiplier tube. A simple phototube, which works similarly, but does not have the intermediate "dynodes". Their most common use was to sample the audio soundtrack, on old movie film, and convert it to a voltage. The soundtrack was a strip next to the image fields, which would block more or less light, according to the recorded audio level. The light source was fairly bright, so a photomultiplier was not required.
    Photomultipliers were, and still are, used in mostly scientific instruments. There was one consumer app, from the 1950s, where GM had several high end car models with actively dimming headlights. The photomultiplier was used to know when to momentarily power off the high beams, reducing the light going to the oncoming car. It was called the Autronic-Eye.

    • @davelowets
      @davelowets 4 месяца назад

      I remember those complicated Autronic Eye systems....
      I NEVER did understand why the car manufacturers chose to fuck about with a high-voltage vacuum tube for that, instead of a MUCH more simpler system using a CdS cell... 🤷🏻

  • @DEtchells
    @DEtchells Год назад +39

    Wow, how the *heck* do you manage to do the research your vids require, and still work a day job?! Just incredible content!
    I was in the consumer imaging business for many years (still am, on a semi-retired basis). My understanding of what really killed off CCDs in consumer cameras was HD video. CCD image quality at the time was markedly superior, but you couldn’t clock them fast enough to do HD video on higher-resolution chips. You could clock data off a ~3 megapixel CCD chip fast enough to one way or another end up with VGA video, but as chips got bigger with more pixels CCDs couldn’t keep up.
    It might have been more the power required for such fast readout rather than the absolute maximum clock speed; I just remember from talking with camera company engineers at the time that it was HD video that was the death knell.
    I remember being chagrined at the sharp drop in image quality that happened as a result; as you said, the noise was the problem and became particularly evident at higher ISO (light sensitivity) levels.
    Another fantastic video, thanks!
    (My one criticism: As someone else noted below, it’s not actually the photoelectric effect that’s at work, it’s that photons make hole-electron pairs, the same phenomenon that drives solar cells. Rather than turning the energy into a continuous output current though, sensor pixels simply collect the charge in the potential wells of diode structures. - Hence, photodiodes :-)

    • @MithunOnTheNet
      @MithunOnTheNet Год назад +1

      He has a team of researchers helping him out.

  • @Soosss
    @Soosss Год назад +20

    This channel is an absolute gold mine, thank you for all the work you do!

  • @punditgi
    @punditgi Год назад +9

    Many thanks for taking on this topic! 😊

  • @brothergrimaldus3836
    @brothergrimaldus3836 Год назад +9

    "And because I was tired."
    Understandable, have a nice day.

  • @Smd1731
    @Smd1731 17 дней назад

    Preparing for an exam that includes CCD and CMOS image sensors, thank you. This is a life saver!

  • @stefanmarraccini8646
    @stefanmarraccini8646 10 месяцев назад +1

    I suggest an episode focused solely on Canon's CMOS achievements. They committed to the technology early and produced superior results across digicam market segments.
    Thanks for your series!

  • @coolandsmartrr
    @coolandsmartrr Год назад +4

    Very good episode! I've always wanted to see your take on imaging technology, and you've presented a good history on it.
    I would love to see this continued into subjects like:
    - Sony's history with imaging technology
    - How cinema camera makers (e.g. ARRI, Red, Blackmagic) sources their imaging sensors
    -- IIRC, Blackmagic seems to source their sensor from ON Semi, which you mentioned in this episode
    -- Furthermore, their popular Pocket Cinema Camera is so small that their circuitry overlaps, introducing noise. They counter by implementing noise cancellation features

  • @makerspace533
    @makerspace533 Год назад +3

    When I think of bubble memory, I think of the non-volatile magnetic memory promoted by TI and others in the 1970's. The development of winchester disks and CMOS memory put the bubble memory to bed. I remember at TI we were also attempting to develop CCD imaging. The problem was that the metal layer on top of the device made it difficult to make the collection area for each pixel as large as it could be. So a technique of sandblasting the back of the chip so thin that light could come through the backside of the chip. Today we talk mostly about the successful ideas, but there was a lot of effort put into those ideas that never made it.

  • @DimbleWally
    @DimbleWally Год назад +7

    10/10 on the Albert Einstein joke reference. 🤣

  • @giszTube
    @giszTube Год назад +22

    I could swear Kodak was involved in the early days of CCDs on satellites.

    • @milantrcka121
      @milantrcka121 Год назад +4

      Yes, Kodak was.

    • @mikereilly2745
      @mikereilly2745 Месяц назад

      kodak 1st stay on the moon , the guys had a 640x480 mp digital camera , this should be common knowledge . I don't know why Kodak didn't use that bragging right during the digital camera wars

  • @-gg8342
    @-gg8342 9 месяцев назад

    I learned about image sensors in radiology school. You did an exceptional job describing the technologies!

  • @Sunbro123
    @Sunbro123 Год назад +7

    Just a minor issue, but when talking about the shutter of a camera in front of the sensor you are actually showing the aperture blades of a lens which are not the same thing.

    • @gus473
      @gus473 Год назад +1

      Thanks, was going to note same. 😎✌️

  • @Falcrist
    @Falcrist Год назад

    10:14
    Turn transistors 2 and 5 off, and turn transistor 7 on. Electrons flow in to charge the capacitor (3) negatively.
    Now turn transistor 7 off, and transistor 2 on. Light hitting the photodiode (1) will allow some electrons to go to ground... reducing the charge.
    Now turn transistor 2 off and transistor 5 on. Assuming transistor 4 is properly configured, electrons will flow through it in proportion to the voltage at the gate (which is the voltage in the capacitor). That current is what's being read by the ADC and converted to a subpixel.

  • @JoshuaC923
    @JoshuaC923 Год назад +3

    As a hobby photographer, this is fascinating. Great work as always! Hope you got enough rest

  • @TheJamieRamone
    @TheJamieRamone Год назад +5

    Photodiodes make use of the PHOTOVOLTAIC EFFECT, not the photoelectric effect. Only alkali metals (sodium, potassium, etc) have a very low energy threshold within the visible light spectrum. The photovoltaic effect doesn't knock out the electrons, but moves them to the valence band. From there the doping of the silicon works its magic attracting them towards one pole while being repelled from the other.

    • @HailAzathoth
      @HailAzathoth Год назад +2

      Exactly lol. PE effect is the ejection of electrons above the vacuum level into free space.

    • @Fulcanelli88
      @Fulcanelli88 Год назад

      In, Ga, Pb...

  • @hobbykip
    @hobbykip Год назад +3

    Good video. The reason CCDs are still here is the fact they are global shutter by design. CMOS needs another storage node for that making the pixel even more complex requiring also more space. With BSI technology and deep trench isolation they are able to make such small pixels with good dynamic range and sensitivity.

    • @-szega
      @-szega Год назад

      Literally the same is true for GS CCDs as well, no free lunch.

  • @archie4oz
    @archie4oz Год назад +2

    1:18 Ben Thompson says "hi" 😂
    16:40 it also helped that those fabs that used to make CPUs and GPUs for the PS2 and PS3 transitioned into making CMOS sensors.

  • @pretzelogic2689
    @pretzelogic2689 4 месяца назад

    I remember working in a hybrid electronics lab back around 1985-88. We had to test bare chips that were bonded to ceramic substrate before packaging. We quickly found that when testing a class of CMOS and NMOS circuits, we had to turn the lab lights off and work with a subdued desk lamp otherwise crazy things would happen.

  • @richleyden6839
    @richleyden6839 Год назад +1

    I'm all in on imaging. Thanks for this episode.

  • @bborkzilla
    @bborkzilla Год назад +2

    Kodak also made CCDs and sourced them for camera manufacturers like Leica and Olympus.

  • @rayoflight62
    @rayoflight62 Год назад +1

    Thank you for this video on image sensors, it is very appreciated.
    Greetings,
    Anthony

  • @JohnDobak
    @JohnDobak Год назад +4

    An interesting offshoot of these chips is the impending IR / night vision CMOS revolution. T-Rex Arms did a video showing how much clarity you can get from a typical DSLR camera when you remove the IR filter and film at night. Unless there are export/sales restrictions every American militiaman will have access to night vision rivaling the army's new IVAS / ENVG-B. Your phone's camera's will take better night vision photos too.

    • @Krzysztof_z_Bagien
      @Krzysztof_z_Bagien Год назад

      CMOS sensors are still not sensitive enough to replace night vision devices, and their sensitivity in near IR is not so great either. So, even if you have a modern, monochrome CMOS sensor without any IR filters whatsoever, even fairly old and cheap night vision tube can still beat it in low light conditions, if you want real-time image at least, and don't want to use external IR light source. That's from my experience, I have IMX462 based astro camera, which is highly sensitive to IR (one of the most IR sensitive consumer chips today), and an old British night vision tube (P8079HP) and that tube is clearly better.
      However, if you use longer exposures (eg. 1s), then camera can produce much better and image, though it's not that much useful for real time observation (moving objects would get smeared etc.).
      We're not quite there yet, but not far away. There are some specialised sensors that can have great performance in (very) low light setting (eg. EMCCD), but they are quite expensive and mostly used for serious science. "Normal" NVG, even those high-end, would still probably be much cheaper.

  • @Luis-qe8el
    @Luis-qe8el Год назад +3

    Just had to drop a merci beaucoup, for the amazing content once more, im always looking forward for some nuts and bolts approach that i feel on this topics that this guy has, 🙏

  • @LoFiAxolotl
    @LoFiAxolotl Год назад +1

    Kinda interesting that right now CCD Sensors are seeing somewhat of a resurgence, big problem still being the amount of energy needed... while CMOS Sensors have hit a wall

  • @squeezedoz
    @squeezedoz Год назад +2

    Love you channel and videos, was hoping for bit more fundamental explanation of function of CCD, CMOS and improvements in function over generations. Still great topic.

  • @dilipdas5777
    @dilipdas5777 Год назад +2

    Photoelectric effect was actually used in Vidicon camera in 1920s

  • @maxheadrom3088
    @maxheadrom3088 Год назад

    The thing on 7:50 is the aperture mechanism - not the shutter. The shutter on most DSLRs are made of two flaps - one that opens and another that closes. On very old cameras it's a blade with a hole that lets light in for the desired by mathing the hole with the camera's permanent hole - that usually had a small lens of glass in front of it. (Old means first half of the 20th century)

  • @tetraphobie
    @tetraphobie 5 месяцев назад

    The picture at 5:30 shows a magnetic domain memory chip. From my understanding, it's a different type of memory based on a different (and unrelated) physical principle of "magnetic bubbles". So that is not a CCD chip and it's based on a different technology than CCD chips. (Incidentally, I'm not sure the inventors of CCDs used the term "bubbles" for anything other than magnetic domains in that other technology. But I may be mistaken.)
    I can see the source of confusion, it's confusing for me as well, because I think magnetic bubbles served as the inspiration for CCDs, of sorts, and they share the general idea and the high level structure. It's just that those two are based on very different physical principles.

  • @naniyo0
    @naniyo0 Год назад +1

    thanks for the video as always. I am a complete amateur in digital sensor tech, but i read that the latest jet fighter F-35 has an EOTS system that is capable to see and target something that is 50km+ away of the size of a suitcase. I bet that is sensor tech to its extreme (at least by today's standard). I wonder who are the companies that possess these state of the art sensor tech, and how far behind are US's competitor.
    I also read that there are three primary types of chip tech these days: logic chip, memory chip, and sensor chip. I always wanna know which type of chip is the most difficult to develop, and in what way. I bet memory chip is the easiest one to crack, but am unsure whether logic chip or sensor chip is more difficult to advance in the long run. 🙏

  • @AABB-px8lc
    @AABB-px8lc Год назад +4

    Sadly no mention of BSI and dual gain, 2 tricks that actually make CMOS wipe CCD even in low light w/o active cooling. And ARRI ALEXA sensor clever trick that not only measure charge, but also time untill capacitor get full (oversatureted ) that make even more information and dynamic range that surpass even analog cinema film.

    • @gus473
      @gus473 Год назад

      Charge injection device 🤔

  • @pizzablender
    @pizzablender Год назад +1

    I remember "bubble" as magnetic memory. CCD was called a "bucket brigade device" at the time, and could be used as analogue delay line.

    • @tapewolf
      @tapewolf Год назад

      BBD chips are still used to produce echo and especially chorus effects for guitars and synths. While this could be done digitally, analogue synths and effects have become fashionable enough that BBD chips are back in production.

  • @WolfmanDude
    @WolfmanDude Год назад +3

    Photomultiplier tubes are still a current technology, there is no modern replacement for those. Its basically the only kind of vacuum tube thats still beeing produced in large volumes for serious applications (Aside from magnetrons).

    • @davelowets
      @davelowets 4 месяца назад

      All TRUE night vision devices use them, as there is no solid state alternative thatbworks nearly as well as a photomultiplier tube does.
      There are still other types of vacuum tubes made in large quantities for today's applications.
      Almost EVERY household in America uses a fairly high powered vacuum tube in their kitchens on a daily basis.... The Magnetron in their microwaves.
      Edit: Oops, didn't see the Magnetron listed at the end of your post...
      Audio tubes are another one that is still produces in large quantities today. MANY musical instument amplifiers still use tubes today for their pleasing sounding distortion when pushed beyond their ratings.
      Transmitting tubes are still alive today also.

  • @tigertiger1699
    @tigertiger1699 Год назад +1

    Really tip my hat to the scientists and developers that had strength of stomach and grit to risk their years and fortunes on these gambles..🙏

  • @DerekWoolverton
    @DerekWoolverton Год назад +11

    First a quick note, Caltech does not capitalize the T, even though it seems like its an abbreviation. Second, also wanted to mention the obscure sensor technology from Foveon that tried to use multi-layered image diodes to capture different wavelengths without a filter. Unfortunately it was complicated enough to fabricate that they could not easily ride process scaling to produce better and better chips, though they still struggle on as part of Sigma and promise they'll have their next generation sensor out any day now.

    • @willow_1
      @willow_1 Год назад +3

      The sensitivity of the Foveon sensor was also not great - they rely on different wavelengths being able to penetrate different depths into the sensor - red being able to penetrate the deepest. However if the red bit isn't strong enough it doesn't get detected in the red layer and can cause issues...
      Basically they just thought it's not a worthwhile tradeoff for most applications, especially when demosaicing have been refined to the point that it's good enough and cheap enough to do for 99% of colour applications.
      Still a fan of mono sensors though!

  • @SprocketHoles
    @SprocketHoles Год назад +2

    A really interesting imaging tech is the fovean sensor. Its based on how far light of different energy penetrate into silicon.

    • @hobbykip
      @hobbykip Год назад

      Sure is but try to manage crosstalk between the colors. I think that makes this technology way too expensive or unusable.

  • @ferencvalenta2005
    @ferencvalenta2005 Год назад +2

    Ehh... PMT's are not imaging devices, you meant camera tubes like Vidicon. CID sensors were completely skipped. CDS was originally used for CCD to get rid of the reset noise. EMCCD, that is basically a solid state, imaging PMT was also missed. Kodak and their transparent gate technology, SITe and their early back-illuminated CCDs (that were used on space probes), Foveon and LBCAST sensors were all missed. And is there a CMOS sensor with ADC in each pixel? There are ones with ADC in every row, is all. Many, many interesting technical details and historycal events ignored or incorrectly presented.

  • @mailson
    @mailson Год назад

    15:42 You said “own version of Moore’s Law” at the same time I got a tweet notification informing the passing of Gordon Moore 😢
    The odds.

  • @EdPin_
    @EdPin_ Год назад +1

    10:13 😁
    Why would i need Blinkist, when i got you...

  • @valeriopreite7573
    @valeriopreite7573 Год назад

    I believe that it's more appropriate to speak of photovoltaic rather than photoelectric effect in this case, as electrons aren't ejected from the material, but promoted from valence to conduction band.

  • @michaelproust7891
    @michaelproust7891 Год назад

    Thank you for the clear information.

  • @samnieves8158
    @samnieves8158 Год назад

    Thank you for taking on such a difficult subject. Optics are a rabbit hole.

  • @alphar9539
    @alphar9539 Год назад

    You are an absolute champion of the people for providing such accurate and useful information in a way that normal working class individuals can understand.

  • @DanielSMatthews
    @DanielSMatthews Год назад +3

    There is a lot more room for advances in this area, imagine a single photon detector on top of each column of a 3D memory so that the photons were counted digitally at each pixel as they came in. The next step beyond that is to bin the counts by photon energy level, colour. Then you have every single pixel as sensitive as possible and acting as a spectrometer.

    • @brodriguez11000
      @brodriguez11000 Год назад

      Better light traps inspired by photosynthesis.

    • @DanielSMatthews
      @DanielSMatthews Год назад +1

      @@brodriguez11000 The quantum of light required in photosynthesis is 8 photons. The ideal system is superconducting nanowires, that exists now, but the superconducting part would be hard, in a consumer device.

    • @Xiaotian_Guan
      @Xiaotian_Guan Год назад

      There's a limit on how far we can push sensor technologies. QE on modern sensors is already very high, the noise at high gain(high ISO) is already dominated by shot noise, the discrete nature of light. But SPAD based sensors would be a huge step up for dynamic range, as we are no longer limited by each pixel's well capacity. But unfortunately I don't think reading out the energy of the incoming photon is something that can be done with current semiconductor technologies? Maybe some 'quantum' sensors can do that? Idk

    • @HailAzathoth
      @HailAzathoth Год назад

      That ain't gonna happen lmao

    • @DanielSMatthews
      @DanielSMatthews Год назад

      @@HailAzathoth I don't know, the individual parts already exist, why specifically do you think that they can't be integrated in the future?

  • @emmerad
    @emmerad Год назад +1

    I would love to see a video about thermal imaging

  • @AnthonyMuscio
    @AnthonyMuscio Год назад +2

    I understood CCD was primarily used for Astronomy and have the advantage of allowing extended exposure times, with less noise, I think the term "light bucket" was used for this. As in the video says also for non-visible light. As CMOS improves it has also started being used in Astronomy. I wonder how this related to Infrared cameras?

    • @tookitogo
      @tookitogo Год назад +1

      All silicon devices are inherently sensitive to light, especially infrared. Both CCD and CMOS image sensors are very sensitive to infrared. In normal visible-light cameras, we use IR-blocking filters (visible at 7:00, it is the pale blue glass window of the sensor module). In pure infrared cameras, we use visible-light-blocking filters (conceptually similar to the purplish-black infrared windows on many remote controls and the devices they control).
      UV is a different matter: Silicon tends not to be particularly sensitive to it, so they have to deliberately design them for it. (Together with the choice of lens materials, since many common types of glass and optical plastic tend to block UV.)

  • @kevinroche7433
    @kevinroche7433 Год назад

    Thank you for an interesting video. I worked on CCDs, using them as analogue data storage in 1972 and found out the hard way that they were light sensitive as my test circuits failed when the sun came out. I haven't had a cause to use them since. Though I did have a Nokia N95 phone for a while. I always wondered why they were being replaced by CMOS.

  • @JohnnyWednesday
    @JohnnyWednesday Год назад

    It's worth noting that CCDs are essential for telescopes and they're vastly preferred in movie cameras as they aren't subject to rolling shutter.

    • @matthewsmith5104
      @matthewsmith5104 10 месяцев назад +1

      There are global shutter CMOS chips and have been for several years now. CMOS has basically taken over the amateur astrophotography market as well because of superior noise characteristics - all my astronomy cameras have been CMOS. CCDs still have some relevance for scientific purposes (I think?) but CCDs are kind of dinosaur tech at the amateur level.

  • @bluegecko6770
    @bluegecko6770 Год назад

    Thanks so much that I was unaware of

  • @zachjones6944
    @zachjones6944 Год назад

    Perfect timing. I'm currently working on a digital camera for a large telescope.

  • @DelfinoGarza77
    @DelfinoGarza77 Год назад

    I was just thinking about this last week....perfect.

  • @PrajjalakChattopadhyay
    @PrajjalakChattopadhyay Год назад

    Two wrong information I found:
    1. 2:13 this diagram is not related to the photoelectric effect. This is Hertz's other experiment that proved the existence of electromagnetic waves.
    2. You mentioned that the PMTs have 5 to 7 dynodes. Actually it may have many more. 10, 12 dynode PMTs are easily available. We use 12 stage PMTs in our lab.

  • @timng9104
    @timng9104 Год назад

    seeing how the CCD transfer data, opposed to crossbar AM arrays, it reminded me of racetrack memory. racetrack memory stores information using electron spins. a correlated electron spin gives you low resistance path, and opposing electron spin gives high resistance path. then you can use spin-orbital coupling, apply current, transfer spin, etc... take a look if interested :P

  • @mikereilly2745
    @mikereilly2745 Месяц назад

    Thank you ! This was fantastic , Even the comment section is awesome and informative , That's when you know it's great.

  • @onkcuf
    @onkcuf Год назад

    Wow. Super cool to Learn. I'm old. I remember full size camcorders and even 3 ccd cams.

  • @rickhobson3211
    @rickhobson3211 3 дня назад

    Image at 2:16 doesnt have anything to do with light. It's an early demonstration of radio waves.

  • @maxheadrom3088
    @maxheadrom3088 Год назад

    I like how the show a woman using a vacuum cleaner while she listens to their digests! I really want it! That silent vacuum cleaner, I mean.

  • @GriftyMcPants
    @GriftyMcPants Год назад

    I appreciate your channel. Thanks!

  • @code-inc
    @code-inc Год назад +2

    Please Make an episode about phased array radar and qam modulation

  • @haberdasherrykr8886
    @haberdasherrykr8886 Год назад +1

    Yeah finally drones "coverage" baby!

  • @BobDiaz123
    @BobDiaz123 Год назад

    CCDs are a current based technology, whereas CMOS is a voltage based technology. As the resolution increases, the heat from the current required for CCDs increases. This became a real problem for professional video camera makers in the early days of HD. The increased resolution of HD made the smaller CCD chips impossible because of the excess heat. While CMOS was a possible solution, the rolling shutter effect of CMOS was disliked by professional video producers. Because this was the only solution to shooting HD on smaller chips, it took time for the professional video market to accept this limitation.

  • @andersjjensen
    @andersjjensen Год назад +2

    Photo CCD. Not to be confused with AMD's CCDs which are Core Complex Dies.

  • @CEEPMDEE
    @CEEPMDEE Год назад

    I happily, recently acquired a used Phase One IQ 280 digital back & XF body. I chose that over the Fujifilm GFX 100 II. I already have the Sony a7rIV... which is close enough to the GFX camera.

  • @RobSchofield
    @RobSchofield Год назад

    Great explanation, thanks!

  • @altebander2767
    @altebander2767 Год назад +2

    I'm sorry, but you are confusing bubble memory, which works by having segments of magnetic fields on the material and CCDs which use charges. CCDs have, BTW been used well into the 1980s for storage devices.

  • @value8035
    @value8035 Год назад +1

    What do you think abut the Dynamic Vision Sensors? (Event cameras)

  • @mercster
    @mercster Год назад

    The g in 'Hughes' is silent. Thanks for the video.

  • @GetOffMyyLawn
    @GetOffMyyLawn Год назад

    Eric Fossum is still active on the DPReview Photographic Science and Technology discussion group... he has many interesting threads about his research.

  • @liaminwales
    @liaminwales Год назад

    Always hoped the Foveon sensor got used more, kind of interesting to see a third player.

  • @pierrecambou3228
    @pierrecambou3228 Год назад

    Modern CMOS image sensors (CIS) have solved the fill factor issue by using Back Side Illumination (BSI) circa 2010 then this approach led to Stacking (2 layer semiconductor) circa 2015 and now triple stack is being expected as the big news of iphone release 2023, all this 3D semiconductor approach is specific to CIS but has much wider implications, such as memory stacking, this could be another story covered by this wonderfull chanel

  • @fensoxx
    @fensoxx Год назад +1

    This is such a great channel. You might drop the Asian focused byline and just do tech wordwide (which you mostly seem to do). You are really good at it. Thanks!

  • @bmobert
    @bmobert Месяц назад

    Have you already done an episode on bubble memory?

  • @OccultDemonCassette
    @OccultDemonCassette Год назад

    What about that SIGMA Fovion image sensor?

  • @Quazgaa
    @Quazgaa Год назад

    This makes me think of how in computer rendering you need to average many samples together per pixel to create a reasonable-quality image that isn't a steamy pile of garbage. If this was done at the hardware level in image sensors, I bet that would be awesome.

  • @kbejustervesenet7261
    @kbejustervesenet7261 Год назад

    I remember Photobit. I applied for a job with them in end of the 1990's after I finished my Masters. They had their offices near the University of Oslo, Blindern in Oslo, Norway (I didn't get the job though...)

  • @murdercom998
    @murdercom998 Год назад

    I was doing some reading on the Henschel Hs 117 surface to air missle and noticed its detonation system used (detonated by acoustic and photoelectric proximity fuses) I suspect they may have used a early photomultiplier tube but I need to do more research.

  • @olavisau
    @olavisau Год назад

    Can you do a video on the new m3p battery that catl is producing?

  • @scienceandmathHandle
    @scienceandmathHandle Год назад

    I would argue that even today CCDs are far better sensors than CMOS, and their "Shutter" can be done electronically as well. This because the electrons on a back side illuminated CCD have to migrate toward the wells by an applied electric field. The big reason why CMOS has "mostly" replaced CCDs is that it is more conducive with the fact that most modern microchips use CMOS technology in their design, and thus it is easier to manufacture, for most fabs. Almost all modern astronomy cameras still use CCD technology as they have better dynamic range, higher quantum efficiency and lower noise. In the last few years I used a modern industrial CCD camera could still out perform almost any other CMOS camera in terms of low light and high QE for a particular application.

  • @bruceli9094
    @bruceli9094 Год назад +1

    Please make video on America's recent breakthrough in Quantum computing and ChatGPT!

  • @billykorando
    @billykorando Год назад

    Lol you really got to do an appropriate “that X?… Albert Einstein” joke 😂

  • @turke765
    @turke765 Год назад

    4:26 forbidden brie wedge

  • @LordPecka
    @LordPecka Год назад +1

    I wonder why do the manafacturers not use some form of 3D layering (perhaps like the AMD is using for their 3D V Cache?) to get the benefit of 100% coverage by photo layer.
    Seems so obvious - and with massive potential benefits - so I have to assume it must be hellishly hard to manafacture.

    • @H0mework
      @H0mework Год назад +1

      Sony has been doing it recently, about 2 years. December 16, 2021
      Sony Develops World’s First*1 Stacked CMOS Image Sensor Technology
      with 2-Layer Transistor Pixel
      Widens Dynamic Range and Reduces Noise by
      Approximately Doubling*2 Saturation Signal Level*3

  • @responsiblestudent7938
    @responsiblestudent7938 Год назад

    What about SPAD based image sensor ICs?

  • @soufmaro502
    @soufmaro502 Год назад

    the more a global shutter/parallel shift registers registers/synchronous adding/progressing makes sence

  • @jamesjensen5000
    @jamesjensen5000 Год назад

    If the sensor is basically capturing photons to displace electrons into wells why doesn’t it charge the battery? Does the amplifier drain the charge in excess of additional charge? Maybe the capture process can power the amplifier before the signal is transmitted to the decoder…

  • @devrim-oguz
    @devrim-oguz Год назад

    The CCDs are still used in scientific machines where they cool them down to absolute zero to mitigate thermal noise.

  • @SarahBoyd1
    @SarahBoyd1 Год назад

    “And now it is time for me to explain how this accursed thing works”. 😂

  • @sachinz4199
    @sachinz4199 Год назад

    Could you do a video on CQDs (Colloidal quantum dots ) and SPAD (single-photon avalanche diode) also how's the potential of such technology to replace CMOS?

  • @steveg2277
    @steveg2277 2 месяца назад

    Bro did that artfully.

  • @ruperterskin2117
    @ruperterskin2117 Год назад

    Cool. Thanks for sharing.