Why Space Probes Still Use Black & White Cameras in the 21st Century

Поделиться
HTML-код
  • Опубликовано: 9 янв 2025

Комментарии • 1,1 тыс.

  • @kasuha
    @kasuha 3 года назад +787

    I'd expect the main reason why Perseverance sent black and white images first even though they were taken with color cameras was because they need less bandwidth and there was a ton of other things they needed Perseverance to transmit after landing.

    • @winkcla
      @winkcla 3 года назад +51

      That's what I was thinking as well. That particular camera could well have been color, but converted for transmission?

    • @johndododoe1411
      @johndododoe1411 3 года назад +82

      They had so little bandwidth that the image would use most of it. Because the best radio equipment (transmitter and antenna) were still packed away in the landing position. Once unpacked, they could upload the full resolution from disk or SSD. Moments earlier they had used the ultra slow "tone" transmitter to basically send a progress bar for the landing sequence, with some guy/gal in the room looking at it and saying out loud what was happening all the way to "touchdown, we are safe on Mars".

    • @twistedyogert
      @twistedyogert 3 года назад +39

      Also, the first photographs that Perseverance took were using the navigation cameras. A robot wouldn't necessarily need to see in color to avoid obstacles.

    • @jeremiefaucher-goulet3365
      @jeremiefaucher-goulet3365 3 года назад +22

      Right after landing, all they had was the low gain antenna relayed via MRO. So yes, bandwith was at a high premium in those moments.

    • @Steph.98114
      @Steph.98114 3 года назад +9

      @@twistedyogert the navigation cameras are actually in colour as stated in the vid

  • @enda320
    @enda320 3 года назад +313

    Any questions?
    ...*hands go up
    any questions not about my shirt?
    ...*hands go down...

    • @firecrow7973
      @firecrow7973 3 года назад +9

      was that the sequence for apollo 13 to "hack" the lunar module?

    • @Usstampcollectersatkiwistamps
      @Usstampcollectersatkiwistamps 3 года назад +7

      I think it's CCD data in 2x2 grid raw form: luminosity/red/blue/green with 7 elements in a horizontal register and 2 in a vertical register. Still don't know what it means though.

    • @dickJohnsonpeter
      @dickJohnsonpeter 3 года назад +6

      Well I'm keeping my hand raised because I want to know.

    • @5Andysalive
      @5Andysalive 3 года назад +2

      @@firecrow7973 That was Apollo 14. On 13 they were very, VERY careful in what they were doing to their Lunar Module.
      Well apart from dropping it into the deepest part of the ocean available because of this Plutonium thingy.

  • @matterwiz1689
    @matterwiz1689 3 года назад +1402

    "its very easy to get a rock to sit still" -Scott Manley, ca. 2021

    • @haydentravis3348
      @haydentravis3348 3 года назад +33

      You need a planet to sit the rock on.

    • @Formula1st
      @Formula1st 3 года назад +20

      @@haydentravis3348 and planets tend to move pretty quick

    • @Damien.D
      @Damien.D 3 года назад +9

      Obviously he never encountered trolls.

    • @PauxloE
      @PauxloE 3 года назад +10

      @@Formula1st If you are on the same planet, that's okay.

    • @vpheonix
      @vpheonix 3 года назад +18

      Sit, stay ..... good boy.

  • @t65bx25
    @t65bx25 3 года назад +498

    Scott Manley: Answering the Spaceflight Questions I Always Wondered but Never Asked since 2011!

    • @haydentravis3348
      @haydentravis3348 3 года назад +7

      Answering questions you never knew to ask.

    • @akizeta
      @akizeta 3 года назад +4

      Seriously, it's Scott's RUclips tenth anniversary this year? We should have a party!

    • @t65bx25
      @t65bx25 3 года назад +4

      @@akizeta He started doing KSP around that time. His channel is 2 or 3 years older than that though.

  • @benjaminsmith4058
    @benjaminsmith4058 3 года назад +238

    The death of CCD really came with the advent of backthinned CMOS, where the light enters the back of the chip, rather than the front where all the circuitry is. This, paired with microlens arrays to focus the light into the center of each photodiode, has given backthinned CMOS the same quantum efficiency as CCD with MUCH higher frame rates. EMCCD is the only remaining CCD sensor that doesn't have a superior equivalent in CMOS, but is for fairly specialized applications.

    • @aspzx
      @aspzx 3 года назад +6

      In what applications do we see CMOS sensors vs CCD? Are pretty much all smartphones made now using CMOS? What about DSLRs? When did the switch happen?

    • @benjaminsmith4058
      @benjaminsmith4058 3 года назад +30

      @@aspzx Pretty much all smartphones are CMOS, which brought the cost of CMOS architecture down greatly. Then, once backthinning became more reliable, most scientific camera vendors have shifted to CMOS, as there are no longer any real advantages to CCD. Along these same lines, the main manufacturers have said they will stop producing CCD sensors:
      www.stemmer-imaging.com/en/news/2015-03-the-future-of-ccd-image-sensors-are-we-seeing-the-end-of-an-era/
      www.alliedvision.com/en/news/detail/news/interview-sony-announces-end-of-ccd-sensor-production.html#:~:text=Sony%20announced%20that%20they%20will,life%20cycle%20before%20that%20anyway
      There is still a large stockpile of CCD cameras and sensors, so they won't immediately disappear when the production stops, but it is now pretty much considered an older/obsolete technology. That said, certain high end CCDs and specialized EMCCDs are still being produced.
      I'm not as familiar with the photography market, but from this article it sounds like camera manufacturers are also switching to backthinned (backside) CMOS sensors: en.wikipedia.org/wiki/Back-illuminated_sensor

    • @alainmaury5941
      @alainmaury5941 3 года назад +17

      I have both EMCCD and thinned CMOS, and get about the same results, so now only using CMOS. There was a difference a few years back when the EMCCD had a sensitivity advantage over classical CCD or non thinned CMOS. Now it's gone.

    • @peteranderson037
      @peteranderson037 3 года назад +20

      @@aspzx If I remember correctly it started to happen about 10 years ago in the consumer/prosumer market. The advent of smart phones dumped a lot of money into CMOS sensor research and development. The power and space efficiency of CMOS over CCD also meant that it was feasible to fit all of the electronics for video capture and storage on a DSLR camera. Before that, pro and prosumer CCD digital video cameras, even SD ones, weren't much smaller than their VHS camcorder predecessors.

    • @ilyapopov823
      @ilyapopov823 3 года назад +19

      ​@@aspzx All smartphones and photo cameras (DSLR and compacts) are using CMOS nowadays. In mainstream DSLRs switch happened around 2007-2008 (except Canon, who used CMOS pretty much from the beginning, 2000)

  • @simonh
    @simonh 3 года назад +54

    I'm a long-time lurker and don't often comment, but I'm inspired to drop in a thank you for the videos you create and the depth of information you have and impart. I'm a photographer and today's video is kinda my thing and has been for a long time. And still, you taught me a couple of things today that I hadn't put together myself before. Kudos, Scott. You genuinely are one of the greatest educators of our time, and you deserve recognition.

  • @jzero4813
    @jzero4813 3 года назад +186

    Not just space probes - science, generally. Optical filters give you way more control over the spectrum you're sampling, so a monochrome sensor with a calibrated QE curve is pretty much all you need to do whatever spectral imaging you need.

    • @EcceJack
      @EcceJack 3 года назад

      Yup!

    • @zeldaoot23
      @zeldaoot23 3 года назад +4

      Yeah, every one of the dozen or so fluorescence microscopes I’ve used or built has had a monochrome CCD or sCMOS sensor. Just stick the right filters or dichroic mirrors in front and you’re good to go!

    • @forloop7713
      @forloop7713 3 года назад

      Why do they then have the colors wheels on the spacecraft to calibrate the camera?

    • @tma2001
      @tma2001 3 года назад +1

      @@forloop7713 I'd have to rewatch but NASA's latest Perseverance Q& A was about the images and calibration targets - IIRC a German scientist in charge of that aspect mentioned the effect of the extreme temp. swings and radiation on the electronics degrading it over time (not sure if that included the sensor chip itself also).

    • @rickkwitkoski1976
      @rickkwitkoski1976 3 года назад +1

      Yes. Much of the questions here are from people who don't understand how images are created. They think that what their eyes see is what a camera gets. NOT! There is much complicated digital processing going on inside ALL digital cameras to give you the final image.
      ALL light sensors only see light and dark and NO colour at all. Everything is combined into one image later. Even if that "later" is a tenth of a second.

  • @mumblbeebee6546
    @mumblbeebee6546 3 года назад +406

    “..that’s 65% more sensor per sensor! Scott Manley, we’re done here!”
    Seriously though, thank you for another great explainer, Scott!

    • @666Tomato666
      @666Tomato666 3 года назад +19

      I understood that reference.jpg

    • @maxedout1046
      @maxedout1046 3 года назад +4

      I understand that reference.

    • @MichaelOnines
      @MichaelOnines 3 года назад +10

      @Chandan Sinha it's like a Cave Johnson quote from the ad leadup to Portal 2.

  • @MCWaffles2003-1
    @MCWaffles2003-1 3 года назад +142

    I bet Scott didn't even have to research much for this since he already does astrophotography. You learn a lot about sensors from the start diving into that hobby

    • @wierdalien1
      @wierdalien1 3 года назад +4

      He did as a job for a while

    • @Formula1st
      @Formula1st 3 года назад +21

      @@wierdalien1 Scott has had way too many awesome jobs

    • @MCWaffles2003-1
      @MCWaffles2003-1 3 года назад

      @@wierdalien1 how do you get a job as an astrophotographer? Was it for his degree?

    • @shazmosushi
      @shazmosushi 3 года назад +10

      He also works as a software engineer at Apple on video codecs if I recall correctly. So it makes sense he has a great understanding of the pipeline from camera sensors to digital images, and the algorithms used to do the processing.

    • @macht4turbo
      @macht4turbo 3 года назад +7

      Every photographer, if you ever cared to research the technology, would have come up with the same points as scott. These people on twitter were just clueless. Which is totally fine, not everyone needs to like technology, but then just don't moan on twitter :)

  • @markc2643
    @markc2643 3 года назад +26

    "That's how old cameras used to operate, and they were really hard to calibrate." I repaired camcorders back in the 80's and 90's. The old tube cameras weren't hard to calibrate, they were a load of fun....that lasted for an hour or so, lol. Each color had 9 adjustment to align that were all interconnected, so after you aligned all 9, you had to go back and do it again, and again until it zeroed into the right position.

    • @kaitlyn__L
      @kaitlyn__L 3 года назад

      Ah, but the trails left behind by strong light sources looked so cool! Older footage involving candles or point sources in darker environments especially left interesting patterns - as the dark room resulted in far less new light coming in to overwrite the point source, and they took so long to drain passively.

  • @Xaerorazor0
    @Xaerorazor0 3 года назад +16

    Happy user of Monochrome sensors at work. Helped build the hardware behind some Back illuminated CCD’s used in research. And they love being cold, very cold. Our warmest camera runs at -110C coldest at -170C

    • @666Tomato666
      @666Tomato666 3 года назад +2

      IR imaging?

    • @chrisdejonge611
      @chrisdejonge611 3 года назад +4

      TSSS amateur. I've worked in far-infrared / mm-wave imaging where the (MKID) detectors were below 100mK. Yes, that's 0.1°C above absolute zero.

    • @Xaerorazor0
      @Xaerorazor0 3 года назад +1

      @@666Tomato666 Our cold camera does only NIR images, warm cameras do UBVRI bands

  • @jamiedenton2321
    @jamiedenton2321 3 года назад +1

    Video and audio might be one of the most underrated modern technologies. Most people do not have the slightest idea how complex they are and how they actually work.

  • @jeffpkamp
    @jeffpkamp 3 года назад +64

    I remember when I was first getting into astrophotography that there were a few other differences that made CCDs a little more desirable. One was that CMOS as Scott noted are active pixels, which means they have current flowing through them during exposure. This leads to the chip heating up as you do an exposure (or take a video). For astrophotography, that means if you're not actively cooling your chip when you take pictures, thermal noise increases as you go until your chip reaches thermal equilibrium. CCDs on the other hand are only active when you're scanning the chip at the end of the exposure, and dont' heat up for long exposures. Also CMOS pixels are read individually and have slight differences between pixels in sensitivity, and need to be recalibrated after construction, which can make them problematic for photometry. On the other hand CCDs do suffer from blooming, where a saturated "well" spills its electrons into the neighbor wells. This is why you see super bright objects (like planets) having horizontal lines coming off of them in images from probes like SOHO.

    • @KaRtHiK19002
      @KaRtHiK19002 3 года назад +2

      Wow that's really interesting! Thanks for sharing!

    • @wb6anp
      @wb6anp 3 года назад

      Aren't they usually black and white still with filters? The one I have seen on your tube still do their that way, then stack the shots.

    • @antoniomaglione4101
      @antoniomaglione4101 3 года назад +3

      The heat problem of C/MOS image sensor shows up only when you use them for filming. And the camera must use an optical DSL system for pointing.

    • @jeffpkamp
      @jeffpkamp 3 года назад +4

      @@antoniomaglione4101 I can personally confirm that they do heat up on long exposures. It's a major pain in the butt when you're doing astrophotography in ambient temperatures > 50F. Many DSLRs limit filming and exposure time to 30 minutes to head off the heat issues. If you have a canon you can watch this in the EXIF temperature data. Take pictures of varying length with a long cool down time between pics. The longer exposures will have higher temperatures logged for the sensor.

    • @danielvanced5526
      @danielvanced5526 3 года назад +6

      @@jeffpkamp the 29 minute 59 second time limit isn't to do with chip temperatures, it's entirely for tax reasons. If it films 30 minutes or longer it is classed as a video camera and attracts higher taxes.

  • @SkylersRants
    @SkylersRants 3 года назад +10

    If I ever want to think I am not very smart, I will listen to Scott Manley describe how cameras work and I will forever be amazed at what engineers have done to design cameras.

  • @alexrossouw7702
    @alexrossouw7702 3 года назад +37

    Internet: "I want to see Mars in colour!"
    NASA: "Pff you can barely see in colour on Earth"

  • @dave_dennis
    @dave_dennis 3 года назад +1

    I head up the test division for a company that makes CMOS color sensors. I know more about human perception of color than I ever dreamed I would. Very well explained. I’m impressed by the broad extent of your knowledge.

  • @alt8791
    @alt8791 3 года назад +393

    I will file this away under my folder of “Scott Manley Videos to Throw at Conspiracy Theorists”

    • @rimka11
      @rimka11 3 года назад +72

      Dont bother, conspiracy theorists are too dumb to understand what he is talking about.

    • @alpha007org
      @alpha007org 3 года назад +14

      I was thinking the same: Conspiracy Sites/People must be going crazy with that first B&W image from Mars and then we got colored ones.

    • @Bbonno
      @Bbonno 3 года назад +22

      Aren't conspiracies mostly about the grand unified thing that simplifies their world, where someone has a plan they can fit into?
      This doesn't help with that :P

    • @Marinealver
      @Marinealver 3 года назад +5

      Mars Rovers are fake, they were filmed in Hollywood.
      /s

    • @bigdogstatus4528
      @bigdogstatus4528 3 года назад

      Link it

  • @cipher4213
    @cipher4213 3 года назад +71

    “It’s very easy to get a rock to sit still.” This is the scientific breakdown we came here for lol! Love you Scott!

    • @MaverickBlue42
      @MaverickBlue42 3 года назад +3

      Tell that to the sailing stones in Nevada....

  • @cdl0
    @cdl0 3 года назад +23

    Let's not forget the two Viking Mars-landers, which had a pair of rather unusual facsimile cameras, and Pioneers 10 & 11 which exploited the spin of the spacecraft to construct images of Jupiter and Saturn with two-channel imaging photopolarimeters.

    • @mcbethjb
      @mcbethjb 3 года назад

      Something that had been perfected on airborne and earth orbiting satellites

  • @rebeccarivers4797
    @rebeccarivers4797 3 года назад +43

    CCD cameras also can have the problem of "flooding" a full column of pixels if some of the pixels get a huge amount of photons(pointing at the sun or if you over expose the pixels).

    • @tybofborg
      @tybofborg 3 года назад +6

      Oh so that's what it was! I always wondered about that. I have a digital camera that's like 10+ years old that did that.

    • @alainmaury5941
      @alainmaury5941 3 года назад +5

      There are two aspects of this. Blooming which you get on certain chips, i.e. you take a picture of a bright star, and it gives a bright trail after the star, and the other is that when you have a bad pixel, it kills the rest of the column of the CCD. After several years in altitude (observatories are often in altitude) the CCD chips are full of dead columns. Anyway it's a moot point, nobody manufactures consumer's CCD anymore.

    • @VaguelyAmused
      @VaguelyAmused 3 года назад +1

      Anti-blooming is your friend. Sony CCD sensors have a very good implementation

    • @YuffX
      @YuffX 3 года назад

      @@VaguelyAmused Where are they used? They didn't used them in their cameras for a while

    • @VaguelyAmused
      @VaguelyAmused 3 года назад

      @@YuffX Honestly I am not sure if they are still used, everything has pretty much moved over to CMOS. Nikon cameras used to use Sony sensors (I know because I've removed 2 from D40's to make a DIY astronomy camera)

  • @johnladuke6475
    @johnladuke6475 3 года назад +482

    Obviously they take B&W photos because that's how a true artiste expresses their work. Duh.

    • @nicosmind3
      @nicosmind3 3 года назад +18

      Ahh, so NASA is full of emos. Good to know

    • @666Tomato666
      @666Tomato666 3 года назад +8

      It's Aht Dahling! Look it up!

    • @Marinealver
      @Marinealver 3 года назад +2

      Honestly, I am finding to the benefit of humanity is becoming less and less of a reason for conducting science.

    • @NimbleBard48
      @NimbleBard48 3 года назад +1

      "artiste" - if that's a typo, it really fits in this context :D

    • @johnladuke6475
      @johnladuke6475 3 года назад +7

      @@NimbleBard48 No, "artiste" is no typo, it's a pretentious dbag. They're the one telling you how you clearly just don't *get* their work, but it's okay, nobody will until they're dead.

  • @Aubstract
    @Aubstract 3 года назад +2

    Hey Scott, another form of sensor that is very interesting is MKIDs. They’re very new, and are very well suited for research-level astronomy on large telescopes. They’re cooled to millikelvin temperatures, and each pixel can count individual photons, and determine the energy of the photon simultaneously. So it’s kinda like an array of tiny spectrophotometers. I don’t fully understand the technology, but I thought I’d share.

  • @AlexanderBatyr
    @AlexanderBatyr 3 года назад +22

    I really appreciate you've mentioned Sigma's Foveon sensor!
    Back in the day I recommended to buy sigma dp1 to my father and even though it had only 5 Mpx sensor, his photos had outstanding color rendering with especially pleasant blue sky, which non of Bayer cameras were able to picture.

    • @MomentousGaming
      @MomentousGaming 3 года назад +2

      Yep and very little colour artifcating, which can still be an issue on even BSI CMOS.

  • @martinhughes2549
    @martinhughes2549 3 года назад +1

    My understanding is that bayer filters reduce actual resolution to 50% of the notional megapixels on your camera.( 50% of the bayer filters are green, the colour your eye is more sensitive too) .
    The digital CCD imagers in spacecraft have to be space rated; and they have to reduce data size to save on memorary and transmission size ( bandwidth) so the New Horizons probes camera was just 1 mp or so. To maximise image quality a filter wheel set up makes more sense.

  • @AlexForencich
    @AlexForencich 3 года назад +3

    Couple of things to note: it's also possible to take pictures with no filter selected. And in this case, it's not called 'black and white' or 'monochromatic', it's called 'panchromatic', as it represents all of the colors. If you're taking separate images with red, green, and blue filters, you can also take a fourth (panchromatic) image with no filter and then combine the brightness data from the panchromatic image with the color data from the other three images. This potentially provides better signal to noise ratio, although the image may also include non-visible light (IR and UV). In some cases, the panchromatic image is captured at a higher resolution (and possibly with a completely different sensor or camera) than the filtered images.

  • @yasnac7576
    @yasnac7576 3 года назад

    Scott this has got to be your best video ever! It's very technical and very revealing at the same time. I do astrophotography and I knew about this process CCD and CMOS. It's very telling when the public doesn't understand. Thank you very much keep up the good work

  • @Ittiz
    @Ittiz 3 года назад +3

    When the company I work for was moving away from 3CCD cameras (which I'm surprised you didn't mention) I advocated for switching to the FOVEON technology. Athough there was too many draw backs as you pointed out. We ended up going with combination of Bayer and filter wheel depending on the application. Although we have recently also starting using multicolored lamp houses with RGB LEDS, which wouldn't work for most space applications because you're using ambient light.

    • @Mythricia1988
      @Mythricia1988 3 года назад

      Oh, I'm gonna guess this was for industrial application? Using a monochrome sensor and then providing the light using colored light in the absence of broadband ambient light is actually pretty clever.

    • @markc2643
      @markc2643 3 года назад

      @@Mythricia1988 That's how Fluorescence Microscopy works when needing different color channels. It makes some pretty pictures.

  • @enjibkk6850
    @enjibkk6850 3 года назад +2

    For cmos camera, one drawback I read about is that since each pixel has some amplification circuit, each pixel is amplified differently (not exact same gain for each pixel).

    • @jpdemer5
      @jpdemer5 3 года назад +1

      I think the final array is calibrated to compensate for that. I don't know if the calibration goes out of whack as the device ages.

    • @chrisdejonge611
      @chrisdejonge611 3 года назад +1

      @@jpdemer5 You typically 'calibrate' a camera every night, using flat-fielding and dark images, and then to reference objects (stars of known magnitude). Good thing is the late-afternoon sky (dusk, when the sun has just set) provides a very uniform illumination, so you typically begin a night of observation at dusk to take the flat-fields! You can also do it in early morning after the observations.

  • @karlkastor
    @karlkastor 3 года назад +11

    6:02 You can also see these shifted color channel on Google Maps/Earth when there is an airplane flying between the camera and the ground.
    Also if people wanna know how analog TV cameras work, Veritasium has a great video on that.

    • @livethefuture2492
      @livethefuture2492 3 года назад +1

      ah so that's what that was, i always wondered why there was this red/green outline on some of the images.

    • @mcbethjb
      @mcbethjb 3 года назад +1

      WorldView uses a separate sensor for each color channel (with different aim points for each) instead of a color wheel. Same problem, different root cause

    • @karlkastor
      @karlkastor 3 года назад

      @@mcbethjb That's very interesting! Good to know, thanks.

  • @bobblum5973
    @bobblum5973 3 года назад +1

    Scott, I loved it when you were talking about the differences between CCDs and CMOS image sensors. I actually worked with memory boards for microprocessor-based systems that used Fairchild CCDs for RAM and Texas Instruments magnetic bubble memory for non-volatile storage. Both gave us quality headaches, the technologies were too new. We ended up switching to DRAM because they were getting cheaper and higher density. But as you said, those CCDs still have a use, the right tool for the job.

  • @pulesjet
    @pulesjet 3 года назад +108

    If you could address every diode junction on your solar panel you would have a giant photo chip.

    • @jokerace8227
      @jokerace8227 3 года назад +37

      Yes, it's actually quite fascinating how Photovoltaics, Light Emitting Diodes, and Charged Coupled Devices are essentially intertwined technologies.

    • @pulesjet
      @pulesjet 3 года назад +1

      @@jokerace8227 CCD require scanned . and decoded. LED type simply produces a output..

    • @jan237
      @jan237 3 года назад +10

      @LueLou they produce IR light, not UV

    • @allangibson2408
      @allangibson2408 3 года назад +14

      @@pulesjet Actually LED’s (like all semiconductor diodes) are sensitive to light. The output circuits are however different. On the flip side all diodes emit light when conducting - you can actually use this to test solar panels by pushing current through the diode junctions (they glow in infrared). A CCD is simply a dynamic RAM chip with a clear cover.

    • @ryukisai99
      @ryukisai99 3 года назад +3

      … Without any focusing device.

  • @tomoakhill8825
    @tomoakhill8825 3 года назад

    Very well said Scott. This topic is conceptually complex and Scott did a good job of giving enough of the complexity to convey why black and white is still a good choice for a camera on Mars. They give the most detail and capture all of the light available. Remember that at 2:00 he shows the grid of color filters used to produce a color image. That means each photodetector gets only a part of the available light, and that each resultant color pixel was produced from three detectors which were in different positions. Not a lot different, but enough to blur the picture. On Mars this small blur, and the discarding of light, is not acceptable.

  • @raykewin3608
    @raykewin3608 3 года назад +21

    Most underrated channel on RUclips.

  • @w6wdh
    @w6wdh 3 года назад

    Scott, this is an awesome video! You got the details correct about CCD and CMOS sensors, even throwing in the details of the Foveon sensors.
    One development in recent years is the commercial deployment of Back Side Illuminated (BSI) CMOS sensors, such as the Sony sensor in the Nikon D850 camera. Full pixel size photodiodes are fabricated under the CMOS readout circuitry, then the silicon IC is flipped over and milled or polished thinner, so light can get into the photodiodes from the back side (which actually faces the lens). Cameras with BSI CMOS sensors are remarkably sensitive to light.

  • @xodarap37
    @xodarap37 3 года назад +15

    A polarizer would make an interesting filter choice, in combination with the other filters...

    • @NoHandleToSpeakOf
      @NoHandleToSpeakOf 3 года назад +6

      as well as those laser etched party glasses that make everything appear as love hearts or something

    • @patreekotime4578
      @patreekotime4578 3 года назад

      With very little atmospheric scattering and the sun pretty far away, I would think that most of the light is already pretty naturally polarized.

  • @picksalot1
    @picksalot1 3 года назад +1

    Thanks for explaining the camera technologies. I found it absolutely fascinating!

  • @jsomhorst
    @jsomhorst 3 года назад +9

    Quote of the day: "It's very easy to get a rock to sit still for its portrait"

  • @Mr.Nichan
    @Mr.Nichan 3 года назад +6

    I was just wondering why I watch videos explaining things I already know, and then this video answered my question: They sometimes include plenty of things I don't know.

    • @adamrak7560
      @adamrak7560 3 года назад

      Same for me:
      I did not know that CCDs can help by tracking the movement.
      I did know about that you can use CCDs really creatively, but that never occurred to me.

  • @randbarrett8706
    @randbarrett8706 3 года назад +30

    I very much appreciate Scott saying “iPhone eks” instead of “ten” and even more appreciate that he doesn’t feel the need to wave around the absolute latest phone model

    • @DilainMedia
      @DilainMedia 3 года назад +11

      But it's a little weird when you realize that he works for Apple.

    • @travelsofmunch1476
      @travelsofmunch1476 3 года назад +1

      @@DilainMedia he does?

    • @travelsofmunch1476
      @travelsofmunch1476 3 года назад +3

      @@DilainMedia I’ll be damned, software developer at Apple

    • @docpossum2460
      @docpossum2460 3 года назад +12

      @@travelsofmunch1476 Possibly one of the few people with more than 1M subs who has a day job.

    • @AstroStrongBox
      @AstroStrongBox 3 года назад

      He probably has the last 4 iphones :) I wonder why he likes X more then Ten?

  • @Ynhockey
    @Ynhockey 3 года назад

    I think this is your best video yet: a great explanation of a space engineering concept but with a strong connection to everyday technology as a point of reference. Sometimes explanations about technologies only used for space/industrial purposes are hard to understand because there's nothing to relate to. Thanks and hopefully there are more of these in the future!

  • @jeyycie3656
    @jeyycie3656 3 года назад +6

    A small disclaimer : when you say a "Pixel" (in a typical bayer matrix) it's actually composed of four photosites, and when you talk about only one of those four, you just call it a photosite.
    Also English made a bit of a mouthful, because when you are talking about the number of pixels a sensor have, it isn't resolution, but the definition.
    Resolution is the ability to resolve fine details, and is linked to the size of the photosites over the actual size of your sensor, not the number of them. Also when talking about definition, it's related to the dimensions, or the numbers of raws of pixel display for the final image, some sensor or imaging technique used more (or less) than one pixels from the sensor to make one pixel displyed on a screen for the final image.
    Finally, a lot of CDD's are much tougher than CMOS and more reliable, are less sensitive from radiations, and some specialize CDD still have an edge in signal to noise ratio (especially in extreme temperatures), so that's why their still used in industries and space related activities.
    PS : no digital sensor can replicate the human perception of color, some film stocks can get close, but our brain is just too messy, color rendition isn't even fixed, when you see a bright color, or a monochromatic color, you brain adapts for it, and try to make it white-ish.
    (I've done the experiment, I was facing a powerfull really deep red light, and 30min later I was seeying a pale orange, it took me a pause to the toilet form the shooting, and actually coming back to the studio to realize it was note pale orange but deep red again.)

    • @Case_
      @Case_ 3 года назад

      Regarding the oddness of human color perception, a nice example of this - as Techology Connections said in a video on the subject, "brown is just orange with context".

  • @richwaight
    @richwaight 3 года назад

    That was super complex but loved the way you described it! THanks for posting :) awesome

  • @wheetcracker
    @wheetcracker 3 года назад +20

    13:50 Thats exactly how high efficiency solar panels work, too. They have multiple layers to try and soak up as many different wavelengths of the sunlight at the same time as possible.

    • @Mythricia1988
      @Mythricia1988 3 года назад +1

      That, I did not know. That's very cool!

    • @vladimirdyuzhev
      @vladimirdyuzhev 3 года назад

      I guess, efficient by area, but inefficient by weight?

    • @wheetcracker
      @wheetcracker 3 года назад +1

      @@vladimirdyuzhev the weight doesn't really change, because the stack is printed in layers onto the same die.

  • @NASA-Shill
    @NASA-Shill 3 года назад

    Great video. As an astrophotographer, this is why cameras for astrophotography are often monochromatic, because you capture more detail and your noise goes down. Just as with the cameras on Mars, on Earth us astrophotographers use color filters to take 3 separate images (RGB) and then combine them in post-production.

  • @alexlandherr
    @alexlandherr 3 года назад +40

    I wonder if there’s a setup for the Raspberry Pi that uses a B&W sensor and a color filter so that one could mimic the spacecraft cameras in operation? I would enjoy programming that setup...

    • @the_retag
      @the_retag 3 года назад +4

      Build one?

    • @xureality
      @xureality 3 года назад

      Arducam has several monochrome sensors available, should be plug and play.
      The easiest way to do the filters would be a hobby servo motor with a 3d printed filter wheel (because i doubt anyone would have one off the shelf. obviously you'd still have to buy the filters)

    • @AlRoderick
      @AlRoderick 3 года назад +7

      A RUclipsr called Matt Gray (friend of Tom Scott) did a video a few years ago where he captured full color images using the black and white Game Boy camera by taking three images with handheld color filters.

    • @lightningvini
      @lightningvini 3 года назад

      NASA has a GitHub page, might be something useful there

    • @rpavlik1
      @rpavlik1 3 года назад +2

      Interestingly, I recently bought a camera for a Pi that had the minimal filter wheel on it: a coil-driven infrared cut filter connected to a light sensor, so it only takes ir-containing photos when there is little or no visible might, to avoid weird colors. But yes you definitely could build a color wheel (maybe with theatrical lighting gels as the filters?) and stick it in front of a mono cameras. For another interesting challenge, try tracking something very precisely on a "rolling shutter" camera (most web cameras and other cameras in video mode) - each row is a different time, and it should be able to be used to your advantage of you have a very sophisticated camera. (For "production" work we try to stick with "global shutter" sensors which are overwhelmingly mono)

  • @owensmith7530
    @owensmith7530 3 года назад

    I knew most of this, but the push broom sensor explanation is new to me. What an ingenious way of doing things! Whoever first came up with that must be really pleased with themselves.

  • @FatHeadDave
    @FatHeadDave 3 года назад +38

    Alone in the bath, listening to Scott...every single man's Saturday night dream 🙃

    • @DoctorJuergen
      @DoctorJuergen 3 года назад +7

      Add scotch ;)

    • @aXYZGaming
      @aXYZGaming 3 года назад +10

      @@DoctorJuergen bathing in... scotch?

    • @FatHeadDave
      @FatHeadDave 3 года назад +4

      @@DoctorJuergen sir, I like your style (however I don't drink so I'll just have a cuppa, that cool too?)

    • @bozo5632
      @bozo5632 3 года назад +4

      Maybe you been in quarantine too long.

    • @Mtlmshr
      @Mtlmshr 3 года назад +2

      Real men don’t take baths they “spit bathe”🤪

  • @rjmunro
    @rjmunro 3 года назад +2

    You didn't mention 3CCD/3CMOS type sensors, usually found in pro-video type cameras which have a prism arrangement to split the red/green/blue colours to 3 separate monochrome sensors.

    • @scottmanley
      @scottmanley  3 года назад +2

      True, I have a3 CCD camera somewhere, but 3 sensor systems rarely made it to space.

  • @unnatural_log6472
    @unnatural_log6472 3 года назад +3

    For the three layer color sensors discussed at the end, wouldn't it make sense to also do the checkerboard pattern for each layer, so you can better guess at the intensities of each color, and you also get direct measurements of the color one pixel over? Idk if that makes sense to anyone...

    • @trimeta
      @trimeta 3 года назад

      My understanding is that which wavelength is picked up by each layer of the Foveon sensor is a physical property of the silicon used to build the sensor. That is, they didn't choose "OK, we'll have the top layer detect blue, the middle layer green, and the bottom layer red," but rather the fundamental nature of the material made those colors get detected at different strengths at different layers of the sensor. So without radically redesigning the material, they can't make it detect in a different order.

  • @youerny
    @youerny 3 года назад +1

    Really interesting! I’d like to watch a part II about image compression algorithms (if any) and radio transmission of pictures, videos and scientific data in general, from deep space. It is in fact not very well explained elsewhere on RUclips

  • @KinreeveNaku
    @KinreeveNaku 3 года назад +3

    This all reminds me of that old spacecraft camera that had a green lens cylinder, red lens cylinder, and blue lens cylinder next to each other and it would just take the three images and overlay them like Scott mentioned. It might’ve been Hubble

  • @KokkiePiet
    @KokkiePiet 3 года назад

    Thanks Scott for this 15 minute crash course in sensor technology and what you can do with it! Very informative! !

  • @jokerace8227
    @jokerace8227 3 года назад +4

    Anyone who has done serious astrophotography understands the overall advantage of wideband monochrome CCD cameras with filters, over the RGB CCDs. Many of the monochrome CCD can "see" a ways into both the infrared and ultraviolet, well beyond human perception. Some filters are designed to exploit that.

    • @nashsok
      @nashsok 3 года назад +2

      I have an old A7s that I sent off the have the hot mirror replaced with a piece of fused quartz so I can do infrared and ultraviolet terrestrial photography - It is really amazing to see everyday objects in was that you've never seen them before!

    • @Mythricia1988
      @Mythricia1988 3 года назад

      @@nashsok Oof, now I'm thinking about getting into film IR photography again... Or digital, but that requires some tinkering. Either way, I never actually got around to doing IR photography but I really want to try it one day. I have the filters and everything!

  • @cwmaguire
    @cwmaguire 3 года назад +1

    Fascinating. Thank you. Some of this I've heard before but it was great to hear it all again in the context of space craft.

  • @flamingmohmohawesome4953
    @flamingmohmohawesome4953 3 года назад +179

    Sorry, but imma have to watch this a second time. The first time I wasn't listening because I was trying to decipher your shirt.

    • @rocketsometimeslaunches8902
      @rocketsometimeslaunches8902 3 года назад +21

      Oh good. It wasn’t just me

    • @benjaminhanke79
      @benjaminhanke79 3 года назад +3

      Is it machine language?
      I think he gave some hints about it in a previous video.

    • @michaelhaney9432
      @michaelhaney9432 3 года назад +5

      Did you get there, my first thought was that it was the code used to correct the error on Apollo 11 but I think that's wrong.

    • @michaelhaney9432
      @michaelhaney9432 3 года назад +5

      @@benjaminhanke79 I don't think so as there are letters greater than f.

    • @HiddenWindshield
      @HiddenWindshield 3 года назад +1

      It doesn't help that his lapel mic folds his shirt so you can't see some of the characters.

  • @markmitchenall5948
    @markmitchenall5948 3 года назад +1

    While I knew the reason, I could never have explained so well or with quite so much additional details. Thanks!

  • @chris-hayes
    @chris-hayes 3 года назад +3

    Fascinating! Clicked to learn about Perseverance, stayed to learn about image processing.

  • @technoadmin
    @technoadmin 3 года назад

    You never disappoint Mr Manley. Loved all the research you did for this.

  • @jonnoMoto
    @jonnoMoto 3 года назад +5

    Forgot about one technology, the 3ccd sensor that used a prism to split light to an r,g &b sensor.

  • @wafflesnfalafel1
    @wafflesnfalafel1 3 года назад

    dude - absolutely love it. I had no idea there is so much going on with the various digital image sensors.

  • @julese7790
    @julese7790 3 года назад +3

    I'm in astrophotography from 1y (because of your 5$ scope video btw) so I understand quite a bit of this video :p anyway I'm gonna share this to my friends on my favorite social media to create some "mise en abyme" (comme nous disons aussi en france entre deux baguettes et deux drapeaux blancs )
    Anyway, TY Mr Manley

  • @menachemgold7677
    @menachemgold7677 3 года назад

    one of the most interesting informative videos you've made in a while, for a space nerd like me it was amazing to realize how complex a whole subject i haven't noticed is

  • @alexlandherr
    @alexlandherr 3 года назад +3

    I know from reading that NOAA APT weather satellites use nIR to approximate the green components of the image data.

  • @AeroLowdown
    @AeroLowdown 3 года назад +1

    The depth of your knowledge never fails to amaze me Scott!

  • @josefkrakel9136
    @josefkrakel9136 3 года назад +3

    I had always assumed that black and white was more accurate and more amenable to image processing.

    • @un-nerdyneko
      @un-nerdyneko 3 года назад +1

      same

    • @scottmanley
      @scottmanley  3 года назад +3

      It is when you don’t have a bayer mask messing you up

  • @GoWstingray
    @GoWstingray 3 года назад

    Bet you glad to get through that one Scott, the smile at the end gave it away. Great video thanks.

  • @guilhem3739
    @guilhem3739 3 года назад +4

    I am obsessed by the message on your t-shirt ...

  • @michaelwoodhams7866
    @michaelwoodhams7866 3 года назад

    I was very peripherally* involved in the early days of the Sloane Digital Sky Survey, c1990, which also used the 'push broom imaging', except we called it 'drift scan'. As I recall, the main camera had 30 squarish CCDs (2k x 2k I think?) and some of them (the UV sensitive ones) cost around $100k each. (There were also some long rectangular CCDs at the edge of the field, I think they were primarily to aid tracking, but also help with the bright stars which overwhelm the main CCDs.)
    * I.e. they let graduate students attend planning meetings. We could say stuff, but I can't recall any instance of a grad student idea being useful.

  • @macdjord
    @macdjord 3 года назад +12

    Scott, what's that shirt referencing?

    • @KinreeveNaku
      @KinreeveNaku 3 года назад

      Someone else decoded it as VLADIMIR THE IMPALER

    • @tricorderandrewjw
      @tricorderandrewjw 3 года назад

      @@KinreeveNaku that's not it

    • @junholee4961
      @junholee4961 3 года назад

      @@tricorderandrewjw why?

    • @cut--
      @cut-- 3 года назад +1

      it's not in qwerty that's for sure !

    • @macdjord
      @macdjord 3 года назад +1

      @@KinreeveNaku Yes, I saw. But they have given no explanation for how,

  • @Crlarl
    @Crlarl 3 года назад +1

    Colour/filter wheels are literally the oldest trick in "the book." Photographers have been doing this to take colour images for (approximately) as long as photography has existed.

  • @arkitect5692
    @arkitect5692 3 года назад +40

    The new mars rover, scoot manley

    • @certifiedkerbal9717
      @certifiedkerbal9717 3 года назад +2

      yes

    • @protonjinx
      @protonjinx 3 года назад +4

      Elon Musks first colony ship to Mars will carry a rover called Scotty McManlyface.

    • @tarmaque
      @tarmaque 3 года назад +4

      NASA is really conservative when naming its vehicles and probes, which to me is very irritating. I mean, they named the Crew Dragon Demo spacecraft "Endeavour" when they have just as easily named it "Puff" or "Smaug" or "Ramoth." (Who _wouldn't_ want to ride to orbit on "Puff the NASA Dragon?") Likewise I think a future Rover should be named after Randall Munroe, and a SpaceX Starship should be named "Boca Chica Gal." Elon Musk is a fan of Iain Banks novels, so I fully expect some future spacecraft to be christened the "Very Little Gravitas Indeed."

    • @Marinealver
      @Marinealver 3 года назад

      Winner 🏆🏆🏆🏆

    • @patreekotime4578
      @patreekotime4578 3 года назад

      @@tarmaque Well, they DO have tax payer dollars to worry about. A senator with an agenda to kill a budget item would have a much easier time killing something with a silly name than the umpteenth synonym for "exploration".

  • @hobbykip
    @hobbykip 3 года назад +1

    I would be interested in what other filters they use in their main science cameras. CMOS or CCD sensors generally have bad quantum efficiency (efficiency of converter photons -> electrons) at longer wavelengths like IR. EMCCD could improve this but does not improve the wavelength range. InGaAs sensors would increase the spectral bandwidth for fitlers but I guess too low resolution for now?

  • @ryanmcgowan3061
    @ryanmcgowan3061 3 года назад +17

    I thought for sure you'd mention that high-quality amateur telescope cameras are monochromatic, and that you can buy narrow-band filters to create false-color imagery. I would be inclined to think you own one, being the kind of guy you are.

  • @charleselmi1568
    @charleselmi1568 3 года назад

    Love coming here and learning actual real stuff, from someone with real knowledge. Thank you for sharing all this!

  • @-Kerstin
    @-Kerstin 3 года назад +8

    This is like reverse click-bait. The title makes the video sound boring as hell but the actual video is great

  • @1xBublex1
    @1xBublex1 3 года назад

    Coming from microscopy i have to affirm everything you said! We are also using (s)CMOS cameras to also increase readout rates (up to 100-200 FPS) for single molecule localization microscopy or single molecule tracking where you (try to) localize single fluorescent molecles and you need high signal to noise and have very low signal.
    Very nice explanation! I also had to explain the reason why our images are taken monochrome and not in color to many students :D

  • @choccychewer8386
    @choccychewer8386 3 года назад +23

    Whats that shirt?

    • @Paul_Ch52
      @Paul_Ch52 3 года назад +7

      Well, if NASA can do it why not Scott?

  • @redtails
    @redtails 3 года назад

    for your interest: the old tech lives on in photomultiplier tubes, for situations where a low signal needs to be captured at scientific accuracy. You see them in high-end fluorescent laser-scanning microscopes. The "color" filtering is done by exotic dichromatic color wheels. Same sort of tricks needs to be applied for getting color images

  • @mastershooter64
    @mastershooter64 3 года назад +4

    I hate it when people say "wow we've spent billions of dollars of taxpayers money but we've only seen black and white images" people really need to know that just because you don't know what something is, doesn't mean it doesn't exist

  • @sebbes333
    @sebbes333 3 года назад +1

    So, for the animation at 12:02 it would have been better if ALL those image sensors at 11:11 was aligned after each other, in that way we could have gotten a much longer animation?

  • @omsi-fanmark
    @omsi-fanmark 3 года назад +3

    Either Apple is no longer paying well or RUclips urgently needs more money: RUclips tried to run ads 4 times during this video! (and failed)

  • @korban8971
    @korban8971 3 года назад

    Thanks Scott. Enjoyed this a lot. Interesting to follow as I recently started astrophotography and there were clearly a lot of common threads to the spacecraft cameras!

  • @dipakahir4688
    @dipakahir4688 3 года назад +15

    Ironically there is a meme like
    Full HD camera on mars rovers but crap quality surveillance cameras in the banks/ATM's

    • @Am_Yeff
      @Am_Yeff 3 года назад +6

      It goes like this:
      A rover on mars: *Jezero crater*
      CCTV cameras: *Unrecognisable mess of pixels*

    • @dipakahir4688
      @dipakahir4688 3 года назад

      @@Am_Yeff exactly 😂😂

    • @randomnickify
      @randomnickify 3 года назад +3

      FILE....SIZE...MATTERS - banks, shops need to keep days if not weeks worth of 24hour/day footage from multiple cameras, with 4K cameras you would need a disc drive farm to keep it.

  • @wictimovgovonca320
    @wictimovgovonca320 3 года назад +1

    The FOVEON sensor sounds like old school color film. It used multiple layers of emulsion with subtractive color formation using cyan, magenta and yellow dyes instead of the additive red, green, and blue colors.

    • @alainmaury5941
      @alainmaury5941 3 года назад +1

      You should look into Lippmann photography then... I mean way back then...

    • @wictimovgovonca320
      @wictimovgovonca320 3 года назад

      @@alainmaury5941 very interesting, I learned something new today. I have seen interference patterns used to test telescope mirrors using an Interferometer, and I believe something similar is used for generating holographic images. Note that multiple exposures through color filters on separate plates predates Lippmann by 40 years or so, but as far as I know required multiple projectors to recombine the color image.

  • @vvvvvvvvvwvvvvw
    @vvvvvvvvvwvvvvw 3 года назад +6

    That shirt looks familiar. Can anyone tell me what thats about?

    • @NavySeal2k
      @NavySeal2k 3 года назад

      About ipaling people, more precice people who had impaled people earlyer.

  • @venturestar
    @venturestar 3 года назад +1

    With Scott you always learn something new!

  • @scott_meyer
    @scott_meyer 3 года назад +14

    To be "more" correct. Monochrome cameras... One color at a time.

  • @OriginalUnknown2
    @OriginalUnknown2 3 года назад

    Thank you, Scott! I have been google-ing this question for so long now and I could never find out (concretely) why they need to "process" color for images, rather than just take normal pictures like any old terrestrial camera :D

  • @jasonstevens8834
    @jasonstevens8834 3 года назад +8

    My boi Scott works for Apple and pronounces the X in iPhone X as “ex” and not 10🧐

  • @ConsitentlyInconsitent
    @ConsitentlyInconsitent 3 года назад

    Thanks for the Image of Iridian Spectral Technologies Optical Filters Scott!

  • @mitchelldean5397
    @mitchelldean5397 3 года назад +3

    Wow this is the soonest I ever got here

  • @IanZainea1990
    @IanZainea1990 3 года назад

    @Scott 2:45 ... you can absolutely do a comparison on youtube of the difference on how our eyes percieve color differently than light. It's done all the time. You get an image. You covert it to something like YCbCr and then you blur just the Y channel (luminance) and the image turns in to a mess of color sections that we can't tell anything about. If you instead blur the Cb and Cr channels, you get an image with crisp clear line in the luminance with splotches of color on top, but you can still tell exactly what the image is. FLIR theermal cameras sort of use this.
    Example: ianzainea.com/examples/blurred-images.php

  • @flags5765
    @flags5765 3 года назад +3

    Come people let perserverancy clear his eyes he got dust all over them

  • @johnhjic2
    @johnhjic2 3 года назад

    Hello Scott, Thanks for a great video. Keep well, keep safe and be happy.

  • @marcinmarcin2506
    @marcinmarcin2506 3 года назад +15

    Quickest fan lol, just kidding

  • @dgthall
    @dgthall 3 года назад

    I think that was the best explanation of that topic I've ever heard... fascinating!

  • @luma8212
    @luma8212 3 года назад +7

    Pog
    First

    • @un-nerdyneko
      @un-nerdyneko 3 года назад +1

      joe bidome meme

    • @luma8212
      @luma8212 3 года назад +3

      @@un-nerdyneko yes vote for me 2024

    • @americankid7782
      @americankid7782 3 года назад +3

      This comment deserves to be top comment.

    • @luma8212
      @luma8212 3 года назад +2

      @@americankid7782 maybe

    • @Am_Yeff
      @Am_Yeff 3 года назад +2

      Joe wheres the obamium

  • @flamencoprof
    @flamencoprof 3 года назад

    I thought I was pretty well aware of the tech in CCDs, filters, etc. Turns out there is a reason I watch this channel. Thanks.

  • @Spedley_2142
    @Spedley_2142 3 года назад

    Scott's stutter at 11:28 was probably because sensors usually don't use RGB but some for of CYV. The Value pixel captures all light and provides white balance. The Cyan captures green AND blue and the Yellow captures red AND green. From those you can calulate RGB values but with 78% of incoming light used rather than 33%.

  • @ShukenFlash
    @ShukenFlash 3 года назад +2

    I imagine there's also a bandwidth savings when transmitting a monochrome image since all you'd need would be the luminance data. I wonder if that holds true with using the filters. Is it easier to send 3+ monochrome images and translate them into color than to send a color image?
    Well, looks like I'm going down a wikipedia/google rabbit hole

    • @patreekotime4578
      @patreekotime4578 3 года назад

      RAW images are the data dump almost straight from the sensor. So it contains all of the data from all of the pixel sites together. Which means it roughly would resemble a "fuzzy" B&W image. The RAW images are HUGE compared to even a decoded color image. They also tend to contain data outside of the dynamic range of human perception. So some of the delay in the NASA image postings is the time it takes to recieve the RAW files, and some of it is the time it takes for them to decode those files into color images. The cameras may be doing a rough and dirty decoding that generates a very small B&W file for the initial uplink.

    • @Mythricia1988
      @Mythricia1988 3 года назад

      @@patreekotime4578 Apparently some of the cameras do JPEG compression before transmitting, which I find kind of fascinating. I assume they use JPEG to send thumbnails or preview images, and then based on that they re-transmit select images in full raw glory. But yeah, hardware advances have allowed them to do on-board JPEG compression on Percy (and I believe Curiosity as well, same computer hardware) without issue which is cool.