If you’re interested in messing around with data from the camera I’ll be posting some of the raw data for these pictures on my Patreon: patreon.com/stuffmadehere. Also, THANK YOU to all the patrons who support these projects!
Who needs drugs when you can have engineering???? I DO.... It's called coffee and red bull.....I'd also go for something stronger and less legal, but I've already blown my budget on the parts of the engineering project 😂
i am so grateful that you continue to defy the incentives to make "lightning vs laser" videos or whatever. gonna finally sign up to your patreon now to hopefully help keep it that way 🤞
I know I'm probably the odd one but I liked it more when you didn't gloss over the details and the debugging process this much. It made me learn stuff so it wasn't just entertainment. Impressive project nonetheless!
Big agree! You're not the only weird one... I find stuffs troubleshooting / trial and error inspiring and interesting, slightly missed on this video but still an amazing video!
Fantastic work! This must have some practical applications. I was disappointed not to see a perspectiveless picture of a cube. Could it be used to create clothing patterns on a mannikin? I suppose that existing 3D CAD can do that another way, but it might be a way to reverse engineer a pattern from clothing. Could you create a funky animated sequence by scanning the same thing many times with different lenses and then viewing the pictures in sequence?
Oooh! It's Lindy! I'm allowed to be a pedant! A mannikin/manikin is an anatomical model, a mannequin is for fashion, though when I was in art class it was always a mannequin here in the US.
Yes! I cam here to say the same thing. Really wanted to see perspectiveless photo of a cube, and also a reverse-perspective photo of a cube. Please do a short, StuffMadeHere :P --- Great video tho, what a brilliant idea :D
A cheap source of these is old supermarket checkout scanners. I used to get lots of these from the scrapyard to harvest the lasers and driver modules, but they're probably on the auction sites. The photodiode sits right on the amp board inside a shield. Great for making laserbounce listeners with just a simple lm386 audio amp added.
I wrote off the idea of an orthographic camera as impossible after years of playing in Blender and CAD, and now my feeling of awe is intense! This was a joy to see, please keep taking pictures with that thing. I'd also enjoy a separate video going over the image building code and component selection in moderate detail. Like the kind of broad strokes that apply across languages and software.
I know it takes a REALLY long time to create your projects and edit them. But I'm going to need you to take stimulants, stay up 72 hours per day and pump out one of these every 3 or 4 days
Fun fact: all of these wacky lenses simulated in this video actually exist, and are widely used in manufacturing and inspection! Telecentric(no perspective) lenses are most common and very popular in metrology, but pericentric(seeing more than one side of the object), hypercentric(seeing behind and/or underneath the object), and even wacky combinations with mirrors. This allows you, as a common example, to inspect the cap on a plastic bottle from all sides, at the same time, using the same camera. Above, around the corners, from the side, AND from below!
There’s absolutely no other RUclips channel that I look forward to as much as this one! The content here is simply outstanding, the quality is unmatched, and the way complex topics are presented in such an engaging and fascinating way is truly remarkable. Every new video feels like a gift, and I can’t wait for the next one to drop. This channel isn’t just entertaining; it’s a genuine contribution to society. Incredible work, keep it up!
As a filmmaker and camera lover, this was seriously amazing. This kind of camera could make some amazing art. I can't help but wonder what video would look like from some of these configurations. Speaking of art, you said this camera feels a lot more like creating art. Art is all about making *choices*. And here you're making a LOT of choices about the way you're photographing your subject.
We definetly need a StuffMadeHere2 video about how you even came up with the idea and how you went from nonsense to actual Images @11:56 There are a lot of questionst that, as you said, went unanswered
My guess is there was some bug with the code mapping angle sensor output to real world position. About the idea I'm betting he was pondering one day what would it take to make a real life ortho camera. I sure have pondered the same but quickly thought it must be impossible. This still baffles me though that it seemed to work so well. How far can you see with the thing? what if you point it a mile away? What if you point it to the moon (assuming there was so super accurate rig to keep it pointed at the same spot on moon surface?
@@teemukarjalainen5361 I'd think the primary limitation is the angular resolution, determined by the length of your tube - keeping in mind also that a longer tube is a narrower field of view, hence less light and therefore noisier. [And also, at least if you actually look at the moon, atmospheric distortions.]
seems like it was just warped and the equations didnt line up if he was using a still image pattern wall for debugging instead of an actual wall, could've probably debug it easier
'Like playing Medusa whack-a-mole where you fix one problem and it proptly truns some part of your project into stone and eventually all the problems in your project are heavy weights you drag over the finish line.' Is what I think he meant.
I think this image made me realize what perspective really is. I’ve had my share of decades in this life, so it’s a big deal to have a smart person explain complex things in a simple way!
I knew looking around stuff was impossible, but the way you sold it, im still disappointed by the (overall pretty cool) result lol. It's just.. the camera looks "around" stuff, basically the same way somebody with a selfie stick would look around whatever you are hiding behind. By moving the camera such that it has direct line of sight. In in a way, it doesnt look around anything, at any point in time. It's just that the stationary center of the camera has no direct line of sight.. but the camera does. I think you could have sold this camera differently to make it less disappointing.
I thought it was going to calculate the view behind an wall by inferring from indirect lighting and shadows, like if you could see a shadow poking out from behind a wall, you might know something is there, and with careful calculations and assumptions you could get more detail. Pretty sure I saw someone working on something like this for "looking" around a corner in a corridor.
I’ve always loved your videos and appreciate the effort you put into them! Lately, though, I’ve noticed a shift, they seem to gloss over some of the details that used to make them so engaging. I understand you might be catering to a younger audience, but for many of us who watched for the in-depth problem-solving, that was the real draw. Have you considered creating a second channel or series for those of us who enjoy the deeper dive? It could be a great way to cater to both audiences!
@@poolkrooni Woah! Thank you lol-I had no idea it existed! I just checked it out, and it looks like a great start. However, it feels like there could be more content related to what’s posted on the main channel. I was hoping to see more.
This project can not stop with this video. Dont start a next one yet, instead have some fun with the camera and do a second vid. The picture turn out too nice to put it aside now
I only just started watching the video but I already love how there's this precise optical device and it's clamped to a piece of lumber. Makes me feel right at home!
You may want to use a first surface mirror if you're not already doing so. I used to make machine vision applications and if the camera angle needed to be changed. Using a regular mirror would result in the primary image from the reflective surface on the back of the glass, and a second ghost image reflected from the front of the glass.
@@moos5221 - Can be, or sometimes silvered glass but you put the silvered surface on the front (i.e., use the glass just as a substrate, so you only need a very thin layer of deposed metal).
@@moos5221 Glass behind metal instead of metal behind glass Reason why it isn't usually done, the glass protects the metal for one, also it's much easier to make flat glass surfaces (and deposit metal, making the metal contacting it flat) than making flat metal surfaces
This has probably, surprisingly, been my favorite of your videos to watch so far. Obviously being able to control perspective to a degree in 3D software is one thing. But to see the images you've created is almost like discovering a new colour or smell. The theory makes sense and seems obvious as to what will happen but seeing it for real (watching a video) is pretty exciting.
I felt that one. Some of the stuff I've worked on has felt like teetering on the edge of madness. And Stuff Made Here works on even crazier stuff than me.
So fun fact, most images you have seen of stars from telescopes looked around an obstacle. Newtonian telescopes have two mirrors with the secondary mirror being in the middle of the of the scope's fov but when well focused it doesn't appear in the final image.
You can also just buy catadioptric telephoto lenses for normal digital cameras that work on a similar principle. A quirk of them is that out of focus highlights form rings instead of spheres because of the ring shaped aperture.
Another fun fact: If you cover the bottom half of a regular camera lens, it will still capture the full field of view. This is because even though the obstruction is blocking 50% of the light going to the lens, all of the light rays are still able to find paths to the detector through the uncovered half of the lens. This is in principle the same thing that happens with the secondary mirror in telescopes. The secondary mirror blocks a portion of the incoming light, but there are a bunch of other paths for the light to take that will hit the primary mirror.
You should use this camera to film some basic shapes like cubes, cylinders and cones to make a education video for teaching perspective. It could be super useful for art classes and drafting classes.
Basic shapes are easy to draw (or 3D render) with varying perspective anyway. The point of making a camera that can do this is so you can play with perspective in "real" (complex) scenes. Anyway, this isn't new. Search for "Hypercentric optics: A camera lens that can see behind objects" for a video published by Applied Science a few years ago (smaller lens, but it's the same principle and works in real time).
it always interests me how long these incredible projects take. have you thought about maybe including some sort of date or time stamps throughout the video? its just something i would personally like to see, thats all. but i love getting to see almost every single little step of the process. great video!
This is his 3rd video in 2024, not counting the one released on January 1st because it was made last year. So, probably 4 months to make one video, on average.
12:01 Somebody needs to make a super cut of all of the times Shane has a breakthrough and finally gets his builds to work. Truly inspiring stuff, keeps you going. Love these moments.
@@BroTeachesU He ain't wrong tho. It's a really cool contraption, but it is basically peeking around the object. To actually see something from behind you'd need a force capable of bending light rays. Like that demonstration of moving near light speed.
10:30 I 'm a EE at Thorlabs and couldn't agree more - even when prototyping a new product, I'll end up copying the a lot of the analog design from our dark magic wizard analog design engineers, or slapping one of those amplifiers in-line to verify everything works before trying a design myself. The analog stuff gets real funky real fast and it's always best to use something you know will work at first.
A very common question in my past with photo diodes: Do I want to go through the pain to try to build an amplifier for this, or buy a good, working one? After one attempt, the latter answer is the default now 😅
Dude, you are flying at such a high level. Your problem solving abilities combined with your relentless determinatin make you inspiring to watch. I'm 20 a solution architect in the software engineering space and restire motorcyclea in my spare time and you make me feel like im not tapping my full potential, in the most motivating way. Kudos to you. Cheers from Australia 😊
That’s absolute insane. I work in the film industry and engineering and this it’s probably one of the most exciting thing in camera tech I’ve ever seen. I didn’t know it’s even possible to simulate lenses like this. Feels truly like a 3-D program. And even there I’ve only seen an autographic view not all the other crazy creative ideas. I can’t imagine how absolutely insane it would be to have this in high resolution with proper RGB read out. And then a proper photo campaign shot on it. Congratulations to this absolute insane achievement The only other absolute insane engineering. I’ve seen a few years ago, was a camera that’s basically reverse ray-tracing the light rays, and therefore can really look around corners and in other rooms without even being there. But seeing you engineering everything yourself from the software to the hardware and everything else is absolutely insane. After seeing your concept, I wonder if it’s possible to kind of use the same concept, if you have multiple images/photographs, undistort, account for their sensor size, and focal length and triangulate them, so you get their position in space, and then later, on the software side, fill in the blanks in kind of like a reproprojection setup to achieve basically the same as you did, but without having to have kind of like motion control rig. What’s what probably result in some that pixels probably could also work even though they’re not all on the same depth axes As a sidenote, it would be super interesting to know more about the hardware you used, I saw an Arduino but more insights about your tech stack and the math that’s making it possible would be super interesting
This is awesome, you should make an animation where you take a series of pictures with a progressively narrower field of view until it becomes orthographic then continue with a increasingly negative field of view. So you would start with being able to just see the cardboard then the card board would "shrink" eventually shrinking until it reveals the thing behind it.
I must really say i extremely appreciate that you let your projects be the star of the show. No extra hype or fancy showing it. Just the project, how its made and the engineering behind it.
12:36 idk why, but i really like this one specifically. it feels like one of those captchas where they ask you to type the numbers but they're all squiggly
When you started describing this it immediately reminded me of things like mechanical tvs or the scanning cameras on satellites.There is a great video from Applied science where a similar effect is made with a big Fresnel lens. Making a scanner just rebuilds the lens one pixel at a time and that is so neat.
The common don't have mentors to show us what lies past persistent effort... except for here. You and all of the educators / makers on RUclips and other platforms bring the fear of failing into part of the process. You are an inspiration, man. Nothing but love towards you and those that support you. Keep it up.
Very cool. This isn't the first time you've built something complex enough that viewers might not be as impressed as they should be ;). You did a fine job of explaining the concept in 20 minutes, but you've obviously had to gloss over many of the image-building difficulties. This is an incredible camera and you've only shown us a few of things it can do; it's a virtual lens on a human-sized sensor. In a way, this reverses much of what digital sensors have been doing for decades; their pixels get smaller and their lenses focus light onto smaller areas to fit into more devices while your camera is a humongous sensor with large virtual pixels. Both strategies require software correction, but for different reasons.
11:20 was rough. I remember checking in a one line of code change after a week of debugging to have the boss ask me what I was doing all week. My soul was already broken and her question felt like a boot to the neck. Hang in there champ! You are doing good.
@@privacyvalued4134 100%. There's a phrase usually used for repair people that's also relevant here. Don't remember the exact phrasing but it's basically "You don't pay $1000 to fix it, you pay me $100 to fix it and $900 for knowing how". Most people could do the same thing (literally just changing a character in a file) but most people also would have no idea how to find out what to change.
We need a spin off series of shorts called Stuff Made Here: Failure Diary. Around 11:35 you mention 'fixing all the issues', it'd be cool to get a look at a few of the problems you solved going over "What is supposed to happen", "What IS happening", "What you suspect is causing the issue", "What IS causing the issue", and what decisions or misunderstandings led you to having the issue in the first place.
Yea i'm missing the old videos. Like, how is he blending the pixels/light readings? Is he making more readings on the outside. How is he supporting everything with electricity, why this focal length for his lens, how is the balancing weight moving past the sensor when he's in the middle. Is the gearshaft not fixed to the rotating pole, how is he unswirling the patterns or is he measuring where the readings were taken. Why is there a bulge in the photo and why is he not fixing it. Is he taking a reading on time intervals so he is moving faster inwards because there is less area to scan...... I want all his thoughts and prayers
Honestly, as a hobbyist programmer, debugging is just not very fun or interesting. Usually it's something really dumb and simple that you overlooked (like letting the electrical contacts touch by accident), and you go through countless iterations of getting the computer to spit out its data at intermediary steps to follow where it's going wrong. It's already confusing enough as the programmer to debug, and even more difficult to explain your thought process behind it. Most of it will just be fruitless "I spent 5 hours thinking in was a problem with X, when it was a problem with Y."
Why do I feel like this is the kind of thing that will have the DoD knocking on your door in a few weeks? Unspeakably cool stuff. I would have written this off as "too many variables" really early on. Great work.
The concept of looking "around" something is actually the least interesting here. Any camera can do this if placed on a long stick like this machine. Everything else is really cool though, especially the elimination of perspective!
The looking around is interesting because the photo looks like it's taken from a stationary point behind the cover. Sure, you could put the camera over the top of the cover, but it would be a picture from above.
@@tomadams7553this. I don't think he emphasized enough why this was such a big deal. It's not about what you're taking a picture of, it's about the impossible perspective you produce with the final image, which no other camera will be able to do
My phone camera can see behind walls too, I don't even use a rotating arm, I just use my own arm and move the whole thing to peek behind it. Other than the data stitching, it's almost the same.
One thing that would be interesting to see from this is a photo with a different fov at each angle. Like it could be normal at the vertical but go through orthographic and end up at reverse perspective at the horizontal. Probably would just look like a mess, but it’s an interesting idea
@@milest9754you wouldn't see it blurred, but you would see the camera arm in each position overlaid all on top of each other. Basically imagine the camera has ten thousand identical arms all around the circle, and it's not moving. That's what you'd see.
It is always worth the wait!! When a Stuff Made Here video drops it is just a joy and this did not disappoint. Fantastic project, a great idea! Now you need to make a really big one!!
I find it poetic that the first cameras required the photographed person to sit still for the medium to expose and you also need this here. I think moving as much components as possible off the rotating giant thing could yield a better photographing time. Although the max "peeking" radius is directly tied to the radius of the rotating structure. Still, I think that can be done with some clever mirror setups and mechanical linkages.
The way you explain things and the visualizations you use are otherworldly. You make extremely complex topics that I otherwise would assume I could never understand, very very digestible and easy to comprehend.
So on one monitor I'm watching this video, on the other monitor python is rendering almost the same graphic that appears at 2:14 (not quite). Weirdo cameras are awesome. Looking forward to the rest of the video =) Edit: this was awesome! Is there going to be a sequel with more wacky lens emulation? I love this. How on earth did this last 23 minutes - felt like 3. I totally feel the "this is art now" sentiment going along with the difficulty of running the rig - Congrats!
I wonder... If you changed out the photodetector for a very, very directional radio receiver and maybe spun it more slowly... if you could take a radio-spectrum "photo" of a city? You've got the movement and the math.... all you need is a different type of signal receiver... Also, you'd have to lose the mirror and point the detector straight ahead. But... still... that could be pretty amazing.
Just a quick note at 14:18 a true orthographic projection is AWESOME in that you can measure anything in the picture and get accurate measurements off of it. To say it more clearly once you know the measure of one thing you can measure everything else, because everything else will be to perfect scale.
Your determination in the face of multiple failures with trial and error is inspirational. 11:50 "The thing I've learned over the years is that there's a finite number of issues. And eventually... it works."
So, you basically put a camera on a stick, which is long enough to look behind a small barier the same way you'd put a mirror on a stick to look aroud edges.
yeah, the camera is peeking around different corners and then creating a flat image by combining different sides of the subject of the photos into 1. I love his projects but this is just not what he says it does
@@neb_setabed His sensor is in thousands of places. It would take you a very long time to put a normal camera at the same place as his sensor. Especially accurately.
I don’t know if it’s been done before, but that ortho/parrallel projection image is SO cool. I’ve done it in 3D software but seeing it’s actually done is mind warping literally
14:00 - MIND. BLOWN. Never seen anything like this! Never even imagined it would be possible. This is cool! The funny thing is, the "seeing around objects" stuff at 16:30 is somehow less impressive, as you could really just move any old camera around the object without needing to spin it. But the no-perspective images are so unlike anything our eyes could ever see. I'm seriously impressed!
As a mech eng. student, I can't stress how cool this is, We just our 2nd year class on sensors and instrumentation and our prof. spoke our heads off on how Impedance amplifiers and active filters work and when you showed a use case for it finally clicked for me (haha, no offence, he was a great prof). I find that eng. students like me often face the struggle of find a project like this to work on. I think the most inspiring thing about this is your ability to take ideas like this and turn them into a crazy-creative project. Definitely kickstarted my brain to try and finding awesome project ideas like this!
Something cool to try would be to use the full censor of the camera, instead of just a point, save all of the images that it takes, and reconstruct the photo afterwards to use whatever lens / FOV you want. Awesome video as always!!
11:42 I think you mean "hydra whack-a-mole" the hydra was the mythological creature that grew 2 new heads every time you cut one off, Medusa was the gorgon that had snakes for hair and turned you to stone if you looked at her.
I could be wrong i have not bothered fact checking, but dont medusa's snakes have the same property of kill one, two grow in its place? I did think the same thing at first tho which is how i found this comment
@@yoitszaitz That I something I have never heard before, and if it is true it'd be so obscure as to not be a reference anyone would understand as "correct" when the more obvious cutting off heads and multiplying schtick Greek monster is the hydra.
as a filmmaker and camera assistant, I find this mind blowing, and a fantastic demonstration of the capabilities and functionality of what I work with. Gives me a better perspective, thank you.
There actually are cameras that see around corners. They measure the time it takes photons to bounce off a wall, hit the object, off the wall again, and then hit the detector. Then with tons of math you can get the general shape of an object that’s occluded. The Smithsonian Magazine has an article about it from 2012 if anyone is interested.
I thought that this was kinda what he was going to do, cuz what he did doesn't really see "behind" the wall, the sensor just looks above/around it. I mean, it's still cool, but I wouldn't call it "seeing behind a wall".
While my amped up engineering senses think this is awesome, this is more like a fancy way of using a wall as a temporal mirror. I mean still freaking awesome, but kind of still cheating in a way :) We need really small black holes to really "see around corners". Oh, and also really strong walls.
@@Mismatch-it's better than seeing behind it. It's seeing _through_ it, because the perspective is from a point that is directly blocked by the obstruction
great video, but a small correction for 17:40 Most cameras only have a single sensor per pixel, and on top of the sensor there is RGB Bayer pattern applied, meaning only some pixel see red, some green and some see blue. The final image is an interpolation of the data from each filter.
Everything apart from the looking round objects bit was stunning. Camera on a stick can look round stuff regardless. The reverse and Orthographic stuff is mind bending, being able to create custom lenses is freaking nuts!
Camera on a stick can look around things, but this is the only camera that can look around things with a perspective that looks like you're looking directly _through_ the obstruction. Notice how he's centered in the image, and you're not looking at him from the side. No other camera can do that.
Love the camera :) Your optics are diffraction-limited, so if acting as a conventional lens, a touch of super-resolution should be trivial. This is essentially accomplished by deconvolving the image with the point spread function. But... for the Orthographic Photo, the point spread function depends on distance. So wherever you have contrast the point spread function can be detected and the distance can be calculated. So in short, you have a depth camera that can detect the distance of edges. That image would be awesome. To detect the point spread is relatively easy - it's a Gauss function with height proportional to the Laplacian of the image.
If you’re interested in messing around with data from the camera I’ll be posting some of the raw data for these pictures on my Patreon: patreon.com/stuffmadehere. Also, THANK YOU to all the patrons who support these projects!
Ohhh new video, I am down! x3
He made stuff
What's the key benefits of Patreon membership again? 🤔
Who needs drugs when you can have engineering???? I DO.... It's called coffee and red bull.....I'd also go for something stronger and less legal, but I've already blown my budget on the parts of the engineering project 😂
i am so grateful that you continue to defy the incentives to make "lightning vs laser" videos or whatever. gonna finally sign up to your patreon now to hopefully help keep it that way 🤞
When you made the real life Orthographic Photo I was so happy for you. That's a huge deal.
Awesome video!
❤
Agreed, this one is HUGE! Very impressive result, congrats!
There is an amazing video by Applied Science where he plays around with an actual physical lens that does the same!:
watch?v=iJ4yL6kaV1A
Incredible and has some pretty interesting implications for how the Nth iteration of this camera could be utilized. So cool!
Yo, Destin, shouldn't you be blowing something up? Why are you wasting time on YT? Just kidding, love your stuff.
I know I'm probably the odd one but I liked it more when you didn't gloss over the details and the debugging process this much. It made me learn stuff so it wasn't just entertainment. Impressive project nonetheless!
Absolutelly
Yeah this one was a bit rushed it felt
Yeah same
I could totally watch the 2 hour version of this video with all the blind alleys, code, math, everything.
Big agree! You're not the only weird one... I find stuffs troubleshooting / trial and error inspiring and interesting, slightly missed on this video but still an amazing video!
All you need to see around walls is a dense enough object to perform gravitational lensing. Can your CNC work on singularities?
Don't give him ideas, next thing we'll know is there's an apocalyptic event at a warehouse
Bending light is a solution
Or just use mirrors and make a periscope...
or a window
Brick is dense enough to hit the wall
Fantastic work! This must have some practical applications. I was disappointed not to see a perspectiveless picture of a cube. Could it be used to create clothing patterns on a mannikin? I suppose that existing 3D CAD can do that another way, but it might be a way to reverse engineer a pattern from clothing. Could you create a funky animated sequence by scanning the same thing many times with different lenses and then viewing the pictures in sequence?
Yeah, I was expecting a cube, and a lot of other pictures. Just one picture per simulated lens type is really not enough.
Oooh! It's Lindy! I'm allowed to be a pedant! A mannikin/manikin is an anatomical model, a mannequin is for fashion, though when I was in art class it was always a mannequin here in the US.
Yes! I cam here to say the same thing. Really wanted to see perspectiveless photo of a cube, and also a reverse-perspective photo of a cube. Please do a short, StuffMadeHere :P --- Great video tho, what a brilliant idea :D
It's Lindy Beige!
if anyone else was curious, the transimpedance amplifier he talks about at 10:25 is listed at $537.91. I can see why he wanted to build his own
A cheap source of these is old supermarket checkout scanners. I used to get lots of these from the scrapyard to harvest the lasers and driver modules, but they're probably on the auction sites. The photodiode sits right on the amp board inside a shield. Great for making laserbounce listeners with just a simple lm386 audio amp added.
Id argue that's a pretty good deal considering the cost of his time.
Haha I even paused to check if there was a price
...hang on there's a machine to impede trans people?
Is that a real word?
I wonder if your camera could be used to simulate what animals with different shaped irises would see such as horses, goats or cats, or octopuses.
Interesting idea!
Top idea!
Solid idea
You had me at nerf herder
This is a GREAT idea!
Awesome as always!!
Yoooooo sup Electro
Engineers of RUclips, assemble
Hello :D
missed opportunity
"Brilliant as always!!" :D
On my way to watch your new video father electron
I wrote off the idea of an orthographic camera as impossible after years of playing in Blender and CAD, and now my feeling of awe is intense! This was a joy to see, please keep taking pictures with that thing.
I'd also enjoy a separate video going over the image building code and component selection in moderate detail. Like the kind of broad strokes that apply across languages and software.
IKR? i was super amazed by the orthographic camera
There does exist lenses for regular cameras that give an orphographic perspective.
I know it takes a REALLY long time to create your projects and edit them. But I'm going to need you to take stimulants, stay up 72 hours per day and pump out one of these every 3 or 4 days
He already does
Crazy comment. Hilarious too
He owes it to humanity
Definitely
@@MrCandyPants He owes nothing. You need to contribute instead, he's already doing it by being himself.
Fun fact: all of these wacky lenses simulated in this video actually exist, and are widely used in manufacturing and inspection! Telecentric(no perspective) lenses are most common and very popular in metrology, but pericentric(seeing more than one side of the object), hypercentric(seeing behind and/or underneath the object), and even wacky combinations with mirrors. This allows you, as a common example, to inspect the cap on a plastic bottle from all sides, at the same time, using the same camera. Above, around the corners, from the side, AND from below!
Shane used a telecentric lens on the puzzle robot!
These lenses are so cool. If only they could easily (and cheaply) be adapted to digital cameras.
i need an example
@@evanbarnes9984 it's almost as if you watched the video
Yep, it's just hard to apply on normal-life-sized objects because this would requite aperture to be meters in diameter.
There’s absolutely no other RUclips channel that I look forward to as much as this one! The content here is simply outstanding, the quality is unmatched, and the way complex topics are presented in such an engaging and fascinating way is truly remarkable. Every new video feels like a gift, and I can’t wait for the next one to drop. This channel isn’t just entertaining; it’s a genuine contribution to society. Incredible work, keep it up!
I agree!
Mate! There's the outback edition @ididathing 😂
No
this is the first video I watch from this channel and I'm just WOW
This, MyMechanic and Primitive Technology
As a filmmaker and camera lover, this was seriously amazing. This kind of camera could make some amazing art. I can't help but wonder what video would look like from some of these configurations.
Speaking of art, you said this camera feels a lot more like creating art. Art is all about making *choices*. And here you're making a LOT of choices about the way you're photographing your subject.
We definetly need a StuffMadeHere2 video about how you even came up with the idea and how you went from nonsense to actual Images @11:56
There are a lot of questionst that, as you said, went unanswered
Maybe it could be called StuffMadeThere?
@@stuartallen2001 Good one.
My guess is there was some bug with the code mapping angle sensor output to real world position. About the idea I'm betting he was pondering one day what would it take to make a real life ortho camera. I sure have pondered the same but quickly thought it must be impossible. This still baffles me though that it seemed to work so well. How far can you see with the thing? what if you point it a mile away? What if you point it to the moon (assuming there was so super accurate rig to keep it pointed at the same spot on moon surface?
@@teemukarjalainen5361 I'd think the primary limitation is the angular resolution, determined by the length of your tube - keeping in mind also that a longer tube is a narrower field of view, hence less light and therefore noisier. [And also, at least if you actually look at the moon, atmospheric distortions.]
seems like it was just warped and the equations didnt line up
if he was using a still image pattern wall for debugging instead of an actual wall, could've probably debug it easier
"Medusa whack-a-mole, where you fix a problem and two more appear"
You got your mythological creatures mixed up. You were thinking of a hydra ;)
I immediately paused the video and came to the comments to see if anyone caught that! 🤣
@@BarrytheSuperScot Same xD
'Like playing Medusa whack-a-mole where you fix one problem and it proptly truns some part of your project into stone and eventually all the problems in your project are heavy weights you drag over the finish line.' Is what I think he meant.
@@Josh-qm7fl Also same lol
Hail Hydra
14:07 You blew my mind with that image. I just paused the video and thought for a long time before resuming. Incredible work.
I think this image made me realize what perspective really is. I’ve had my share of decades in this life, so it’s a big deal to have a smart person explain complex things in a simple way!
He krilled styro pyro
Styro pyro has been reduced to mannequin
Had the same experience, just sitting and thinking and then looking around in the office and trying to visualise how it would look without perspective
I concur. It only gets crazier!
Same.
And then the one at 14:45 blew each previous mind-particles again.
I knew looking around stuff was impossible, but the way you sold it, im still disappointed by the (overall pretty cool) result lol. It's just.. the camera looks "around" stuff, basically the same way somebody with a selfie stick would look around whatever you are hiding behind. By moving the camera such that it has direct line of sight. In in a way, it doesnt look around anything, at any point in time. It's just that the stationary center of the camera has no direct line of sight.. but the camera does. I think you could have sold this camera differently to make it less disappointing.
I thought it was going to calculate the view behind an wall by inferring from indirect lighting and shadows, like if you could see a shadow poking out from behind a wall, you might know something is there, and with careful calculations and assumptions you could get more detail. Pretty sure I saw someone working on something like this for "looking" around a corner in a corridor.
I’ve always loved your videos and appreciate the effort you put into them! Lately, though, I’ve noticed a shift, they seem to gloss over some of the details that used to make them so engaging. I understand you might be catering to a younger audience, but for many of us who watched for the in-depth problem-solving, that was the real draw. Have you considered creating a second channel or series for those of us who enjoy the deeper dive? It could be a great way to cater to both audiences!
I’m with you on this!
He does in fact have a second channel!
Stuff Made Here 2, it was made for that very purpose (deep dives), but now sees way less frequent uploads than the main channel
@@poolkrooni Woah! Thank you lol-I had no idea it existed! I just checked it out, and it looks like a great start. However, it feels like there could be more content related to what’s posted on the main channel. I was hoping to see more.
I would pay for extended cuts. This content has real value. Just saying.
Him dropping a video is almost like Christmas. It immediately makes my day at least a bit better
Same
JCS Criminal Psychology dropped a video on the same day too. Today is insane.
stop the glaze poof
I also find Xmas to be rather unsatisfying
15:08 sick album cover
ya
that was my first thought for the first color photo
That's exactly what I thought, lol
v a p o r w a v e
I was coming to comments to say that exact same thing.
This project can not stop with this video. Dont start a next one yet, instead have some fun with the camera and do a second vid. The picture turn out too nice to put it aside now
11:39 wait, isn't the hydra the creature that grows back their heads?
That's what i wanted to say!
Yup, Medusa was the mortal Gorgon (a creature that can turn people who look at them to stone) who was killed by Perseus
not just grow back - grow back more than it lost
Dammit an hour too slow...
I had decided not to point this out. 😂
I only just started watching the video but I already love how there's this precise optical device and it's clamped to a piece of lumber. Makes me feel right at home!
You may want to use a first surface mirror if you're not already doing so. I used to make machine vision applications and if the camera angle needed to be changed. Using a regular mirror would result in the primary image from the reflective surface on the back of the glass, and a second ghost image reflected from the front of the glass.
what would it be like, just a highly polished metal sheet without glass?
@@moos5221 - Can be, or sometimes silvered glass but you put the silvered surface on the front (i.e., use the glass just as a substrate, so you only need a very thin layer of deposed metal).
@@moos5221 Glass behind metal instead of metal behind glass
Reason why it isn't usually done, the glass protects the metal for one, also it's much easier to make flat glass surfaces (and deposit metal, making the metal contacting it flat) than making flat metal surfaces
@@RFC3514 @bluerendar2194 ok, understood, makes sense.
Those are common in laser applications too, like lasercutters. Preventing heat buildup in the glass
This has probably, surprisingly, been my favorite of your videos to watch so far. Obviously being able to control perspective to a degree in 3D software is one thing. But to see the images you've created is almost like discovering a new colour or smell. The theory makes sense and seems obvious as to what will happen but seeing it for real (watching a video) is pretty exciting.
"Who needs drugs when you have engineering" lol I absolutely love it xD
Shh...they'll they will ban creativity next.🤫
I felt that one. Some of the stuff I've worked on has felt like teetering on the edge of madness.
And Stuff Made Here works on even crazier stuff than me.
“Why not both?”
I need coffee for my engineering
Merch!
So fun fact, most images you have seen of stars from telescopes looked around an obstacle. Newtonian telescopes have two mirrors with the secondary mirror being in the middle of the of the scope's fov but when well focused it doesn't appear in the final image.
You can also just buy catadioptric telephoto lenses for normal digital cameras that work on a similar principle. A quirk of them is that out of focus highlights form rings instead of spheres because of the ring shaped aperture.
Another fun fact: If you cover the bottom half of a regular camera lens, it will still capture the full field of view. This is because even though the obstruction is blocking 50% of the light going to the lens, all of the light rays are still able to find paths to the detector through the uncovered half of the lens.
This is in principle the same thing that happens with the secondary mirror in telescopes. The secondary mirror blocks a portion of the incoming light, but there are a bunch of other paths for the light to take that will hit the primary mirror.
Oh this should be fun, and semi-safe for once.
A giant steel weight spinning around on an arm with exposed electrical wiring touching. Yeah, super safe.
@@Runefrag safer than several of his last projects though.
engage your safety squints as prudence the safety goat is on the case.
@@Runefragnow describe operating a car in the same style.
You're right. It needs something sharp or explosive.
Powder actuated spinny cam?
This is so fuggin awesome.
What an engeneering marvel and an honor to see it working.
Feels unreal
You should use this camera to film some basic shapes like cubes, cylinders and cones to make a education video for teaching perspective. It could be super useful for art classes and drafting classes.
Basic shapes are easy to draw (or 3D render) with varying perspective anyway. The point of making a camera that can do this is so you can play with perspective in "real" (complex) scenes.
Anyway, this isn't new. Search for "Hypercentric optics: A camera lens that can see behind objects" for a video published by Applied Science a few years ago (smaller lens, but it's the same principle and works in real time).
I was hoping to see 5 sides of a cube all at once
16:26 HAVE YOU SEEN THIS MAN IN YOUR DREAMS?
Comment that in every other RUclips video would have a different perspective.
Adam Driver you mean?
That's my runescape character
it always interests me how long these incredible projects take. have you thought about maybe including some sort of date or time stamps throughout the video? its just something i would personally like to see, thats all. but i love getting to see almost every single little step of the process. great video!
Yeah I'd love if the clips had timestamps
This is his 3rd video in 2024, not counting the one released on January 1st because it was made last year. So, probably 4 months to make one video, on average.
this is actually amazing i subbed and liked.
12:01
Somebody needs to make a super cut of all of the times Shane has a breakthrough and finally gets his builds to work.
Truly inspiring stuff, keeps you going. Love these moments.
The best part is without any context this seams very weird to be happy about a picture like that
22:44 - We might never see a video from him again.
The cut was ominous.
Insert "totally worth it" meme here
Ya it’s not looking around a wall he is moving a camera around the wall? Not cool
@@Bob-o-h4kThat better be sarcasm
@@BroTeachesU He ain't wrong tho. It's a really cool contraption, but it is basically peeking around the object.
To actually see something from behind you'd need a force capable of bending light rays. Like that demonstration of moving near light speed.
10:30 I 'm a EE at Thorlabs and couldn't agree more - even when prototyping a new product, I'll end up copying the a lot of the analog design from our dark magic wizard analog design engineers, or slapping one of those amplifiers in-line to verify everything works before trying a design myself. The analog stuff gets real funky real fast and it's always best to use something you know will work at first.
Does Thorlabs still send the goodie boxes? That was the best part of my PhD
@williamcox8491 Yes! Still doing the labsnacks with all the orders :)
@@nicknack125 My coworker is always getting labsnacks. Extras when they ship one order in multiple packages.
A very common question in my past with photo diodes: Do I want to go through the pain to try to build an amplifier for this, or buy a good, working one? After one attempt, the latter answer is the default now 😅
@@alexandermarsteller7848sometimes available TIA don’t give you needed amplification-bandwidth ratio. Then you have to design your own haha
Dude, you are flying at such a high level. Your problem solving abilities combined with your relentless determinatin make you inspiring to watch. I'm 20 a solution architect in the software engineering space and restire motorcyclea in my spare time and you make me feel like im not tapping my full potential, in the most motivating way. Kudos to you. Cheers from Australia 😊
16:52 unironically would go super hard as an album cover
I was thinking that a lot of those images could look quite good as disc art!
He was definitely onto something when he said these images "felt like art" it's basically an album cover machine
18:32 too
Either that or as CCTV footage of a serial killer in a Walmart.
Would be cool as 12” label art, made me think of some of the labels for AFX - Analord
Never clicked so fast
Same
Back from work, sofa, open RUclips, click first suggested video.
dicked down
That’s absolute insane. I work in the film industry and engineering and this it’s probably one of the most exciting thing in camera tech I’ve ever seen. I didn’t know it’s even possible to simulate lenses like this. Feels truly like a 3-D program. And even there I’ve only seen an autographic view not all the other crazy creative ideas. I can’t imagine how absolutely insane it would be to have this in high resolution with proper RGB read out. And then a proper photo campaign shot on it. Congratulations to this absolute insane achievement
The only other absolute insane engineering. I’ve seen a few years ago, was a camera that’s basically reverse ray-tracing the light rays, and therefore can really look around corners and in other rooms without even being there. But seeing you engineering everything yourself from the software to the hardware and everything else is absolutely insane.
After seeing your concept, I wonder if it’s possible to kind of use the same concept, if you have multiple images/photographs, undistort, account for their sensor size, and focal length and triangulate them, so you get their position in space, and then later, on the software side, fill in the blanks in kind of like a reproprojection setup to achieve basically the same as you did, but without having to have kind of like motion control rig. What’s what probably result in some that pixels probably could also work even though they’re not all on the same depth axes
As a sidenote, it would be super interesting to know more about the hardware you used, I saw an Arduino but more insights about your tech stack and the math that’s making it possible would be super interesting
It reminds me somehow of distributed aperture telescopes like the VLA Very Large Array. Maybe I'm off base but thats what I thought of.
The idea of reverse ray-tracing light rays makes my head hurt
all these lenses exist in a way that could be applied to a film camera, unlike this
was the reverse raytracing another youtube video? I'd like to check that one out
Things like this exist for applications like inspecting equipment or hard to reach places.
Every project you do is just insanely great. I love everything about it. Keep it up !!
This is awesome, you should make an animation where you take a series of pictures with a progressively narrower field of view until it becomes orthographic then continue with a increasingly negative field of view. So you would start with being able to just see the cardboard then the card board would "shrink" eventually shrinking until it reveals the thing behind it.
I must really say i extremely appreciate that you let your projects be the star of the show. No extra hype or fancy showing it.
Just the project, how its made and the engineering behind it.
12:36 idk why, but i really like this one specifically. it feels like one of those captchas where they ask you to type the numbers but they're all squiggly
20:54 "I don't feel like I am making art on a digital camera". All your videos are art to me :D
When you started describing this it immediately reminded me of things like mechanical tvs or the scanning cameras on satellites.There is a great video from Applied science where a similar effect is made with a big Fresnel lens. Making a scanner just rebuilds the lens one pixel at a time and that is so neat.
5:24 - Pi 5: spotted!
I thought you had finally lost it looking at an Arduino Mega, then I spied the Pi 5 hidden behind.
He's got the whole US supply of them on that one camera :P
@@Jeff121456 Haha I was like "there *has* to be some more grunt besides just an Arduino on there..."
The common don't have mentors to show us what lies past persistent effort... except for here. You and all of the educators / makers on RUclips and other platforms bring the fear of failing into part of the process. You are an inspiration, man. Nothing but love towards you and those that support you. Keep it up.
At 12:00 I actually out loud said "WOW!" that is amazing. Well done!
Very cool. This isn't the first time you've built something complex enough that viewers might not be as impressed as they should be ;). You did a fine job of explaining the concept in 20 minutes, but you've obviously had to gloss over many of the image-building difficulties.
This is an incredible camera and you've only shown us a few of things it can do; it's a virtual lens on a human-sized sensor. In a way, this reverses much of what digital sensors have been doing for decades; their pixels get smaller and their lenses focus light onto smaller areas to fit into more devices while your camera is a humongous sensor with large virtual pixels. Both strategies require software correction, but for different reasons.
shut up mate stop yapping
11:20 was rough. I remember checking in a one line of code change after a week of debugging to have the boss ask me what I was doing all week. My soul was already broken and her question felt like a boot to the neck. Hang in there champ! You are doing good.
Nah. You should just find a new boss. Some bugs take a long time (weeks to months) to track down and are one-line fixes.
@@privacyvalued4134debugging is a skill you can improve. And one that should be taken into account for compensation…
@@privacyvalued4134 100%. There's a phrase usually used for repair people that's also relevant here. Don't remember the exact phrasing but it's basically "You don't pay $1000 to fix it, you pay me $100 to fix it and $900 for knowing how".
Most people could do the same thing (literally just changing a character in a file) but most people also would have no idea how to find out what to change.
If your bugs take weeks to find, find another profession
@@SystemsPlanetYou're the guy writing all the bugs
We need a spin off series of shorts called Stuff Made Here: Failure Diary.
Around 11:35 you mention 'fixing all the issues', it'd be cool to get a look at a few of the problems you solved going over "What is supposed to happen", "What IS happening", "What you suspect is causing the issue", "What IS causing the issue", and what decisions or misunderstandings led you to having the issue in the first place.
yeah! wish he used 'stuff made here 2' more and for these things!
Yea i'm missing the old videos. Like, how is he blending the pixels/light readings? Is he making more readings on the outside. How is he supporting everything with electricity, why this focal length for his lens, how is the balancing weight moving past the sensor when he's in the middle. Is the gearshaft not fixed to the rotating pole, how is he unswirling the patterns or is he measuring where the readings were taken. Why is there a bulge in the photo and why is he not fixing it. Is he taking a reading on time intervals so he is moving faster inwards because there is less area to scan...... I want all his thoughts and prayers
Honestly, as a hobbyist programmer, debugging is just not very fun or interesting. Usually it's something really dumb and simple that you overlooked (like letting the electrical contacts touch by accident), and you go through countless iterations of getting the computer to spit out its data at intermediary steps to follow where it's going wrong. It's already confusing enough as the programmer to debug, and even more difficult to explain your thought process behind it. Most of it will just be fruitless "I spent 5 hours thinking in was a problem with X, when it was a problem with Y."
Why do I feel like this is the kind of thing that will have the DoD knocking on your door in a few weeks? Unspeakably cool stuff. I would have written this off as "too many variables" really early on. Great work.
12:09 "Who needs Drugs when you have engineering?" needs to be a motivational poster
MERCH!
An Adam Savage T-shirt.
The level of complexity in this project is out of this world! Genius!
The concept of looking "around" something is actually the least interesting here. Any camera can do this if placed on a long stick like this machine. Everything else is really cool though, especially the elimination of perspective!
The looking around is interesting because the photo looks like it's taken from a stationary point behind the cover.
Sure, you could put the camera over the top of the cover, but it would be a picture from above.
@@tomadams7553this. I don't think he emphasized enough why this was such a big deal. It's not about what you're taking a picture of, it's about the impossible perspective you produce with the final image, which no other camera will be able to do
My phone camera can see behind walls too, I don't even use a rotating arm, I just use my own arm and move the whole thing to peek behind it. Other than the data stitching, it's almost the same.
I thought the same.. what's all the fuss about? This project borders on over-engineering unless that was the point. Bit confused by this vid.
One thing that would be interesting to see from this is a photo with a different fov at each angle. Like it could be normal at the vertical but go through orthographic and end up at reverse perspective at the horizontal. Probably would just look like a mess, but it’s an interesting idea
Have it take a picture of itself in the mirror
What would it look like??
The same, or you would see the lens always
@@EternalPending yeah I think you’d see the camera blurred everywhere
@@milest9754you wouldn't see it blurred, but you would see the camera arm in each position overlaid all on top of each other. Basically imagine the camera has ten thousand identical arms all around the circle, and it's not moving. That's what you'd see.
I love that you share your passion with your family. Best content ever.❤
It is always worth the wait!! When a Stuff Made Here video drops it is just a joy and this did not disappoint. Fantastic project, a great idea! Now you need to make a really big one!!
This is absolutely amazing! Could you make a photo of a globe with the last “lense” you used? 🌎
~ 11:45 Hydra, not medusa.
0:43 WIFE!!!
I find it poetic that the first cameras required the photographed person to sit still for the medium to expose and you also need this here.
I think moving as much components as possible off the rotating giant thing could yield a better photographing time.
Although the max "peeking" radius is directly tied to the radius of the rotating structure. Still, I think that can be done with some clever mirror setups and mechanical linkages.
"It's a very weird camera" I'd be very disappointed on this channel if it wasn't...... 😂
The way you explain things and the visualizations you use are otherworldly. You make extremely complex topics that I otherwise would assume I could never understand, very very digestible and easy to comprehend.
So on one monitor I'm watching this video, on the other monitor python is rendering almost the same graphic that appears at 2:14 (not quite). Weirdo cameras are awesome. Looking forward to the rest of the video =)
Edit: this was awesome! Is there going to be a sequel with more wacky lens emulation? I love this. How on earth did this last 23 minutes - felt like 3. I totally feel the "this is art now" sentiment going along with the difficulty of running the rig - Congrats!
My other favorite creator that explains complex science topics!
The perfect example of engineering and creativity combined!
"Looks like I've seen some stuff." Yes, ma'am, you have seen some stuff. 22:27
"Who needs drugs when you have engineering" - 12:10
I’ve been chasing that high for twenty years in my career. 😂
Yup I loved this line! It accurately captures the *euphoria of triumph* over an extremely challenging problem.
The 'nonplussed wife' gag on this channel is what keeps me coming back. That and the stuff made here
Amazing. I really appreciate that you discuss the problems, and time and effort you take to deal with them.
I wonder... If you changed out the photodetector for a very, very directional radio receiver and maybe spun it more slowly... if you could take a radio-spectrum "photo" of a city? You've got the movement and the math.... all you need is a different type of signal receiver... Also, you'd have to lose the mirror and point the detector straight ahead. But... still... that could be pretty amazing.
5:48 is gonna be the next Firefox Logo 🔥
No way people find this video through the search bar
Yeah, i didn't too
I did. I saw it a snippet on tiktok, then looked it up on YT using the search bar.
Actually searched for a guy that got split in half by speeding F1 race car.
Got this guy instead.
I really enjoy content like this. It's both entertaining and useful. Thanks for sharing your knowledge in such an enjoyable way!
Just a quick note at 14:18 a true orthographic projection is AWESOME in that you can measure anything in the picture and get accurate measurements off of it.
To say it more clearly once you know the measure of one thing you can measure everything else, because everything else will be to perfect scale.
side note: only on x and y axis, but not on the z axis.
Its called a telecentric lens, they're commonly used in manufacturing lines as part of quality control
@@ZPdrumeryeah but how many telecentric lenses are this big? Genuine question because I really don't know, but I doubt there's many
made some of the hardist song cover art I've ever seen.
In the future, I would choose Shane with a bottle of Adderall as humanity's champion in defense against the machines.
same
Your determination in the face of multiple failures with trial and error is inspirational.
11:50 "The thing I've learned over the years is that there's a finite number of issues. And eventually... it works."
So, you basically put a camera on a stick, which is long enough to look behind a small barier the same way you'd put a mirror on a stick to look aroud edges.
It's kinda like a digital periscope.
He did, feels like cheating.
Was thinking the same thing if the camera sticks past the edge then it could see him and not seeing behind walls
Yeah, had the same feeling
"This camera can see behind things when I put it where the thing isn't obstructing what's behind"
the camera isn’t seeing behind the wall though, you are just reaching the camera around the wall
Yeah it's cool but not really a camera that can look around walls...
yeah, the camera is peeking around different corners and then creating a flat image by combining different sides of the subject of the photos into 1.
I love his projects but this is just not what he says it does
Like if I was to take a normal camera and put it at the same place as his sensor I could take a better picture
Thanks for saving me 20 minutes.
The spinning arm was enough for me to doubt the claim.
@@neb_setabed His sensor is in thousands of places. It would take you a very long time to put a normal camera at the same place as his sensor. Especially accurately.
I don’t know if it’s been done before, but that ortho/parrallel projection image is SO cool. I’ve done it in 3D software but seeing it’s actually done is mind warping literally
@3blue1brown *creates an entire library to articulate physics on video*
@StuffMadeHere proceeds to scribble on an ipad while recording
manim might be more visually pleasing, but doing it with an ipad is way faster and most of the time good enough to understand the concept.
@@multiarray2320 stuffmadehere actually builds the thing. that helps. if it were only a ipad drawing you wouldnt watch it probably :)
love them both
i bet by the end of next year we get to see a homemade MRI machine🫡
"I asked my wife to see what my interior looks like"
"Looks like a crab."
14:00 - MIND. BLOWN. Never seen anything like this! Never even imagined it would be possible. This is cool!
The funny thing is, the "seeing around objects" stuff at 16:30 is somehow less impressive, as you could really just move any old camera around the object without needing to spin it. But the no-perspective images are so unlike anything our eyes could ever see. I'm seriously impressed!
As a mech eng. student, I can't stress how cool this is, We just our 2nd year class on sensors and instrumentation and our prof. spoke our heads off on how Impedance amplifiers and active filters work and when you showed a use case for it finally clicked for me (haha, no offence, he was a great prof). I find that eng. students like me often face the struggle of find a project like this to work on. I think the most inspiring thing about this is your ability to take ideas like this and turn them into a crazy-creative project. Definitely kickstarted my brain to try and finding awesome project ideas like this!
Something cool to try would be to use the full censor of the camera, instead of just a point, save all of the images that it takes, and reconstruct the photo afterwards to use whatever lens / FOV you want.
Awesome video as always!!
I think that could work.
Light Field Cameras can do this already, but it would be interesting as none of them are anywhere near this big.
11:42 I think you mean "hydra whack-a-mole" the hydra was the mythological creature that grew 2 new heads every time you cut one off, Medusa was the gorgon that had snakes for hair and turned you to stone if you looked at her.
Exactly what I thought when I heard that
I could be wrong i have not bothered fact checking, but dont medusa's snakes have the same property of kill one, two grow in its place? I did think the same thing at first tho which is how i found this comment
@@yoitszaitz That I something I have never heard before, and if it is true it'd be so obscure as to not be a reference anyone would understand as "correct" when the more obvious cutting off heads and multiplying schtick Greek monster is the hydra.
@CthulhusDream yeah i just fact checked myself and found nothing, idk where i heard it before
“You kinda look like Thomas the Tank.” The rizz is strong in this one.
as a filmmaker and camera assistant, I find this mind blowing, and a fantastic demonstration of the capabilities and functionality of what I work with. Gives me a better perspective, thank you.
There actually are cameras that see around corners. They measure the time it takes photons to bounce off a wall, hit the object, off the wall again, and then hit the detector. Then with tons of math you can get the general shape of an object that’s occluded. The Smithsonian Magazine has an article about it from 2012 if anyone is interested.
I was going to bring this up - it's an interesting ability that emerged out of the research methodology referred to as "femtophotography".
I thought that this was kinda what he was going to do, cuz what he did doesn't really see "behind" the wall, the sensor just looks above/around it. I mean, it's still cool, but I wouldn't call it "seeing behind a wall".
While my amped up engineering senses think this is awesome, this is more like a fancy way of using a wall as a temporal mirror. I mean still freaking awesome, but kind of still cheating in a way :)
We need really small black holes to really "see around corners". Oh, and also really strong walls.
@@Mismatch-it's better than seeing behind it. It's seeing _through_ it, because the perspective is from a point that is directly blocked by the obstruction
great video, but a small correction for 17:40 Most cameras only have a single sensor per pixel, and on top of the sensor there is RGB Bayer pattern applied, meaning only some pixel see red, some green and some see blue. The final image is an interpolation of the data from each filter.
21:11 that is an alpaca
using my minecraft knowledge, that is a camel
Your results are insanely good. I am in awe of your drive every time you post a video.
11:39 Did you mean Hydra? Medusa turns you into stone, doesn't grow more heads.
Good way to bait insufferable people to get them to comment and boost the algorithm. He got himself a good haul looking through the replies.
i was gonna say that 😭
Everything apart from the looking round objects bit was stunning. Camera on a stick can look round stuff regardless.
The reverse and Orthographic stuff is mind bending, being able to create custom lenses is freaking nuts!
Camera on a stick can look around things, but this is the only camera that can look around things with a perspective that looks like you're looking directly _through_ the obstruction. Notice how he's centered in the image, and you're not looking at him from the side. No other camera can do that.
Cant even imagine how huge of a headache this was to make 😂
Love the camera :)
Your optics are diffraction-limited, so if acting as a conventional lens, a touch of super-resolution should be trivial. This is essentially accomplished by deconvolving the image with the point spread function.
But... for the Orthographic Photo, the point spread function depends on distance. So wherever you have contrast the point spread function can be detected and the distance can be calculated. So in short, you have a depth camera that can detect the distance of edges. That image would be awesome.
To detect the point spread is relatively easy - it's a Gauss function with height proportional to the Laplacian of the image.