PUBG Mobile ► Download PUBG Mobile on IOs and Android Today here uqp6.adj.st/?adjust_t=cjm8yy_2rencx&adjust_campaign=CorridorCrew&adjust_deeplink=igame1320%3a%2f%2f
I wanted to see how the iPhone 13 photos would look without any AI processing compared to original iPhone to see how much has the actual hardware of the sensor changed.
Like he said in the last 15 seconds, a lot. Original iPhone had 2mpx, f2,8 or more, without AF. New iPhones have AF, 12mp, f1,6 large sensor with IOS. Yes the difference in software processing is huge, but so is the difference in hardware.
From a photographic perspective, most of the dynamic range and resolution woes on the original iphone is due to low resolution glass, it reeks of poor optics and looks like a vintage lens slapped on a sensor. I'd like to see someone retrofit some modern glass in front of it and there should be a pretty significant improvement across the board.
What’s interesting to me is the sensors of the iPhone 1 camera give it a filmic look, like you were shooting on 16mm film. Even though the newer camera’s better, there’s a certain nostalgic quality to the photos taken on the 1 that isn’t there on the 12.
Another interesting comparison would be to do the Google Pixel 3, 4, and 5. They all use the EXACT same Sony Exmor IMX363 12.2 MP sensor just with improved optics and more and more image processing. Going back further to the Pixel 1 (Sony Exmor IMX378) and Pixel 2 (Sony Exmor IMX362), the sensor really hasn't changed much in 5 years, yet the processing power has made vast improvements with the same available resources.
Yea and he was talking about how it takes a bunch of pictures at once, puts them together to make a better image, and also the dark mode thing and I'm like... Wait.. Didn't the pixels do this first?
I work with CCTV cameras for my job. One thing I always find on older cameras that were installed years ago is a haze on the lens and sensor. The plastics inside the unit will out gas over time putting a thin layer of gunk on the sensor and inside lens. This is probably the reason why the original photos seem to have a hazy and unfocused quality to them. I think if you can scour the internet for original iPhone photos from 2007, you would probably have sharper images.
Great point. Some of those outdoor CCTV optics have probably seen many thousands of hours at high temperature. Do you see the same problems on the indoor CCTV cameras?
I have tons of OG iPhones in my photo library. They do have a certain vibe but $80 canon powershot camera from the same year were way better. Nowadays small cameras are fairly terrible in terms of processing.
@@Hamachingo Because they're generally not doing any processing. Small point and shoots do extremely minimal work on in-camera JPEGs compared to newer, higher end cameras. And then if you're shooting RAW no processing is being done at all on those cameras, whereas smartphones will actually still denoise and use bracketing for HDR even when shooting RAW.
i dont think it would matter all that much. pictures from that time have a very distinct quality to them and the blurriness is exactly how i remember it being back in the day.
Also the front glass of most phone cameras are always smeared with human gunk and skin oils. When ever I use my phone I compulsively clean the camera glass because I hate that haze that is ubiquitous in photos take with phones.
The photos from the iPhone 1 look like something from a video game. Sometimes you just want an old and not so great digital camera because it takes photos in a unique way. Working with limitations can be fun.
Not gonna lie, the old iPhone’s camera was much better than I expected and honestly isn’t *that* much worse than the new ones. I’m not saying it’s good, but it did impress me.
No, first iPhone camera was utter crap... at the same time when you could get an iPhone 3g (kinda same iphone 1 cameras with video for back camera and 3G capable), if you wanted a great small camera with a phone there was the LG Viewty who takes ASTONISHING pictures for the time, or even Nokia N95 who was one of the king of photophone of that era, comparing them to the iPhone 3g camera would result in the same result of comparing it to a iPhone 5-6 camera mostly... Remember that the LG Viewty could record videos in VGA 30 FPS easily and could also make 120fps slowmo... in 2008 x)
Same reaction. I actually thought that I would prefer the old one for some style or specific picture. And, the first Iphone camera is good enough for my needs funny enough. OK my needs are super low, but still.
15:30 those aren't AI artefacts. Those are JPEG compression artefacts. Whenever you see those 8x8 pixel squares with wavy patterns aligned vertically or horizontally those all come from the type of compression that JPEG, MP4 and many other photo and video formats use. The way it works is by comressing each 8x8 pixel square separately by first representing it as a sum of wavy patterns with different frequencies and then ignoring the higher frequency patterns to save memory. So it is most noticable around sharp edges which require the higher frequencies to be represented accurately but the compression throws those away. So you get bright and dark spots a few pixels away from the edges.
@@CatskillOne Ah ok. Though HEIC will still have compression artefacts. I know it can achieve half the filesize of jpeg with similar compression levels.
I was hoping you'd try taking 9 photos (from the same position) with the iphone 1, and use all of them to enhance the output image 😌 Perhaps it would end up even better than the enhanced ones you got so far.
3 года назад+48
Nowadays, you buy a camera that you can also call people with... Seems like every commercial for every cell phone is just about the camera, and of course about ALL the megapixels.
@@myname-mz3lo I'm talking about old digital cameras when I was a kid I had a shitty camcorder and the first iPhone has very similar quality, changing my phone to 5 megapixels doesn't recreate that, it reminds me of early RUclips content
Modern computers are fucking monsters, just as a comparison of you have a good graphics card you can do in real time what only 10 or 15 years ago required a masive server farm to pull off
@@carso1500 Are monster for some things. In general things that can be parallelized infinitely like image processing and AI, has been enhanced a lot, things that can be parallelized but not infinitely have been enhanced a good amount, but things that cannot be parallelized or things that can be parallelized a bit but still have a main thread to wait have not advanced like at all, if you compare CPU single thread operations from a top cpu from 10 years ago to the top cpu of today, that has improved only 20% in 10 years.
@@diablo.the.cheater That is completely untrue. Intel released the i7-3770k 10 years ago, at stock speeds an Intel i9-12900k is anywhere from 75-125% faster than the 3770k in single threaded tasks (depending on the workload, instruction set, etc). You can literally Google this stuff in 30 seconds, don't spread misinformation.
I actually wish we could disable the AI denoising. It looks super weird sometimes, especially when you know what a denoised render with not enough samples looks like. Just let us choose the image before processing too.
“Where are the apps??” On-phone apps were not needed because web apps can be run in the browser. Then on-phone apps were allowed… and they became shells that ran web apps It came full circle. :)
@@GuitarListen A bunch of apps out there are built as web applications. Meaning they're using HTML/CSS/JS so they are able to run on different devices. Making it cheaper to produce since all you're doing is changing the look depending on the screen it's running on. Oversimplified a bit for clarity.
What’s even crazier is that they weren’t able to get the raw data from the iPhone 1. They just used the processed data. Imagine getting raw data and multiple exposures and stuff
There wasn't much "processing" in phones camera before around 2011-2012, only some basic tint, grain and some very low sharpening, first real post process was HDR (the basic multi exposure methode), on iOS it arrived with iOS 4.1 so around january 2011. They where already struggling taking 1 picture so it was fair to not ask them to take 3 pictures to make a HDR shot x).
@@antoinepersonnel6509 They mean processed as in demosaiced, compressed into JPEG, maybe as you say a little sharpening and NR etc. RAW would still be better as even this basic processing has gotten better since then.
@@antoinepersonnel6509 compression was a thing since the very early point and shoot cameras. Uncompressed image will have greater information, but most of that information can be easily constructed using modern ai, so I don't think it would've been much different than the processed image in the video.
@@ritwikreddy5670 Yeah mostly what i was saying, there was not that much to recover, instead of a blurry JPG mess it would be a mosaic of RAW BITMAP mess. But they could have taken RAW unprocessed images from the iphone 12 tho with the app Yamera, that would have be a great compare, especially for nightmode.
Sam and Niko need to quit playing around and give Wren an educational VFX channel under the Corridor umbrella. Wrenducation at its finest. Keep these videos coming!
Well, any photographer could tell you this; iPhone cameras still use tiny tiny sensors, (obviously) which creates very high pixel density on the sensor to keep the resolution but creates extreme noise and lack of sharpness (not to mention lack of light fidelity). They also have short little lenses with fixed apertures that, due to their actual size, not F-stop, have horrible low light performance. And of course, no optical zoom, which means they rely on multiple lenses with different focal lengths to achieve the effect, which makes it more complex and expensive. Additionally, they digitize mechanical processes in cameras; phone cameras have no mechanical shutter and rely on a digital pre-capture technique. And lastly, they rely *heavily* on the post-processing of the image. They use fake depth of field because the aperture and sensor can't create it, they add fake sharpness and vibrance, and so much AI BS that just makes pictures look fakely stunning instead of realistically beautiful. And believe me, anyone can tell the difference. Overall, image processing compresses the colorspace and the image size itself, sacrificing clarity, focus, dynamic range, detail, and resolution. When the AI is involved to regenerate the lost data and *attempt* to improve the image, it ends up looking completely fake, as exemplified by the 3rd Party AI programs shown in this video. This is what separates phone cameras from dedicated ones; the software vs. Hardware dichotomy. You can take stunning personal-use photos with phones now, but you could never use a phone for professional use. everything they are could never EVER stack up to any dedicated DSLR or mirrorless camera. The simple fact is phone cameras are so incredibly small, and real cameras are much larger with more features to dedicate and make pictures naturally more perfect. In this sense, size really does matter.
@DonDoesRoblox plenty of pro photographers use phones when they're more convenient and well suited to the task. There are obviously things the phone camera won't work for but saying they have no use beyond social photos is elitist bs
@@stephenchurch1784 agreed, this reeks with gatekeeping lol again, gear is secondary. always. you wont be a professional if you cant make a small sensor work.
Wren: *falls off his one wheel breaking his collar bone, going through incredible pain, and is inconvienced by a cast for months Also Wren: *Decides it's a good idea to ride his one wheel while holding two phones and wearing novelty glasses that impair his vision 🤦♂️🤦♂️
I’m glad I watched this, I had kind of surmised just from my own knowledge of photography and computing that they were stacking multiple shots and dynamically upping the shutter speed if things moved. Then you showed Phil literally explaining that very thing. Kind of forgot they did that.
Seeing this makes me annoyed how little the cameras themselves have actually improved across 14 years, the iPhone 1 looks so much better than I was expecting
I think there's some explanation of why technology isn't progressing as quickly as we think. It's not exponential at this point, but actually marginal. You can see it with iPhones, especially. The leaps were much greater with the early models, now they're barely noticeable. It's just marketing BS.
It's a massive improvement but when you consider It's nearly 15 years of evolution not so much... I mean let's just compare the iPhone 1 to something from 1993 lol
Pro tip: pause at 06:52 and cross your eyes, find the right distance and focus from the screen and let your eyes relax. If you succeed you should she 3 images. The one in the middle has an impressive 3D depth and parallax effect when you move the screen or your head. Kinda cool to see a blend of the different phone cameras in crisp 3D depth.
@@Cookid621 maybe it depends on the viewer's eyesight. I wear glasses and find it harder to view stereograms now than when my vison was 20/20. But i managed to see this one ok the 1st time, with a bit of struggle Edit: i tried it again putting my face near my screen. That worked better for me
Wow, this was another super interesting topic by Wren! He really knows how to bring an idea like this to life and demonstrate it in such an eye opening way! 😊
Great stuff! Having a background in image processing and high end photo retouching, a lot of things come to mind to make the iPhone 1 photos even better. For example using multiple shots stacked in Photoshop for superresolution, similar to what deep fusion does. I would love to see things like that in a future video!
Stupid question but, wouldn't that require additional installation in the old iPhone?
3 года назад+3
@@_khooman_chronicles_ what he meant is set up both phones in a tripod, and manualy do what the new iphone does (stack 10 pictures at different exposure values) with the iPhone 1 to compare the difference, I just don't remember if the first iPhone was capable of manual exposure due to processing power, I'm not sure if it's possible Edit: there might be a hack in Cydia tho
@@_khooman_chronicles_ Like @Victor Trasvina said taking multiple shots manually and then stacking them afterwards in PS. It wouldn't necessarily have to be bracketed exposures. Just tapping to expose for the highlights would work. The computational photography in phones until very recently wasn't capable of bracketing multiple exposures either. They just take a couple of underexposed shots and combine them to lower the noise floor. Then the image can be brightened in post and tonemapped to retain the highlights.
@ Exactly! Although the shots could be taken handheld and aligned as well, if the exposure time is fast enough. That way you'd ensure subpixel movement.
Image stacking would've made a big difference combines with these other upscaling techniques. It's not a fair comparison when the modern iPhone gets to use a series of images taken at different exposures.
Wren, this is one of those videos where I wish travel and stuff was easier, because I feel like teaming up with someone like Destin from SmarterEveryDay would have made this absolutely insane.
My OnePlus 9 Pro has that Hasselblad camera, which can literally see better in the realtime preview at night than my own eyes do. I can take a picture of what appears to be pitch black darkness to me and it will pull out an image, albeit slightly yellowed. I can also zoom in so far that I can see into the window 8 houses down the street, whereas I can barely even see that house with my eyes. Cameras have gotten so insane, and they're finally starting to really make a lot of progress in those extremes, which you don't use often but also make the more normal situations look way better.
I would really love to see you guys make a time travel animation for back to the future. One as close to the original as possible. One completly overboard. And one how you would imagine it
6:20 ok, the new iphone 13 pro sensors are amazing but austin needs to spit out facts, it is nowhere near 1 inch sensor. In fact, plenty of other phones have bigger sensors. However, apple imaging software is indeed incredible, still, not in any way near a 1 inch sensor and thats not a bad thing, bigger sensor requires bigger and better glass, so I dont think phones will get into 1 inch sensor without having a big, i mean hugeee ass camera bump.
Right! most people don't understand that apples "massive" sensors are just slightly larger than a pen tip. I recently bought a Full frame a7iii and the sensor alone is the size of the bottom half of an iPhone nearly 2 inches diagonally
@@joshgeurink2131 Thats not even a large sensor either. Medium format, (real medium format, not the digital slightly larger than 35mm "medium format)the sensor size is so large you can make post card prints the size of the sensor without any enlargement.
Yeah no where close to 1". The iPhone 13 wide angle sensor has a surface area of 35.2mm2 while the Sony RX 100 VII has a sensor surface area of 116.2mm2. On top of that the 1"-type sensor of the Sony has the physical dimensions of 13mm x 8.8mm with a diagonal of 15.86" which puts its largest dimension just over 0.62in. So that sensor is not even close to 1" in size. 1"-type, from what I can tell must be marketing jargon, since it's no where close to 1", and by comparison a true 1" sensor which is somewhere in the APS-C sensor range makes the iPhone sensor look tiny. The iPhone 13, while it's got a great camera, is no where remotely close to 1" in size...
Film Riot did a great showcase of the 13 Pro's video capture which is definitely a whole lot better than my own DSLR's video. That being said, I'll still reach to my DSLR for shooting RAW photography.
I actually kind of like the Polaroid look of the original iPhone way better than I expected to. It weirdly makes the photo seem less "perfect" or clinical. It gives the photo a bit of soul.
You could have also tried putting it on a tripod and making multiple photos focusing on other parts of the image - and then stack them together with a program like Lightroom to emulate the "Deep Fusion".
If you really want to do this right, you should take a burst while being extremelly still, upscale them invidually, then mix them, that will give you way better results.
Google pixels are the perfect example of this. The camera hardware is kinda old and mediocre but the post processing let's it compete with the latest iPhones Try to take a video though and the quality goes back to what you'd expect from the pixel hardware
Yeah the Pixel was for a while the unchallenged leader in software post-processing. Night sight was a groundbreaking feature and was available on the two generations previous the one it was introduced. Great to see that other manufacturers have also had great success with it, and I can't wait to see what post-processing can do in the future.
@@TherapyGel though I think nightvision on my Pixel 4a 5G is still amazing it is much brighter and more detailed than what I can see with my own eyes. So I guess nightmode on Pixel might even still be better than on the iPhone 13.
The iPhone doesn’t use the NPU when opening images it’s already taken. It uses the NPU only on capture time. What you’re seeing when opening the photo is the transition from a thumbnail/preview to the full res image.
It's also possible that the thumbnail is taken from the live feed, which makes it a fair assumption that the preview shows an unprocessed photo. Otherwise, additional steps in image processing is surely being done in the background so that it is ready for the next shot as fast as possible. It can use the NPU better as you open the gallery, as a result of preserving resources. The app is smart enough to determine how to do processing. Of course this is mostly assumptions on my part because there's not much documentation on how Apple designed the software behind image processing. But the assumption that it only uses the NPU on capture time (assuming you mean as the shot is being taken) might be wrong. Although to be said, A15 is pretty fast and the model is probably pretty lightweight that the processing is instantaneous.
You guys are fast becoming one of my favorite channels on RUclips. Really interesting topics and super informative on the tech behind the scenes without asking me feel like an idiot. 👏
Except for when I watch them outside of the corridor app I see ads like this, PUBGM doesn't have controller support and its on its 3rd/4th year of gaming while he mentioned 7 years ago the game has gotten better???? maps are still only 3* more than when it started unless you count small 4v4/8v8 small maps and the fun game modes they keep getting rid of (also small maps or relying on already made ones). So not like new maps all the time. Also back when in 1st year of pubg mobile you used an app to use a controller then later you gor banned for using third party apps to use a controller. So the ad was almost all a lie? But I do enjoy the content but watching what they do on youtube (since with RUclips premium figured I'd watch) this is just dissapointing
I've had a number of times where friends questioned why it took so long to get a shot just right with my actual camera and I explained I have to change my settings to get what I want and they go well my phone can take a picture quicker and I'm like ya it does a million things at once. All for them to love how good the photo looks when I'm done 😂 phones are great for capturing fun times and light duty stuff but even with all they do now, a traditional camera will still blow them out of the water.
To me improving the software side of things is all great and stuff but what I'm more interested in is having actually better pictures, not something that isn't a true representation of the moment. Apple's objective is probably to offer very ordinary hardware but with software enhancement that way they save money on the hardware components.
I liked the video when I noticed that you had a sponsor add time limit indicator. I watched the whole video, but I definitely appreciated you letting me know how much time was left on the add. 🤠
@@alexbarnet6982 I get that they edited it so that it starts at the next segment of the video, I meant that the words are on beat with the first bar of the music like he rapped to it lol
If you have photoshop you can take multiple photos and then File>Scripts>Load Files into Stack. From there you can have it align the photos and you can layer the images to average out the noise grain to make these "super resolution" photos.
Because of the pursuit for the best objective imagine we lose some of the "character" of older technology that had to deal with a lot of limitations. That's one of the reasons why Black pro mist filters are so popular with filmmakers, it adds some character back
Awesome Wren!! So good! I shoot vids with my phone and it does the job but would actually love to shoot one with the original iPhone now! Interested to see it! Also, Battlefield Mobile alpha is available? You guys should try that and make another long-awaited Battlefield video
I kind of hate how iPhone photos look, I feel like I can't get an accurate gauge on how I actually look when they do stuff like smoothing out skin imperfections in selfies. It makes me look much better than it should, and if I switch to the front facing camera I look completely differently. It's quite frustrating. I want to look accurate.
You can never have accurate. With a digital camera there is unavoidable processing, like the white balance, distortortion fixes and color tweaking, but even on the analog level things will be distorted/formed by how wide the lens angle is, and how you've framed the photo. I'm not trying to mansplain how a camera works to you, rather your idea of an "accurate" picture got me thinking. What is an "accurate" picture? Even without the camera distortion, there are so many things that change and influence your look.
I'm with you. There wayyyy to much image processing. It's not natural. Over saturated. Smoothed where it shouldn't be. Extra texture where its not. Yet for whatever reason it's always rated the top phone camera. Maybe it's just the minority of us, but I like the muted more accurate photos not the auto Photoshop effects.
Another thing, the average camera has pretty shit dynamic range if you only take a single shot, especially compared to what the human eye sees. You *have* to have some processing, like multiple composited shots, in order to get close to an "accurate image" of what another human would see in the same position.
@@Appletank8 Part of the awful low-light response is the high f-stop. The fact that the apertures on the old phones, and the standard lens of new phones is about 2-3mm, there's only so much light that can get to the sensor through such a small hole. For the ultra-wide and low light modes, modern phones use the extra lenses up to about 6-7mm diameter.
I remember having a 40mp Nokia lol I wish new phones had that with modern processing power. The sensor in that thing was amazing but it had the windows os which totally sucked for everything lol
Some Android phone cameras are better than the iPhones, that's just a fact, however I believe apps have been optimized iPhone cameras for example why selfies on snapchat or Instagram look so much better on iPhone than Android
Good news! Pixel has been doing this since they launched. They've always been focused on image processing, and that's why even the 4 and 5 were able to keep up with the iPhones with a 3 and 4 year old sensor.
@@Frankie-xu6sr yeah pretty much. Android have such a wide range, and the software for this computational photography is usually locked in the phone's camera app. The only exclusion that I know of is when the pixel line had the "pixel neural/visual core" in the pixel 2 3 and 4. Then other apps were able to utilize that to take higher quality images.
PUBG Mobile ► Download PUBG Mobile on IOs and Android Today here uqp6.adj.st/?adjust_t=cjm8yy_2rencx&adjust_campaign=CorridorCrew&adjust_deeplink=igame1320%3a%2f%2f
Nah,dawg.Just.....nah.
Kewl
Fortnite
Pubg fell off
Thank you SponserSkip
Forget 90s VHS nostalgia look, I'm all about the DVD 2000s phone camera nostalgia
That Laserdisc aesthetic
Please stop your making me feel old
Man the early 2010s is wear the 0.5 mega pixel selfie cameras were considered good on a android lol
How about 16mm film look
No thanks. Didn't wait 20 years for photo/video quality to stagnate.
I wanted to see how the iPhone 13 photos would look without any AI processing compared to original iPhone to see how much has the actual hardware of the sensor changed.
agreed, I thought this was where they were heading with it.
Thanks for saving me fifteen minutes
Like he said in the last 15 seconds, a lot. Original iPhone had 2mpx, f2,8 or more, without AF. New iPhones have AF, 12mp, f1,6 large sensor with IOS. Yes the difference in software processing is huge, but so is the difference in hardware.
@@extracoolboy Unnecessary comment. We want to see the image, not the specs. If you can provide them, do so.
@@extracoolboy hardly an answer tbh
“I haven’t checked 3 in a while”
What an understatement lmao
@@azkya8540 what the f***?
@@CyberShink thot bots have been roaming earth for a while *dramatic music plays
@@CyberShink Bots have ramped up a good bit recently
This is corridor not valve
@@CyberShink just report it as spam
Bonus it deletes it form your view
From a photographic perspective, most of the dynamic range and resolution woes on the original iphone is due to low resolution glass, it reeks of poor optics and looks like a vintage lens slapped on a sensor. I'd like to see someone retrofit some modern glass in front of it and there should be a pretty significant improvement across the board.
What’s interesting to me is the sensors of the iPhone 1 camera give it a filmic look, like you were shooting on 16mm film. Even though the newer camera’s better, there’s a certain nostalgic quality to the photos taken on the 1 that isn’t there on the 12.
Yeah exactly!!!
it is the combination of lo-fi and the "classic" 35mm fov.
especially the shot under the overpass with the huge light blow up just feels somewhat more right than the new one
Actual film is really high resolution but was lost back in the day when transfered to VHS and whatnot. Go watch the restoration process of Jaws.
Modern cameras are overprocessed. A lot of the things he talks about here are really bad for proper photography.
Anytime there's a random shot of Sam in the background he should be out of focus and walking with a Bigfoot pose
And with that mysterious bigfoot music too 😂
I WILL invest in this idea
And when you zoom into the Blurry Photos...its just a Sandhill Crane.
@@movementvisuals_ the Sam filter lol
Next time on VFX artists debunk
Another interesting comparison would be to do the Google Pixel 3, 4, and 5. They all use the EXACT same Sony Exmor IMX363 12.2 MP sensor just with improved optics and more and more image processing. Going back further to the Pixel 1 (Sony Exmor IMX378) and Pixel 2 (Sony Exmor IMX362), the sensor really hasn't changed much in 5 years, yet the processing power has made vast improvements with the same available resources.
arent there even big diff between the normal and A models? (or do they do use a diff sensor??). but yes would be cool, albeit maybe a bit similar.
@@TestarossaF110 nope same camera difference price and hardware
I remember when the pixel came out
Yea and he was talking about how it takes a bunch of pictures at once, puts them together to make a better image, and also the dark mode thing and I'm like... Wait.. Didn't the pixels do this first?
@@jhall7798 Yes
I work with CCTV cameras for my job. One thing I always find on older cameras that were installed years ago is a haze on the lens and sensor. The plastics inside the unit will out gas over time putting a thin layer of gunk on the sensor and inside lens. This is probably the reason why the original photos seem to have a hazy and unfocused quality to them. I think if you can scour the internet for original iPhone photos from 2007, you would probably have sharper images.
Great point. Some of those outdoor CCTV optics have probably seen many thousands of hours at high temperature. Do you see the same problems on the indoor CCTV cameras?
I have tons of OG iPhones in my photo library. They do have a certain vibe but $80 canon powershot camera from the same year were way better. Nowadays small cameras are fairly terrible in terms of processing.
@@Hamachingo Because they're generally not doing any processing. Small point and shoots do extremely minimal work on in-camera JPEGs compared to newer, higher end cameras. And then if you're shooting RAW no processing is being done at all on those cameras, whereas smartphones will actually still denoise and use bracketing for HDR even when shooting RAW.
i dont think it would matter all that much. pictures from that time have a very distinct quality to them and the blurriness is exactly how i remember it being back in the day.
Also the front glass of most phone cameras are always smeared with human gunk and skin oils. When ever I use my phone I compulsively clean the camera glass because I hate that haze that is ubiquitous in photos take with phones.
Shoutout to Austin for repping that Logic merch
The no pressure album shirt lol
He’s so bad
Rattpack baby
legit
No way I love your content
The “Hey Guys, this is Austin”… was too funny 😂
Literally came here to laugh😂😂😂
If Austin won't say it, Wren will
XD
Looool
The photos from the iPhone 1 look like something from a video game. Sometimes you just want an old and not so great digital camera because it takes photos in a unique way. Working with limitations can be fun.
FMV cutscenes in ps1 games be like
Buy a $15 children camera
If you want limitations - add a filter
I strongly agree
Not gonna lie, the old iPhone’s camera was much better than I expected and honestly isn’t *that* much worse than the new ones. I’m not saying it’s good, but it did impress me.
No, first iPhone camera was utter crap... at the same time when you could get an iPhone 3g (kinda same iphone 1 cameras with video for back camera and 3G capable), if you wanted a great small camera with a phone there was the LG Viewty who takes ASTONISHING pictures for the time, or even Nokia N95 who was one of the king of photophone of that era, comparing them to the iPhone 3g camera would result in the same result of comparing it to a iPhone 5-6 camera mostly...
Remember that the LG Viewty could record videos in VGA 30 FPS easily and could also make 120fps slowmo... in 2008 x)
Same reaction. I actually thought that I would prefer the old one for some style or specific picture. And, the first Iphone camera is good enough for my needs funny enough. OK my needs are super low, but still.
I actually prefer the old iPhone photos, the photos from the new one, although having a higher resolution and dynamic range, look too artificial.
honestly for 2007 it was indeed way better than I expected
My first chinese off-brand smartphone in like 2013 had wayyyyy worse photos lmao
15:30 those aren't AI artefacts. Those are JPEG compression artefacts. Whenever you see those 8x8 pixel squares with wavy patterns aligned vertically or horizontally those all come from the type of compression that JPEG, MP4 and many other photo and video formats use.
The way it works is by comressing each 8x8 pixel square separately by first representing it as a sum of wavy patterns with different frequencies and then ignoring the higher frequency patterns to save memory. So it is most noticable around sharp edges which require the higher frequencies to be represented accurately but the compression throws those away. So you get bright and dark spots a few pixels away from the edges.
But doesn’t the iPhone use HEVC instead of JPEG?
@@CatskillOne HEVC is a video encoder. AKA H265. But video encoders do the same thing to save space.
@@Arcona I meant to say HEIC. I just checked and the pictures on my phone are HEIC
@@CatskillOne Ah ok. Though HEIC will still have compression artefacts. I know it can achieve half the filesize of jpeg with similar compression levels.
Thank you! I was just thinking the same!
I was hoping you'd try taking 9 photos (from the same position) with the iphone 1, and use all of them to enhance the output image 😌 Perhaps it would end up even better than the enhanced ones you got so far.
Nowadays, you buy a camera that you can also call people with...
Seems like every commercial for every cell phone is just about the camera, and of course about ALL the megapixels.
Not necessarily, some phones are focused on the performance. Multi-tasking or gaming while having just an OK camera.
because thats all the smooth brain cons00mers who buy the newest model every year or so want out of their phone
“PUBG simulates what it’s like to be in a real war, it has features such as guns, and bullets.”
😂
Said no one ever.
@@handle__
What
I really like how older cameras look, it gives me alot of nostalgia
Older real camera are very nice but older phones were trash
hipster culture have come full cycle I think
thats not old cameras lol .. film c ameras are old cameras and they look great . just put your phone on 5 megapixel if you want 2000's camera look
@@myname-mz3lo I'm talking about old digital cameras when I was a kid I had a shitty camcorder and the first iPhone has very similar quality, changing my phone to 5 megapixels doesn't recreate that, it reminds me of early RUclips content
@@morffi1123 that's the whole point, working with the limitations is fun, it makes you think more creatively when shooting
The truly amazing part is how fast that processing happens. When you take a picture, it's ready to be viewed in less than a second. Just crazy
Modern computers are fucking monsters, just as a comparison of you have a good graphics card you can do in real time what only 10 or 15 years ago required a masive server farm to pull off
@@carso1500 Are monster for some things. In general things that can be parallelized infinitely like image processing and AI, has been enhanced a lot, things that can be parallelized but not infinitely have been enhanced a good amount, but things that cannot be parallelized or things that can be parallelized a bit but still have a main thread to wait have not advanced like at all, if you compare CPU single thread operations from a top cpu from 10 years ago to the top cpu of today, that has improved only 20% in 10 years.
@@diablo.the.cheater That is completely untrue. Intel released the i7-3770k 10 years ago, at stock speeds an Intel i9-12900k is anywhere from 75-125% faster than the 3770k in single threaded tasks (depending on the workload, instruction set, etc). You can literally Google this stuff in 30 seconds, don't spread misinformation.
@@TechnoBabble As a i7 3770 non-k user, I approve this message.
Haven't watched much Corridor Crew but I INSTANTLY recognized 8:41 as the set of the Aimbot video over on the Rocket Jump channel. Such a classic
I actually wish we could disable the AI denoising. It looks super weird sometimes, especially when you know what a denoised render with not enough samples looks like.
Just let us choose the image before processing too.
“Where are the apps??”
On-phone apps were not needed because web apps can be run in the browser. Then on-phone apps were allowed… and they became shells that ran web apps
It came full circle. :)
Majority of apps are not running web applications. Or what are you trying to say?
I had one and i had apps on it like tiny wings
@@GuitarListen that there wasn’t an App Store that you could add things from. The idea was that developers would make browser apps instead.
@@rickyspanish5316 how did you install tiny wings without the App Store?
@@GuitarListen A bunch of apps out there are built as web applications. Meaning they're using HTML/CSS/JS so they are able to run on different devices. Making it cheaper to produce since all you're doing is changing the look depending on the screen it's running on. Oversimplified a bit for clarity.
What’s even crazier is that they weren’t able to get the raw data from the iPhone 1. They just used the processed data. Imagine getting raw data and multiple exposures and stuff
That's true. Imagine how compressed are the JPEGs from the iPhone 1.
There wasn't much "processing" in phones camera before around 2011-2012, only some basic tint, grain and some very low sharpening, first real post process was HDR (the basic multi exposure methode), on iOS it arrived with iOS 4.1 so around january 2011.
They where already struggling taking 1 picture so it was fair to not ask them to take 3 pictures to make a HDR shot x).
@@antoinepersonnel6509 They mean processed as in demosaiced, compressed into JPEG, maybe as you say a little sharpening and NR etc.
RAW would still be better as even this basic processing has gotten better since then.
@@antoinepersonnel6509 compression was a thing since the very early point and shoot cameras. Uncompressed image will have greater information, but most of that information can be easily constructed using modern ai, so I don't think it would've been much different than the processed image in the video.
@@ritwikreddy5670 Yeah mostly what i was saying, there was not that much to recover, instead of a blurry JPG mess it would be a mosaic of RAW BITMAP mess.
But they could have taken RAW unprocessed images from the iphone 12 tho with the app Yamera, that would have be a great compare, especially for nightmode.
Sam and Niko need to quit playing around and give Wren an educational VFX channel under the Corridor umbrella. Wrenducation at its finest.
Keep these videos coming!
He’s ready to take the Bill Nye mantle. His physics vfx breakdowns are great and also Wren’s got an ME degree like Bill lol
i second this idea
0:52
Wow they are legitimately afraid of people re-creating the key. Interesting. I guess that makes sense you can do it off of an image/video.
Yes, I had an idea to steal something that way. Don't mean I intended to, just had a crime idea.
Well, any photographer could tell you this; iPhone cameras still use tiny tiny sensors, (obviously) which creates very high pixel density on the sensor to keep the resolution but creates extreme noise and lack of sharpness (not to mention lack of light fidelity). They also have short little lenses with fixed apertures that, due to their actual size, not F-stop, have horrible low light performance. And of course, no optical zoom, which means they rely on multiple lenses with different focal lengths to achieve the effect, which makes it more complex and expensive. Additionally, they digitize mechanical processes in cameras; phone cameras have no mechanical shutter and rely on a digital pre-capture technique. And lastly, they rely *heavily* on the post-processing of the image. They use fake depth of field because the aperture and sensor can't create it, they add fake sharpness and vibrance, and so much AI BS that just makes pictures look fakely stunning instead of realistically beautiful. And believe me, anyone can tell the difference. Overall, image processing compresses the colorspace and the image size itself, sacrificing clarity, focus, dynamic range, detail, and resolution. When the AI is involved to regenerate the lost data and *attempt* to improve the image, it ends up looking completely fake, as exemplified by the 3rd Party AI programs shown in this video. This is what separates phone cameras from dedicated ones; the software vs. Hardware dichotomy.
You can take stunning personal-use photos with phones now, but you could never use a phone for professional use. everything they are could never EVER stack up to any dedicated DSLR or mirrorless camera. The simple fact is phone cameras are so incredibly small, and real cameras are much larger with more features to dedicate and make pictures naturally more perfect. In this sense, size really does matter.
thank you for your essay recapping this video very necessary wow
great break down, thanks !
@DonDoesRoblox plenty of pro photographers use phones when they're more convenient and well suited to the task. There are obviously things the phone camera won't work for but saying they have no use beyond social photos is elitist bs
@@stephenchurch1784 “well suited” oh fucking hell i fucking hope to lord, satan and god you mean when they arent carrying their main camera
@@stephenchurch1784 agreed, this reeks with gatekeeping lol
again, gear is secondary. always. you wont be a professional if you cant make a small sensor work.
Wren: *falls off his one wheel breaking his collar bone, going through incredible pain, and is inconvienced by a cast for months
Also Wren: *Decides it's a good idea to ride his one wheel while holding two phones and wearing novelty glasses that impair his vision
🤦♂️🤦♂️
For the sake of content
Eh falling is just a learning experience
Did he also break his collar bone? I thought it was the guy who frontflipped against airsoft guns
Wren also did a one wheel offroad endurance race. I think he can handle his stomping grounds around the office.
*in the same place
Man those og iPhone photos have such a nostalgic look to them. It's not "good" looking but it takes me back.
Its like an impressionist painting; it doesnt have to be accurate to portray a feeling
@@swanurine It just marks a certain time period that I look fondly on so well
@@azkya8540 No, I don't think I will
That phone clearly wasn’t Russian.. it showed up a day late! 😅😂
emoji cringe
@Fun With Minerals lol they were making a joke.
Then it's definitely Indian. 😆😂😎
@@seductiveseaweed an accurate factual joke
@@amc4t Reddit cringe
It’s crazy how we used to laugh at CSI ENHANCE and now it’s becoming a real thing
lol its not nearly as crazy
I’m glad I watched this, I had kind of surmised just from my own knowledge of photography and computing that they were stacking multiple shots and dynamically upping the shutter speed if things moved.
Then you showed Phil literally explaining that very thing. Kind of forgot they did that.
Seeing this makes me annoyed how little the cameras themselves have actually improved across 14 years, the iPhone 1 looks so much better than I was expecting
I think there's some explanation of why technology isn't progressing as quickly as we think. It's not exponential at this point, but actually marginal. You can see it with iPhones, especially. The leaps were much greater with the early models, now they're barely noticeable. It's just marketing BS.
S21 ultra
These guys are exaggerating so much to fill content. In a raw vs raw the technology in today's phones are leaps and bounds above 2007s iPhone.
@Ba Goai 👍🏾
It's a massive improvement but when you consider It's nearly 15 years of evolution not so much... I mean let's just compare the iPhone 1 to something from 1993 lol
It's like Wren wanted to break an arm again the way he was riding the 1-wheel and taking photos.
Wren is so good at conveying information in an approachable way
1:12 Wren talking without moving his mouth, amazing
now i can't unsee it, lmao
I’m so glad I watched that Bollywood 7 CGI video before it got taken down. I need to watch the full movie now ahaha
I had it saved to watch later. I'm fuming 🤣🤣
shit, i didnt watch it :(
yeah me too it was hilarious
mate can you tell the link for that video, it'll be in your history.. pls
@@narasimhamkl8156 nah man i thought that too but it was removed
Perfectly timed, I was about to buy a new phone specifically for the camera. Corridor is always one step ahead 😂
If youre not attached to ios look into the vivo x70 pro+ or mi 11 ultra or wait for the pixel 6
@@prcvl you have any suggestions for a phone that can handle call of duty mobile properly.... I don't really care for the camera
@@Buangbuang I phone 13 it has the best prosser in a smart phone
@@Buangbuang any phone with Snapdragon 800 +
At that point just buy a damn professional camera... Some amazing cameras out there are even cheaper than a shitty iPhone.
That was the most unexpected crossover ever
Pro tip: pause at 06:52 and cross your eyes, find the right distance and focus from the screen and let your eyes relax.
If you succeed you should she 3 images. The one in the middle has an impressive 3D depth and parallax effect when you move the screen or your head.
Kinda cool to see a blend of the different phone cameras in crisp 3D depth.
it kinda works but its definitely far from acceptable for stereogram standards
@@Cookid621 it looks pretty good to me
Oh i literally do this all the time lol😂😂 its so fun
@@Cookid621 maybe it depends on the viewer's eyesight. I wear glasses and find it harder to view stereograms now than when my vison was 20/20. But i managed to see this one ok the 1st time, with a bit of struggle
Edit: i tried it again putting my face near my screen. That worked better for me
A new way of entertainment. I love it
Wow, this was another super interesting topic by Wren! He really knows how to bring an idea like this to life and demonstrate it in such an eye opening way! 😊
3:19 wren seal impression
Austin Evans and Corridor Crew. The crossover i never knew i needed
never seen him before. Just assumed he is some minor CC employee.
“We don’t have an iPhone. How are we going to do this video?!”
**everyone continues to play Smash**
Great stuff!
Having a background in image processing and high end photo retouching, a lot of things come to mind to make the iPhone 1 photos even better.
For example using multiple shots stacked in Photoshop for superresolution, similar to what deep fusion does.
I would love to see things like that in a future video!
Stupid question but, wouldn't that require additional installation in the old iPhone?
@@_khooman_chronicles_ what he meant is set up both phones in a tripod, and manualy do what the new iphone does (stack 10 pictures at different exposure values) with the iPhone 1 to compare the difference, I just don't remember if the first iPhone was capable of manual exposure due to processing power, I'm not sure if it's possible
Edit: there might be a hack in Cydia tho
@@_khooman_chronicles_ Like @Victor Trasvina said taking multiple shots manually and then stacking them afterwards in PS.
It wouldn't necessarily have to be bracketed exposures. Just tapping to expose for the highlights would work.
The computational photography in phones until very recently wasn't capable of bracketing multiple exposures either. They just take a couple of underexposed shots and combine them to lower the noise floor. Then the image can be brightened in post and tonemapped to retain the highlights.
@ Exactly! Although the shots could be taken handheld and aligned as well, if the exposure time is fast enough. That way you'd ensure subpixel movement.
Love that y’all play smash! My favorite game of all time - would love to see you talk about video game fx
Image stacking would've made a big difference combines with these other upscaling techniques. It's not a fair comparison when the modern iPhone gets to use a series of images taken at different exposures.
After many months of Austin NOT saying it, Wren had to come in and say “HEY GUYS, THIS IS AUSTIN.” Ahhhhhh
Yes so nice to hear it again lmao
This left me curious how much better would've looked with the 9 photographs reference per scene from the iPhone1 to feed the AI
Wren, this is one of those videos where I wish travel and stuff was easier, because I feel like teaming up with someone like Destin from SmarterEveryDay would have made this absolutely insane.
14:50 - 'Heh... I look like a crazy METH scientist'
LMAO
Feels like this video is a good example for having access to the raw data so you can have the option to CSI enhance the image like you want it to ;D
My OnePlus 9 Pro has that Hasselblad camera, which can literally see better in the realtime preview at night than my own eyes do. I can take a picture of what appears to be pitch black darkness to me and it will pull out an image, albeit slightly yellowed.
I can also zoom in so far that I can see into the window 8 houses down the street, whereas I can barely even see that house with my eyes.
Cameras have gotten so insane, and they're finally starting to really make a lot of progress in those extremes, which you don't use often but also make the more normal situations look way better.
As someone who owned a flip phone in the 2000s, I can't complain at all with the new iPhone cameras
My first few phones were flip phones to. You can't complain, but i can. I hate iPhones, i find they suck
Shout-out to the people at Topaz Labs for making those AI programs so good.
4:29 when he said “Hey guys this is Austin”it was such a simple but good reference to Austin’s channel
I would really love to see you guys make a time travel animation for back to the future.
One as close to the original as possible. One completly overboard. And one how you would imagine it
6:20 ok, the new iphone 13 pro sensors are amazing but austin needs to spit out facts, it is nowhere near 1 inch sensor. In fact, plenty of other phones have bigger sensors. However, apple imaging software is indeed incredible, still, not in any way near a 1 inch sensor and thats not a bad thing, bigger sensor requires bigger and better glass, so I dont think phones will get into 1 inch sensor without having a big, i mean hugeee ass camera bump.
Right! most people don't understand that apples "massive" sensors are just slightly larger than a pen tip. I recently bought a Full frame a7iii and the sensor alone is the size of the bottom half of an iPhone nearly 2 inches diagonally
@@joshgeurink2131 Thats not even a large sensor either. Medium format, (real medium format, not the digital slightly larger than 35mm "medium format)the sensor size is so large you can make post card prints the size of the sensor without any enlargement.
Don't worry iPhone got that pure sapphire glass ......
Yeah no where close to 1". The iPhone 13 wide angle sensor has a surface area of 35.2mm2 while the Sony RX 100 VII has a sensor surface area of 116.2mm2. On top of that the 1"-type sensor of the Sony has the physical dimensions of 13mm x 8.8mm with a diagonal of 15.86" which puts its largest dimension just over 0.62in. So that sensor is not even close to 1" in size. 1"-type, from what I can tell must be marketing jargon, since it's no where close to 1", and by comparison a true 1" sensor which is somewhere in the APS-C sensor range makes the iPhone sensor look tiny. The iPhone 13, while it's got a great camera, is no where remotely close to 1" in size...
@@TheMWGriffin A 1" sensor is no where near APS-C sensor range. It's much smaller than even a MFT sensor.
Film Riot did a great showcase of the 13 Pro's video capture which is definitely a whole lot better than my own DSLR's video. That being said, I'll still reach to my DSLR for shooting RAW photography.
If it's better then you don't know how to use your camera. Or your dslr is absolutely shit
Wren could not have done a better introduction to Austin 😅
Even Austin can’t do it better anymore
I actually kind of like the Polaroid look of the original iPhone way better than I expected to. It weirdly makes the photo seem less "perfect" or clinical. It gives the photo a bit of soul.
it's just grain, jpeg artifacts and a low dynamic range.
3:40 gotta love that beck album
You could have also tried putting it on a tripod and making multiple photos focusing on other parts of the image - and then stack them together with a program like Lightroom to emulate the "Deep Fusion".
If you really want to do this right, you should take a burst while being extremelly still, upscale them invidually, then mix them, that will give you way better results.
Do it then!
Burst photos didn’t exist on the first iPhone
@@stillakilla I'm sure he means image stacking/super-resolution, it's not new stuff, try to search the video on youtube, you'll find plenty of them.
@@eyespy3001 A burst is just a bunch of pictures taken one after the other, even if it doesn't have an specific button for it, I'm sure you can do it.
@@YOEL_44 I suppose you could do that, yeah.
When Wren said "Hey guys" I instinctively said "This is Austin" and then there he was lol
5:16 NOICE!
The insertion of that sponsored segment was so sneaky and smooth. It was beautiful
Google pixels are the perfect example of this. The camera hardware is kinda old and mediocre but the post processing let's it compete with the latest iPhones
Try to take a video though and the quality goes back to what you'd expect from the pixel hardware
thats why so many are exited for the pixel 6
Yeah the Pixel was for a while the unchallenged leader in software post-processing. Night sight was a groundbreaking feature and was available on the two generations previous the one it was introduced. Great to see that other manufacturers have also had great success with it, and I can't wait to see what post-processing can do in the future.
@@TherapyGel though I think nightvision on my Pixel 4a 5G is still amazing it is much brighter and more detailed than what I can see with my own eyes. So I guess nightmode on Pixel might even still be better than on the iPhone 13.
@@lal12 it's definitely still the best of them all, even on my 2 XL. Can't wait to see what the 6 is capable of.
Yeah the iPhone blows all others away in video for sure. Always has.
The colab I never knew I needed lmfao I love Austin
The iPhone doesn’t use the NPU when opening images it’s already taken. It uses the NPU only on capture time. What you’re seeing when opening the photo is the transition from a thumbnail/preview to the full res image.
It's also possible that the thumbnail is taken from the live feed, which makes it a fair assumption that the preview shows an unprocessed photo.
Otherwise, additional steps in image processing is surely being done in the background so that it is ready for the next shot as fast as possible. It can use the NPU better as you open the gallery, as a result of preserving resources. The app is smart enough to determine how to do processing.
Of course this is mostly assumptions on my part because there's not much documentation on how Apple designed the software behind image processing. But the assumption that it only uses the NPU on capture time (assuming you mean as the shot is being taken) might be wrong.
Although to be said, A15 is pretty fast and the model is probably pretty lightweight that the processing is instantaneous.
@@秋保です i think youre right. This transition only happens the first time you open the photo, so its not just thumbnail to actual image transition.
those images with the old iPhone made me feel incredibly emotional for some reason. Just like pure nostalgia slapping me in the face
You guys are fast becoming one of my favorite channels on RUclips. Really interesting topics and super informative on the tech behind the scenes without asking me feel like an idiot. 👏
Except for when I watch them outside of the corridor app I see ads like this, PUBGM doesn't have controller support and its on its 3rd/4th year of gaming while he mentioned 7 years ago the game has gotten better???? maps are still only 3* more than when it started unless you count small 4v4/8v8 small maps and the fun game modes they keep getting rid of (also small maps or relying on already made ones). So not like new maps all the time. Also back when in 1st year of pubg mobile you used an app to use a controller then later you gor banned for using third party apps to use a controller. So the ad was almost all a lie? But I do enjoy the content but watching what they do on youtube (since with RUclips premium figured I'd watch) this is just dissapointing
I've had a number of times where friends questioned why it took so long to get a shot just right with my actual camera and I explained I have to change my settings to get what I want and they go well my phone can take a picture quicker and I'm like ya it does a million things at once. All for them to love how good the photo looks when I'm done 😂 phones are great for capturing fun times and light duty stuff but even with all they do now, a traditional camera will still blow them out of the water.
That’s true but probably only if a person knows how to work with camera properly lmao 😂
I REMEMBER in 2010 having to intercalate like 3 different photos of varying exposures to create an HDR photo!
I'm curious about a comparison between the old and if you limit the new to 2MP.
5:30
The collab we didn’t ask for
But the collab we needed
To me improving the software side of things is all great and stuff but what I'm more interested in is having actually better pictures, not something that isn't a true representation of the moment.
Apple's objective is probably to offer very ordinary hardware but with software enhancement that way they save money on the hardware components.
So was Arnold's CPU actually an NPU? I mean, his neural net processor was a learning computer...
My favorite videos are where Wren talks at us for 20 minutes.
The original version of iOS did support 3rd party apps in the form of web apps which could be saved to the homescreen
I just LOVE seeing my city in the background!!! Thanks guys..... West West y'all.....
I liked the video when I noticed that you had a sponsor add time limit indicator. I watched the whole video, but I definitely appreciated you letting me know how much time was left on the add. 🤠
Can’t get over how at 18:06 , when Wren introduces Topaz Labs, the first sentence he speaks is right on beat with the background music.
It’s called editing
@@alexbarnet6982 I get that they edited it so that it starts at the next segment of the video, I meant that the words are on beat with the first bar of the music like he rapped to it lol
I would LOVE to see another "Following Bob Ross in real time" video done with BEEPLE!!!
I appreciate the fact that Nick also put on his lab coat.
When Wren said “Hey Guys This Is Austin” I felt weird since I am used to Austin saying it.
If you have photoshop you can take multiple photos and then File>Scripts>Load Files into Stack. From there you can have it align the photos and you can layer the images to average out the noise grain to make these "super resolution" photos.
When he said "He guys" I mentally prepared myself to add "..this is Austin" as a joke. Didn't expect Austin to actually be there
Me too!!!!!
Blackmagic: "Let's do a global shutter on our 4K!" "Nope. Turns out there's always room for Jell-O."
Because of the pursuit for the best objective imagine we lose some of the "character" of older technology that had to deal with a lot of limitations.
That's one of the reasons why Black pro mist filters are so popular with filmmakers, it adds some character back
Awesome Wren!! So good! I shoot vids with my phone and it does the job but would actually love to shoot one with the original iPhone now! Interested to see it! Also, Battlefield Mobile alpha is available? You guys should try that and make another long-awaited Battlefield video
You guys should try creating each own Bond title sequence! Would be fun to see
8:27 that Harry Potter moment
"this is for Jimmy... Jimmy has not lived here in.. 10 years!!" *Dramatic horror music*
The "hey guys, this is Austin" gave me goosebumps!!
I get y’all are talking about the phone but the jimmy cameo with those classic VGHS headphones I loved that show so much holy crap epic nostalgia
These videos are so great. they turned things that I do not care about whatsoever into watchable content that makes me laugh & learn so thank you
I kind of hate how iPhone photos look, I feel like I can't get an accurate gauge on how I actually look when they do stuff like smoothing out skin imperfections in selfies. It makes me look much better than it should, and if I switch to the front facing camera I look completely differently. It's quite frustrating. I want to look accurate.
You can never have accurate. With a digital camera there is unavoidable processing, like the white balance, distortortion fixes and color tweaking, but even on the analog level things will be distorted/formed by how wide the lens angle is, and how you've framed the photo.
I'm not trying to mansplain how a camera works to you, rather your idea of an "accurate" picture got me thinking. What is an "accurate" picture? Even without the camera distortion, there are so many things that change and influence your look.
I'm with you. There wayyyy to much image processing. It's not natural. Over saturated. Smoothed where it shouldn't be. Extra texture where its not. Yet for whatever reason it's always rated the top phone camera. Maybe it's just the minority of us, but I like the muted more accurate photos not the auto Photoshop effects.
Another thing, the average camera has pretty shit dynamic range if you only take a single shot, especially compared to what the human eye sees. You *have* to have some processing, like multiple composited shots, in order to get close to an "accurate image" of what another human would see in the same position.
@@Appletank8 Part of the awful low-light response is the high f-stop. The fact that the apertures on the old phones, and the standard lens of new phones is about 2-3mm, there's only so much light that can get to the sensor through such a small hole. For the ultra-wide and low light modes, modern phones use the extra lenses up to about 6-7mm diameter.
@@jakobvanklinken True but there are phones (like the Xperia 1 III) with more accurate/true to life processing
I remember having a 40mp Nokia lol I wish new phones had that with modern processing power. The sensor in that thing was amazing but it had the windows os which totally sucked for everything lol
Now can we get one done for Android devices, I'm curious if they are doing similar work to Apple
Yeah but some of there phones have legitimately better cameras
Some Android phone cameras are better than the iPhones, that's just a fact, however I believe apps have been optimized iPhone cameras for example why selfies on snapchat or Instagram look so much better on iPhone than Android
Good news! Pixel has been doing this since they launched. They've always been focused on image processing, and that's why even the 4 and 5 were able to keep up with the iPhones with a 3 and 4 year old sensor.
@@Frankie-xu6sr yeah pretty much. Android have such a wide range, and the software for this computational photography is usually locked in the phone's camera app. The only exclusion that I know of is when the pixel line had the "pixel neural/visual core" in the pixel 2 3 and 4. Then other apps were able to utilize that to take higher quality images.
12:17 that Lemmino music. i love it