I know this may be something obvious I'm pointing out, especially for professionals or folks coming from cinema camera rigs, but people approaching this from the phone-camera world may not know that this type of rig negates the auto-focus capabilities of the iPhone...since the iPhone sensor/focus is always trained on the glass in the DOF adapter. So, if you are using a DOF adapter and associated lenses, you will need to acquire some extra gear and learn how to pull focus for your shots. This is something that professionals take for granted, but it might be a new skill for someone moving up to this type of shooting.
GREAT POINT! I hope people realize that. In this episode we focused on the camera and lens mount story directly, but in Episode 24 we will focus on the camera accessories and that will include some info about how we used the Tilta Nucleus M. Thanks for the comment!
@@lanolinlight You're absolutely right, but it is a skill that requires practice and rehearsal. It all depends on how you approach this type of rig. If coming from above, it's a no-brainer. If, however, you are moving up from a base camera-phone rig, it's something else that has to be learned. That's the point I was trying to make.
Large frame sensor is basically all you need, its all about light diffusion and T stops. Seriously all that proccessing power in a phone and yet camera manufacturers STILL haven't cottoned on to the idea of JUST SELLING US NOTHING BUT a USB Lens mount and large frame sensor...(JUST SELL US THAT!) we don't even need lenses we don't need processors we just need a lens mount and a damn awesome sensor. But hey if they did that the entire world would have a cheap camera solution and the market would crumble in years...
no the first thing you need is great color science! a removabe lens ok over 10bit ok high dynamic range ok and interframe recording ok but 4k isnt importnat
I understand that for them it was an experiment, in which they are aware of the importance of the size of the sensor that the iPhone cannot satisfy. So you might as well invest in a Sony FX30 (I exclude FX3 for the price but which is an excellent candidate) Panasonic GH6, 7, S52 (some of the above-mentioned cameras have prices similar to Apple) that have larger sensors than the iPhone. In conclusion, where do you want to show the final product (film)? In a movie theater, on the TV, on the computer monitor, on the smartphone? As you can understand, the vision of that film is in function of the size on which it is projected. It is obvious that as the size of the vision increases, artifacts begin to appear. If the purpose is to show the movie on the smartphone, tablet or at most on the 17" monitor of a PC, ok go for the iPhone as a means of shooting. Of course I'm not saying I can use a 70 mm IMAX camera, but already with a 24x36 camera it produces exceptional results.
That's the right attitude. While many people see this as either pointless or a threat, every time a new technology hits a surge it creates its own weather which gains momentum when other tools start developing within its ecosystem. I suspect that shooting with an Apple product could be fairly common practice by the end of the decade.
I love that phones can finally produce usable footage for "creative" stuff thanks to the Blackmagic Camera app and log. Just makes it so much easier to practice shooting, learn, and try stuff when you don't have to be bogged down and fiddling with stuff. Once Apple makes all the cameras the same as the main camera in the Iphone 15 pro it'll be pretty incredible. Obviously it's not going to replace cinema cameras or rigs, or shoot a whole movie or anything, but it'll be such a great tool to use for "enthusiasts" like me trying to learn....without spending $5000+ on camera gear. The last missing piece is successfully improving and mimicking depth of field and the depth mapping. I think they're actually getting pretty close to being able to mimic it surprisingly well. It's significantly less of a gimmick now than it use to be. Also...imagine if they put a built in ND filter or eND in phones? Man that would be the day.
@@jz6367the "ND" filter in the s24U is NOT a physical ND. it's SOFTWARE merging multiple images to make a long exposure look. which means samsung's "ND filter" isn't for light control, it's for fake long shutter control.
I rarely comment a video, but damn I love your content ! It’s fast, it’s concise, it’s excellent. I’m a total noob in this domain but still I rather watch the video a 2nd time than having someone talk slow for 30 minutes. And It’s so well explained i almost got it all ! Thank you, a yt channel with this quality will get at least 200k subscribers for sure
Absolutely loving this series! Your first video inspired me to get an iPhone and shoot some short films with it. A few thoughts here: 1) The anamorphic looks soft on focus throughout, even on the actor’s face, as you chatted about towards 14:30. FWIW, there are shots here that you’ve taken where I actually prefer the iPhone over the anamorphic. It certainly has a look, for better or worse! A shoot where you get images both from anamorphic and the stock iPhone lens could work, although then you have to make sure that it works stylistically. 2) There is inherent difference between the two lenses because of the focal length, so you’re not getting background compression on the 24 mm like you are on the anamorphic. There is a workaround to this that most people don’t seem to take advantage of - because of the large megapixel sensor on the 24mm on the iPhone, you can zoom in on your app of choice (I prefer Black Magic) up to 3x, without any degradation of quality, and retain a perfectly sharp 4k image. So, zooming 2x on the 24mm functionally gives you a roughly 48mm lens, with background compression that generally matches that focal length. If you’re skeptical that those results can get you sharp images, give it a try! Record at 4k 2x with the 24mm on a dedicated app, and then record at 4k 1x. Punch in on the 4k 1x image in post, and you’ll notice a large drop off in quality vs. the image recorded at 2x! Because of the large megapixel sensor, you are going to get a sharp image until you start punching in past like 3.5x on the 24mm. There are a lot of focal lengths available that were previously thought to be out of reach on iPhone because of this! I think if you were to use that trick here, and “match” the focal lengths, then your results would have even been much much closer! Thanks for the awesome content. Can’t wait for more!
Can we just take time and appreciate the hard work they put in to make this video, this is one of the most clearest and informative videos on making the most of your camera on hand, which for most people is an iPhone, I’ve ever seen.
Exceptional content! Thank you for sharing such a thorough understanding of the image production process. Your channel's content explores various aspects deeply, providing abundant opportunities for learning and growth.
...and #4, some people are only using their iphones because they don't have cinema or mirrorless cameras. Not sure why people hate on smartphone filmmaking so much. To me, this video just shows where filmmaking can or even is going. It gives the people who don't have dedicated cameras a chance and a bit of hope. I love watching these. Thanks!
it's truly a wonder that these are the images coming out of an iphone. it would be interesting to see you guys come up with solutions for run and gun creators / vloggers / or even iphone shooters in the live events space. thanks for your hard work michael and team!
I love the innovative approach to this iPhone setup! Honestly it’s probably the best one I’ve seen yet! I would almost argue that Phones could eventually become a sort of “pocket cinema camera” that does other things like any phone can and is already amazing for things like set BTS! At this point tho, when Sigma FP, Fugi cameras, BMPCC 4k is $1200, FX30 is $1500-2100 and BMPCC 6K G2 is just over $1560 with 4/3 or S35 and FP has full frame size sensors, then at this point unless you already have the iPhone 15 Pro and no camera, the cost of building a budget compact anamorphic cinema kit like this is probably still less to buy and the size isn’t much larger. You’d only need a pl adapter as you wouldn’t need the DOF ground glass adapter and other elements that compensate for exposure, sensor size or lens limitations. I loved this video, epic ideas and trying new things despite their current irrelevance to the mainstream industry doesn’t negate the need to experiment and learn and try!
I love these shoots & ideas. The anamorphic shot reminded me of Matt Reeves’ Batman, and how he wanted to ‘dirty the frame’. Such a BEAUTIFUL look you achieved there 🥰😍
This was really well made, concise, easy to follow, and entertaining. You could put this video in a paid course focused on iPhone film making and it would be of tremendous value.
BRILLAINT VIDEO. Thank you guys for some serious hard work putting this episode together. There is no doubt, that the iPhone is more than up to the job. Just done a product photoshoot, and the little Apple knocked it out of the park.
This is really cool, I appreciate the work that has gone into this. However as somebody that primarily shot ads, I prefer the look of the plain iPhone.
This is great and produces amazing results…. How ever that is still a camera oriented rig and lens not iPhone native accessorising… for the next one use off the shelf consumer rigging like Neewer iPhone cage with 17mm Ulanzi/adapted anamorphic lens the little square one…(please don’t use Moment or Beastgrip if you really wanna go on budget). I think Production Design and PostProduction are the the real kings….
I have learned that my films (so far) are NOT interesting to people. People will watch anything...if it's interesting. If you want to engage your audience, start with a good story. Present that story to them, using the absolute best resources and skills you can muster. If it happens that you can only afford an iPhone, then go for it! ...Buuuut, if you happen to have access to an ARRI Alexa Mini, then... The higher the quality, the more the audience will feel respected and therefore, more engaged. 👍
I will use my iPhone to shoot in the future due to its size and convenience and to challenge and to pick myself as an artist. To shoot a great film on a phone is a filmmakers challenge and bonus bragging points to showcase your ability.
@@TheDreInDaHouse Sure! I was only generalizing. My only point was that I personally use some pretty nice equipment, but I (unfortunately) lack the talent to create fresh, interesting content. Maybe I would do better stuff, if I would do as you say and focus more on challenging myself, instead of the equipment I'm using.
Great content! You people are real professionals with real information to share. I love it! Subbed!!!! One small additional note: Tilta MB-T12 retails for 690€ VAT excl. in Europe. Not quite a bargain everywhere :D
iPhone is a brilliant smartphone and camera. But when you have such a production before and after the filming, you could also use a photocopier as a camera and have a beautiful result
Incredible video, you guys are insaleny good ! The osmo pocket 3 is a best seller right now, I wonder If you could obtain the same result since it gots a 1inch sensor.
We actually tried that in our preliminary and got some interesting results. The reason we didn't use it is because the Beastgrip anamorphic lenses do not have a lot of focal length options and tend to favor WIDE focal lengths. I needed telephoto and it's one of the limitations of some of the lenses that adapt directly to the phone. For the best framing, I want control over longer lenses and that just isn't an option (yet) without a professional prime like the Mercury's we used here.
The ground glass needs to spin or oscillate to keep it's texture invisible. Try pairing the iPhone with an old Letus35 or Redrock DoF adapter or modify the Beastgrip to have oscillating glass.
My guy. Nice episode. But these figures you throwing out there for ‘affordable,’ and ‘off the shelf,’ Is not really that. I guess for professionals and those in this field. However, some of us, like me, are just enthusiasts. So we not going to get into that… 😅😁 However, the results do seem better than what you achieved before. I did notice that overall, the images were much softer and lacked a some detail- even with subjects in focus. IS this because of the ground glass? Apart from that, this is absolutely amazing work. You are absolutely right about the iPhone becoming a more capable device for cinematography soon- everything related to the image sensor size is correct. However, as for me, I will use my gimbal, my Shiftcam case and the 60mm lensultra lens and shoot in the 16:9 ratio. I will cheer you on wholeheartedly from all the way in Trinidad and be a part of the discussion and the innovation but wait until the iPhone tech has matured much better before I invest in that kind of, ‘off the shelf’ items. 😅😊
I feel lioe all of this is the blackmagic camera app and sensor, so I also feel a similar thing could be acheived on something like a sony phone with a better sensor and the app.
Is there a way to get a copy of the actual scene that was being videoed? The "Agent" at the filing cabinet is a good friend of my wife and I, and it would be very cool to see the complete scene.
What happens if you shoot anamorphic with the iPhone+cinematic mode? (with a small anamorphic lens for iPhone) skipping the big lens set up? sharper scene with blurred background?
This is a cool experiment, but the video was disingenuous about the overall price of that setup for sure. Other costs not mentioned: EF to PL Adapter - $100 (Your numbers came up $100 short of that rig, so this must have been left out) Diopter - $500+ 4x5 diffusion filter - $100-$500 Wireless follow focus system + rigging - $500+ Tripod - $1500+ for a barely decent fluid head setup Z-flex tilt tripod head - $60 Slider - $500+ And of course the real heavy lifter here is the tens of thousands in lighting and studio equipment, not to mention audio. No one in the world would spend all of that money on equipment to shoot cinema-style footage on a 12mm sensor. Again, its a cool experiment, but you made it so much about affordability and how the iPhone meets all the qualifications of a cinema camera. I'd argue having at least a Super 35 sensor is the sixth qualifier because footage from a tiny sensor just doesn't look the same, 4K 10-bit RAW or not.
It's possible I made a mistake and failed to add that in. That's totally an accident (definitely not trying to be disingenuous here). We really worked hard to share the cost of all the main camera gear. Now the actual PL lenses we used (Mercury) had a rear element that was too wide for the Fotodiox, so we had to make a customized widening to fit the lens, but if you use an EF lens, you don't need that piece and if you use a PL, Fotodiox provides a list of compatible lenses. All this is to say we always try to be genuine with the information. Plus, I expect there will be direct smartphone to PL in 2025 so some of these adaptors will become moot pretty quick. Thanks for the comment:)
@@Michael_Cioni Thank you for the reply. I learned a lot from this video so I hope my criticism wasn't overboard. Your skills and knowledge are appreciated.
Good eye:) This is mostly due to the ground glass adaptor. But there are some new adaptors coming out that won't have this artifact. So I expect to see major improvement!
This is a neat experiment, but far from the cheapest true anamorphic you can get with an iPhone. Brands like Moment have been putting out iPhone-compatible anamorphic lenses for a couple years now priced at around $120 new. A full rig with these lenses could easily come in at a fourth of what was spent here.
I noticed you guys used the atlas lens extender 1.6x adapter to connect the Mercury lenses, but did you use the PL to PL or the PL to LPL and how did you connect it to the beastgrip DOF mk 3 when that is an EF mount?
Good eye:) You are correct Beastgrip DOF is EF. When we did this there wasn't a commercially available adaptor to make this work. We had to use the Atlas PL adaptor and modify the rear element so it would fit the flange of the Mercury and then adapt that to the Beastgrip. It worked fine, but was definitely not off-the-shelf. I've spoken with a lot of people about making more PL to PL options for this and I expect in 2025 you'll see more hitting the market. Thanks for the comment!
Anyone knows what the technique about color grading is that makes the image look darker?, when he shows the color correction at 0:55, everything goes a bit darker, the exposure looks so good and i cant seem to find a way to search and learn about it.
That's called a "thick negative." You can watch Episode 24 to learn bit more about the lighting and Episode 26 to learn bit more about Apple log. But there are tons of RUclips videos about log color and DaVinci Resolve. I would start there. We used Apple Log + a LUT + a Grade (Resolve) and combined those together to get this to look very controlled. Thanks for the comment:)
We were working with Atlas Lens Co and because the flange depth and rear image circle diameter varies from lens to lens, not all PL lenses are compatible with converters. The Mercury's we used cover full-frame so they differ from a Cooke S4 (for example). So this was a custom adaptor that allowed for a wider rear diameter. I expect someone out there makes it, and the Fotodiox works on a lot of lenses but not the new Mercury's. So going custom for this was the easiest. But you can fit a lot of lenses on the Fotodiox and other EF -> PL adaptors out there. I hope someone makes one to support more lenses, tho. I think people would buy it.
The Fotodiox and most PL adapters work with very FEW lenses. None full frame. How much did it cost to build the custom adapter? You should factor that cost in.
I have a few questions. What is the difference between the LUT and the later colour correction pass? And which programme can I use to do this? And where can I get this LUT?
Great question! The LUT is a creative conversion from the camera log into a viewing gamma space (in this case, Apple Log into Rec709). Some people refer to this as a "base grade" because it gets you "normalized images" coming from a log source. Then you use a color correction pass to fine-tune the look. Ironically, the fine-tune color should happen before (or beneath) the LUT in order to get the best control and range out of the adjustments. I work with colorist Nick Lareau who grades in DaVinci Resolve and uses the base LUT and about 20 different layered effects to achieve the creative color pass. That's the foundation of color grading and thanks to RUclips, there are about 5,000 DaVinci Resolve training videos to start learning how to do it yourself. Plus, Blackmagic makes a free version of Resolve to try! Good luck:)
The vignetting on the Mercury was very minimal - but we used the Atlas LF rear expander adaptor (from EF to PL) so we had as much coverage as possible. The vignette was minimal and aesthetically appropriate for this project.
Please tell me what hub you are using to get from the Iphone to the camera monitor and also the production monitor? I know you're using a tx/rx system but the Iphone only has one port. That would be a great help as the one I am using doesn't seem to send video to both or there is not enough power. Thank you.
IN all our tests, if you use the 24mm lens, that's the "hero" of iPhone and the lowest signal-to-noise ration. Since the ProRes is legit 10bit HQ (440mbps) it handles fairly well. The ground glass used to adapt the PL lenses has some artifacting (the ground glass fresnel can sometimes be visible) but I QC this on an 85" LG OLED (C22) and it really does look impressive. Our colorist, Nick Lareau reported zero issues coloring it.
Hello, thank you for your very comprehensive video. I'm preparing the launch of my RUclips channel with multicam anamorphic footage. I have a GH5II with a 35mm SLR Magic x2 lens, and two iPhone XS 512GB with a Sirui x1.33 lens. I noticed that recently, the BlackMagic Cam app has an anamorphic decompression feature that compresses instead. Have you noticed this issue on your end?
Good point. I personally found it depends on the shot. Sometimes it's really hard to see any negative artifacts, other times it's clear. We shoot and finish all our work in UHD and I personally QC it on a calibrated 85" LG OLED (model C2). Lenses like these are supposed to be highly stylized and I have to say it holds up very well. But if you think about phones and DOF adaptors 1, 2 and 3 years ago - -> reverse that to 1, 2, and 3 years from now and that's where this gets interesting... Thanks for the comment!
That is an Edlekrone Slider Plus. really nice and elegant solution for small uses. the weight of this rig was pushing it, but Edlekrone makes a slightly larger version that better supports larger cameras. We'll talk more about this in our next Episode 24! Thanks for the comment!
I know this may be something obvious I'm pointing out, especially for professionals or folks coming from cinema camera rigs, but people approaching this from the phone-camera world may not know that this type of rig negates the auto-focus capabilities of the iPhone...since the iPhone sensor/focus is always trained on the glass in the DOF adapter. So, if you are using a DOF adapter and associated lenses, you will need to acquire some extra gear and learn how to pull focus for your shots. This is something that professionals take for granted, but it might be a new skill for someone moving up to this type of shooting.
GREAT POINT! I hope people realize that. In this episode we focused on the camera and lens mount story directly, but in Episode 24 we will focus on the camera accessories and that will include some info about how we used the Tilta Nucleus M. Thanks for the comment!
Pulling focus is not rocket science. Anybody acquiring all that gear will do well to learn a few extra skills.
@@lanolinlight You're absolutely right, but it is a skill that requires practice and rehearsal. It all depends on how you approach this type of rig. If coming from above, it's a no-brainer. If, however, you are moving up from a base camera-phone rig, it's something else that has to be learned. That's the point I was trying to make.
Focus pullers are the best.
Agreed. Always important to have a 1st AC that knows how to pull focus
Cinema camera needs: 9:01
1. 4k resolution and above.
2. Intraframe encoding [RAW & RGB].
3. High dynamic range. [Must record logarithmic.
4. Wide color gamut. From 10bit [1 Billion colors].
5. Removable lenses.
💯 When he listed these I was like YES! Finally, someone is speaking my language! 😊
Large frame sensor is basically all you need, its all about light diffusion and T stops. Seriously all that proccessing power in a phone and yet camera manufacturers STILL haven't cottoned on to the idea of JUST SELLING US NOTHING BUT a USB Lens mount and large frame sensor...(JUST SELL US THAT!) we don't even need lenses we don't need processors we just need a lens mount and a damn awesome sensor. But hey if they did that the entire world would have a cheap camera solution and the market would crumble in years...
by your definition, the Alexa Mini or any camera with an ALEV IV sensor isn’t a cinema camera
no the first thing you need is great color science! a removabe lens ok over 10bit ok high dynamic range ok and interframe recording ok but 4k isnt importnat
I understand that for them it was an experiment, in which they are aware of the importance of the size of the sensor that the iPhone cannot satisfy. So you might as well invest in a Sony FX30 (I exclude FX3 for the price but which is an excellent candidate) Panasonic GH6, 7, S52 (some of the above-mentioned cameras have prices similar to Apple) that have larger sensors than the iPhone.
In conclusion, where do you want to show the final product (film)? In a movie theater, on the TV, on the computer monitor, on the smartphone? As you can understand, the vision of that film is in function of the size on which it is projected. It is obvious that as the size of the vision increases, artifacts begin to appear.
If the purpose is to show the movie on the smartphone, tablet or at most on the 17" monitor of a PC, ok go for the iPhone as a means of shooting.
Of course I'm not saying I can use a 70 mm IMAX camera, but already with a 24x36 camera it produces exceptional results.
Finally someone did a DOF/Anamorphic lense rig here on YT and may I say nailed it. Can finally purchase myself Blazar Remus and film on a budget!
you can save nearly a thousand dollars by using a used mirrorless camera. Dont use a phone, and dont overpay to use a phone as a camera like come on.
It’s been years since camera tech has made me this excited, it’s got me thinking about the possibilities of what may be coming in the near future!
That's the right attitude. While many people see this as either pointless or a threat, every time a new technology hits a surge it creates its own weather which gains momentum when other tools start developing within its ecosystem. I suspect that shooting with an Apple product could be fairly common practice by the end of the decade.
I love that phones can finally produce usable footage for "creative" stuff thanks to the Blackmagic Camera app and log. Just makes it so much easier to practice shooting, learn, and try stuff when you don't have to be bogged down and fiddling with stuff.
Once Apple makes all the cameras the same as the main camera in the Iphone 15 pro it'll be pretty incredible. Obviously it's not going to replace cinema cameras or rigs, or shoot a whole movie or anything, but it'll be such a great tool to use for "enthusiasts" like me trying to learn....without spending $5000+ on camera gear.
The last missing piece is successfully improving and mimicking depth of field and the depth mapping. I think they're actually getting pretty close to being able to mimic it surprisingly well. It's significantly less of a gimmick now than it use to be.
Also...imagine if they put a built in ND filter or eND in phones? Man that would be the day.
Spot on.
Idk about the Pro Max but the Ultra does now have an ND filter option built in. Can't say how good it is though.
@@jz6367the "ND" filter in the s24U is NOT a physical ND. it's SOFTWARE merging multiple images to make a long exposure look. which means samsung's "ND filter" isn't for light control, it's for fake long shutter control.
I rarely comment a video, but damn I love your content !
It’s fast, it’s concise, it’s excellent. I’m a total noob in this domain but still I rather watch the video a 2nd time than having someone talk slow for 30 minutes. And It’s so well explained i almost got it all !
Thank you, a yt channel with this quality will get at least 200k subscribers for sure
Wow! Thank you! So glad you enjoy it! We're new to RUclips and happy you find value on our channel. Let me know what else you'd like us to cover!
Absolutely loving this series! Your first video inspired me to get an iPhone and shoot some short films with it. A few thoughts here:
1) The anamorphic looks soft on focus throughout, even on the actor’s face, as you chatted about towards 14:30. FWIW, there are shots here that you’ve taken where I actually prefer the iPhone over the anamorphic. It certainly has a look, for better or worse! A shoot where you get images both from anamorphic and the stock iPhone lens could work, although then you have to make sure that it works stylistically.
2) There is inherent difference between the two lenses because of the focal length, so you’re not getting background compression on the 24 mm like you are on the anamorphic. There is a workaround to this that most people don’t seem to take advantage of - because of the large megapixel sensor on the 24mm on the iPhone, you can zoom in on your app of choice (I prefer Black Magic) up to 3x, without any degradation of quality, and retain a perfectly sharp 4k image. So, zooming 2x on the 24mm functionally gives you a roughly 48mm lens, with background compression that generally matches that focal length. If you’re skeptical that those results can get you sharp images, give it a try! Record at 4k 2x with the 24mm on a dedicated app, and then record at 4k 1x. Punch in on the 4k 1x image in post, and you’ll notice a large drop off in quality vs. the image recorded at 2x! Because of the large megapixel sensor, you are going to get a sharp image until you start punching in past like 3.5x on the 24mm. There are a lot of focal lengths available that were previously thought to be out of reach on iPhone because of this! I think if you were to use that trick here, and “match” the focal lengths, then your results would have even been much much closer!
Thanks for the awesome content. Can’t wait for more!
Can we just take time and appreciate the hard work they put in to make this video, this is one of the most clearest and informative videos on making the most of your camera on hand, which for most people is an iPhone, I’ve ever seen.
Exceptional content! Thank you for sharing such a thorough understanding of the image production process. Your channel's content explores various aspects deeply, providing abundant opportunities for learning and growth.
I can't believe how detailed it is. It deserves a lot more of views
...and #4, some people are only using their iphones because they don't have cinema or mirrorless cameras. Not sure why people hate on smartphone filmmaking so much. To me, this video just shows where filmmaking can or even is going. It gives the people who don't have dedicated cameras a chance and a bit of hope. I love watching these. Thanks!
What a quality and information rich video great work 👏
Home run!! you guys killed it with this episode!!! Keep them coming and please make your videos LONGER!!!!!!
Thank you for breaking it down further
it's truly a wonder that these are the images coming out of an iphone. it would be interesting to see you guys come up with solutions for run and gun creators / vloggers / or even iphone shooters in the live events space. thanks for your hard work michael and team!
Love this idea.
I love the innovative approach to this iPhone setup! Honestly it’s probably the best one I’ve seen yet!
I would almost argue that Phones could eventually become a sort of “pocket cinema camera” that does other things like any phone can and is already amazing for things like set BTS!
At this point tho, when Sigma FP, Fugi cameras, BMPCC 4k is $1200, FX30 is $1500-2100 and BMPCC 6K G2 is just over $1560 with 4/3 or S35 and FP has full frame size sensors, then at this point unless you already have the iPhone 15 Pro and no camera, the cost of building a budget compact anamorphic cinema kit like this is probably still less to buy and the size isn’t much larger. You’d only need a pl adapter as you wouldn’t need the DOF ground glass adapter and other elements that compensate for exposure, sensor size or lens limitations.
I loved this video, epic ideas and trying new things despite their current irrelevance to the mainstream industry doesn’t negate the need to experiment and learn and try!
Love stuff like this and no one has done it as in depth as this.
I love these shoots & ideas.
The anamorphic shot reminded me of Matt Reeves’ Batman, and how he wanted to ‘dirty the frame’. Such a BEAUTIFUL look you achieved there 🥰😍
Very interested to see VR & focus pulling - I’ve been thinking about that for a while!
Sick vid!
Thanks! Targeting April 2nd for the Vision Pro story!
Very interesting video! Thank you for pushing forward smartphone filming innovation and for all the very informative details!
Great Video and great project!
Love what you guys are doing and the way you're doing it! Keep going!
It would be interesting a breakdown of the rig for both the expensive rig and cheaper and where to get them?
This was really well made, concise, easy to follow, and entertaining. You could put this video in a paid course focused on iPhone film making and it would be of tremendous value.
This is amazing. Thanks everyone!
these actors have something, id watch that series!
My LUMIX S5 has great image quality. My 15 pro is pretty damn close. Insane. Using anamorphic looks so good
BRILLAINT VIDEO. Thank you guys for some serious hard work putting this episode together. There is no doubt, that the iPhone is more than up to the job. Just done a product photoshoot, and the little Apple knocked it out of the park.
Thanks for the compliment!
Woo!!! Beastgrip Nation FTW!
Danny Boyle is supposedly directing 28 Years Later on an iPhone 15 Pro Max, with a setup like this.
This is really cool, I appreciate the work that has gone into this. However as somebody that primarily shot ads, I prefer the look of the plain iPhone.
The 16 pro max is amazing! I bought one! Love the video tho
I got the 16 too and I agree but I did not get the pro max, onely the pro cause the max is to big for my hands
I would love to know which custom EFTPL mount you use
strada just the people complaining we are actually enjoying this series on phones we need more
Love your work, guys
thank you so much:)
This is great and produces amazing results…. How ever that is still a camera oriented rig and lens not iPhone native accessorising… for the next one use off the shelf consumer rigging like Neewer iPhone cage with 17mm Ulanzi/adapted anamorphic lens the little square one…(please don’t use Moment or Beastgrip if you really wanna go on budget). I think Production Design and PostProduction are the the real kings….
I have learned that my films (so far) are NOT interesting to people. People will watch anything...if it's interesting. If you want to engage your audience, start with a good story. Present that story to them, using the absolute best resources and skills you can muster. If it happens that you can only afford an iPhone, then go for it! ...Buuuut, if you happen to have access to an ARRI Alexa Mini, then... The higher the quality, the more the audience will feel respected and therefore, more engaged. 👍
I will use my iPhone to shoot in the future due to its size and convenience and to challenge and to pick myself as an artist.
To shoot a great film on a phone is a filmmakers challenge and bonus bragging points to showcase your ability.
@@TheDreInDaHouse Sure! I was only generalizing. My only point was that I personally use some pretty nice equipment, but I (unfortunately) lack the talent to create fresh, interesting content. Maybe I would do better stuff, if I would do as you say and focus more on challenging myself, instead of the equipment I'm using.
Your content is all next-level.
This is incredible!
epic video. keep it coming
Congrats guys u freed many minds today!!
Incredible point of view, so clear explanations! Your team is fantastic, this is perfectly adapted to new tech avangarde.Best regards from Romania.
🔥This is awesome 😎
Nice work Team 😊
thank you! This one was really fun to push the boundaries!
You're crazy guys. I love you content!
This will change filmmaking forever
Great content! You people are real professionals with real information to share. I love it! Subbed!!!!
One small additional note:
Tilta MB-T12 retails for 690€ VAT excl. in Europe. Not quite a bargain everywhere :D
Good to know! Thanks:)
iPhone is a brilliant smartphone and camera. But when you have such a production before and after the filming, you could also use a photocopier as a camera and have a beautiful result
you could pull of the same stuff with any top of the line android phone also. it's not only reserved to IPHONE PRO MAX.
Okay, gonna have to start trying some things with my Pixel
Incredible video, you guys are insaleny good ! The osmo pocket 3 is a best seller right now, I wonder If you could obtain the same result since it gots a 1inch sensor.
Dr Frankenstein would be proud of you ;)
loved it
I would love to see the difference between the $8000 anamorphic lens with Beast grip anamorphic lens. 🤔
We actually tried that in our preliminary and got some interesting results. The reason we didn't use it is because the Beastgrip anamorphic lenses do not have a lot of focal length options and tend to favor WIDE focal lengths. I needed telephoto and it's one of the limitations of some of the lenses that adapt directly to the phone. For the best framing, I want control over longer lenses and that just isn't an option (yet) without a professional prime like the Mercury's we used here.
The ground glass needs to spin or oscillate to keep it's texture invisible. Try pairing the iPhone with an old Letus35 or Redrock DoF adapter or modify the Beastgrip to have oscillating glass.
Beautifully made scenes and well made video about the subject! 👏🙏💪
Thank you 🙌
My guy. Nice episode. But these figures you throwing out there for ‘affordable,’ and ‘off the shelf,’ Is not really that. I guess for professionals and those in this field. However, some of us, like me, are just enthusiasts. So we not going to get into that… 😅😁 However, the results do seem better than what you achieved before. I did notice that overall, the images were much softer and lacked a some detail- even with subjects in focus. IS this because of the ground glass? Apart from that, this is absolutely amazing work. You are absolutely right about the iPhone becoming a more capable device for cinematography soon- everything related to the image sensor size is correct. However, as for me, I will use my gimbal, my Shiftcam case and the 60mm lensultra lens and shoot in the 16:9 ratio. I will cheer you on wholeheartedly from all the way in Trinidad and be a part of the discussion and the innovation but wait until the iPhone tech has matured much better before I invest in that kind of, ‘off the shelf’ items. 😅😊
Look at the price of Red cameras and suddenly it's VERY affordable
Coffeezilla of filmmaking 🤔
Good content.
I feel lioe all of this is the blackmagic camera app and sensor, so I also feel a similar thing could be acheived on something like a sony phone with a better sensor and the app.
We need compare with Moment anamorphic lenses:)
Is there a way to get a copy of the actual scene that was being videoed? The "Agent" at the filing cabinet is a good friend of my wife and I, and it would be very cool to see the complete scene.
🤯 love it!
What happens if you shoot anamorphic with the iPhone+cinematic mode? (with a small anamorphic lens for iPhone) skipping the big lens set up? sharper scene with blurred background?
Wow this is great, I love this chsnnel
Wow. Vision Pro with Follow Focus is Wild. Like a lion ripping time space wide open wild. Looking forward to it
At first we thought it was ridiculous. Then we were surprised!
@@Michael_Cioni 😭😭😭 Incredible.
Well done!
This is a cool experiment, but the video was disingenuous about the overall price of that setup for sure.
Other costs not mentioned:
EF to PL Adapter - $100 (Your numbers came up $100 short of that rig, so this must have been left out)
Diopter - $500+
4x5 diffusion filter - $100-$500
Wireless follow focus system + rigging - $500+
Tripod - $1500+ for a barely decent fluid head setup
Z-flex tilt tripod head - $60
Slider - $500+
And of course the real heavy lifter here is the tens of thousands in lighting and studio equipment, not to mention audio. No one in the world would spend all of that money on equipment to shoot cinema-style footage on a 12mm sensor. Again, its a cool experiment, but you made it so much about affordability and how the iPhone meets all the qualifications of a cinema camera. I'd argue having at least a Super 35 sensor is the sixth qualifier because footage from a tiny sensor just doesn't look the same, 4K 10-bit RAW or not.
It's possible I made a mistake and failed to add that in. That's totally an accident (definitely not trying to be disingenuous here). We really worked hard to share the cost of all the main camera gear. Now the actual PL lenses we used (Mercury) had a rear element that was too wide for the Fotodiox, so we had to make a customized widening to fit the lens, but if you use an EF lens, you don't need that piece and if you use a PL, Fotodiox provides a list of compatible lenses.
All this is to say we always try to be genuine with the information. Plus, I expect there will be direct smartphone to PL in 2025 so some of these adaptors will become moot pretty quick. Thanks for the comment:)
@@Michael_Cioni Thank you for the reply. I learned a lot from this video so I hope my criticism wasn't overboard. Your skills and knowledge are appreciated.
Chromatic aberration was introduced during post production? or is the result of the glass and adaptors used?
Good eye:)
This is mostly due to the ground glass adaptor. But there are some new adaptors coming out that won't have this artifact. So I expect to see major improvement!
great video, instant subscription and like. keep it up
This is a neat experiment, but far from the cheapest true anamorphic you can get with an iPhone. Brands like Moment have been putting out iPhone-compatible anamorphic lenses for a couple years now priced at around $120 new. A full rig with these lenses could easily come in at a fourth of what was spent here.
I noticed you guys used the atlas lens extender 1.6x adapter to connect the Mercury lenses, but did you use the PL to PL or the PL to LPL and how did you connect it to the beastgrip DOF mk 3 when that is an EF mount?
Good eye:)
You are correct Beastgrip DOF is EF. When we did this there wasn't a commercially available adaptor to make this work. We had to use the Atlas PL adaptor and modify the rear element so it would fit the flange of the Mercury and then adapt that to the Beastgrip. It worked fine, but was definitely not off-the-shelf. I've spoken with a lot of people about making more PL to PL options for this and I expect in 2025 you'll see more hitting the market. Thanks for the comment!
All that stuff added to an iPhone feels like: ruclips.net/video/aSG_23DTvKs/видео.html&ab_channel=seedyrom 😂😂
All about focus
impressive 👌👌.. Thanks.
what about adapters like moondoglabs?
Please tell us more about the PL adapter. I have been digging all over the place for a full frame version.
Anyone knows what the technique about color grading is that makes the image look darker?, when he shows the color correction at 0:55, everything goes a bit darker, the exposure looks so good and i cant seem to find a way to search and learn about it.
That's called a "thick negative." You can watch Episode 24 to learn bit more about the lighting and Episode 26 to learn bit more about Apple log. But there are tons of RUclips videos about log color and DaVinci Resolve. I would start there. We used Apple Log + a LUT + a Grade (Resolve) and combined those together to get this to look very controlled. Thanks for the comment:)
what is the custom EF to PL mount that you used. What is the cost as well
We were working with Atlas Lens Co and because the flange depth and rear image circle diameter varies from lens to lens, not all PL lenses are compatible with converters. The Mercury's we used cover full-frame so they differ from a Cooke S4 (for example). So this was a custom adaptor that allowed for a wider rear diameter. I expect someone out there makes it, and the Fotodiox works on a lot of lenses but not the new Mercury's. So going custom for this was the easiest. But you can fit a lot of lenses on the Fotodiox and other EF -> PL adaptors out there. I hope someone makes one to support more lenses, tho. I think people would buy it.
@@Michael_Cioni Thanks Michael. I wanted to adapth the Balzar Remus lenses that I have. So I will test the fotodiox first
@@GabrielScindianwork perfect?
The Fotodiox and most PL adapters work with very FEW lenses. None full frame.
How much did it cost to build the custom adapter? You should factor that cost in.
damn so good 🎉❤
I have a few questions. What is the difference between the LUT and the later colour correction pass? And which programme can I use to do this? And where can I get this LUT?
Great question! The LUT is a creative conversion from the camera log into a viewing gamma space (in this case, Apple Log into Rec709).
Some people refer to this as a "base grade" because it gets you "normalized images" coming from a log source. Then you use a color correction pass to fine-tune the look. Ironically, the fine-tune color should happen before (or beneath) the LUT in order to get the best control and range out of the adjustments. I work with colorist Nick Lareau who grades in DaVinci Resolve and uses the base LUT and about 20 different layered effects to achieve the creative color pass. That's the foundation of color grading and thanks to RUclips, there are about 5,000 DaVinci Resolve training videos to start learning how to do it yourself. Plus, Blackmagic makes a free version of Resolve to try! Good luck:)
When using full-frame anamorphic lenses
What is the size of the lens without vignetting on the iPhone? Do you have vignettes for 25 meters?
The vignetting on the Mercury was very minimal - but we used the Atlas LF rear expander adaptor (from EF to PL) so we had as much coverage as possible. The vignette was minimal and aesthetically appropriate for this project.
Must be nice to have out of stock accessories unlinke us out here in the public
Give us your lut or coloring tech for this vid man
An iPhone is a cinema camera actually lol
Please tell me what hub you are using to get from the Iphone to the camera monitor and also the production monitor? I know you're using a tx/rx system but the Iphone only has one port. That would be a great help as the one I am using doesn't seem to send video to both or there is not enough power. Thank you.
Great stuff, I wonder how this footage would hold up in heavy post, such as VFX?
IN all our tests, if you use the 24mm lens, that's the "hero" of iPhone and the lowest signal-to-noise ration. Since the ProRes is legit 10bit HQ (440mbps) it handles fairly well. The ground glass used to adapt the PL lenses has some artifacting (the ground glass fresnel can sometimes be visible) but I QC this on an 85" LG OLED (C22) and it really does look impressive. Our colorist, Nick Lareau reported zero issues coloring it.
Is any chance u can show your coloring process?
Great idea! I think we'll do an episode about that in the future.
I just wish I could see this equipment
Cinegear Expo is this weekend in Los Angeles - we'll have the equipment in the Strada booth if you're in town!
Hello, do you have a link where I can find this?🤔
Must be nice to have out of stock accessories unlinked us out here in the public
ANY flagship phone's footage can be edited to look like this 😂
Trust me when it comes to video recording Apple's prores log is so much better compared to other flagship devices
Hello, thank you for your very comprehensive video.
I'm preparing the launch of my RUclips channel with multicam anamorphic footage. I have a GH5II with a 35mm SLR Magic x2 lens, and two iPhone XS 512GB with a Sirui x1.33 lens. I noticed that recently, the BlackMagic Cam app has an anamorphic decompression feature that compresses instead. Have you noticed this issue on your end?
Looks good on a phone, but when it comes to watching on anything bigger the flaws of the DoF adapter setup becomes very apparent
Good point. I personally found it depends on the shot. Sometimes it's really hard to see any negative artifacts, other times it's clear. We shoot and finish all our work in UHD and I personally QC it on a calibrated 85" LG OLED (model C2). Lenses like these are supposed to be highly stylized and I have to say it holds up very well. But if you think about phones and DOF adaptors 1, 2 and 3 years ago - -> reverse that to 1, 2, and 3 years from now and that's where this gets interesting... Thanks for the comment!
And now, the upcoming 28 Years Later movie will be shot on iPhone 15 Pro Max. Crazy how far we've come.
What is the AI to do a depth shield
What kind of slider was used on this production?
That is an Edlekrone Slider Plus. really nice and elegant solution for small uses. the weight of this rig was pushing it, but Edlekrone makes a slightly larger version that better supports larger cameras. We'll talk more about this in our next Episode 24! Thanks for the comment!
Hey, I added the app. But no custom LUT. :(
I’ll stick to photography in natural lighting 😂
🐐
You sound so much like a moto youtuber Born A Goon! Or are you the one behind that channel? 😅
Por que no usaron los lentes de beastgrip ?