I see a lot of people saying “it doesn’t look real” don’t get it. This is a new technology that allows for a director to see the post production cgi product in a lower quality version than the final but in real time on set during filming. This could have major effects on the use of these technologies when you can have them integrated in real time with a person’s performance. And not to mention an actor being able to really see were and what their characters are doing animated or cgi. Making for a better performance from all actors and a better ability for a director to see what up till now wouldn’t be seen till wrap and months of post production work afterwards often the actual cause of some of the bad cgi I think. Just trying to make sub par shots “just work”.
I thought I was crazy reading these comments. It’s like no one even watched the video or what the point of the presentation was. This isn’t actually the effects for the movie just creating moving concepts for what it could look like. Without taking up the time and resources when initialing the production of the film.
@@wilmcl9209 Nah your right let’s go back to filming the whole movie and taking that footage and adding cgi raw then when any issues arise they can just “make it work” with lots of sloppy cgi work to force footage to fit a new concept for the scene. Spending potential millions on reshoots if they can’t use existing footage. Also adding to scenes shot and in post production they learn it just doesn’t work like they imagined ending with scrapped script sequences. Not all technological advancements are negative. Maybe just maybe seeing a snapshot of what a scene would look like DURING the initial filming process and not after filming in post production cgi could not only prevent costly reshoots. But could also help marry the practical and cgi like never before allowing for a direct visual representation of what they are attempting for any on camera actors. Cgi isn’t something that should be wholly removed from movies bc without it we wouldn’t get believable flying or no more crossing styles like “Cool World” or “Who Framed Roger Rabbit” even in a world where a movie may use hand drawn animation and computer software to insert a cartoon character like the cartoon cat cop from the last action hero. It would allow for a on set actor to “see” their digital co star reacting with them in “real time” and not having them added afterwards and the actor was talking to a ball on a stick for hours and hours of filming with no real idea of how it would look and be. Too much cgi is an issue but to deny it has any place in movies is as idiotic as someone from the late 1920s complaining about too much long winded dialogue in movies. “Movies don’t need all that talking just pop a dialogue text card up and get back to the action” those where probably something your great grandfather would have said.
There are also scenes that were recorded but not added to the game due to a lack of budget, time, and technology, which would be amazing to have added to the game
True enough; however, people want to see a bit more on the polished production-style of quality in the result when the title says "movie" in it. Seeing this, its no better than any standard video game. Motion, physics, volumetric lighting, all of it screams video game and very little to none of it screams cinema.
People trying to compare this to any regular CGI are missing the point. Most CGI takes hours a frame, this is playing out at real time like a real set would during a take. For actual final shots, they would move this into a rendering pipeline that renders it at a far better quality, but the idea is they can now plan out scenes this way very economically. They can iterate faster, save more on physical shots that might get cut, and deliver movies faster this way
Where was that stated? Never once was that said. They literally are talking about real time rendering. Btw, they actually already do what you're talking about, that isn't new. You're clearly missing the point here.
@captaincaptain2128 if you look closely, the video at the beginning has a text at the bottom of the frame saying that this is a real-time visual effects proof of concept video. It is a game changer.
@@MrGreenAKAguci00 Not really. The technology isn't quite powerful enough to make real time rendering look as good as traditional CGI. Give it a decade.
@captaincaptain2128 it doesn't have to be that good if it's only used during production and to allow the crew to work more efficiently and be more informed about what the end result will look like. Single She-Hulk episode after all the CGI changes they had to do due to the director's indecision and constant script changes ended up costing around 25 million USD. 25 mln per episode, mostly because of vfx. Now that show had many more issues, with the writters room and other things but what Sony shows here makes integrating vfx way easier than before because changes can be implemented and verified in real-time. Then after everyone is happy they will render it for the big screen with all the bells and whistles. What you saw here was not the final release.
@@MrGreenAKAguci00 what you're describing already exists and has existed for nearly 2 decades. This isn't that though. I suggest you do a little more research on this technology instead of just watching one video. There's dozens of articles and stuff about it. You clearly misunderstand the purpose of this tech. It's meant to replace the post production CGI work by essentially using digital puppets and rendering it all in real time. It's not meant to be a reference for the director because that technology already exists and allows them to do just that. This tech is meant to replace post production CGI and allow the directors to edit the final shots that day, such as editing lighting. I understand not everyone knows how filmmaking works, but you must know that the amount of time and effort that went into this test alone is in the millions of dollars, right? The assets need to be made prior to filming, and can take hundreds of hours and millions of dollars, and for what? To be a reference for the director and just get scrapped so that the CGI team can spend hundreds more hours and millions more dollars recreating the same stuff but at a little higher quality? You understand how crazy that is, right!? It's called real time rendering for a reason, it's rendering in real time. Ultimately it's going to replace traditional CGI eventually, but the tech isn't there yet. I suggest you read up about it more, as it's very fascinating tech, but you also clearly don't know how it works or it's purpose.
I love it. They should use this for one new preview for the new movie. Before showing other preview about new Ghostbusters movie get fans excited for the new move coming out. I know I can't wait to see it.
@@user-lz5vh9bb5wpeople have been saying that for decades. Nothing beats the weight, the tactile nature, the REALITY of practical effects. Even the best digital approximation is still only an approximation. Diminishing returns means that getting it to perfect will be prohibitively costly.
@@alpinion323 100%. I remember video game designers telling me that their best tech would replace movies within two to three years. And that was 14 years ago.
@@user-lz5vh9bb5w maybe. But even if it could, would you want it to? What’s more impressive? Seeing an acrobat do a backflip into a glass of water or seeing an animation? There used to be a sense of “wow, how did they do that?” But even if your right that will be replaced with “AI make all my eyeball food. Nom nom nom”
Just came from a website where they said there are upcoming games & animated projects in the works for Ghostbusters, so this could very well be what they will be like.
Because it was made with a software called Softimage. That software was soo good that it almost destroyed the 3D software industry. That is why Autodesk bought them and killed it.
A lot of it was a mix of practical and CGI. What still makes the effects believable to this day is because they knew how to mix the two together to their strengths.
@@hopperhelp1 100%. The pure CG elements like the wide Brachiosaurus shot and herds of running dinos looks kinda of dated, but the hybrid approach was best!
JUrrasic park did everything right. Also they made use of the Dinos in the dark, and the mix with practical. Hope hollowood soon realized, they need to go back to do both.
This isn't about this footage being the "final product." This is about a filmmaker being on set, looking through the camera, and being able to adjust everything based on the real-time feedback from the monitor/s. This, combined with the recent Screen Craft technology used on The Mandalorian and other shows is pretty huge. We're getting to the point where directors can direct scenes organically and in person the way they would on location...but this time it's virtual.
@@maymayman0 What you see in the video is entirely rendered in the computer, and the director directed it by standing in a warehouse-sized room with lots of equipment, computers, etc. He had a tablet-sort of device, sort of a "camera equivalent" but it wasn't a camera. On the screen of that device and on other monitors around the room, he and other crew members could see what the "3D camera" was pointing at. The director could move this tablet/camera-equivalent around and see what the computer was showcasing in real time. However, no actual cameras were used to make this short film. Everything you see was created without an actual video camera that recorded any sort of live picture. That's why it's called cameraless.
@@andysmith1996 I don't know, dude. I"m not someone who worked on this. My guess is that this was a test and the final renders for an actual movie would be "full pixel quality."
Oh come on people comparing with jurassic park, seriously?, understand this not a finished product, this is kind of a sketch of what can be done, still to be worked on and developed, is the basement for future great and amazing things in the industry, in my opinion this looks and feels AMAZING, the render concept, ths inmersive story (NYC streets), the clearly remastered soundtrack, the timeline setting with rusted Ecto 1 from Ghostbusters 3 fiercely facing the Marshmallow Man, the amazing sound effects, I mean, this is actually very enjoyable, thank you Ghost Corps, Pixomondo, Playstation Studios and Epic Games for this incredible project which you can be sure will be rewarded in the future with so much support from the Ghostbusters fans whatever your final product(s) be.
I worked in the VFX industry for 5 years and I totally agree (I actually briefly worked on Ghostbusters Afterlife). Practical effects mixed with today's digital comping abilities is a match made in heaven, yet the default is still always CG. It's a sad world we live in today.
Heaven forbid storytelling ever become accessible to the common person. Filmmaking should cost millions of dollars and only be accessible to the elite.
@@heckensteiner4713 the point of this test was to help VFX artists and actors to get an idea of what they'll be doing so they can limit the amount of reshooting because thats expensive
Okay, I think everyone is missing the point here. The important part of this is "real-time". They're able to get high quality effects in camera on the sound stage without waiting for a render, which can take days or weeks depending on the length of the shot and can still require compositing. That said, I don't think they're going to use those real time effects in the movie. First of all, I don't think you're seeing the real time effects here. If you look at the screens on the sound stage, they seem just a little rougher than what we were shown at the start. I suspect that was put through a little render pass to sweeten the textures and effects and lighting. However, even that it's a finished scene yet. They'd need to put real people in the scene, those digital stick figures just ain't gonna fly on the big screen. There's very few particle effects, with dust and paper and debris flying. It's just not a movie quality scene. That said, real time graphics have appeared on the screen more than a few times, because real time rendering is used in the Volume, which was created for the Mandalorian and used in other shows as well. But those are for more or less static backgrounds and they didn't take up the whole scene, as there was usually a human actor and physical set pieces in front of them. Is it possible that the real time graphics could get real enough to be what makes up a final scene? One day maybe. But it would have to be a carefully constructed scene, and one that doesn't betray the unreal-ness of what you're seeing. And I mean "unreal" both in the sense of the technology and in the uncanneyness.
So this is basically some form of HD layout? Or test-running an effects package in a realtime environment? I checked the link and it's in German, for whatever reason I don't have translation options available. There's a video embedded that explains the staypuff effect in detail but I'm still a bit confused what else is real time, maybe some of the camera effects and I guess the vehicle rig was moved around in real time? The vid name is "SCC - Visual effects using real time technology" I'm still a bit puzzled, because realtime rig setups to preview mocap have been used many times before. But I'm guessing the 'tech demo' stuff here is the traffic sim, lighting, and fluid FX in real time? In the sizzle reel, there's still an artist on scene making adjustments to animation just like on any regular mograph stage, so that stuff still takes 'time' even if it's closer to realtime. I'm not really sure how that would be used-- I guess you'd use all this as some kind of springboard for the final product? In the demo it... doesn't look very dynamic, especially the camera angles so I'm just surprised they're trying to pass it off as some big interesting innovation. BASICALLY any mocap you see on screen is highly HIGHLY adjusted by animators. Like, to the point of the mocap basically becoming a base to the final product after many many many iterations. Realtime performance capture is currently a thing-- so I still don't get why they did all this just do have the demo have.. nice lighting? And some fluid? That doesn't really look too great?
This is the thrill-a-minute, action-packed, special effects extravaganza I expect from Ghostbusters, without all the boring dialogue and outdated attempts at comedy. You've done it again, Sony.
3D Animated series are made for the last 25 years at least. After Pixars Toy Story everyone went into 3d animation. I am using Unreal and Unity for the last 10 years and i started with 3D in 1990. The question is what the production is aiming for? Today i am just working real-time. I am not rendering anymore. The possibilities that RT Engines likes Unreal provide are huge because you can almost work like in the real world but in a virtual one, without waiting hours for some usable results. But what makes me think here is the description. Is it about VFX or a "new" production workflow for whole films and series?
Why is everyone using the same Unreal Matrix demo city scene. I appreciate the work and art direction, but this wouldn't hold up as a sequence cut into a live action film. It still has video game quality. Just because you can render the whole sequence in a day, doesn't mean you should. All thoughts aside, still a fun short sequence though :)
Wow. I remember a few papers about the realtime rendering concept 15 years ago. They thought the technology for doing it this well was a couple of decades away. Got it in a few years early. Nice job.
So, I should note that a) this wasn’t for a new Ghostbusters movie at all… it was a stand-alone proof of concept. And b) the output of this process is never what would end up on screen. This is just a way to get a faster, on-set VFX capture that can be fully visualized as it is shot. Once all that was done, it would still go to post-production for performance, animation and timing tweaks, as well as some additional compositing, then would get a full render with ray-tracing (which Unreal’s live render engine is currently incapable of). The shots would then be mixed with (and matched to) actual shot footage, not ever delivered as a full, humanless scene like this. In short, there’s no fear that movies will ever look like this. It’s just a method for making the VFX pipeline faster and more connected to principal photography.
Yeah, but they''re not pitching this as previz or volume capture material. As a proof of concept its clearly intended to say; this is what we can do; final pixel in real time. I work with a lot of volume work and they dont put this much effort into the output lol
This is a proof of concept rendering for Jason and the creators to work from. This is not the finished final edited cut of the scene. I think there will be so much more to film than just chases thru city streets.
Still a long way to go with the effects, getting there though... for me it was Stay Puft...in original film when he walked you had reactive effects like fire hydrants exploding or crack lines along the road as he walked giving weight to the character this seemed missing in this sequence.
Nothing to do with a new movie. Clearly stated that it was designed and created to push real time graphics tech, ie in game graphics. If its anything more than that its related to an unannounced game, not a movie
Apparently, it was just a fun little project that Jason did just to let fans know that Ghostbusters is alive & well & to help tide us over till the next movie is released. There is talk of some games coming out in the future as well however.
As a PoC shot, it works but is far from being ready to accept it as final product. Not a single person in the shots, no preople in the Ecto-1, where is the Stay Puft's collar? I can see this as a great tool for directors to use for visualizatio/guidance but not as a finished product.
THIS IS WHAT WE WANT!!! A fully realized, sandbox style game where we can drive around NYC, NJ, maybe even upstate and PA as the Ghostbusters. Taking jobs in businesses (jewelry stores), subways, restaurants, haunted mansions, Central Park, amusement parks (Coney Island), and sports arenas/stadiums would be wild. Even constructing and eventually piloting Ecto-2 would be epic.
Excellent work by all involved. I wonder if human characters were avoided to prevent the uncanny valley effect. Some parts appeared AI-generated, but in realtime it's all amazing. Wonder what the result would be if they put all settings on epic and render non-realtime to a file in unreal.
this is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. this was never meant to be enjoyed as an actual finished product.
@@FablestoneSeriesWow, IDK. I know what pre vis is of course, although none of the level of productions I've ever worked on had a budget for it. But what I didn't know, is that they did so much work and spent so much just for pre-vis alone. The budget for the movie must have been limitless. But this explains leaving out the humans for know. Still curious about the epic setting and using unreal to do non realtime rendering. Because hey, why not.
Incredible that it is real time. CGI takes are now possible. The title is confusing though, as you still need to get the shots with a camera, even if it is virtual.
I thought they had been a mix up as it looked like a computer game. The Stay Puft Marshmallow Man looked great, probably the only thing to be "cinema quality". But it is only a "proof of concept".
I think the implications of this are pretty clear. Sony released this while they are in labor dispute with actors & writers to show them what they were capable of using the Unreal video game engine. This sends a clear message. Get off the picket lines and back to work, or next time, we're going to figure out how to do photo realistic people using the Unreal engine.
No, Give the industry employees a raise and screw your stock holders, the strike must continue and fans should not be watching any new productions until fair wages are achieved.
@grimacres It's not about the shareholders. It's about what was and will be no more. For actors, they want to renegotiate contracts they signed that they are no longer happy with, and it generally revolves around streaming. When they made tv show in, say, 2002, they were paid for the work and offered a residual for syndication. These contracts existed before Hulu, Disney+, Netflix, etc. Should they get paid for something not spelled out in their contacts. Most competent people would say no. If you said yes, then how far would you go back and who do you go after? Every show ever made for over the past 60 or 70 years? Everybody knows the big streaming companies, but what about the little ones? The Roku channel store has dozens of little streamers that charge very little or are completely ad supported who stream niche content such as old westerns. Should actors go after these guys too for not paying a residual when they stream Gunsmoke or Bonaza or even Zorro. Another actor complaint is the number of episodes they make and the breaks in between seasons. Network shows and established actors get their 20+ season shows like Law & Order SVU, Blue Bloods, The Rookie, etc. When you act on a streaming show, you only get six, maybe eight episodes, and you're never guaranteed the role will come around again. If it does, then it might be two years later. Should they get paid more for working less than say an established star who worked hard and earned that 20-episode role that might have taken them 5 to 10 years to get. The writers complaint is similar. Established writers get longer episodes runs and make more money. New writers working on streaming projects get paid, but their job may only last months, where established writers on shows like Law & Order or Blue Bloods have to work pretty much year round to make those shows. These writers on these short-term assignments are calling these gigs mini rooms instead of writers' rooms. Are the actors & writers right or wrong. I don't know. I do know they took the wrong course of action. Striking never makes any sense, no matter the industry. It just emboldens the other side to dig their heels in and be less cooperative. Hollywood can sit and wait for the desperation of not getting paid to sink in.
@hokemoseley2934 That's not what they were trying to do. They were trying to show you that the Unreal Engine technology has advanced to the point of photo realistic environments. Essentially they could do a car chase without having to wreck any cars or endanger human life. I believe the next step would be to create a believable digital actor that can get out of the car and fool the audience by moving around and delivering lines.
If people find this compelling, ok, it's just slick bullshit for those without nuance, human first movies will always be better, these are just Special effects, vacuous and corny, it's like a really dramatic Skip Intro
They didn't get Ecto-1's handling correct. The rear end slides more in a skid than that. They need to stop trying to make it look like it handles like a super car and love it for the beast it is.
I think the effects in the original look better😅 This could get more convincing over time, but there’s some existential hurdles as well such as the jobs and the lacking “human element.” I understand having an artistic vision and wanting it shaped perfectly, but I don’t think any human is capable of micromanaging every detail. AI will be able to fill in those details, but will they make sense and feel real? And couldn’t the lacking details for realism just be added by image capture of real objects and actors rather than generating everything? If you want to represent elements of the real world in a movie, you may as well use the real world when you can. In many cases, actors being filmed add character or expression that wasn’t originally intended, but works best for the story. This tech is really cool, but I don’t think it should always be a replacement for real “filming.” In the same way cartoons don’t replace live action and visa versa; They should co-exist. These media companies are so greedy though, I think they want to avoid paying anyone they cannot completely control. It seems they want to separate IPs from real people, that way the IP’s can live on while the people who make it all possible can be replaced at any time. It’s very concerning
this is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. this was never meant to be enjoyed as an actual finished product.
This is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. This was never meant to be enjoyed as an actual finished product.
If it's realtime rendering... could be fun if they found a way to have some games where a few daily boss fights had a guy in a mo-cap suit reacting to players. I know there's a lot of issues with that for scaling, but I think it would be funny if someone managed it. Would need to hire a lot of people to be on standby in mo-cap suits for instancing.
It looks like an interesting concept, the city reminded me of Matrix unreal engine version... and then they mention they use UE. Still the lighting , textures and motion need to be more realistic in the product.
I see a lot of people saying “it doesn’t look real” don’t get it. This is a new technology that allows for a director to see the post production cgi product in a lower quality version than the final but in real time on set during filming. This could have major effects on the use of these technologies when you can have them integrated in real time with a person’s performance. And not to mention an actor being able to really see were and what their characters are doing animated or cgi. Making for a better performance from all actors and a better ability for a director to see what up till now wouldn’t be seen till wrap and months of post production work afterwards often the actual cause of some of the bad cgi I think. Just trying to make sub par shots “just work”.
I thought I was crazy reading these comments. It’s like no one even watched the video or what the point of the presentation was. This isn’t actually the effects for the movie just creating moving concepts for what it could look like. Without taking up the time and resources when initialing the production of the film.
Its literally a step in the Wrong direction. Theres nothing good about this
@@wilmcl9209 Nah your right let’s go back to filming the whole movie and taking that footage and adding cgi raw then when any issues arise they can just “make it work” with lots of sloppy cgi work to force footage to fit a new concept for the scene. Spending potential millions on reshoots if they can’t use existing footage. Also adding to scenes shot and in post production they learn it just doesn’t work like they imagined ending with scrapped script sequences. Not all technological advancements are negative. Maybe just maybe seeing a snapshot of what a scene would look like DURING the initial filming process and not after filming in post production cgi could not only prevent costly reshoots. But could also help marry the practical and cgi like never before allowing for a direct visual representation of what they are attempting for any on camera actors. Cgi isn’t something that should be wholly removed from movies bc without it we wouldn’t get believable flying or no more crossing styles like “Cool World” or “Who Framed Roger Rabbit” even in a world where a movie may use hand drawn animation and computer software to insert a cartoon character like the cartoon cat cop from the last action hero. It would allow for a on set actor to “see” their digital co star reacting with them in “real time” and not having them added afterwards and the actor was talking to a ball on a stick for hours and hours of filming with no real idea of how it would look and be. Too much cgi is an issue but to deny it has any place in movies is as idiotic as someone from the late 1920s complaining about too much long winded dialogue in movies. “Movies don’t need all that talking just pop a dialogue text card up and get back to the action” those where probably something your great grandfather would have said.
These are the people who think CGI means you press a button and the computer does all the work for you
@@whosmansisthis6624where was that stated? When specifically did they say any of that?
Man this really makes me want another 2009 game, I love Spirits Unleashed but man I miss the solo story and those graphics would be incredible
Agreed, I want a REAL remaster of the game
There are also scenes that were recorded but not added to the game due to a lack of budget, time, and technology, which would be amazing to have added to the game
@@Musicalmane True, plus MULTIPLAYER! For God sakes
I miss Pong
We need a open world game of ghostbusters!
What is it about the music, the siren, the car, and characters that just works so goddamn well? Always puts a smile on my face.
This is called a proof of concept. It’s meant to showcase their ability to display camera angles and VFX capabilities. This isn’t a finished sequence.
True enough; however, people want to see a bit more on the polished production-style of quality in the result when the title says "movie" in it.
Seeing this, its no better than any standard video game. Motion, physics, volumetric lighting, all of it screams video game and very little to none of it screams cinema.
pretty sure they're using a customized unreal engine 5 to shoot this
Exactly. It was part of a Playstation Showcase. Has nothing to do with movies. Clickbait headline/title.@@cagneybillingsley2165
the title also says "test" lol
@@cagneybillingsley2165 they legit show ue5 being used, no pretty sure, they are.
It has the real ghostbusters cartoon vibe too it.
People trying to compare this to any regular CGI are missing the point. Most CGI takes hours a frame, this is playing out at real time like a real set would during a take. For actual final shots, they would move this into a rendering pipeline that renders it at a far better quality, but the idea is they can now plan out scenes this way very economically.
They can iterate faster, save more on physical shots that might get cut, and deliver movies faster this way
Where was that stated? Never once was that said. They literally are talking about real time rendering. Btw, they actually already do what you're talking about, that isn't new. You're clearly missing the point here.
@captaincaptain2128 if you look closely, the video at the beginning has a text at the bottom of the frame saying that this is a real-time visual effects proof of concept video. It is a game changer.
@@MrGreenAKAguci00 Not really. The technology isn't quite powerful enough to make real time rendering look as good as traditional CGI. Give it a decade.
@captaincaptain2128 it doesn't have to be that good if it's only used during production and to allow the crew to work more efficiently and be more informed about what the end result will look like. Single She-Hulk episode after all the CGI changes they had to do due to the director's indecision and constant script changes ended up costing around 25 million USD. 25 mln per episode, mostly because of vfx. Now that show had many more issues, with the writters room and other things but what Sony shows here makes integrating vfx way easier than before because changes can be implemented and verified in real-time. Then after everyone is happy they will render it for the big screen with all the bells and whistles. What you saw here was not the final release.
@@MrGreenAKAguci00 what you're describing already exists and has existed for nearly 2 decades. This isn't that though. I suggest you do a little more research on this technology instead of just watching one video. There's dozens of articles and stuff about it. You clearly misunderstand the purpose of this tech. It's meant to replace the post production CGI work by essentially using digital puppets and rendering it all in real time. It's not meant to be a reference for the director because that technology already exists and allows them to do just that. This tech is meant to replace post production CGI and allow the directors to edit the final shots that day, such as editing lighting. I understand not everyone knows how filmmaking works, but you must know that the amount of time and effort that went into this test alone is in the millions of dollars, right? The assets need to be made prior to filming, and can take hundreds of hours and millions of dollars, and for what? To be a reference for the director and just get scrapped so that the CGI team can spend hundreds more hours and millions more dollars recreating the same stuff but at a little higher quality? You understand how crazy that is, right!? It's called real time rendering for a reason, it's rendering in real time. Ultimately it's going to replace traditional CGI eventually, but the tech isn't there yet. I suggest you read up about it more, as it's very fascinating tech, but you also clearly don't know how it works or it's purpose.
I love it. They should use this for one new preview for the new movie. Before showing other preview about new Ghostbusters movie get fans excited for the new move coming out. I know I can't wait to see it.
This is proof of why in camera practical effects are still the way to go.
For now. Give it a few more years and a AAA asset budget and you'll be wrong.
@@user-lz5vh9bb5wpeople have been saying that for decades. Nothing beats the weight, the tactile nature, the REALITY of practical effects. Even the best digital approximation is still only an approximation. Diminishing returns means that getting it to perfect will be prohibitively costly.
@@alpinion323 100%. I remember video game designers telling me that their best tech would replace movies within two to three years. And that was 14 years ago.
@@alpinion323 Yes, but we haven't had such good AI prospects for decades. Things are about to change exponentially fast.
@@user-lz5vh9bb5w maybe. But even if it could, would you want it to? What’s more impressive? Seeing an acrobat do a backflip into a glass of water or seeing an animation? There used to be a sense of “wow, how did they do that?” But even if your right that will be replaced with “AI make all my eyeball food. Nom nom nom”
Just came from a website where they said there are upcoming games & animated projects in the works for Ghostbusters, so this could very well be what they will be like.
Looks like a great video game. Doesn't look like a movie though.
Nope, it sure doesn't.
This was rendered in a day. Imagine what they could do in a month…
Ok boomer
Yup... is Sony only concept-testing 2018 technology now?
This is real-time directed - it’s not so much about the cg.
The CGI looks great. That being said, Jurassic Park came out 30 years ago and still feels more convincing.
Because it was made with a software called Softimage.
That software was soo good that it almost destroyed the 3D software industry.
That is why Autodesk bought them and killed it.
A lot of it was a mix of practical and CGI. What still makes the effects believable to this day is because they knew how to mix the two together to their strengths.
@@hopperhelp1 100%. The pure CG elements like the wide Brachiosaurus shot and herds of running dinos looks kinda of dated, but the hybrid approach was best!
JUrrasic park did everything right.
Also they made use of the Dinos in the dark, and the mix with practical.
Hope hollowood soon realized, they need to go back to do both.
JP had a 1:1 hydraulic monster T Rex. And the cgi was driven by a stop motion puppet operated by a stop motion master.
the remastered song of ray parker jr sounds really good, they should release it
This is awesome. We need a follow up to the 2009 game but co-op this time around
It was coop on the wii 👍
🙌🏾
@@sideskrollfour man in story and open world is what we want
@@trealsteve Aren't you tired of "open world" though? I can't play one single "open world" game more...
@@sideskroll No, I’m not. Too bad for you.
This isn't about this footage being the "final product." This is about a filmmaker being on set, looking through the camera, and being able to adjust everything based on the real-time feedback from the monitor/s. This, combined with the recent Screen Craft technology used on The Mandalorian and other shows is pretty huge. We're getting to the point where directors can direct scenes organically and in person the way they would on location...but this time it's virtual.
Why is it called cameraless if the director will be using a camera?? Genuine question not being fesecious
@@maymayman0 What you see in the video is entirely rendered in the computer, and the director directed it by standing in a warehouse-sized room with lots of equipment, computers, etc. He had a tablet-sort of device, sort of a "camera equivalent" but it wasn't a camera. On the screen of that device and on other monitors around the room, he and other crew members could see what the "3D camera" was pointing at. The director could move this tablet/camera-equivalent around and see what the computer was showcasing in real time. However, no actual cameras were used to make this short film. Everything you see was created without an actual video camera that recorded any sort of live picture. That's why it's called cameraless.
But the clip states that this shows you can have "final pixel quality in a fully synthetic environment".
@@andysmith1996 I don't know, dude. I"m not someone who worked on this. My guess is that this was a test and the final renders for an actual movie would be "full pixel quality."
Oh come on people comparing with jurassic park, seriously?, understand this not a finished product, this is kind of a sketch of what can be done, still to be worked on and developed, is the basement for future great and amazing things in the industry, in my opinion this looks and feels AMAZING, the render concept, ths inmersive story (NYC streets), the clearly remastered soundtrack, the timeline setting with rusted Ecto 1 from Ghostbusters 3 fiercely facing the Marshmallow Man, the amazing sound effects, I mean, this is actually very enjoyable, thank you Ghost Corps, Pixomondo, Playstation Studios and Epic Games for this incredible project which you can be sure will be rewarded in the future with so much support from the Ghostbusters fans whatever your final product(s) be.
Nothing will ever replace real character actors for a film that is in a human world, the original still set the bar way back, we miss those days! 💙
I worked in the VFX industry for 5 years and I totally agree (I actually briefly worked on Ghostbusters Afterlife). Practical effects mixed with today's digital comping abilities is a match made in heaven, yet the default is still always CG. It's a sad world we live in today.
Heaven forbid storytelling ever become accessible to the common person. Filmmaking should cost millions of dollars and only be accessible to the elite.
Calm down boomer
@@A_Distant_Life
I am sorry, but what?
Story telling is already accesible to everyone. Don't mix up story telling with million dollar CGI partys...
@@heckensteiner4713 the point of this test was to help VFX artists and actors to get an idea of what they'll be doing so they can limit the amount of reshooting because thats expensive
Okay, I think everyone is missing the point here. The important part of this is "real-time". They're able to get high quality effects in camera on the sound stage without waiting for a render, which can take days or weeks depending on the length of the shot and can still require compositing.
That said, I don't think they're going to use those real time effects in the movie. First of all, I don't think you're seeing the real time effects here. If you look at the screens on the sound stage, they seem just a little rougher than what we were shown at the start. I suspect that was put through a little render pass to sweeten the textures and effects and lighting. However, even that it's a finished scene yet. They'd need to put real people in the scene, those digital stick figures just ain't gonna fly on the big screen. There's very few particle effects, with dust and paper and debris flying. It's just not a movie quality scene.
That said, real time graphics have appeared on the screen more than a few times, because real time rendering is used in the Volume, which was created for the Mandalorian and used in other shows as well. But those are for more or less static backgrounds and they didn't take up the whole scene, as there was usually a human actor and physical set pieces in front of them. Is it possible that the real time graphics could get real enough to be what makes up a final scene? One day maybe. But it would have to be a carefully constructed scene, and one that doesn't betray the unreal-ness of what you're seeing. And I mean "unreal" both in the sense of the technology and in the uncanneyness.
So this is basically some form of HD layout? Or test-running an effects package in a realtime environment? I checked the link and it's in German, for whatever reason I don't have translation options available. There's a video embedded that explains the staypuff effect in detail but I'm still a bit confused what else is real time, maybe some of the camera effects and I guess the vehicle rig was moved around in real time? The vid name is "SCC - Visual effects using real time technology"
I'm still a bit puzzled, because realtime rig setups to preview mocap have been used many times before. But I'm guessing the 'tech demo' stuff here is the traffic sim, lighting, and fluid FX in real time? In the sizzle reel, there's still an artist on scene making adjustments to animation just like on any regular mograph stage, so that stuff still takes 'time' even if it's closer to realtime.
I'm not really sure how that would be used-- I guess you'd use all this as some kind of springboard for the final product? In the demo it... doesn't look very dynamic, especially the camera angles so I'm just surprised they're trying to pass it off as some big interesting innovation.
BASICALLY any mocap you see on screen is highly HIGHLY adjusted by animators. Like, to the point of the mocap basically becoming a base to the final product after many many many iterations. Realtime performance capture is currently a thing-- so I still don't get why they did all this just do have the demo have.. nice lighting? And some fluid? That doesn't really look too great?
This has got to be the next Ghostbusters video game.🤞🙏🤞🙏🤞
Ray or somebody’s been keeping the Ecto-1’s engine in great condition!
Ha ha, glad I'm not the only one who said out loud that Ecto would not drive like that.
Yep lol😂
Pretty sure that's Winston. ;)
Certainly we're not supposed to accept this as looking real...
its bloody test footage mate
looks more like a game to me
Looks like a superbowl ad
Because it was made in Unreal engine which is usually used for video games
lol the marshmallow man said “hmmm nah Il get that better car”
bro bouta play with it like a toy car😭🙏
This is the thrill-a-minute, action-packed, special effects extravaganza I expect from Ghostbusters, without all the boring dialogue and outdated attempts at comedy. You've done it again, Sony.
I like how puff just looks around and drops the car like oh sorry 😂 1:13
"If there's something weird and it don't look good...." (Face Palm)
This is literally why there are strikes going on right now.
Because of video games?! Mk.
3D Animated series are made for the last 25 years at least. After Pixars Toy Story everyone went into 3d animation. I am using Unreal and Unity for the last 10 years and i started with 3D in 1990. The question is what the production is aiming for? Today i am just working real-time. I am not rendering anymore. The possibilities that RT Engines likes Unreal provide are huge because you can almost work like in the real world but in a virtual one, without waiting hours for some usable results. But what makes me think here is the description. Is it about VFX or a "new" production workflow for whole films and series?
Fallaron en un detalle del hecto 1 no hay efecto de luces de las intermitentes de la parrilla del motor
Why is everyone using the same Unreal Matrix demo city scene. I appreciate the work and art direction, but this wouldn't hold up as a sequence cut into a live action film. It still has video game quality. Just because you can render the whole sequence in a day, doesn't mean you should. All thoughts aside, still a fun short sequence though :)
I thought the exact same thing
Wow. I remember a few papers about the realtime rendering concept 15 years ago. They thought the technology for doing it this well was a couple of decades away. Got it in a few years early. Nice job.
you could say it was indeed a decade away
If you want to be a cameraman, AI took that too as far as Sony is concerned
I guess one would shoot against "draft mode" in real-time - then follow that up by "quality mode" re-rendering (and some consequent re-editing)
So, I should note that a) this wasn’t for a new Ghostbusters movie at all… it was a stand-alone proof of concept. And b) the output of this process is never what would end up on screen. This is just a way to get a faster, on-set VFX capture that can be fully visualized as it is shot. Once all that was done, it would still go to post-production for performance, animation and timing tweaks, as well as some additional compositing, then would get a full render with ray-tracing (which Unreal’s live render engine is currently incapable of). The shots would then be mixed with (and matched to) actual shot footage, not ever delivered as a full, humanless scene like this.
In short, there’s no fear that movies will ever look like this. It’s just a method for making the VFX pipeline faster and more connected to principal photography.
Yeah people will believe anything they see on YT without researching it whatsoever!
Yeah, but they''re not pitching this as previz or volume capture material. As a proof of concept its clearly intended to say; this is what we can do; final pixel in real time. I work with a lot of volume work and they dont put this much effort into the output lol
Isn't "real-time raytracing" something they advertise? I see the term used all over their marketing material.
This misses the spirit of Ghostbusters. It feels like a Fast and Furious movie.
Ecto 1 is family
This is a proof of concept rendering for Jason and the creators to work from. This is not the finished final edited cut of the scene. I think there will be so much more to film than just chases thru city streets.
@@neo529Damn straight.
ecto1: speed i'm speed
ecto1: clear the path i have to hurry
My siblings and I lived and breathed Ghost Busters in the 1980's. Kids were drinking Ecto-Cooler.
Still a long way to go with the effects, getting there though... for me it was Stay Puft...in original film when he walked you had reactive effects like fire hydrants exploding or crack lines along the road as he walked giving weight to the character this seemed missing in this sequence.
Please use this technology for something other than a movie
They are. This was a tech demo for Playstation.
@@GeeksmithingI hope so. Then we’d finally get an open world GB game.
@@trealsteve that would be a ton of fun. GTA but Ghostbusters!
@@Geeksmithing 💯
Nothing to do with a new movie. Clearly stated that it was designed and created to push real time graphics tech, ie in game graphics. If its anything more than that its related to an unannounced game, not a movie
Apparently, it was just a fun little project that Jason did just to let fans know that Ghostbusters is alive & well & to help tide us over till the next movie is released. There is talk of some games coming out in the future as well however.
This looks awesome
Is this better than what Jim did for Avatar?
It looked a lot like the UE 5 demo, and then they told us that is what it is.... with a twist!
It looks almost real. There are things about colour, contrast and physics that the thing just doesn't get right but it's bloody good!
Wow, a videogame cutscene. The future is now!!
As a PoC shot, it works but is far from being ready to accept it as final product. Not a single person in the shots, no preople in the Ecto-1, where is the Stay Puft's collar? I can see this as a great tool for directors to use for visualizatio/guidance but not as a finished product.
It seems like this is an "in your face" to the people striking.
I know this is more-so a tool for filmmakers but this looks simultaneously terrifying and amazing.
I wonder what the name will be called for the next Ghostbusters movie?!
THIS IS WHAT WE WANT!!! A fully realized, sandbox style game where we can drive around NYC, NJ, maybe even upstate and PA as the Ghostbusters. Taking jobs in businesses (jewelry stores), subways, restaurants, haunted mansions, Central Park, amusement parks (Coney Island), and sports arenas/stadiums would be wild. Even constructing and eventually piloting Ecto-2 would be epic.
You know this isn’t a game trailer and a clip from an actual movie that will come out… right?
@@charliekill88 Shaddup!
Looks very good.
Excellent work by all involved. I wonder if human characters were avoided to prevent the uncanny valley effect. Some parts appeared AI-generated, but in realtime it's all amazing. Wonder what the result would be if they put all settings on epic and render non-realtime to a file in unreal.
this is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. this was never meant to be enjoyed as an actual finished product.
@@FablestoneSeriesWow, IDK. I know what pre vis is of course, although none of the level of productions I've ever worked on had a budget for it. But what I didn't know, is that they did so much work and spent so much just for pre-vis alone. The budget for the movie must have been limitless. But this explains leaving out the humans for know.
Still curious about the epic setting and using unreal to do non realtime rendering. Because hey, why not.
@@MaraldBes it is getting so fast and cheap to do this level of previs in unreal engines. most previs don't even add music.
Can someone tell me what movie would benefit from this? Or is it just video games.
This is amazing
Me in GTA when I find the fast car:
The tires of the taxi that hit look floating over the road instead of a heavy taxi.
I want to hear this song version of the theme song. Sounds soooo good
How is this Cameraless again? Seems like they are using Virtual Production Cameras to me.
Clickbait. No physical cameras used. Plenty of Unreal Engine cameras in this scene though!
Incredible that it is real time. CGI takes are now possible. The title is confusing though, as you still need to get the shots with a camera, even if it is virtual.
I feel like I'm watching a car commercial
We're getting closer. But we still need cameras and real people for movies.
Ecto-1 drives to the scene gives me a Christine vibe. Anybody who saw the movie Christine based on the Steven King novel knows what I mean.
Yeah, I was thinking myself that Ecto-1 looked like it was doing this itself, especially as you don't see a driver.
RIP mini big puft
Might be fine for shorts. Reminds me of many of the Love Death and Robots clips, some of them go for this type of realism
would be cool if this was an early 90s continuation of The Real Ghostbusters cartoon
I was expecting it to say, "I am Groot!"
calling it "cameraless" is ignorant... You know how many cameras are involved with motion capture???
none
You need 0 cameras for spatial tracking
How much motion capture do you believe they did for this car? Haha.
Is Avatar real time render like this too? I guess not right?
I thought they had been a mix up as it looked like a computer game. The Stay Puft Marshmallow Man looked great, probably the only thing to be "cinema quality". But it is only a "proof of concept".
This SHOULD be the opening cutscene to a 4 player co-op Ghostbusters game…
*Shows a trailer void of emotional impact* "Our goal at Sony is to fill the world with emotion..."
This ecto-1 is such a fast car for how old it is
No, have they not learnt anything.......? Practical effects are the way coupled with cgi to tart it up.
you really hit the uncanny valley when you see humans
I think the implications of this are pretty clear. Sony released this while they are in labor dispute with actors & writers to show them what they were capable of using the Unreal video game engine.
This sends a clear message. Get off the picket lines and back to work, or next time, we're going to figure out how to do photo realistic people using the Unreal engine.
No, Give the industry employees a raise and screw your stock holders, the strike must continue and fans should not be watching any new productions until fair wages are achieved.
@grimacres It's not about the shareholders. It's about what was and will be no more.
For actors, they want to renegotiate contracts they signed that they are no longer happy with, and it generally revolves around streaming. When they made tv show in, say, 2002, they were paid for the work and offered a residual for syndication. These contracts existed before Hulu, Disney+, Netflix, etc. Should they get paid for something not spelled out in their contacts. Most competent people would say no.
If you said yes, then how far would you go back and who do you go after? Every show ever made for over the past 60 or 70 years? Everybody knows the big streaming companies, but what about the little ones? The Roku channel store has dozens of little streamers that charge very little or are completely ad supported who stream niche content such as old westerns. Should actors go after these guys too for not paying a residual when they stream Gunsmoke or Bonaza or even Zorro.
Another actor complaint is the number of episodes they make and the breaks in between seasons. Network shows and established actors get their 20+ season shows like Law & Order SVU, Blue Bloods, The Rookie, etc. When you act on a streaming show, you only get six, maybe eight episodes, and you're never guaranteed the role will come around again. If it does, then it might be two years later. Should they get paid more for working less than say an established star who worked hard and earned that 20-episode role that might have taken them 5 to 10 years to get.
The writers complaint is similar. Established writers get longer episodes runs and make more money. New writers working on streaming projects get paid, but their job may only last months, where established writers on shows like Law & Order or Blue Bloods have to work pretty much year round to make those shows. These writers on these short-term assignments are calling these gigs mini rooms instead of writers' rooms.
Are the actors & writers right or wrong. I don't know. I do know they took the wrong course of action. Striking never makes any sense, no matter the industry. It just emboldens the other side to dig their heels in and be less cooperative.
Hollywood can sit and wait for the desperation of not getting paid to sink in.
I'm not going to pay money to watch a 2 hour video game cutscene in the theater
@hokemoseley2934 That's not what they were trying to do. They were trying to show you that the Unreal Engine technology has advanced to the point of photo realistic environments. Essentially they could do a car chase without having to wreck any cars or endanger human life.
I believe the next step would be to create a believable digital actor that can get out of the car and fool the audience by moving around and delivering lines.
If people find this compelling, ok, it's just slick bullshit for those without nuance, human first movies will always be better, these are just Special effects, vacuous and corny, it's like a really dramatic Skip Intro
Rendering humans 100% realistically will always be the real litmus test
The cgi looks ok except that the stay puft marshmellow man is moving a little too fast in some area's(it feels unnatural).
Ghostbusters:The Video Game on PS5 remake?
It's tough to film in NY. This is a good solution for that. It shouldn't be used for everything, but this is a pretty good idea.
This ought to be a video game instead, perhaps one w/a brand new movie quality writing & cast.
Perhaps a midquel between 2 & Afterlife.
would be great to see one next year so it ties in with the 40th anniversary
This uses the same assets as that one spiderm-man two trailer we had a while back
They didn't get Ecto-1's handling correct. The rear end slides more in a skid than that. They need to stop trying to make it look like it handles like a super car and love it for the beast it is.
This looks more like a really good pre-vis
I think the effects in the original look better😅 This could get more convincing over time, but there’s some existential hurdles as well such as the jobs and the lacking “human element.” I understand having an artistic vision and wanting it shaped perfectly, but I don’t think any human is capable of micromanaging every detail. AI will be able to fill in those details, but will they make sense and feel real? And couldn’t the lacking details for realism just be added by image capture of real objects and actors rather than generating everything? If you want to represent elements of the real world in a movie, you may as well use the real world when you can. In many cases, actors being filmed add character or expression that wasn’t originally intended, but works best for the story. This tech is really cool, but I don’t think it should always be a replacement for real “filming.” In the same way cartoons don’t replace live action and visa versa; They should co-exist. These media companies are so greedy though, I think they want to avoid paying anyone they cannot completely control. It seems they want to separate IPs from real people, that way the IP’s can live on while the people who make it all possible can be replaced at any time. It’s very concerning
You don't know sh*t.
this is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. this was never meant to be enjoyed as an actual finished product.
Oh my mistake, I thought this was a BTS of the production for a new Ghostbusters *video game*, my bad...
did you miss the part that this is for a VIDEO GAME and not a film?
This is what is called a pre-vis. it is intended only for the director to see, so they can line up the shots they will require. pre-vis are the modern day version of storyboards. This was never meant to be enjoyed as an actual finished product.
Good but still needs some work. For example the Marshmallow man moves too fast and agile for its size
Looks like a video game cutscene
Dear Sony,
here some topic recommendations for a stunning proof of concept:
* Gaussian splatting
* NeRF
* Photogrammetry
* Nanite
* Lumen
If it's realtime rendering... could be fun if they found a way to have some games where a few daily boss fights had a guy in a mo-cap suit reacting to players. I know there's a lot of issues with that for scaling, but I think it would be funny if someone managed it. Would need to hire a lot of people to be on standby in mo-cap suits for instancing.
Looks like a great sequel to the Ghostbusters Playstation 3 game!
It looks like an interesting concept, the city reminded me of Matrix unreal engine version... and then they mention they use UE. Still the lighting , textures and motion need to be more realistic in the product.
Yo cuando jugo con mis juguetes
Hey look its a michelin dough boy 😂
i know its just POC state but this footage looks really souless, to me this look just like in-game event scene, not real motion picture movie.
Wow 95% of the comments dont even get for what this is... Thats exactly why the world gets dumber and dumber every day...
My only critique is that it looks really plasticy, maybe some more PBR and RTX?
it is pbr and ray traced bro
@@FabioGopfert I know it is, Im just suggesting giving it MORE
PBR?
Professional Bull Riding?
@@maverickhuntermeta4954 no, "Physically based rendering"
This is an animated movie. If that's what you want, then great. But this is in no way going to replace live action filmmaking. Very videogamey.
It's literally from a video game engine. This clickbait title is clickbait.
Best way to kill the marshmellow man is to get him on the computer.
I hope it's only doesn't try to insult us and try to pass that off as a movie. I've seen better CGI graphics on a Nintendo.