I wanted to make a smart arse sarcastic comment about how handy this will be to all of us who have full size fighter cockpits in our spare rooms… but I love this content and what you’ve done way too much for that 😂 thanks for sharing so the rest of us can drool anyway.
I, too, was going to post something regrettably youtube-comment-ish, but this is just too cool to rag on. This is truly incredible, and I would rather have this than an actual supercar or A-10.
Nope, that describes the feeling after spending years with DCS, getting it to work finally, learning the planes and their systems ... only to realize there is nothing in the sterile world of DCS to achieve with all the systems and buttons you just learned.
@@thelespauldude3283 Use them for what exactly? DCS's experience pattern is learning a plane for 15-20 hours then spending 90 minutes flying the pitifyl campaigns/scenarios it offers. Like dont get me wrong, I was playing DCS for 20+ years ever since "Lock On" version, even was running MP server for a year, but I got burned by ED's incompetence actually delivering content beyound just those buttons (e.g. dynamic campaign now "in progress" for 13 years) and whoever tries to produce a bigger war scenario in that engine either hits a performance roadblock (since still today AI and simulation is a single threat on CPU) or ED will drive them crazy with breaking compatibility in updates (as ex-DCS server admin for 2 years on average I spent full weekend every month fixing our server missions because ED decided to break mission scripting logic yet again in update). So no, I now play BMS for complex simulation because there is actually a competent war to play with instead of empty maps in DCS and I play WarThunder since there the average player can actually dogfight using energy fighting and you can get into good fights in 30-60 min play time once you are a busy adult with carrier/kids, instead in DCS majority of players only know their buttons and have eyes glued on radar screen, but don't understand shait about energy manuevering and to get some better experience you have to dedicated bazillion of hours keeping up with virtual squadron. PS: And yes, also war thunder is shait, given it's ignore of simulation game mode and bugs there, the toxic russian Z-ombie developers and company owned by a Russian oligarch. So both DCS and War Thunder are shait, just differnt kind of.
If you want to do virtual desktop over wifi, you should be able to use the android debug bridge to set the network to use the usb cable. I can't remember the exact steps, but you get the port and ip that virtual desktop is using, then in the debug bridge, you set the headset to forward that port over the usb.
That thing is beyond wicked. Soo much work, massive respect for sharing this! There is a speaker for producers called a subpac, you probably know but they are supposed to be real nice.
I am trying to convince my parents to let me buy a HGU-55/P for more immersion, and they won’t let me (for obvious reasons) and then I see this, and am blown away by how chill your wife must be. I love seeing the updates by the way, this is the coolest project I have ever seen!
incredible! like it is insane how far you can get with VR! i am delighted to see that you have gotten this to work. One can only wish to get halfway as far as you :)
@Narcosis71 yep. I actually bought it just for room scale shooters like pavlov, I didn't even consider that id end up using the quest for dcs rather than the G2.
i'm getting back into simming and thinking about upgrading from my quest 2. been looking at the quest 3 or potentially higher end options such as the pimax crystal light. any thoughts? you happy with the resolution of the quest 3?
Great video, and hats off to your hard work and cleverness man! Personally, I'm waiting for not too long a time when the whole passthrough software is built in native to DCS. The boundary creation software already exists in the Q3 where you identify a custom area in the room as the guardian. That couldn't be too difficult to change into creating a passthrough boundary. I reckon it will be there in a year or two as pitbuilding grows, and pitbuilding product prices decrease with market growth, and DCS responds to that growing market with in built solutions, as they always have done before. Think native VR support itself. Exciting (and much easier!!) times ahead.
Hah, open kneeboard to display the passthrough colour is a good idea. I like reality mixer myself but I can see open kneeboard being useful. I think I will try it and see how it works.
You realize I now have no excuse NOT to build a sim pit. You REALIZE how much of my money you just spent. YOU REALIZE how expensive divorce lawyers are, right? 🤣
This is the video that I have been waiting for someone to make. Thank you. The problem that I have is that my sim was set up fully to run on BMS. I am working on updating everything for DCS, but RL gets in the way.
Nice video. I use a Leap Motion for handtracking and I setup my virtual hands with the x, y and z axis we get on the vr tab in game. I make them match my real life setup (hornet copy) so the virtual hands basically can navigate my real hand to get some AR. Eagle Dynamics made the three axis for both hands which is a bit annoying because 3 axis for each individual hand would be very preferable (because of the difference in cockpit and real life measurements). Currently I therefore work around it by setting up the right hand in vr relative to the real pit. Works great. Keep up the great content buddy! 🫡
What would be a great addon is to somehow keep a low res panorama projection or some kind of array of LED floodlights that can cast “sunlight” dynamically into the real cockpit.
Is there a way from dcs bios to get the xyz rotation of the aircraft and the time of day in game? That could be sent to some mini computer feeding the projectors to project ambient light into the cockpit live
For VR you should consider something like a Valve Index or a Pixmax, the large FOV is a game changer in VR... makes you feel like you're in the plane instead of in the plane looking through binoculars.
I have a Varjo Aero and I hate Having to run Steam I believe it interferes with the visual operation of the Varjo Aero. I will try this Virtual Desktop. Like to get some feed back if there are any Varjo Aero users out there!!
Man playing that game in a VR such a struggle. I had oculus quest 2 on while playing F16 viper. Finding all the buttons while dog fighting is a nightmare I mean you can’t setup everything on the joystick. If you re cruising then that’s a different thing.
This is pretty cool can you actually read the MFDs? I struggle to read my phone in the quest 3 let alone trying to find a small tank on the MFD. Either way you have knocked it out of the park with this content.
Dammit that's cool. I bought a reverb g2 a few months ago for vr and gave trying to get it to work because steam vr is so terrible. I should really try and tackle it again.
Brilliant! Ever considered putting it all on a motion rig? Don't know the weight of it all but together with XR I would assume that would be final step to reach Nirvana. 🐗👼🥃
Turns out that motion doesn't do much for fighter sims, for things other than basically straight-and-level flight, carrier approaches, or tanking. Any of the dynamic maneuvering (which is why you want to be in a fighter sim anyway, right?) doesn't benefit all that much - actually the opposite. (BTW, I'm a 34-year veteran of flight simulator testing, with experience in both large and small aircraft sims, and have written the US Navy's flight test manual for piloted simulator test and evaluation. I've spent probably a thousand hours in motion base and seat motion sims, and far more in fixed-base fighter sims.) The real crux of the problem is that no motion system currently available - or even possible with current technology - can replicate (or even hint at) sustained g forces without serious compromises. And those compromises are so distracting that they decrease the sense of realism for anyone with actual experience in an airplane. Things like range of motion, acceleration limitations, cueing washout, and attitude/rate/acceleration confounding are simply not solvable without artificial gravity - and we just haven't figured that out yet. Even if your motion system can go fully 360 in all axes, you still can't replicate the accelerations. So it tumbles your inner ear gyros in a very unrealistic fashion, and does a wonderful job of giving you motion sickness, but that's about it. Even centrifuge sims, which can reproduce high and sustained g forces, are really horrid for any other part of simulation training - due to the rotation rates required. Motion cueing works fine for small short-duration changes - like straight and level flight, or mostly steady approaches to an airfield. It can help in formation flight and hover, for sure. But anything resembling air combat - nope. There's a reason that basically no serious military fighter combat sims use motion bases. Some have very simple seat cueing (vibe and pressure points and very small motions), and there are a few centrifuges for specific g-force training, but that's it. "Butt-kicker" subwoofers are actually pretty good at cueing the rumble and vibes of stall buffet or helicopter vibrations etc., and that's usually the limit for many fighter sims. When NASA wanted to produce a more realistic simulation for the Space Shuttle and other projects, they built the Vertical Motion Simulator (VMS) - look it up if you can. It can produce somewhat sustained g forces - but still not enough for fighter sims, and the thing is 60 feet tall and 40 feet wide: totally unreasonable for a fighter sim. (I've been for a ride in it - it's fascinating.) So motion cueing systems are essentially only used for FAA Level C and Level D transport-class aircraft that spend most of their time in steady and low-acceleration, low-rate flight - where it really helps and excels. And even then, plenty of studies have shown that there is actually minimal added training value from motion systems - in many ways, it detracts from the real transfer of skills. But it is cool.
@@Brandon_SoMD First of all it sounds like you have a job that makes me jealous. I have seen a number of videos of people using a motion rig id DCS and they get pretty wild about it, so I put it on my list of things to get, after upgrading my jetseat to a buttkicker, which hopefuly is back on stock in a couple of weeks. I am unsure about either a 3DOF or 6DOF system. In DCS I fly the A-10, so I don't really need to feel the catapult or 9g's in a fast jet, but I understood that it would bring an extra level of immersion.I recently saw a video of a former hornet pilot trying out a motion rig (on GYGO) and he was quite positive about it. Some time ago I saw a nice video about the SIMONA simulator from Delft University that tricks your mind. Did you watch this by any chance? Very curious about your thoughts.
@@Ready.A-10C Yeah, it's been an incredibly fun job. I literally wrote the manual on US Navy sim testing, and I get to spend a ton of time pretending I'm a pilot in the world's best real milsims. Here's the thing about motion rigs: people unfamiliar with professional simulation systems generally have the mistaken impression that just tilting the cockpit the same way the simulated aircraft is tilting will produce useful and accurate sensations. That's not true - in fact it couldn't be further from the truth. Motion cueing is incredibly complex to get good enough to be useful. First, the concept of "washout." In any limited-range motion system (a small one might move +/-30 degrees and maybe a foot of total motion distance), it's impossible to follow the attitude of the simulated airplane. At some point, the motion base has to stop moving, even if the airplane is doing a full 360-deg aileron roll or a vertical loop in pitch. How does the system handle this? It has to slow and stop moving (without an abrupt jerk at the stopping point), and then it has to move back towards the neutral position to prepare for the next motion. This slow/stop/return produces sensations on the occupant. So it has to be done VERY carefully, without triggering the inner ear (for angular rates) and seat-of-the-pants (for linear accelerations) to notice the change. Next, reproducing a sensation of sustained accelerations is REALLY tricky. For a long-term forward acceleration, such as on takeoff, you might tilt the cockpit back so that a portion of the 1g gravity is pushing on the pilot's back, making it feel like you're speeding up. But GETTING TO THAT PITCH ANGLE produces an angular rate for a moment, which the inner ear happily notices - while your eyes don't see it. The same happens for sustained side forces, like turning left or right while taxiing. So you have a momentary innate confusion about why your eyes aren't seeing the pitch rate. Your brain is REALLY good at detecting angular rates - it's how you are able to walk. So your brain's balance system rebels at this confounding of rates and accelerations, and the result is almost always motion sickness, or "simulator sickness" as a term of art when it happens in a simulator. It's potentially bad enough that some military training commands don't let a pilot fly a real airplane for up to 24 hours after a sim session - because it really screws with the inner ear systems. And, in an open cockpit (where you can see your environment outside the cockpit) it's even worse. Remember that old 1980's arcade game "Afterburner"? You sat in an open cockpit, with a TV screen in front, and the whole thing moved around on its base while you flew an approximation of an F-14 in dogfights. Great, but this adds yet more trouble: your peripheral vision sees the non-moving exterior world, and your brain rejects the motion cues as completely out of sync with the visual display of the outside world. So motion base sims must be very carefully isolated from the outside world. (An virtual reality cockpit does this just fine, fortunately - as does a carefully built augmented reality cockpit like this A-10 setup.) Now, jerking someone around in an unrealistic manner while they play a video game may be exciting, but it's not helpful to someone interested in having the motion cues add to the realism or help them understand what's going on to help them more accurately control the airplane. Fortunately, *some* simulators can get away with all these things just fine - sims that don't experience large sustained accelerations, and don't change their pitch or bank angle all that much: airliners and similar classes of airplanes. That's why the FAA has specific requirements for motion bases on Level C and Level D sims in the FAR Part 60 code. If you're interested in flying bizjets or a small civil Cessna, the average small motion base is probably going to actually be helpful and close to realistic. But fighters? Nope; they maneuver too dynamically for a motion base to be really useful. What about centrifuge sims, where you can really sling someone around at high and sustained g forces? Okay, but now you're introducing a pretty healthy angular rate into the equation, and the brain CANNOT ignore it. At higher g situations, the person in the centrifuge is experiencing literally hundreds of degrees per second of combined pitch and yaw rates, but their out-the-window display shows them a very different thing. And the brain HATES it. Try this sometime: sit in an office chair with a blindfold on, look straight ahead, have someone spin the chair at about 2 seconds per revolution (180 deg/sec), and turn your head to look up over your shoulder. You'll probably almost fall out of the chair as your brain gets completely confused - if you don't throw up. Turning the head really spins up the inner ear "gyro" horribly; in a dogfight situation, where the pilot has to be constantly turning their head to track the enemy fighter, it's awful. That's why milsims don't use centrifuges except for the specific g-tolerance training. So to summarize, motion bases are really good for SOME things, but not for others. They have massive limitations you have to know about if you care about accuracy. As for me, I know too much about what it SHOULD be like, to really appreciate simplistic motion systems. I've worked on everything from 60-inch 6DOF systems, to 1-inch (yes, ONE inch) 3DOF systems, plus motion cueing seats. My job is all about "does this thing we're buying really train people effectively," so I'm deeply interested in realistic and accurate training. As a result, I'm pretty strongly biased against "fun" motion systems just for the sake of feeling SOMETHING. But - I'll also admit that a lot of games don't model reality, anyway. (For the record, I also have a lot of thoughts about force feedback systems....) So I'd recommend: do some research into motion cueing, and make sure you are at least congizant of all these limitations before you get too deep into picking a system. You may be just fine with it, and that's cool. I just want you to be informed.
Just get yourself an Ethernet to USBC adapter. Use that with your link cable into your router and now you have a much faster connection and you can max all your settings and your headset is always charged. Another bonus you don’t have to have all that extra batter packs etc on your headset.
What is your preferred way to fly? MR or surround projectors? I have always wanted a VR setup that "hard links" the coordinate system of a physical sim pit to that of a virtual VR cockpit so that the headset tracking matches the two cockpits perfectly. If only the pilots body is passed through and everything else is rendered you should be able to reach for a virtual button and touch a real world button. Given good enough hand tracking you could even go fully VR.
in the dcs bios guide, it says "All you need to do is use the drop down boxes in the control refernce to select your aircraft, panel and the button/switch/light/gauge/anything you want, then copy and paste the code that it gives you into the Arduino IDE and flash it to your Arduino. " where do i find this "drop down box" for the f18
I’m surprised you’re not struggling with your mouse. My DCS/Windows will absolutely not keep the mouse in the DCs window as soon as I look in the direction of implementing mixed reality.
Is it possible to just set the cockpit monitor to pink, and use the virtual displays in DCS. This way you still get the physical buttons to pass through. Also I wonder if there is any sort of software that would track physical trackers in your cockpit to make the alignment better
I found the difficulty wasd actually chnaging the shape of the colour pass through from Alt Tab - ie, no way to be able to see what you are changing - in the end i just went with VDs 'show hands' passthrough - eh voila.
I can't seem to get DCS to output the display of those screens (I have the winwing ones) when in VR, outside vr they work, but if I am in vr mode, they are blank.
Nice work, as always. Is the lag (the relative motion) on the edges due to screen capturing or can you also experience it during flight? Everywhere I see mixed VR like this, it doesn't seem to be fully locked in. Thank you very much!
@geerdjacobs6484 it's the screen capture doing that. I've cropped it out as much as I can. In the headset the FOV is much wider and there is no clipping on the edges. But it's not 100% perfect just yet.
Hi. Sorry to go OT here. Did you ever do anything further with the Simped rudder pedals, i.e. pots for the toe brakes? I have the same set of pedals and would really like to be able to use them over USB. Do you have a detailed explanation anywhere of how you replaced the original electronics with a hall sensor? To convert these pedals to USB, for those of us who don't have a cockpit full of a selection of boards, what would be the simplest way to make these show up as a USB game controller with three axes?
With a budget of about $3000 CAD is there a stellar VR headset out there that can do this passthrough? i have the winwing F18 front control panel. Would love to set this up with a better camara setup.
I'm curious to ask how well can you use the cockpit without the mixed reality whilst in VR, because if you think about it the distance should still be correct so even though you can see you hands you brain could still understand where all the buttons and switches are
So what's the plan then, projector and screen wise? will you tear them down and use VR everytime? It's really cool where technology is taking us. Seriously.
I have a very basic question. I notice you have live displays of the MFD’s on the left and right. I wanted to use my iPad as an extended display to show the bottom MFD in my F18. How do I do that? I already got the software to extend my display onto the iPad as a second screen, but how do I get the MFD onto it?
Does anyone in the comments section use a 7900 xtx or any 7000 series to play DCS? If so do you have any problems with CDT or poor frame rates. I heard the drivers are poorly optimised for flight sims and Nvidia cards are a better experience?
I wanted to make a smart arse sarcastic comment about how handy this will be to all of us who have full size fighter cockpits in our spare rooms… but I love this content and what you’ve done way too much for that 😂 thanks for sharing so the rest of us can drool anyway.
*laughs nervously in F-15C* 🤣
I, too, was going to post something regrettably youtube-comment-ish, but this is just too cool to rag on.
This is truly incredible, and I would rather have this than an actual supercar or A-10.
"It drives you to alcoholism but you'll get it to work eventuelly" that pretty much describes the DCS VR experience😂
Nope, that describes the feeling after spending years with DCS, getting it to work finally, learning the planes and their systems ... only to realize there is nothing in the sterile world of DCS to achieve with all the systems and buttons you just learned.
@@BlackbirdDrozd knowing all the systems and buttons and being able to use them is the achievement. Whats there in warthunder to achieve?
@@thelespauldude3283 Use them for what exactly? DCS's experience pattern is learning a plane for 15-20 hours then spending 90 minutes flying the pitifyl campaigns/scenarios it offers. Like dont get me wrong, I was playing DCS for 20+ years ever since "Lock On" version, even was running MP server for a year, but I got burned by ED's incompetence actually delivering content beyound just those buttons (e.g. dynamic campaign now "in progress" for 13 years) and whoever tries to produce a bigger war scenario in that engine either hits a performance roadblock (since still today AI and simulation is a single threat on CPU) or ED will drive them crazy with breaking compatibility in updates (as ex-DCS server admin for 2 years on average I spent full weekend every month fixing our server missions because ED decided to break mission scripting logic yet again in update).
So no, I now play BMS for complex simulation because there is actually a competent war to play with instead of empty maps in DCS and I play WarThunder since there the average player can actually dogfight using energy fighting and you can get into good fights in 30-60 min play time once you are a busy adult with carrier/kids, instead in DCS majority of players only know their buttons and have eyes glued on radar screen, but don't understand shait about energy manuevering and to get some better experience you have to dedicated bazillion of hours keeping up with virtual squadron.
PS: And yes, also war thunder is shait, given it's ignore of simulation game mode and bugs there, the toxic russian Z-ombie developers and company owned by a Russian oligarch. So both DCS and War Thunder are shait, just differnt kind of.
@@thelespauldude3283 access to classified/NOFORN documentation? 🤣🤣🤣
@@thelespauldude3283use on what? Dcs ai is stupid and multiplayer dead
Truly incredible work man
If you want to do virtual desktop over wifi, you should be able to use the android debug bridge to set the network to use the usb cable.
I can't remember the exact steps, but you get the port and ip that virtual desktop is using, then in the debug bridge, you set the headset to forward that port over the usb.
Awesome! All that effort paid off. That's like the holy grail of flight simming right there.
That thing is beyond wicked. Soo much work, massive respect for sharing this! There is a speaker for producers called a subpac, you probably know but they are supposed to be real nice.
This is where DCS and like will really shine. Give it a few more gens and it will be awesome
I kinda wanted to say something along the lines of step one be rich, but honestly this shit is sick bro, good job 👍
God I'd love to ave this for my old school space combat games. Imagine the Death Star Trench Run or playing Descent in AR
Thanks heaps for the video. Can’t wait to give this a try. 😊
I am trying to convince my parents to let me buy a HGU-55/P for more immersion, and they won’t let me (for obvious reasons) and then I see this, and am blown away by how chill your wife must be. I love seeing the updates by the way, this is the coolest project I have ever seen!
3:30 ... a totally understandable understatement in sim building.
Very happy to see the VR_MIRROR lua setting! I had been hoping someone would share a solution to using transparent MFDs.
Great video, well done! Very engaging and informative. Will have to set up openkneeboard for my Spitfire in DCS. Thanks for the pointers🙏🍻
incredible! like it is insane how far you can get with VR! i am delighted to see that you have gotten this to work. One can only wish to get halfway as far as you :)
Wags has ascended to diety, that reverb is hilarious lol
Excellent!! Can't wait to do mine!
Love your videos man thank you for everything you do to help the community out
What an amazing setup. Thanks for sharing!
Fantastically done, mate!
man what a stud! awesome build that i hope to replicate one day(for the Hornet, it's my first love). Thanks for the info on how to set all this up!
THAT is one bad ass setup! So when can we expect the 1:1 scale of the hornet? 🤣🤣🤣
What an amazing time to do this - epic! Great share mate - thank you.
Just wanted to say, I'm on my way to building my own pit. Thanks for helping me see ways of doing it.
Love your work mate.
The Q3 and BoboVR setup was the best investment I’ve made in a long time.
@Narcosis71 yep. I actually bought it just for room scale shooters like pavlov, I didn't even consider that id end up using the quest for dcs rather than the G2.
i'm getting back into simming and thinking about upgrading from my quest 2. been looking at the quest 3 or potentially higher end options such as the pimax crystal light. any thoughts? you happy with the resolution of the quest 3?
If I can achieve this, I’d be crying instead of playing.
Great video, and hats off to your hard work and cleverness man! Personally, I'm waiting for not too long a time when the whole passthrough software is built in native to DCS. The boundary creation software already exists in the Q3 where you identify a custom area in the room as the guardian. That couldn't be too difficult to change into creating a passthrough boundary. I reckon it will be there in a year or two as pitbuilding grows, and pitbuilding product prices decrease with market growth, and DCS responds to that growing market with in built solutions, as they always have done before. Think native VR support itself. Exciting (and much easier!!) times ahead.
Hah, open kneeboard to display the passthrough colour is a good idea. I like reality mixer myself but I can see open kneeboard being useful. I think I will try it and see how it works.
You realize I now have no excuse NOT to build a sim pit. You REALIZE how much of my money you just spent. YOU REALIZE how expensive divorce lawyers are, right? 🤣
3:25 Any chance of a separate video eventually detailing how to get VR to work (Quest 3) with DCS?
Excellent work man. It won't be too much longer before I'm able to leverage this method in my F-15C...
Very cool!! Thanks for sharing :)
Still an amazing accomplishment after all this time!
This is the video that I have been waiting for someone to make. Thank you. The problem that I have is that my sim was set up fully to run on BMS. I am working on updating everything for DCS, but RL gets in the way.
Nice video. I use a Leap Motion for handtracking and I setup my virtual hands with the x, y and z axis we get on the vr tab in game.
I make them match my real life setup (hornet copy) so the virtual hands basically can navigate my real hand to get some AR.
Eagle Dynamics made the three axis for both hands which is a bit annoying because 3 axis for each individual hand would be very preferable (because of the difference in cockpit and real life measurements).
Currently I therefore work around it by setting up the right hand in vr relative to the real pit. Works great.
Keep up the great content buddy! 🫡
This is nuts. Love it.
I'm following your videos for a very long time... ❤😊
but the most I like on *this* video are the flashing position lights....
😅🤣😂
It looks so real you need nomex gloves not to break the illusion
"It Drive you to Alcoholism..." LOL. yea, getting my Quest 3 to work wirelessly in DCS without stutters and issues had me drinkin as well.
What would be a great addon is to somehow keep a low res panorama projection or some kind of array of LED floodlights that can cast “sunlight” dynamically into the real cockpit.
Is there a way from dcs bios to get the xyz rotation of the aircraft and the time of day in game? That could be sent to some mini computer feeding the projectors to project ambient light into the cockpit live
i have been using hue lights for that, ruclips.net/video/N6Uz7fpqK24/видео.html
This is so cool!
Nice video, thank you
Great work thanks much
You've convinced me to upgrade to the bobovr.
Dude, that's so cool
For VR you should consider something like a Valve Index or a Pixmax, the large FOV is a game changer in VR... makes you feel like you're in the plane instead of in the plane looking through binoculars.
I have a Varjo Aero and I hate Having to run Steam I believe it interferes with the visual operation of the Varjo Aero. I will try this Virtual Desktop. Like to get some feed back if there are any Varjo Aero users out there!!
amazing!
Nice! Can you please show how you came up with the pass through shape?
@R.I.D.E lots of trial and error. It's a bit weird because the 'kneeboard' is laid at about a 45 degree angle from where I'm sitting.
Mint 👌
Man playing that game in a VR such a struggle. I had oculus quest 2 on while playing F16 viper. Finding all the buttons while dog fighting is a nightmare I mean you can’t setup everything on the joystick. If you re cruising then that’s a different thing.
THE ULTIMATE!!
This is pretty cool can you actually read the MFDs? I struggle to read my phone in the quest 3 let alone trying to find a small tank on the MFD. Either way you have knocked it out of the park with this content.
@justinallen2285 it's not easy. I can find the hot spot of targets, but I sometimes need to raise the headset up to be able to identify what they are.
Now you need some rotating lamp above you to simulate the sunlight inside the cockpit ;).
Respect.
DREAMMMMM
Dammit that's cool. I bought a reverb g2 a few months ago for vr and gave trying to get it to work because steam vr is so terrible. I should really try and tackle it again.
Damn u legend
Brilliant! Ever considered putting it all on a motion rig? Don't know the weight of it all but together with XR I would assume that would be final step to reach Nirvana. 🐗👼🥃
Turns out that motion doesn't do much for fighter sims, for things other than basically straight-and-level flight, carrier approaches, or tanking. Any of the dynamic maneuvering (which is why you want to be in a fighter sim anyway, right?) doesn't benefit all that much - actually the opposite.
(BTW, I'm a 34-year veteran of flight simulator testing, with experience in both large and small aircraft sims, and have written the US Navy's flight test manual for piloted simulator test and evaluation. I've spent probably a thousand hours in motion base and seat motion sims, and far more in fixed-base fighter sims.)
The real crux of the problem is that no motion system currently available - or even possible with current technology - can replicate (or even hint at) sustained g forces without serious compromises. And those compromises are so distracting that they decrease the sense of realism for anyone with actual experience in an airplane. Things like range of motion, acceleration limitations, cueing washout, and attitude/rate/acceleration confounding are simply not solvable without artificial gravity - and we just haven't figured that out yet.
Even if your motion system can go fully 360 in all axes, you still can't replicate the accelerations. So it tumbles your inner ear gyros in a very unrealistic fashion, and does a wonderful job of giving you motion sickness, but that's about it. Even centrifuge sims, which can reproduce high and sustained g forces, are really horrid for any other part of simulation training - due to the rotation rates required.
Motion cueing works fine for small short-duration changes - like straight and level flight, or mostly steady approaches to an airfield. It can help in formation flight and hover, for sure. But anything resembling air combat - nope. There's a reason that basically no serious military fighter combat sims use motion bases. Some have very simple seat cueing (vibe and pressure points and very small motions), and there are a few centrifuges for specific g-force training, but that's it. "Butt-kicker" subwoofers are actually pretty good at cueing the rumble and vibes of stall buffet or helicopter vibrations etc., and that's usually the limit for many fighter sims.
When NASA wanted to produce a more realistic simulation for the Space Shuttle and other projects, they built the Vertical Motion Simulator (VMS) - look it up if you can. It can produce somewhat sustained g forces - but still not enough for fighter sims, and the thing is 60 feet tall and 40 feet wide: totally unreasonable for a fighter sim. (I've been for a ride in it - it's fascinating.)
So motion cueing systems are essentially only used for FAA Level C and Level D transport-class aircraft that spend most of their time in steady and low-acceleration, low-rate flight - where it really helps and excels.
And even then, plenty of studies have shown that there is actually minimal added training value from motion systems - in many ways, it detracts from the real transfer of skills.
But it is cool.
@@Brandon_SoMD First of all it sounds like you have a job that makes me jealous. I have seen a number of videos of people using a motion rig id DCS and they get pretty wild about it, so I put it on my list of things to get, after upgrading my jetseat to a buttkicker, which hopefuly is back on stock in a couple of weeks. I am unsure about either a 3DOF or 6DOF system. In DCS I fly the A-10, so I don't really need to feel the catapult or 9g's in a fast jet, but I understood that it would bring an extra level of immersion.I recently saw a video of a former hornet pilot trying out a motion rig (on GYGO) and he was quite positive about it. Some time ago I saw a nice video about the SIMONA simulator from Delft University that tricks your mind. Did you watch this by any chance? Very curious about your thoughts.
@@Ready.A-10C Yeah, it's been an incredibly fun job. I literally wrote the manual on US Navy sim testing, and I get to spend a ton of time pretending I'm a pilot in the world's best real milsims.
Here's the thing about motion rigs: people unfamiliar with professional simulation systems generally have the mistaken impression that just tilting the cockpit the same way the simulated aircraft is tilting will produce useful and accurate sensations. That's not true - in fact it couldn't be further from the truth. Motion cueing is incredibly complex to get good enough to be useful.
First, the concept of "washout." In any limited-range motion system (a small one might move +/-30 degrees and maybe a foot of total motion distance), it's impossible to follow the attitude of the simulated airplane. At some point, the motion base has to stop moving, even if the airplane is doing a full 360-deg aileron roll or a vertical loop in pitch. How does the system handle this? It has to slow and stop moving (without an abrupt jerk at the stopping point), and then it has to move back towards the neutral position to prepare for the next motion.
This slow/stop/return produces sensations on the occupant. So it has to be done VERY carefully, without triggering the inner ear (for angular rates) and seat-of-the-pants (for linear accelerations) to notice the change.
Next, reproducing a sensation of sustained accelerations is REALLY tricky. For a long-term forward acceleration, such as on takeoff, you might tilt the cockpit back so that a portion of the 1g gravity is pushing on the pilot's back, making it feel like you're speeding up. But GETTING TO THAT PITCH ANGLE produces an angular rate for a moment, which the inner ear happily notices - while your eyes don't see it. The same happens for sustained side forces, like turning left or right while taxiing. So you have a momentary innate confusion about why your eyes aren't seeing the pitch rate. Your brain is REALLY good at detecting angular rates - it's how you are able to walk. So your brain's balance system rebels at this confounding of rates and accelerations, and the result is almost always motion sickness, or "simulator sickness" as a term of art when it happens in a simulator.
It's potentially bad enough that some military training commands don't let a pilot fly a real airplane for up to 24 hours after a sim session - because it really screws with the inner ear systems.
And, in an open cockpit (where you can see your environment outside the cockpit) it's even worse. Remember that old 1980's arcade game "Afterburner"? You sat in an open cockpit, with a TV screen in front, and the whole thing moved around on its base while you flew an approximation of an F-14 in dogfights. Great, but this adds yet more trouble: your peripheral vision sees the non-moving exterior world, and your brain rejects the motion cues as completely out of sync with the visual display of the outside world. So motion base sims must be very carefully isolated from the outside world. (An virtual reality cockpit does this just fine, fortunately - as does a carefully built augmented reality cockpit like this A-10 setup.)
Now, jerking someone around in an unrealistic manner while they play a video game may be exciting, but it's not helpful to someone interested in having the motion cues add to the realism or help them understand what's going on to help them more accurately control the airplane.
Fortunately, *some* simulators can get away with all these things just fine - sims that don't experience large sustained accelerations, and don't change their pitch or bank angle all that much: airliners and similar classes of airplanes. That's why the FAA has specific requirements for motion bases on Level C and Level D sims in the FAR Part 60 code. If you're interested in flying bizjets or a small civil Cessna, the average small motion base is probably going to actually be helpful and close to realistic.
But fighters? Nope; they maneuver too dynamically for a motion base to be really useful.
What about centrifuge sims, where you can really sling someone around at high and sustained g forces? Okay, but now you're introducing a pretty healthy angular rate into the equation, and the brain CANNOT ignore it. At higher g situations, the person in the centrifuge is experiencing literally hundreds of degrees per second of combined pitch and yaw rates, but their out-the-window display shows them a very different thing. And the brain HATES it. Try this sometime: sit in an office chair with a blindfold on, look straight ahead, have someone spin the chair at about 2 seconds per revolution (180 deg/sec), and turn your head to look up over your shoulder. You'll probably almost fall out of the chair as your brain gets completely confused - if you don't throw up. Turning the head really spins up the inner ear "gyro" horribly; in a dogfight situation, where the pilot has to be constantly turning their head to track the enemy fighter, it's awful. That's why milsims don't use centrifuges except for the specific g-tolerance training.
So to summarize, motion bases are really good for SOME things, but not for others. They have massive limitations you have to know about if you care about accuracy.
As for me, I know too much about what it SHOULD be like, to really appreciate simplistic motion systems. I've worked on everything from 60-inch 6DOF systems, to 1-inch (yes, ONE inch) 3DOF systems, plus motion cueing seats. My job is all about "does this thing we're buying really train people effectively," so I'm deeply interested in realistic and accurate training. As a result, I'm pretty strongly biased against "fun" motion systems just for the sake of feeling SOMETHING. But - I'll also admit that a lot of games don't model reality, anyway.
(For the record, I also have a lot of thoughts about force feedback systems....)
So I'd recommend: do some research into motion cueing, and make sure you are at least congizant of all these limitations before you get too deep into picking a system. You may be just fine with it, and that's cool. I just want you to be informed.
Just get yourself an Ethernet to USBC adapter. Use that with your link cable into your router and now you have a much faster connection and you can max all your settings and your headset is always charged. Another bonus you don’t have to have all that extra batter packs etc on your headset.
hey thx for explaining, have been trying myself but could not get it to work.
Which MFD screens are you using? Great that there is now a solution for exporting MFD screens in VR.
Fantástico.
My local county fair has an F-15 cockpit on display for people to sit in. I may need to steal that if I want any of this advice to work
Now all you need is thrusters and movement handling and it will be perfect 😂😂😂😂
What is your preferred way to fly? MR or surround projectors?
I have always wanted a VR setup that "hard links" the coordinate system of a physical sim pit to that of a virtual VR cockpit so that the headset tracking matches the two cockpits perfectly. If only the pilots body is passed through and everything else is rendered you should be able to reach for a virtual button and touch a real world button. Given good enough hand tracking you could even go fully VR.
in the dcs bios guide, it says "All you need to do is use the drop down boxes in the control refernce to select your aircraft, panel and the button/switch/light/gauge/anything you want, then copy and paste the code that it gives you into the Arduino IDE and flash it to your Arduino. "
where do i find this "drop down box" for the f18
ayayay now i need this....thanks? :)
have you tried latest Quest 3 software update, supposed to improve pass-through quality
Wow❤
Rad!
My thing is what do you do if you wanna fly another jet? This is cool tho
I’m surprised you’re not struggling with your mouse. My DCS/Windows will absolutely not keep the mouse in the DCs window as soon as I look in the direction of implementing mixed reality.
i wanna fly in this... ngl... but i'll never get to do it
Is it possible to just set the cockpit monitor to pink, and use the virtual displays in DCS. This way you still get the physical buttons to pass through. Also I wonder if there is any sort of software that would track physical trackers in your cockpit to make the alignment better
I found the difficulty wasd actually chnaging the shape of the colour pass through from Alt Tab - ie, no way to be able to see what you are changing - in the end i just went with VDs 'show hands' passthrough - eh voila.
I can't seem to get DCS to output the display of those screens (I have the winwing ones) when in VR, outside vr they work, but if I am in vr mode, they are blank.
Nice work, as always. Is the lag (the relative motion) on the edges due to screen capturing or can you also experience it during flight? Everywhere I see mixed VR like this, it doesn't seem to be fully locked in.
Thank you very much!
@geerdjacobs6484 it's the screen capture doing that. I've cropped it out as much as I can. In the headset the FOV is much wider and there is no clipping on the edges. But it's not 100% perfect just yet.
Hi. Sorry to go OT here. Did you ever do anything further with the Simped rudder pedals, i.e. pots for the toe brakes? I have the same set of pedals and would really like to be able to use them over USB. Do you have a detailed explanation anywhere of how you replaced the original electronics with a hall sensor? To convert these pedals to USB, for those of us who don't have a cockpit full of a selection of boards, what would be the simplest way to make these show up as a USB game controller with three axes?
is it possible to have the 24:21 view as the small one with the main as cockpit view?
Excellent. Anyone done this for the F-18 Hornet?
I wonder how Apple's Vision Pro would look for this setup? I know it's a different $$ but still - passthrough should be way better?
step 1: have a vr headset that can do MR, or at least have its camera's available to other programs. (aka not my Vive Cosmos)
With a budget of about $3000 CAD is there a stellar VR headset out there that can do this passthrough? i have the winwing F18 front control panel. Would love to set this up with a better camara setup.
I'm curious to ask how well can you use the cockpit without the mixed reality whilst in VR, because if you think about it the distance should still be correct so even though you can see you hands you brain could still understand where all the buttons and switches are
wow
So what's the plan then, projector and screen wise? will you tear them down and use VR everytime?
It's really cool where technology is taking us. Seriously.
I’m thinking about doing something like this myself. Are there any drawbacks other than the slight degradation to readability?
can you inport aloso jpg imge to virtual desktop?
Sometimes he forger
I have a very basic question.
I notice you have live displays of the MFD’s on the left and right.
I wanted to use my iPad as an extended display to show the bottom MFD in my F18. How do I do that? I already got the software to extend my display onto the iPad as a second screen, but how do I get the MFD onto it?
Hello I am building my own sim project, how can I connect many arduinos to my pc? Because I don't have enough USB ports to connect them via cable
What is beter by your opinion? to have 3x dataprojectors...or this setup, to have mixed reality?
Why not use a real greenscreen which is outside the cockpit and well lit? But yeah you have to build a canopy for this to fully work :D
Plz live stream!
I had a reverb g2. How does it compare to quest 3 in terms of resolution etc?
@fa-ajn9881 it's better than a Q2. But no passthrough.
What router are you using?
Can you record in VR180 so we can see a little more what you see?
Does anyone in the comments section use a 7900 xtx or any 7000 series to play DCS? If so do you have any problems with CDT or poor frame rates. I heard the drivers are poorly optimised for flight sims and Nvidia cards are a better experience?