Pixel Prof
Pixel Prof
  • Видео 169
  • Просмотров 1 068 343
Quick Tip: Default Frame Rate for Motion Design Projects in Unreal Engine 5.5
This quick tip video shows where you can find and change the setting for the default frame rate for Motion Design animation in UE5.5
Просмотров: 164

Видео

Fixing "Auto" Retargeting Issues in Unreal Engine 5.5
Просмотров 2979 часов назад
This video shows workflow for addressing issues that can be encountered in Unreal Engine when Auto Retargeting doesn't really do the "Auto" part so well. By generating retargeting assets and making logical adjustments to the source and target regarting poses, what initially looks like a very problematic retarget can be made quite effective.
Creating a Motion Design Project in Unreal Engine 5.5
Просмотров 45712 часов назад
This video is part of a series for beginners showing how to create a new project in UE 5.5 configured for Motion Design work. In addition to setting up the blank project and plugins, this also shows how to add a free set of material assets that can be used when creating motion graphics elements.
Install Unreal Engine on a PC
Просмотров 10012 часов назад
Just a quick video showing how to install Unreal Engine. The screen captures are from a Mac, but the install process described is for a PC.
Broadcast Transition Logic in Unreal 5.5 (For Beginners)
Просмотров 34512 часов назад
This video, intended for beginners, is a step by step demonstration on how to setup a simple lower thirds broadcast graphic template using the Transition Logic features of UE5.5's Motion Design plugin.
Dynamic Materials in Unreal Engine 5.5
Просмотров 61121 час назад
This quick-tip video shows how to use Dynamic Materials in Unreal 5.5 (should also work in 5.4 ...and maybe.. in 5.3?) Dynamic Materials allow a single material instance to have lots of different looks in a single level, without the need to create duplicate instance assets in the Content Browser. This is particularly useful in Motion Design so template levels can be configured to use different ...
Animate Metahumans Using an Audio File in Unreal Engine 5.5
Просмотров 4,7 тыс.День назад
One of the "under the radar" demos at Unreal Fest 2024 was the new ability to use nothing more than an audio dialog file to animate Metahuman lip sync animation. This video shows a Pre-Release walkthrough of how to use this functionality using the 5.5 Metahuman Plugin. This video was recorded using the initial Pre-Prelease version of UE5.5, so there are still a few things to be ironed out, and ...
NDI Alpha Channel with Motion Design in Unreal 5.5
Просмотров 78121 день назад
This video shows how Unreal Engine 5.5's native NDI Output plugin can be used with the Motion Design Rundown Playlist feature to output graphics with alpha channel that can be composited for live broadcasting and live streaming outside of Unreal Engine. This video is recorded using an early, pre-release version of UE5.5 downloaded from Epic's Github source repository, so features are subject to...
Native NDI Output in Unreal Engine 5.5
Просмотров 1,8 тыс.21 день назад
Let's take a look at the built-in NDI Media output functionality in a pre-release version of Unreal Engine 5.5! This video is the first in a series taking a look at some new features and updates found in the source code release of UE5.5. Since this is built from source ahead of the official pre-release, anything shown could change between this recording and the official release, but it's always...
Installing Unreal Engine 5 on a Mac
Просмотров 2,6 тыс.Месяц назад
This video walks through the process of installing Unreal Engine onto an Apple Macintosh, (in this case, an M2 MacbookAir). Included in the video is a demonstration of creating a first-time Film, Video & Live Events project, and adjusting some of the Engine's performance settings to improve the frames per second achieved. In this case, the FPS improved from 19-21fps to a more respectable ~30fps...
Triggering Different Sequences in Unreal Engine
Просмотров 330Месяц назад
This video shows how Unreal Engine can be setup to play different Level Sequences in response to different events such as triggering differet collision boxes. This is shared in response to some questions that were posted to a previous tutorial explaining how to use a keystroke to play a single sequence. Hope this helps. Have fun.
Free Movella Mocap Asssets: Retargeting in Unreal 5.4
Просмотров 4542 месяца назад
This video is a quick look at how to easily retarget free XSens motion capture animation assets downloaded from Movella.com so they can be used on Unreal Engine Manny or MetaHuman characters. The free mocap data can be downloaded from movella.com
Create Custom Metahuman Template Project in Unreal Engine
Просмотров 5592 месяца назад
This video shows how you can easily create your own custom template project that can be used when launching Unreal Engine to save time by having all the plugins and rendering settings you need ready to go any time you create a new project.
SIGGRAPH 2024 - Revopoint 3D Handheld Scanners
Просмотров 2242 месяца назад
In this video, I chat with Vivian Li, from Revopoint 3D to learn about the MIRACO - Standalone 3D Scanner. Vivian shares an "instant demo" scanning my head and face on the spot as in their SIGGRAPH booth. You can learn more about the Revopoint 3D scanners here: www.revopoint3d.com/collections/3d-scanners
SIGGRAPH 2024 - Move4D Volumetric Capture System
Просмотров 4112 месяца назад
In this video, I chat with Matt Bennett at SIGGRAPH 2024 to learn more about the Move4D volumetric motion capture system. You can learn and see more about this system at www.move4d.net/
SIGGRAPH 2024 - Machine Learning & USD Tools in Nuke and Katana with The Foundry's Christy Anzelmo
Просмотров 2262 месяца назад
SIGGRAPH 2024 - Machine Learning & USD Tools in Nuke and Katana with The Foundry's Christy Anzelmo
SIGGRAPH 2024 - What's a SIGGRAPH?
Просмотров 1252 месяца назад
SIGGRAPH 2024 - What's a SIGGRAPH?
SIGGRAPH 2024 - ArcEye/Sony Volumetric Capture System
Просмотров 5072 месяца назад
SIGGRAPH 2024 - ArcEye/Sony Volumetric Capture System
SIGGRAPH 2024 - New LookingGlass "Holographic" Displays
Просмотров 4202 месяца назад
SIGGRAPH 2024 - New LookingGlass "Holographic" Displays
SIGGRAPH 2024 - Vicon Performance Capture
Просмотров 2122 месяца назад
SIGGRAPH 2024 - Vicon Performance Capture
SIGGRAPH2024 - How Manus Mocap Gloves Work
Просмотров 2772 месяца назад
SIGGRAPH2024 - How Manus Mocap Gloves Work
SIGGRAPH 2024 - Kickoff
Просмотров 6162 месяца назад
SIGGRAPH 2024 - Kickoff
Virtual Production Camera Calibration with Vicon Shogun
Просмотров 6512 месяца назад
Virtual Production Camera Calibration with Vicon Shogun
Quick Tip: Importing older "Incompatible" Assets into new versions of Unreal Engine
Просмотров 4143 месяца назад
Quick Tip: Importing older "Incompatible" Assets into new versions of Unreal Engine
Automating CSV Data Updates for Unreal Engine Motion Design
Просмотров 6903 месяца назад
Automating CSV Data Updates for Unreal Engine Motion Design
CSV Driven Motion Graphics with Unreal Engine 5.4 - Video 3 Step-by-Step
Просмотров 6863 месяца назад
CSV Driven Motion Graphics with Unreal Engine 5.4 - Video 3 Step-by-Step
CSV Spreadsheet Driven Motion Graphics with Unreal Engine 5.4 - Video 2 Technical Overview
Просмотров 5324 месяца назад
CSV Spreadsheet Driven Motion Graphics with Unreal Engine 5.4 - Video 2 Technical Overview
Reality Capture Intro for Office Hours
Просмотров 3014 месяца назад
Reality Capture Intro for Office Hours
CSV Spreadsheet Driven Motion Graphics with Unreal Engine 5.4 - Video 1 Introduction
Просмотров 6464 месяца назад
CSV Spreadsheet Driven Motion Graphics with Unreal Engine 5.4 - Video 1 Introduction
UE 5.4 Motion Design Boolean Text Operation
Просмотров 8914 месяца назад
UE 5.4 Motion Design Boolean Text Operation

Комментарии

  • @SpinxSage
    @SpinxSage День назад

    When I export it and bring it into my level sequence, my metahuman does not move. But when checking the animation sequence, the head does move

  • @pawelkikta
    @pawelkikta 2 дня назад

    Thank you Pixel Prof!

  • @keithlynch9846
    @keithlynch9846 2 дня назад

    hi, thanks for this! i have a similar issue but mine is happening with a character that i've rigged in Blender using Auto Rig Pro (so it's rigged to the SK_Mannequin skeleton). So I import it and select that as the skeleton. It looks fine, but when i bring in some animation, it distorts the character like happens with your MH. I tried making your changes to the SK_Mannequin skeleton but didnt do anything (potentially because it's IK rigged). Sorry for the long comment!

  • @vj-baker-88
    @vj-baker-88 3 дня назад

    Great tuto. Do you know if it is possible to trigger play/stop scenes/pages in broadcast module with midi or osc messages ? Thanks for all your videos.

  • @activemotionpictures
    @activemotionpictures 3 дня назад

    Wonderful video. TX 4 sharing. 6:45 the character's source toe ends are dipped. I've seen this in other mocaps.

    • @PixelProf
      @PixelProf 3 дня назад

      Good catch! Yeah, This data is from a “first time using the mocap system” class. I’ll have to cover the Vicon process for correcting this toe issue next time the class is in the studio.😁

    • @activemotionpictures
      @activemotionpictures 3 дня назад

      @@PixelProf Thanks for covering how to edit the source, tho. Really good info. Really useful for these cases: fingers, toe end. Great video!

  • @z2d3dgraphics38
    @z2d3dgraphics38 4 дня назад

    Super cool future… love it… I can’t figure out how to trigger inside particular overlap but not by overwhelming instead by action key … can’t connect conditions is overlapping + action key to trigger sequence attached to overlapping box. My had is spinning in BPs… C++ is so much easier to make it work. Any ideas?

  • @yonghengshouhu
    @yonghengshouhu 5 дней назад

    Boss - Can you tell me more about 5.5 material designers? Almost no one talks about it. , it’s just that the BUG problem of 5.4 has not been fixed yet. . .

    • @PixelProf
      @PixelProf 5 дней назад

      Yup… more on that coming soon! 😁 There are some great updates to Material Designer in 5.5 that make some significant changes from the 5.4 version, but there are a couple glitches in the first 5.5 Prerelease. Looking forward to the updates coming in the next week or so.

    • @johansmas453
      @johansmas453 4 дня назад

      I tried using this rundown system but it had too many bugs. Textures randomly disappeared. Animations glitched showing wrong frame when taking in a new level. Even playlist failed and played the wrong thing. Hope this part of unreal gets fixed asap 🙂

    • @PixelProf
      @PixelProf 4 дня назад

      UE5.5 is working much better than the 5.4 version, but the first pre-release still has a couple glitches in it that should be fixed before full release.

  • @metternich05
    @metternich05 6 дней назад

    Could you do a more comprehensive tutorial on metahuman lip sync? There's almost none out there.

    • @PixelProf
      @PixelProf 6 дней назад

      Sure… are you looking for something on editing animation that already came from performance or audio capture(like this video shows), ….. or “starting from scratch” using only keys & curves on the control board for animating?

    • @metternich05
      @metternich05 5 дней назад

      @@PixelProf Starting from scratch is more accurate :) I'm somewhat familiar with UE, I do environments but haven't even tried metahumans. What I actually have in mind is creating an avatar or virtual character that would be the face of a youtube channel and do all the talking. I'm thinking of something more realistic than the one above, with more complex facial expressions, head movement and even gestures. I'm not sure how much effort this is. Though with all the AI rage out there, avatars will be on the rise, if they aren't already.

  • @rozgoo
    @rozgoo 6 дней назад

    Awesome, how to use transition logic and layers

  • @SanjeevKumar-be4dd
    @SanjeevKumar-be4dd 7 дней назад

    I tried to export to blender....Mesh & Armature are poles apart....soo frustating

    • @PixelProf
      @PixelProf 7 дней назад

      Erg...I don't use Blender (yet) so not sure what can be done there. Will have to dig into it sometime in the near future.

  • @НиколаГаврић-ш9э
    @НиколаГаврић-ш9э 7 дней назад

    You saved me. Thank you!!!

  • @Svi3sa
    @Svi3sa 7 дней назад

    Thank you for making such a quick and clear tutorial!!

  • @OwenMcAteer
    @OwenMcAteer 7 дней назад

    Amazing video, thank you. Helped me solve this issue in UE5.4 and it worked perfectly!

  • @duchmais7120
    @duchmais7120 9 дней назад

    Awesome. I recall trying something like this way back using Blender phonemes/visemes (mouth shapes)...A, O, E, W/R, T/S, L/N, U/Q, M/B/P, F/V to aid in Lip Sync. Where each phoneme was associated with specific lip and facial movements. Haven't tried it in years. Waiting to try this out once they Release of the Unreal Engine 5.5. The Eye movements (Blinking) is what is puzzling Me.

  • @Felix-iv2ns
    @Felix-iv2ns 9 дней назад

    runtime?

  • @nielslesliepringle3143
    @nielslesliepringle3143 9 дней назад

    It's really not bad. But the facial expression still looks like someone who took an overdose of Botox.

  • @olavandreasterrasource8034
    @olavandreasterrasource8034 9 дней назад

    now how do can i use the eyes i cant find it out

    • @PixelProf
      @PixelProf 9 дней назад

      I'll work on a follow-up video that shows how to apply adjustments and otherwise animate the face and eyes in conjunction with the viseme results from this tool, but it is basically to use sequencer to bake this result onto a face control board, and layer on additional animation inputs.

    • @olavandreasterrasource8034
      @olavandreasterrasource8034 8 дней назад

      i am looking forward to see your next videos thanks so much

  • @Monoville
    @Monoville 9 дней назад

    Great simple tutorial, so glad they've implemented this. Gave it a couple of tests and the results are very impressive, particularly when combined with the facial animations from city sample to replace the eye movements/blinking. Will still use the previous animator method for accurate facial, but this is a great quick method.

  • @tbowren
    @tbowren 10 дней назад

    For sure this was my most favorite thing I saw and demoed at Unreal Fest. Thanks for making a video so quickly!

    • @PixelProf
      @PixelProf 10 дней назад

      Hah! Thanks Tony!

  • @renderme2550
    @renderme2550 10 дней назад

    Thank you for doing this. I've been using Iclone 8's Acculips to Unreal Metahumans via livelink. Combined with iclone's "Digital Soul" and other powerful animation tools this is a game changer.

  • @RichardRiegel
    @RichardRiegel 10 дней назад

    Yap... then export it via FBX, import to 5.4 and smoke it.. works perfect for now :)

    • @SpinxSage
      @SpinxSage День назад

      I haven't gotten lucky with this, any advice?

  • @olavandreasterrasource8034
    @olavandreasterrasource8034 10 дней назад

    thanks you so much i have been looking for this tutorial for weeks

  • @pondeify
    @pondeify 10 дней назад

    oh man this is awesome - thanks for sharing!

  • @ke_sahn
    @ke_sahn 10 дней назад

    is it possible to combine this with a performance capture?

    • @PixelProf
      @PixelProf 10 дней назад

      Yup. Body motion can be applied independently and other facial performance can be layered on with Unreal’s Sequencer.

  • @tmaintv
    @tmaintv 11 дней назад

    Very interesting thanks. What can u do with the depth input?

    • @PixelProf
      @PixelProf 10 дней назад

      Depth input is to process data captured with the LiveLink Face iOS app when it’s set to “MetaHuman Animator” mode.

  • @massinissa8697
    @massinissa8697 11 дней назад

    it is already in Nvidia omniverse app with some expression faces !

  • @jonaltschuler2024
    @jonaltschuler2024 11 дней назад

    Hoping to try this for real time streamed audio, but looks like it’s not there yet. If anyone has suggestions for that, please let me know 🙏

    • @PixelProf
      @PixelProf 11 дней назад

      Yup. Fingers crossed that's a near-future thing, but this function (for now) is recorded audio/post-process only.

  • @lordnaps
    @lordnaps 11 дней назад

    pretty cool but the lack of any emotion on the rest of the face because its only tracking audio and not a facial structure makes me wonder what this could be used for practically

    • @PixelProf
      @PixelProf 11 дней назад

      Since Unreal has an extremely capable (ind continuously improving), layered animation system, the synchronized viseme animation from this process can be readily combined with other motion capture, hand-keyed or procedural animation sources. For example, a LiveLink Face performance capture can be quite effective for overall face expressions, but lacks fidelity around the mouth, so something I’m hoping to experiment with soon is using this to process the audio from a LiveLink capture, and then art-direct the layering of results from both performance capture methods. Also, I imagine this is a step in the direction of realtime audio->animation, which could facilitate performance based on realtime voice generation systems (just speculation on my part for now)

    • @lordnaps
      @lordnaps 11 дней назад

      @@PixelProfwould love to see that idea of mixing the live link and facial cap. but i definitely seeing it going in that direction

    • @xavierhatten9011
      @xavierhatten9011 9 дней назад

      @@PixelProf ok I"ve done this with Livelink Face app you can move the head In real time to make it more human like but the eyes needs to be able to blink more but you can run the animation and do the head movement at the same time

    • @Ronaldograxa
      @Ronaldograxa 9 дней назад

      I wanted to use meta humans to create digital avatars but I see tools like hey gen coming up and I really wonder if meta human is actually worth the time.. AI is moving too quick

    • @ETT-b6q
      @ETT-b6q 6 дней назад

      Great tutorial. I’m trying to get additional animation on top of that audio file to animation. Normally I would bake mocap to rig and add additive- but with this facial mocap it breaks or reduces the lip sync to very small movements. How would you go about it?

  • @virtualworldsbyloff
    @virtualworldsbyloff 11 дней назад

    Animate = Lipsync = CLICKBAIT

  • @ValicsLehel
    @ValicsLehel 11 дней назад

    If can be Blueprinted and feed a wav to playback as a stream would be perfect.

  • @dddharmesh
    @dddharmesh 11 дней назад

    Amazing. So quick and easy to follow.... Thanks

  • @aerospacenews
    @aerospacenews 11 дней назад

    Appreciate the effort that went into this video and that Epic is rolling out the capability @PixelProf

  • @commontimeproductions
    @commontimeproductions 11 дней назад

    This is amazing! Thanks so much for sharing that this is now available. After the last "free" plugin that was allowing audio to lip sync decided to stop all of us using it for free and started charging obscene prices and deemed only for large studios, so many of us indie devs were completely derailed in using the metahuman lip sync functionality. Epic is amazing for making this a part of the engine finally. No longer will we be stopped by a pay wall for our projects. Hooray! I will be sharing your video :)

  • @nsyed3d
    @nsyed3d 12 дней назад

    This is great, Thank you.

  • @Finalfootagefilms
    @Finalfootagefilms 12 дней назад

    Are you on Mac or Windows? Somehow I am not able to activated BMD (8K Pro) on Mbp (M1 max with Expension) in UE5 (with BMD plugin enabled). Every other software is working fine with 8K pro

    • @PixelProf
      @PixelProf 12 дней назад

      Unreal can only output Blackmagic Media Capture tools on Windows machines.

    • @Finalfootagefilms
      @Finalfootagefilms 12 дней назад

      @@PixelProf agreed, this apple vs Epic game beef is making it very hard for content creators. Hope they find solution soon.

  • @appliedclinicalvr2359
    @appliedclinicalvr2359 12 дней назад

    Is there anyway at all to package a project as a standalone .exe file so that the Livelink driven avatar Metahuman, or one inported from Characer Creator) still works? A VTuber was able to do this with v4.26 years ago, but it involved really unstable 3rd party plugins and most responders to that tutorial could not get it to work, myself included.

  • @projetsgroupeactif
    @projetsgroupeactif 12 дней назад

    Exciting! Yes, my dream is to make a science/history video game based on Unreal Engine 5. Since my computer is a MacBook Pro, I was wondering if I could download Unreal Engine 5 from it. Video fully viewed.

  • @ajbrasil266
    @ajbrasil266 13 дней назад

    Thank you so much for figuring this out and sharing a video about it Do you have a clue how I could make the VR-spectator cam (from the UE 5.5 XR-Template) stream via NDI? As I am missing the media viewport capture (from the Virtual Production Tools), I was not able to get it to work. (What I am also missing is a way to decide between HX3 and "uncompressed" NDI streams or making it working on a Mac or IOS...). Have you figured out a way to make this work?

  • @neXib
    @neXib 16 дней назад

    Really needed this playlist, great stuff, well explained.

  • @Revs_Of_F1
    @Revs_Of_F1 16 дней назад

    Interesting can’t wait to get 5.5

  • @YihuaLi-u2n
    @YihuaLi-u2n 19 дней назад

    Hi, I have a question,if I got SVG have animation information how can I play it in UE5 system?

    • @PixelProf
      @PixelProf 19 дней назад

      UE's SVG import doesn't internalize SVG animation... it's really just intended for still logos and shapes.

  • @thejetshowlive
    @thejetshowlive 21 день назад

    Hi Nick! Tried importing this into Blender, the fbx came in looking like a monster. Boo! I'll try import the fbx into Mobu. Fingers Crossed!

    • @PixelProf
      @PixelProf 19 дней назад

      Did MoBu/Maya work better? For me, these seem to import the files well. I’m not sure of what the “expectations” of Blender’s fbx importer are.

  • @FilipMasiulewicz
    @FilipMasiulewicz 21 день назад

    What about corrective joints? They are not animated properly with this solution, unfortunately.

    • @PixelProf
      @PixelProf 19 дней назад

      Those should be driven by blueprints in unreal rather than animated by an external tool. External tools should just animate on the primary joints and avoid changing anything on the leaf joints.

    • @filipmasiulewicz5036
      @filipmasiulewicz5036 18 дней назад

      @@PixelProf If You export animation from Maya or MoBu wihout animated corrective joints then in UE those joints won't be animated automatically. That is why MH in Maya are built with two constrained skeletons by default, I presume.

    • @PixelProf
      @PixelProf 18 дней назад

      @@filipmasiulewicz5036 I'm at Unreal Fest this week, but will check/test when I'm back next week. I think the idea behind the driver skeleton in Maya is that you can get the proper deformations for cloth and other physics interactions there. As long as the MH Animation Blueprints and Post Process blueprints haven't been altered or deactivated, those should be driving the corrective joints and shapes in UE. If something external is overriding them, you could try bringing the animation into Sequencer, then baking to ControlRig, then back to a new Animation Sequence to get the appropriate corrections applied.

  • @tomcatCfilo
    @tomcatCfilo 22 дня назад

    Thank you

  • @blendragon28
    @blendragon28 25 дней назад

    Wow after I just purchased Offworld Live too

  • @AlanAstle
    @AlanAstle 25 дней назад

    I don’t know what NDI media output is to be excited about it.

    • @PixelProf
      @PixelProf 25 дней назад

      Being built in, it’s natively integrated into the Motion Design broadcast graphics Rundown/Playlist system, making it turnkey to connect with NDI-based broadcast and streaming setups without needing to configure render targets or even install any external plugins. To me, that’s awesome.🤓

    • @dwovowb
      @dwovowb 16 дней назад

      NDI allows you to send out a key and fill output out over a network card instead of outputting to SDI or HDMI

    • @AlanAstle
      @AlanAstle 16 дней назад

      @@dwovowb interesting

  • @nicop.exemusic
    @nicop.exemusic 25 дней назад

    Can't waiit to test it!

  • @violentpixelation5486
    @violentpixelation5486 25 дней назад

    Oh wow. 💯👌❤️ #unrealengine

  • @IndoorsShow
    @IndoorsShow 25 дней назад

    Alpha channel, working?

    • @PixelProf
      @PixelProf 19 дней назад

      Yup….NDI Alpha Channel with Motion Design (an Unreal 5.5 "Sneek Peek") ruclips.net/video/oxuSC_fVEIE/видео.html

  • @patfish3291
    @patfish3291 29 дней назад

    Hi Nicholas! I am currently trying to set up a little virtual production Studio with an OptiTrack Mocap system (Prime x22), green screen, a BMD Ultimatte 4K Keyer. In combination with a PC-Workstation with a BMD 8K Capture Card, a 4090, and a BMD Ursa Mini Pro G2 camera. 1) I connected my time code and the BMD Genlock generator with my eSync-Box of the Mocap System. But I'm not 100% sure how to get Genlock and Time code simultaneously into the Camera and the Capture card of the PC. Do you have any advice for me? 2) Do you recommend tow separate time code generator (ext) for the eSync-Box of the Mocap System and one for the Camera? regards Patrick

    • @PixelProf
      @PixelProf 28 дней назад

      You can connect additional outputs from the BMD Genlock generator to the Ref inputs on the Ursa, Ultimatte and Decklink. That should cover genlock sync across the systems. The camera only "needs" timecode if you'll be recording on it. If the camera is only used for "live-only", you can ignore it's timecode. As of UE5.4, UE should be able to get timecode from the mocap system via livelink, so you can use that for UE timecode (along with the Decklink genlock signal for lock). Hope this helps.

    • @patfish3291
      @patfish3291 27 дней назад

      @@PixelProf Great! that helps a lot! ...I thought the camera needs both at live streaming to unreal :) ... thx 🙏