Ordinary Alan
Ordinary Alan
  • Видео 118
  • Просмотров 75 979
Good News
Short Julie and Sally skit created in Ordinary Animator.
Sound effect from SoundEffectsFactory (www.youtube.com/@SoundEffectsFactory).
"News Room" (skfb.ly/oMFUV) by Md.Faruk is licensed under Creative Commons Attribution (creativecommons.org/licenses/by/4.0/).
Просмотров: 31

Видео

Self Improvement (Omniverse render)
Просмотров 35Месяц назад
Ordinary Animator test, creating and exporting a scene as USD so it can be rendered in NVIDIA Omniverse. The characters were created by VRoid Studio (free for desktops and iPad). Music is "Saturday Shopping" by Young Rich Pixels, courtesy of RUclips Music Library. Animation assembled by ordinaryanimator.com. #nvidiaomniverse #ordinaryanimator #vroidstudio #vroid
Self Improvement
Просмотров 35Месяц назад
This is an early test video of Ordinary Animator. It simplifies creating simple animations from a screenplay such as "Julie turns and looks at Sally". The characters were created by VRoid Studio (free for desktops and iPad). Music is "Saturday Shopping" by Young Rich Pixels, courtesy of RUclips Music Library.
Camera Panning Demo (with subtitles)
Просмотров 782 месяца назад
I am working on a web based animation project. Type in a screenplay, it animates VRoid Studio created characters. This demo shows camera panning working. Not 100% reliable yet, but making progress. Screenplay: INT. In Sally's Bedroom - morning [2b] Sally is sitting on the coffee table. Julie is sitting on the sofa. Full body shot of Julie. Pan camera slowly to a distant camera shot of Sally. Sa...
Head Turn Test
Просмотров 402 месяца назад
Just when you think you have nailed head turns... A work in progress. This animation was generated from the following screenplay text (camera positions are manually done at present, and no audio yet). Now, is it a bug in head rotation calculations, or AI adding more attitude than I was expecting!!! INT. In Sally's Bedroom - morning [2b] Sally is sitting on the coffee table and Julie is sitting ...
Omniverse and UsdPreviewSurface for portable materials
Просмотров 1463 месяца назад
Last video I hacked an NVIDIA Omniverse material to add in my own texture file. This little video shows using the more portable UsdPreviewSurface material to load up your own texture (PNG etc) file. There is also a trick in Omniverse where it will use Omniverse more advanced materials if available, falling back to the less powerful but more portable UsdPreviewSurface in other systems.
Omniverse - lowering viewport resolution and quick material creation
Просмотров 723 месяца назад
Someone on Discord was trying to create their own material in Omniverse with their on PNG file. They were also running low on GPU memory. So here is a quick video showing how to reduce GPU memory usage, and a quick and dirty way to create a material with your own PNG file
Psalm 23: The Lord is My Shepherd
Просмотров 204 месяца назад
A comforting and timeless Psalm expressing trust and faith in the guidance and protection of the Lord. It depicts a journey of spiritual nourishment, peace, and assurance, even in the face of challenges and enemies. The imagery of green pastures, still waters, and divine presence conveys a message of eternal goodness and security in the Lord's care. This video was created using #mootion.com #fa...
Omniverse Stage Recorder for Baking Animations
Просмотров 2064 месяца назад
Mat on Discord was asking about how to bake an Omniverse animation (in a PushGraph) to standard OpenUSD timeSample data. I thought I would record this quick video as it shows how to do it, including using sublayers. #Omniverse #NVIDIA #OpenUSD #USD #Animation #Baking 00:00 Introduction 00:44 Marker 2 02:13 Marker 1
Why Is #OpenUSD Good for Teams?
Просмотров 105 месяцев назад
Why is #OpenUSD good for teams? Traditional file formats typically don’t support collaborative editing. This has been changing over time with tools like Google Docs, but it is hard to have complex data structures representing 3D scenes being modified by multiple people at once. Pixar faced this problem when creating animated movies. An animated movie includes storyboards, character design, rigg...
What are common use cases for #OpenUSD?
Просмотров 125 месяцев назад
OpenUSD is a file format for capturing 3D scenes. In this video I run through use cases OpenUSD is being used for, including animation, digital twins, simulations, and robotics. OpenUSD was originally just USD, Pixar’s Universal Scene Description format. It was designed to help manage animated movie production with large teams. Well, it turns out this is also useful in other industries. For exa...
Is #OpenUSD better than FBX?
Просмотров 315 месяцев назад
OpenUSD and FBX are both 3D modeling file formats. FBX has been widely used, but OpenUSD is gaining traction. Which is better? #openusd #usd #usda #fbx #nvidia #apple #autodesk #pixar #3d #3danimation #omniverse #shorts
As a Programmer, what should I know about #openusd
Просмотров 255 месяцев назад
As a Programmer, what should I know about #openusd. If you are a programmer with familiarity in file formats such as JSON and XML, and want a peek at OpenUSD? OpenUSD was originally from Pixar, used to capture scenes for animated movies. More recently it is being adopted for a wide range of use cases such as robotics, simulation, architecture, and digital twins. It is the main format used by NV...
First OpenUSD Render in Unreal Engine
Просмотров 1078 месяцев назад
First OpenUSD Render in Unreal Engine
Use an Avatar for Voiceovers with Ordinary Animator (for free)
Просмотров 778 месяцев назад
Use an Avatar for Voiceovers with Ordinary Animator (for free)
in3D (iphone app) test run
Просмотров 1268 месяцев назад
in3D (iphone app) test run
Generative AI for Background Images and Omniverse - short demo
Просмотров 12210 месяцев назад
Generative AI for Background Images and Omniverse - short demo
Using AI to speed up animated video creation in Ordinary Animator
Просмотров 8210 месяцев назад
Using AI to speed up animated video creation in Ordinary Animator
Happy Halloween 2023
Просмотров 5211 месяцев назад
Happy Halloween 2023
Sketchfab + NVIDIA Audio2Face = Animated Mr Pumpkin Head!
Просмотров 308Год назад
Sketchfab NVIDIA Audio2Face = Animated Mr Pumpkin Head!
Wind Animation via Blend Shapes in NVIDIA Omniverse
Просмотров 250Год назад
Wind Animation via Blend Shapes in NVIDIA Omniverse
9. Ref files
Просмотров 115Год назад
9. Ref files
8. Default Prims in USD Files
Просмотров 363Год назад
8. Default Prims in USD Files
7. References and Payloads in USD
Просмотров 705Год назад
7. References and Payloads in USD
NVIDIA Omniverse Path Tracking render test.
Просмотров 104Год назад
NVIDIA Omniverse Path Tracking render test.
6. It's a bird! It's a plane! ... No, it's a bird! (GLB and USD)
Просмотров 65Год назад
6. It's a bird! It's a plane! ... No, it's a bird! (GLB and USD)
5. Under Cover with USD
Просмотров 93Год назад
5. Under Cover with USD
4. The First Stage
Просмотров 103Год назад
4. The First Stage
3. Prim and Proper(ties) in USD
Просмотров 175Год назад
3. Prim and Proper(ties) in USD
2. Installing Omniverse
Просмотров 522Год назад
2. Installing Omniverse

Комментарии

  • @gmangman123
    @gmangman123 Месяц назад

    thank you for tutorial

  • @redaboucetta1846
    @redaboucetta1846 Месяц назад

    Woow precious information,thank u

  • @varispekka
    @varispekka Месяц назад

    Ah, dynamic hair in Omniverse, sweet!!

    • @XOrdinary99
      @XOrdinary99 Месяц назад

      Actually, the dynamic hair is done in JavaScript libraries (in the browser), which I then bake and export to USD/Omniverse as bone animation.

  • @werewolfstudiios
    @werewolfstudiios 2 месяца назад

    Going great friend! Adopting AI was a good move. Those rotations can be done right, it may be AI.

  • @27harishvk
    @27harishvk 3 месяца назад

    hello. I want to learn ominiverse.. looks like you could help me get good start. Do you have any paid interactive sessions?.

  • @27harishvk
    @27harishvk 3 месяца назад

    we no more have this launcher available to download. We need to get licensed portal access to download.

  • @TomasSwiftMetcalfe
    @TomasSwiftMetcalfe 3 месяца назад

    The depth resolution isn't reliable with mediapipe/Blazepose if the full person isn't in frame. What helps is to have all the points in frame and make as flat an image as possible (zoom out, stand square to the camera).

  • @Sodomantis
    @Sodomantis 3 месяца назад

    Is it even possible to make it less intuitive? Sequences blow.

  • @torusx8564
    @torusx8564 6 месяцев назад

    First (Nice even if the video is old)

  • @biocronic2986
    @biocronic2986 7 месяцев назад

    ehm, aaaaaa, ahhhh , ahhhhh , aaa a a a sorry cannot hear your words :D what a terrible voice

  • @Milo_Toto
    @Milo_Toto 8 месяцев назад

    is that unity hard to learn sir?, I dont know about this at all, I am only using blender, zbrush, and this ez motion recorder, can i export the motion out, I want to get the keypose out of this and ajust it myself

  • @savvyshopperr
    @savvyshopperr 10 месяцев назад

    very cool

  • @senselessplays
    @senselessplays 10 месяцев назад

    Can you share the code or tutorial of this project

  • @arnosolo2008
    @arnosolo2008 Год назад

    Thank you.

  • @hypersasquatch1
    @hypersasquatch1 Год назад

    evaluating the tool for my next project. not overly excited about it so far. Your approach is similar to how i was planning to use it.. hope it works out for you !

  • @stephanied5668
    @stephanied5668 Год назад

    Hey thank you for this course, it was helpful :)

  • @soetdjuret
    @soetdjuret Год назад

    Thanks, this was really helpful. Keep these awesome tutorials coming! By the way your dog is so cute!

  • @werewolfstudiios
    @werewolfstudiios Год назад

    Hmm.. seems great! Gotta try out

  • @markgarrett2946
    @markgarrett2946 Год назад

    promo sm 🙄

  • @talkandedit8714
    @talkandedit8714 Год назад

    Interesting...I never thought of it that way

  • @Leyverse
    @Leyverse Год назад

    Very cool what you are doing. It would be really nice if there was a 3D equivalent for Rhubard. To create lypsinc from 3d avatars. Do you have a discord where we can talk? I would like to know more about the pipeline if you don't mind.

    • @XOrdinary99
      @XOrdinary99 Год назад

      audio2face from NVIDIA Omniverse looks interesting. Unlike some solutions, it seems to be mesh based rather than blendshape based, so it can cope with a wider range of meshes. I am poking around the NDVIDI Omniverse discord at the moment learning more (XOrdinary99), and the Unity discord.

    • @rindtier7287
      @rindtier7287 Год назад

      ​@@XOrdinary99hey do you make it work? How fast is the API for audi2visemes by nvidia? Like for a 10s sound, will the user wait a lot? This would kill the ux.

    • @XOrdinary99
      @XOrdinary99 Год назад

      @@rindtier7287 Try looking for NVIDIA Ace and Omniverse Convai - they are getting it going real time with player NPCs. Pretty cool stuff.

  • @mm.mov3d895
    @mm.mov3d895 Год назад

    thanks for sharing your experiences 😍

  • @golosbezdoka
    @golosbezdoka Год назад

    Good tutorial. I did everything as it was done here, but I'm facing a problem. As soon as the clip ends or a new one starts, my character's position is being resest. Do you know what might be the issue? Thanks :)

    • @extraordinarytv3483
      @extraordinarytv3483 Год назад

      If two clips in a row, what I normally do is right click on second clip and there is an option to "align with previous clip"

    • @golosbezdoka
      @golosbezdoka Год назад

      @@extraordinarytv3483 Thanks, I tried that. didn't help :( It looks like people have the same issue again and again for the last 7 years and unity doesn't bother to fix faything :D thera are hundreds of topics on fotums and still no clear answer.

    • @extraordinarytv3483
      @extraordinarytv3483 Год назад

      @@golosbezdoka Have you tried asking on the official Unity Discord? There is an "animation" channel there (I sometimes poke my nose there too). There is some complexity with root motion, clip types, etc. On Discord its easier to share screenshots etc.

    • @golosbezdoka
      @golosbezdoka Год назад

      @@extraordinarytv3483 oh, thanks! creating a Discord account :)

    • @golosbezdoka
      @golosbezdoka Год назад

      @@extraordinarytv3483 I have found an answer. It was on making my animations Humanoid. I'm not sure if it works for objects with wheels but I I'm having humanoid robots so it's ok. Thanks again :)

  • @yutaro2449
    @yutaro2449 Год назад

    My avatar does not have a blendshape, what do I do?

    • @XOrdinary99
      @XOrdinary99 Год назад

      If you rig bones onto your avatar for eyebrows etc, you can animate those bones instead. Either that, or add blendshapes in Blender or similar (Blender calls them "shape keys").

  • @ggwp8618
    @ggwp8618 2 года назад

    Is it possible to somehow change the Animator of a animation track during run time? I have a game with 4 characters and i want the animator of a character user selects. (All 4 characters have same animations)

    • @XOrdinary99
      @XOrdinary99 2 года назад

      Not 100% sure I understand exactly what you are doing, but I have discovered there is an official Unity Discord server which has a dedicated Animation channel. Lots of smart people there answering questions like this.

    • @ggwp8618
      @ggwp8618 2 года назад

      @@XOrdinary99 thanks for the reply. I have a timeline at the start of every scene with a 1 animator track and 4 characters. At the moment the animator track has the 1st characters which i have dragged and droped in the animator track. Lets say user selected the 2nd character. Now i want the animator track to automtically get the 2nd character at the start of the scene via some code or configuration. I will post my thread on discord as well.

    • @XOrdinary99
      @XOrdinary99 2 года назад

      @@ggwp8618 I am not sure the "best" way, but maybe you could update the "track bindings" which associate the object to be animated with the animation track? The track asset itself does not know what it is animating. The track bindings are the details that links a game object to the animation track so the animation track knows what to animate.

  • @mackleonard
    @mackleonard 2 года назад

    Helpful tutorial thanks for sharing!

  • @joewannimozzarella
    @joewannimozzarella 2 года назад

    Is it possible to track movement and face directly in Unity? Vseeface is only made in Unity, so there must be something that allows you to do it directly without Vseeface?

    • @XOrdinary99
      @XOrdinary99 2 года назад

      It might exist, but never heard of it myself. There are other libraries of code you need to use to capture camera, track facial signposts, etc. Gluing the VSeeFace code base directly into Unity would mean it cannot be used with anything else. The VMC protocol greatly simplifies cross platform compatibility, and versioning etc. (Unity changes their platform, so things change.) So technically it is possible - I just have not heard of anyone going that route (and I can understand why they have not).

  • @SyanTheBee
    @SyanTheBee 2 года назад

    Is it possible to pose your animation clips as you go? I only ask because I've gotten so far in animating in Unity, but when it comes to clips I want to make my own, not always use ones from Miximo, there's some specific clips I want to do, like I want to pose my models, something like that

    • @XOrdinary99
      @XOrdinary99 2 года назад

      You can certainly create your own clips - I do that mainly for things like facial expressions though because posing characters is a bit painful in Unity. Using the Animation window you can edit an animation clip. You can record the rotations per bone, but if you use Animation Rigging package you can record the IK positions for hands and legs, which is much easier (but less control). I used to use uMotionPro (I think i was called), but you create your clips in its own format and export to an animation clip. It works, but if you want to make a change you have to go back to its format, change it, and export again. The extra step was a bit annoying.

    • @SyanTheBee
      @SyanTheBee 2 года назад

      @@XOrdinary99 Ah, alright then, I thought there was a less painful way XD

  • @JakeLondonRivers
    @JakeLondonRivers 2 года назад

    Looks so good!

  • @MelonyBS
    @MelonyBS 2 года назад

    How do you import custom characters and models? And where can I find them?

    • @XOrdinary99
      @XOrdinary99 2 года назад

      VRoid Studio is a free download from Pixiv for creating 3D models and exporting them in VRM file format. (Lots of videos on how to use it on RUclips.) VRM files can be imported directly into PlayAnimaker (I forgot exact command, but it was pretty easy).

    • @MelonyBS
      @MelonyBS 2 года назад

      @@XOrdinary99 do I use sidequest And then go into the Anna make her file and put the VRM in there? Because I tried that and it didn’t show up and game when I clicked import.

    • @XOrdinary99
      @XOrdinary99 2 года назад

      @@MelonyBS If you check the "About" tab for this channel, I just created a Discord server where its easier to chat than RUclips comments. No guarantees on response rates though(!)

  • @notafunctionalhuman1335
    @notafunctionalhuman1335 2 года назад

    Hi there! I'm currently working on trying to export a custom model into a VRM and have come into a lot of trouble with the blend shapes The main issue is that I can't find any proper documentation on how to make a blend shape clip actually modify the blend shapes of a model, nor on what a blend shape clip is, more than just what it does Given that, I've been quite stuck trying to get the model to actually function outside of VRC, which is what it was initially made for, and where it works without issue seeing how you seem to have a good amount of experience with UniVRM, I was hoping you could help me with understanding how to get blend shapes to work with models If it helps any, my unity version is 2019.4.34f1, and UniVRM version is 0.93.0

    • @alanjameskent
      @alanjameskent 2 года назад

      Simplifying... Blendshapes in Unity (Shape keys in Blender) is like a series of mesh distortions - move the points near the corner of the mouth up a bit because I want a smile. A "0" strength means don't make any changes to the default face, a "1" means add the full set of changes (relative movements). So a blendshape is like copying the mesh of say the face, then adjusting it for a particular expression (smile, sad, angry, etc). Blendshapes were great for different facial expressions in particular. // AR Kit then came along and said "rather than have one blendshape for angry, lets have 52 little independent blendshapes and use several at once to make expressions". So angry = left and right corners of mouth down, eyebrows down, eyebrows closer together, eyes closed a bit more... etc. In a VRoid Stuido character, you can look at "Face" and if you look on Skinned Mesh Renderer, you will find blendshapes there with slider strengths. // A "blendshape clip" is specific to VRM - it is a set of weights for all the blendshapes. E.g. you can define an "angry" blendshape clip" (really, more like a pose if you ask me) that says "use these weights for all the blendshapes to make the face look angry". The deltas for each blendshape and the base mesh for the face are added together to achieve the final result. (You can also change materials, but ignore that for now.). Does that help?

    • @notafunctionalhuman1335
      @notafunctionalhuman1335 2 года назад

      @@alanjameskent thanks for the explanation onblendshape clips, though, i do already know what blendshapes are it does partially help me understand what the clips are, but i still have no idea how to get them to work

    • @alanjameskent
      @alanjameskent 2 года назад

      @@notafunctionalhuman1335 *preset (not present!)

    • @XOrdinary99
      @XOrdinary99 2 года назад

      It looks like RUclips deletes my responses that contain URLs for some reason (spam protection?), so I created a Discord server to make Q&A easier. See the "About" tab of this channel to find the link. No promises I will see messages immediately!

  • @maxiaohuimaxiao
    @maxiaohuimaxiao 2 года назад

    继续更新呀兄弟!

  • @dungeondelves
    @dungeondelves 2 года назад

    This is very cool test! May I ask how you set up the Character animator side? You said you used transforms instead of draggers, how exactly did you apply that to the limbs? Is it just moving rotations around a center point? Are the midi controls just for swapping out the hand assets? I love that the hands change depending on their position with the Vive controllers!

    • @alanjameskent
      @alanjameskent 2 года назад

      I put the Transform behavior on each hand then the midi events were wired up up the position x/y values and the rotation value. So moving the controllers was like adjusting sliders. I also used midi events to apply triggers to switch the hand. But character animator now has built in full body tracking from a webcam, so I would look at that first.

  • @maxiaohuimaxiao
    @maxiaohuimaxiao 2 года назад

    4:11处的构图喜欢!

  • @maxiaohuimaxiao
    @maxiaohuimaxiao 2 года назад

    非常有帮助的教程,感谢分享,除了教程内容外,我对您的动画作品更感兴趣,请问这是动画连续剧吗?在哪儿能看到?

    • @XOrdinary99
      @XOrdinary99 2 года назад

      It is early on, but I blog and release episodes at extra-ordinary.tv/ - only one out so far, next coming soon. episodes.extra-ordinary.tv/

    • @maxiaohuimaxiao
      @maxiaohuimaxiao 2 года назад

      @@XOrdinary99 哇!居然有神秘客!还挺想知道后面会发生什么,这个故事从剧本到故事板再到动画制作都是您一个人完成的吗?难度大吗?一共需要多久才能从零创作出来成品?

    • @XOrdinary99
      @XOrdinary99 2 года назад

      @@maxiaohuimaxiao Yes, I wrote script, designed characters (VRoid Studio), animated (Unity), created web site for publishing. It has taken a long time to learn all the tools. I was planning to blog more on how I do it in case others want to try. Static comics (no motion) would be MUCH quicker. But it has been fun learning it all!

    • @maxiaohuimaxiao
      @maxiaohuimaxiao 2 года назад

      @@XOrdinary99 太棒了,一个人完成的快乐我能想象得到,只需按照自己的想法来创作,完全可以把自己想要的内容讲述出来,真是个不错的方案!

  • @Reanimathor
    @Reanimathor 2 года назад

    I was so frustrated to try to do it in just Timeline tool. Jesus. Unity seems to have to worst documentation regarding this topic. The problem I have is that my Unity (Personal) does not show Sequences and I have no idea how to get them. Google seem to have no solution for me. I have to download a package probably, but how?!

    • @XOrdinary99
      @XOrdinary99 2 года назад

      What version of Unity are you using? I just upgraded recently to 2022.1b12 trying to get Rigging Animation and Sequences working together. In a new version there is a new "Feature Set" feature that makes it easier (don't remember when introduced), but waiting for a good stable combination.

    • @Reanimathor
      @Reanimathor 2 года назад

      ok, im in 2020 for some reason.... dear lord.

    • @Reanimathor
      @Reanimathor 2 года назад

      @@XOrdinary99 Thanks for the answer. I haven't seen it before realizing myself, that my packages look differently, which was odd. I'm downloading both 2021 and 2022 beta right now. We'll see:) I will take a look at your video to learn this stuff, as I have no idea about Unity:) (Unreal user here, learning for my students who are using Unity :D)

    • @extraordinarytv3483
      @extraordinarytv3483 2 года назад

      @@Reanimathor I would love to get your opinion on the pros and cons of each! I am using Unity as the VRoid Studio community mainly uses Unity, but UE5 has some cool looking visual capabilities. For me, being able to ask the community for help is probably more important.

    • @XOrdinary99
      @XOrdinary99 2 года назад

      (Hmmm, I left a comment using wrong account but cannot view it. Posting again.) I wrote up a blog on version numbers I am using in case helpful. extra-ordinary.tv/2022/04/23/unity-sequences-and-animation-rigging-versions-april-2022-update/

  • @march9004
    @march9004 2 года назад

    Hi ive exported a character from Vroid and imported it into Unity but when I try to preview the blendshapes the avatar doesnt do them. I clicked on blendhape avatar and select blink. sad etc. but nothing happens on my avatars face. Is there something I did wrong in the export or import? Im using the new version of Vroid if that makes a difference

    • @XOrdinary99
      @XOrdinary99 2 года назад

      Strange! What version of VRoid Studio, UnivRM, and Unity are you using? I would drag the character into a scene in Unity, then click to expand the character so you can see the Face child object, then click Face and look in the Inpsector panel. Expand the Skinned Mesh Renderer component. In there, expand the Blendshape List. You should see sliders per expression etc. If you drag blink there, does that work? This will help work out if the blendshape is broken or the blendshape clip that references the blendshape. If you want more interactive help, I suggest visiting twitter.com/suvidriel/status/1353650757778735109?s=21 - she has a Discord server with a channel on Unity and Vroid. Easier to share screenshots there.

  • @-._.-._.-._.-._.-._.-
    @-._.-._.-._.-._.-._.- 2 года назад

    Hi, I've been looking for options to send Midi "triggers" (I don't know the real term) to Adobe Character Animator. It has nothing to do with the HTC Vive but since you seem pretty experienced with the Midi capabilities of CH, I was wondering if you could redirect me on the right path. I will be using Touch Portal as a control panel for my puppet, and I would like to get more than 36 triggers for the whole scene. Thank you. Visual.

    • @XOrdinary99
      @XOrdinary99 2 года назад

      Hmmm. I replied on phone, but not showing up here. So repeating just in case. I suggest posting to the Character Animator forums - its easier to talk through the issue there (can paste screenshots for example). MIDI has "events" like "note down". In Character Animator in rigging mode you can tell it to "listen for a midi event and bind that to the trigger" so its triggered by a MIDI note press instead of a keyboard key press. (You can do more, like sliders, but if its just triggers then that is just note presses.) There are a few threads on MIDI there too that might help, like... community.adobe.com/t5/character-animator-discussions/midi-channel-numbers-with-character-animator/td-p/10106393

    • @-._.-._.-._.-._.-._.-
      @-._.-._.-._.-._.-._.- 2 года назад

      Thank you for the reply. After sending the comment, I checked a lot of different Virtual Midi "drivers", and I found LoopMidi. It works flawlessly with Character Animator and Touch Portal. (I just have to use two strings for one trigger on Touch Portal : 1. [Midi Note ON] - 2. [Midi Note OFF]) Are the Character Animator forums only for Creative Cloud members?

    • @alanjameskent
      @alanjameskent 2 года назад

      @@-._.-._.-._.-._.-._.- Yes, I used loopmidi as well successfully. I don’t know forum membership rules, but character animator I think is only available with CC subscription.

    • @-._.-._.-._.-._.-._.-
      @-._.-._.-._.-._.-._.- 2 года назад

      @@alanjameskent Ah cool! Are you using Touch Portal as well? Yes that is correct (+50€ per month!!)

    • @extraordinarytv3483
      @extraordinarytv3483 2 года назад

      @@-._.-._.-._.-._.-._.- No, I was using loopMidi to move signals between machines - e.g. HTC Vive on one machine, Character Animator on another.

  • @lecyber-purgatoire9803
    @lecyber-purgatoire9803 2 года назад

    Hello! I would like to know where you make your face animations? Under Unity?

    • @alanjameskent
      @alanjameskent 2 года назад

      There are two parts to it. There is the underlying 3D model of the character, which includes mesh adjustments to move the mouth, cheeks, eyes etc - the ARKit library from apple has 52 such little movements for example. Or you can do complete facial expressions all at once. Tools like Blender, Maya, etc can create them. Then there is controlling when they happen. Unity can control when they are used, it does not do a good job to create them. (There are other tools too that can use them.)

  • @SyanTheBee
    @SyanTheBee 2 года назад

    I really like the 3D environments ^^ are those from the unity asset store or your iwn? Either was its cool

    • @XOrdinary99
      @XOrdinary99 2 года назад

      I use assets from the asset store then tweak them a bit. I don't have the skill or time to make full blown location scenes like that. That location mainly uses "Town Constructor 3" from the asset store.

    • @SyanTheBee
      @SyanTheBee 2 года назад

      @@XOrdinary99 that's awesome ^^ I'll check that out, thanks ^^ also your animationsare awesome as always ^^

  • @hoczbrushhcm4134
    @hoczbrushhcm4134 2 года назад

    nice

  • @АмирМустафаев-и1б
    @АмирМустафаев-и1б 2 года назад

    I think problem in script

  • @shyamarama
    @shyamarama 2 года назад

    Cool to learn about sequences - thanks!!

  • @rachaelsix975
    @rachaelsix975 3 года назад

    Hello! I have a question. I have all of the packages properly installed, but when I try to start recording, I get an error saying: "Object reference not set to an instance of an object", and it will not allow me to record. Can you help?

    • @XOrdinary99
      @XOrdinary99 3 года назад

      In the Console window there will be a stack trace. It usually gives a hint which component is the problem. Normally for me it means I missed dragging the model to a property. ExternalReceiver has "Model" of "External Receiver" script (1 property) and EasyMotionRecorder has 2 (Motion Data Recorder / Animator is the important one for recording, but there is also Motion Data Player / Animator). You have to drag the model in the scene hierarchy for your character to those 3 fields. Otherwise, can you share the Console Window stack trace? I am after the file name and line number to check which field it is.

    • @rachaelsix975
      @rachaelsix975 3 года назад

      @@XOrdinary99 Thank you! But I still do not think it is working. What is the best way to send you a screenshot?

    • @XOrdinary99
      @XOrdinary99 3 года назад

      @@rachaelsix975 Do you have a Unity forum account? forum.unity.com/threads/evmc4u-and-easymotionrecorder-complete-hack-for-vmc-protocol-live-capture-in-a-window.1181998/ is an example thread you could reply to perhaps?

    • @rachaelsix975
      @rachaelsix975 3 года назад

      @@XOrdinary99 Just did! My name is SpasticRobots

  • @davebaldwin265
    @davebaldwin265 3 года назад

    This is great stuff! Super helpful! Thank you!

  • @dironahills
    @dironahills 3 года назад

    this is so helpful, thanks for the insight

  • @Peak_Stone
    @Peak_Stone 3 года назад

    finally someone made a video about sequences.