Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Please limit one question/topic per post.
Please provide the following details depending on types of question

▼ For question regarding Editor, please provide:
- Live2D Software Version: (e.g. Cubism Editor 3.1.02, Euclid Editor 1.2.02)
- OS & Version: (e.g. Windows 10, macOS HighSierra)
- PC Model: (e.g. MacBookPro 2017)
- Graphics Specifications: (e.g. GeForce GTX 950M)
- Memory: (e.g. 8GB)
- Illustration Software used: (e.g. Photoshop, ClipStudio, SAI)
- Attach screenshot or capture video to show your problem:
▼ For question regarding SDK, please provide:
- For us to reproduce the issue you wish to inquire about, please provide either a minimum project file which has the issue. or specific instructions on how to reproduce the issue.
- Types and version of Cubism SDK you are using.
▼ For question regarding Cubism Software License,
For issues related to purchasing, your order, license key, and subscription, please contact us via contact form:

Motion capture for Animation in Live2d

My name is james,

i just want to know is there any way to animate a video using motion capture with live2d.

Thank you so much!


  • I don't think you can animate a live2d model with mo-cap data. The movement range of a live2d model is pretty limited (by the original illustration), so you can't do the dynamic action that you can with traditional 3D model. If it's just facial expressions, you should be able to do it based on how you make your live2d model and link up parameters.
  • Yes, " If it's just facial expressions, you should be able to do it based on how you make your live2d model and link up parameters". ... yes, thats what i am asking. can u please tell me how to do that?
    (its for animating a video, not live rendering for games)

    Well, if its not motion capture, Drag function looks promising, but how to add custom drag funtions..
  • I think you should check this out:

    I think face tracking was done with FaceShift Studio and the parameters were linked with parameters of the Live2D model on Unity.

    And this guy did something similar using Kinect and Unity:

    (Second guy in the session)
  • Wow thanks, i got the idea!
  • is that possible to generate the animator file with the script,
    so that we can load that file in animator and rendering the final video...
  • I don't know. It may be possible if you can somehow transform your facial data into .mtn... but I don't think there is any easy way.

    Why not just use auto lip-sync and do the manual facial expression like this one? It's probably much easier, faster, and better quality.

Sign In or Register to comment.