Howdy, Stranger!

It looks like you're new here. Sign in or register to get started.

If you have any questions, reports, suggestions, or requests about Live2D, please send them to this forum.
※We cannot guarantee statements or answers from Live2D staff. Thank you for your understanding in advance.
 
Live2D Cubism
Cubism Products and Downloads
Cubism product manuals and tutorials
Cubism Editor Manual    Cubism Editor Tutorial    Cubism SDK Manual    Cubism SDK Tutorial
[INFORMATION](03/28/2024)
Cubism Editor 5.1 alpha version is now available!
Find out how to use the new features here.
Please take this opportunity to use it and give us your feedback!

For more information and download, please check out the Cubism Editor 5.1 alpha category.

How to make character respond to touch?

edited December 2023 in Help
How do i make my Live2d character respond to touch? Eg. when i touch or click on the character's body, he will react to the way I want.

Comments

  • edited February 2016
    Hi,

    I take that your question is regarding the SDK for Unity. There are basically two things to do although the first one is optional.

    1) Add special parts to your model that serve as hitboxes. The manual contains a section on that topic (Editor Function -> Cubism Modeler -> Special model -> 02. there is a judgment per model). Although the machine translation is horrible it hopefully gives you an idea.

    2) Implement your own intersection test algorithm on top of the SDK because the current SDK doesn't provide such functionality. Basically, you use the SDK to get the current position of the parts you wan't to check for touches and then test whether those positions overlap with any touches. As this is not that straight-forward I wrote up a small script as a starting point:
  • Seems the script doesn't fit in one post so I have to split it up... :'(
  • edited February 2016
    using live2d; using System; using System.Collections.Generic; using UnityEngine; /// <summary> /// [2016-02-03] /// /// This behaviour showcases functionality provided by the Live2D SDL for Unity 2.1.x /// to check whether a parts of a model has been touched. /// The implementation is brute-force and totally unoptimized, /// but should be documented enough to serve as a starting point. /// </summary> public class ModelTouchTrigger : MonoBehaviour { /// <summary> /// The name of the parts to check for touches. /// Must exactly match the ID of the live2d model parts, e.g. 'PARTS_01_BODY_001', etc.. /// </summary> public string PartsID = null; /// <summary> /// The live2d model instance the parts is attached to. /// You have to (1) set this variable yourself after the behaviour has been instantized and /// and to (2) make sure that the model you are referencing belongs to a component attached to the same game object. /// </summary> [NonSerialized] public Live2DModelUnity Live2DModel = null; /// <summary> /// Update is called once per frame by Unity. /// /// This function checks whether the parts intersects with any touches and handles the intersections events if any. /// </summary> private void Update() { // Return early in case no live2d model is loaded or no parts is specified. if (string.IsNullOrEmpty(PartsID) || Live2DModel == null) { return; } // Get touches. var touchPositions = new List<Vector2>(); foreach (var touch in Input.touches) { // Here we pretend to be only interested in new touches, // so skip any not-new touches. if (touch.phase != TouchPhase.Began) { continue; } // Transform the touch position from screen space to model space. // This will only work as long as live2d model belongs to the same game object. var touchWorldPosition = Camera.main.ScreenToWorldPoint(new Vector3(Input.mousePosition.x, Input.mousePosition.y)); var touchLocalPosition = transform.TransformPoint(touchWorldPosition); var touchModelPosition = TransformLocalToModelPoint(touchLocalPosition); // Register the touch by adding its position. touchPositions.Add(touchModelPosition); } // Treat a left button mouse clicks as a touch when running in the editor just for convenience. if (Application.isEditor && Input.GetMouseButtonDown(0)) { var touchWorldPosition = Camera.main.ScreenToWorldPoint(new Vector3(Input.mousePosition.x, Input.mousePosition.y)); var touchLocalPosition = transform.TransformPoint(touchWorldPosition); var touchModelPosition = TransformLocalToModelPoint(touchLocalPosition); touchPositions.Add(touchModelPosition); } // Return early in case there are no touch events we are interested in. if (touchPositions.Count == 0) { return; } // Get the index of the parts. var partsIndex = Live2DModel.getPartsDataIndex(PartsID); // Return early in case the parts index is invalid. if (partsIndex == -1) { return; } // Get the parts data. var parts = Live2DModel.getModelImpl().getPartsDataList()[partsIndex]; // Loop through the draw data associated with the parts data. // Besides other data, the draw data contains the transformed vertices of the parts, // i.e. the result of the update() function called on a live2d model. foreach (var drawData in parts.getDrawData()) { // Pretty ugly: To get the transofrmed vertices, we need to derive the draw index from the data. var drawDataID = drawData.getDrawDataID(); var drawIndex = Live2DModel.getModelContext().getDrawDataIndex(drawDataID); // Get the transformed verices. var transformedVertices = Live2DModel.getTransformedPoints(drawIndex); // Perform intersection tests and handle intersections. foreach (var touchPosition in touchPositions) { // Perform a simple aabb intersection test; var doesAABBContainPoint = DoesAABBContainPoint(transformedVertices, touchPosition); // In case you want finer grained intersection tests, // you could implement a triangle based intersection test here // by 'reconstructing' the triangles from the vertices, // but we skip that here. // You can get the vertex indices you'd need by calling: // // // var vertexIndices = live2DModel.getIndexArray(drawIndex); // Handle touches. var isTouched = doesAABBContainPoint; if (isTouched) { // Here we simply send a message to the game object itself and return. gameObject.SendMessage("OnModelTouched", PartsID); return; } } } }
  • /// <summary> /// Transforms a point from local space to live2d model space. /// </summary> /// <param name="localPoint">The point to transform.</param> /// <returns>A point transformed to live2d model space.</returns> private Vector2 TransformLocalToModelPoint(Vector3 localPoint) { // Assume that the model matrix is the same as in the 'simple' project included in the SDK. var modelMatrix = Matrix4x4.Ortho(0, Live2DModel.getCanvasWidth(), Live2DModel.getCanvasWidth(), 0.0f, 0.0f, 0.0f); // Inverse transform the position. var modelPointX = (localPoint.x - modelMatrix.m03) / modelMatrix.m00; var modelPointY = (localPoint.y - modelMatrix.m13) / modelMatrix.m11; return (new Vector2(modelPointX, modelPointY)); } /// <summary> /// Constructs an aabb from the point cloud and /// checks whether the point is contained within the constructed aabb. /// </summary> /// <param name="pointCloud">The points to construct the aabb from.</param> /// <param name="point">The point to check whether it is contained with the construct aabb..</param> /// <returns>True in case the point is contained within the constructed aabb; false otherwise.</returns> private bool DoesAABBContainPoint(float[] pointCloud, UnityEngine.Vector2 point) { // Return early in case the transformed vertices seem invalid. if (pointCloud == null || pointCloud.Length == 0) { return (false); } // Construct the sides of the aabb from the transformed points. float left = Live2DModel.getCanvasWidth(); float right = 0.0f; float top = Live2DModel.getCanvasHeight(); float bottom = 0.0f; for (var i = 0; i < pointCloud.Length; i = i + 2) { float x = pointCloud[i]; float y = pointCloud[i + 1]; if (x < left) { left = x; } if (x > right) { right = x; } if (y < top) { top = y; } if (y > bottom) { bottom = y; } } // Return false in case the point lies outside of a boundary. if (point.x < left) { return (false); } if (point.x > right) { return (false); } if (point.y > bottom) { return (false); } if (point.y < top) { return (false); } // Arriving here means there is an intersection. return (true); } }
  • Hi Andi, i read through the manual (02. there is a judgment per model) and I think i understand it. But after adding hitboxes to my character and labelling them as eg. D_REF.HEAD & D_REF.BODY, what should I do then? Do I save the character file as .cmox and then open the file in Cubism Animator to add the movements? How do i link D_REF.HEAD & D_REF.BODY to my movements? Eg, when i touch the character's head, she will smile, and when i touch her body she will shake her head?



  • Hi,

    sorry for me answering late. Most of the projects that use the approach introduced in the tutorial use static parts for hit detection so they don't follow any animations. If you'd want to make the parts movable, you'd either have (a) to parent the deformer the the deformer you want it to follow (or the other way round) or (b) to animate the parts for hit detection as you said.
    However, for the snippet I posted you don't need to prepare special parts for hit detection: you can also perform hit detection with your "normal" parts. This, however, might end up costly. So you could try the following:

    - Try out the snippet without preparing simplified parts for hit detection and in case performance is acceptable, just go with that.

    - In case performance is not acceptable you try to optimize the snippet.

    - And in case the performance is still be not acceptable, prepare simplified parts as described in the tutorial and animate them if and as necessary.

    I hope this clears things up a bit.
  • Hi, thanks for your reply. Just to confirm that you understood what I was saying, I would like my character to behave like this when i touch them, for example:
    https://www.youtube.com/watch?v=AhRMi0hQVz0
    https://www.youtube.com/watch?v=aj6ma3sNXbE

    By the way what is a 'snippet'? Is it referring to the codes that you wrote earlier?
  • Yes, the 'snippet' was referring to the code, sorry :) .
    To achieve the effect as in the second clip (touching the head triggers a different reaction than touching the body, I guess), you could use the behavior I posted like this: You attach the behavior 2 times to your character. In one you set the parts ID to the body part, in the other one you set the parts ID. If one of the parts is clicked, the corresponding behavior will send a message to the game object (http://docs.unity3d.com/ScriptReference/GameObject.SendMessage.html). Then you attach a third behavior to handle touches. This behavior must have a method with the following signature:

    void OnModelTouched(string partsID)

    It shouldn't matter whether the method is private or not. In this method you can then handle touches like this:

    { if (partsID == BodyID) { DoThis(); } else if (partsID == FaceID) { DoThat(); } }

    There should be, of course, better ways to do the whole thing, but it should work an maybe is good enough.
Sign In or Register to comment.