Mikoverse Wiki

Motion capture is not necessary to have fun in Mikoverse, but it certainly adds to your experience and everyone you're with!

The intent of this guide is to help index the different solutions available, and provide context on how to use it in Mikoverse. There is no particular one-size-fits-all solution, and which solution fits you will depend on your comfort level on the different solutions available.

If you need help with any of the motion tracking, please swing by the Mikoverse Discord #tracking-discussion channel.

Live-Link Face Capture[]

Live-Link is a quicker way to have a better experience so that players can see each other talk in higher definition. It's possible to use VMC Protocol for Face motion capture with the other solutions further below.

There is currently an issue with Imported avatars with networked LiveLink, but it will work with avatars registered with Mikoverse.

LiveLink Face (iOS only)[]

Minimum iPhone model is iPhone X[1] This is the lowest Latency solution in terms of Live Link.

  1. Download and install Live Link Face
  2. Make sure your iPhone and the Mikoverse PC is on the same network
  3. On your Mikoverse PC
    1. Find your local IPV4 address by opening Command Prompt (type in 'cmd' in the Start menu)
    2. Look for your 'IPv4 Address'. This should start with 192.168. or 10.0.
    3. Take note of the address
  4. In Live Link:
    1. Start Live Link Face, go to the gear icon on the top left
    2. Look for 'Live Link' under 'Streaming'. Tap into it.
    3. Input target IP as your Mikoverse IP
    4. Head back to the camera screen and make sure the 'LIVE' box is green. If not, tap it.
  5. To re-center where your avatar is looking, press C (with gizmo mode is off)

Troubleshooting

  • Live Link Face may require your Mikoverse PC to be set to "Private Network"
  • Live Link will only transmit to the first application that asks for it. Meaning if you have a UE project open using LiveLink, Mikoverse will not be able to use LiveLink. You can close the app using LinkLink and Mikoverse should be able to connect.
MeowLL (Android)[]

MeowFace: https://suvidriel.itch.io/meowface

"With this utility you can easily convert iOS and Android MeowFace (iFacialMocap) motion data to LiveLink used by Unreal Engine. This should help those whom use Android with MikoVerse facial and head tracking." -Ultimationzzz

https://github.com/Ultimationzzz/MeowLL

NOTE: If you're doing a side-by-side comparison with LiveLink and MeowLL, MeowLL may not work until you restart.

MediaPipe LiveLink[]

Main Discord Topic: https://discord.com/channels/1171719466732748912/1212624513234112512/1212624513234112512

https://github.com/mogue/MediaPipe-LiveLink-Python

  1. Download windows EXE (202mb) MogueMotionCapture.exe
  2. Run the MogueMotionCapture.exe
  3. If "Windows protected your PC" click "more info" and "Run anyway"
  4. Press the play button to begin tracking
  5. Make sure your face is being captured in the preview window
  6. Smile in MikoVerse Press ESC key to close the application. You can relocate the "MogueMotionCapture.exe" anywhere on your PC. A "MogueMotionCapture.ini" file will be saved when you exit the application. The file will be in the same directory as the exe, it has some default configurations that can be changed.

Note: running the application from Python loads faster and is recommended if you are familiar with Python

Body Tracking[]

Please also note that if you're using VMC Protocol full body tracking, you will "slide" in Mikoverse while moving with mouse and keyboard. However it is possible to use a treadmill to walk and simulate walking in Mikoverse.

Webcam-based Tracking[]

  • Most of these solutions use AI tracking. Results will vary and will require tuning. All have issues of being a little jittery
  • Face Capture may be possible with these application, but the system is different than LiveLink.
  • In all cases that use VMC Protocol, VMC Protocol needs to be enabled in Mikoverse. Head to Actions > Search for 'Enable VMC'
Webcam Motion Capture VSeeFace XR Animator
$ Paid Free Free
User Profiles Yes Yes, but only when models are named differently 'Streamer Mode' - Loads last session's settings
Built-in Animations Idle Animations No Yes, 30+
Custom animations Idle Animations No Drag and Drop FBX Animations, MP4 Videos
VMC Protocol Receiver/Remixing One Up to Two No
Face Tracking Yes Yes Yes
Hand + Finger Tracking Yes With Leap Motion Yes
Upper body settings Yes, with filtering No Body Tracking Built in Yes, Advanced settings
Full body settings Yes, with filtering No Body Tracking Built in Yes, Advanced settings
Mobile App Support MeowFace

Facemotion3d iFacialMocap

iFacialMocap

FaceMocap3d VTube Studio

First-Party Documentation Yes
Webcam Motion Capture[]

https://webcammotioncapture.info/

Highlights:

  • UI is generally better laid out.
  • Webcam tracking has a good out of the box experience
  • User Profiles for quick switching

This is a paid solution, but has a free 'demo'. Payment options include monthly subscriptions, pre-paid, or lifetime.

Idle Animations means if the app cannot detect a specific limb, the app will place the limb back to the Idle Animation. However, if an Idle animation has a 'twist' (e.g. a dance animation), upper body tracking should be turned off (Advanced Tracking Settings > Enable/Disable Tracking > Uncheck Head + Upper Body)

Main Documentation:

https://webcammotioncapture.info/manual.php

Quick Setup:

  1. Startup, Login
  2. In Webcam Motion Capture window:
    1. Select the camera you wish to use
    2. Place webcam as you need it
    3. Change webcam resolution as needed (lower resolution for better performance)
    4. 'Hide Webcam' if you need to.
  3. In Webcam Motion Receiver window:
    1. On Top left, Click 'Load VRM' and select your VRM (default is generally good enough for bipedal humanoid VRMs)
    2. Turn on 'Send to External App' at the top
    3. On the left, Click 'Start'
  4. In Mikoverse:
    1. Go to Actions > "Enable VMC" ("Enable VMC Pose only" if using LiveLink)
VSeeFace[]

https://www.vseeface.icu

Highlights:

  • Can receive multiple VMC Protocol apps
  • VSeeFace Body Parts

    Parts available to control what is being controlled by which receiver or VSeeFace

    Can specify which app applies to each bone
  • Has Leap Motion Support

VSeeFace does not have native body tracking, but is able to capture face motion and support two VMC Protocol receivers.

It's recommended to use multiple solutions so that VSeeFace acts as the main output to Mikoverse. For instance, you can use XR Animator to output to a specific port and have VSeeFace receive the VMC Protocol data from XR Animator.

You can also control which body part is controlled by which VMC Protocol Receiver

XR Animator[]

https://github.com/ButzYung/SystemAnimatorOnline

Highlights:

  • Drag and Drop FBX animations
  • MP4 video animations
  • Has In-depth settings to adjust almost everything
  • UI is terrible

Has a streamer mode that load previous settings for quick startup.

FBX animation files mostly work, but some FBX animation files will need to be put through Unreal Editor and re-exported

MP4 video files can be used once activated in the menu

Lots of tooltips, but the UI is difficult to navigate

Quick Setup

  1. Launch, Drag-and-drop VRM avatar file, Click 'Start'
  2. Webcam/Media
    1. Double-click 📷Webcam icon
    2. Click 'Yes' to enable selfie camera
    3. Select your webcam
      1. In case of using Local Media, you can drag-and-drop MP4 videos and the app will attempt to track the person in the video
  3. Motion Capture
    1. Double-click 🧍‍♂️Person icon
    2. Select your preferred motion capture mode (generally Body + Hands)
  4. Turn on VMC Protocol
    1. Double-click 'VMC' icon
    2. Click "VMC-protocol: OFF". Should switch to "VMC-protocol: ON
  5. Posing
    1. Double click 🕺 Dancing Icon
    2. Select a default pose
      1. Browsing poses can be a pain. Take some time to familiarize how to select a pose
      2. When the motion capture can't detect a specific body part, the body part returns to the selected pose.
  6. In Mikoverse:
    1. Go to Actions > "Enable VMC" ("Enable VMC Pose only" if using LiveLink)

To use FBX animations

  1. Make sure Mocap is off.
    1. To turn off Mocap, double-clicking the icon again and selecting turn off.
  2. Drag-and-drop the FBX animation directly into XR animator.
    1. Known working FBX animations are available on Mixamo. If the FBX animation doesn't work, the animation may need to be run through Unreal Editor and re-exported.

Luppet

WIP

Waidayo

WIP

VCam

WIP

HANA_Tool

WIP

VR Tracker based[]

With Headset: VR Support

Virtual Motion Capture

https://akira.works/VirtualMotionCapture-en/

Has ability to work with VR

SteamVR

WIP

Meta

WIP

Slime VR

Buy fully assembled or buy a kit for DIYing https://www.crowdsupply.com/slimevr/slimevr-full-body-tracker

Full Body Tracking Sensors[]

XSense[]

Professional Grade, but has native support within Mikoverse

Rokoko

WIP

Other Trackers[]

Ultraleap Leap Motion Controller 2 (Hand Only)[]

Native Leap Motion for Mikoverse is broken.

Workaround is to use VSeeFace and use VMC Protocol through VSeeFace