Motion capture is not necessary to have fun in Mikoverse, but it certainly adds to your experience and everyone you're with!
The intent of this guide is to help index the different solutions available, and provide context on how to use it in Mikoverse. There is no particular one-size-fits-all solution, and which solution fits you will depend on your comfort level on the different solutions available.
If you need help with any of the motion tracking, please swing by the Mikoverse Discord #tracking-discussion channel.
Live-Link Face Capture[]
Live-Link is a quicker way to have a better experience so that players can see each other talk in higher definition. It's possible to use VMC Protocol for Face motion capture with the other solutions further below.
There is currently an issue with Imported avatars with networked LiveLink, but it will work with avatars registered with Mikoverse.
LiveLink Face (iOS only)[]
Minimum iPhone model is iPhone X[1] This is the lowest Latency solution in terms of Live Link.
- Download and install Live Link Face
- Make sure your iPhone and the Mikoverse PC is on the same network
- On your Mikoverse PC
- Find your local IPV4 address by opening Command Prompt (type in 'cmd' in the Start menu)
- Look for your 'IPv4 Address'. This should start with 192.168. or 10.0.
- Take note of the address
- In Live Link:
- Start Live Link Face, go to the gear icon on the top left
- Look for 'Live Link' under 'Streaming'. Tap into it.
- Input target IP as your Mikoverse IP
- Head back to the camera screen and make sure the 'LIVE' box is green. If not, tap it.
- To re-center where your avatar is looking, press C (with gizmo mode is off)
Troubleshooting
- Live Link Face may require your Mikoverse PC to be set to "Private Network"
- Live Link will only transmit to the first application that asks for it. Meaning if you have a UE project open using LiveLink, Mikoverse will not be able to use LiveLink. You can close the app using LinkLink and Mikoverse should be able to connect.
MeowLL (Android)[]
MeowFace: https://suvidriel.itch.io/meowface
"With this utility you can easily convert iOS and Android MeowFace (iFacialMocap) motion data to LiveLink used by Unreal Engine. This should help those whom use Android with MikoVerse facial and head tracking." -Ultimationzzz
https://github.com/Ultimationzzz/MeowLL
NOTE: If you're doing a side-by-side comparison with LiveLink and MeowLL, MeowLL may not work until you restart.
MediaPipe LiveLink[]
Main Discord Topic: https://discord.com/channels/1171719466732748912/1212624513234112512/1212624513234112512
https://github.com/mogue/MediaPipe-LiveLink-Python
- Download windows EXE (202mb) MogueMotionCapture.exe
- Run the MogueMotionCapture.exe
- If "Windows protected your PC" click "more info" and "Run anyway"
- Press the play button to begin tracking
- Make sure your face is being captured in the preview window
- Smile in MikoVerse Press ESC key to close the application. You can relocate the "MogueMotionCapture.exe" anywhere on your PC. A "MogueMotionCapture.ini" file will be saved when you exit the application. The file will be in the same directory as the exe, it has some default configurations that can be changed.
Note: running the application from Python loads faster and is recommended if you are familiar with Python
Body Tracking[]
Please also note that if you're using VMC Protocol full body tracking, you will "slide" in Mikoverse while moving with mouse and keyboard. However it is possible to use a treadmill to walk and simulate walking in Mikoverse.
Webcam-based Tracking[]
- Most of these solutions use AI tracking. Results will vary and will require tuning. All have issues of being a little jittery
- Face Capture may be possible with these application, but the system is different than LiveLink.
- In all cases that use VMC Protocol, VMC Protocol needs to be enabled in Mikoverse. Head to Actions > Search for 'Enable VMC'
Webcam Motion Capture | VSeeFace | XR Animator | |
$ | Paid | Free | Free |
User Profiles | Yes | Yes, but only when models are named differently | 'Streamer Mode' - Loads last session's settings |
Built-in Animations | Idle Animations | No | Yes, 30+ |
Custom animations | Idle Animations | No | Drag and Drop FBX Animations, MP4 Videos |
VMC Protocol Receiver/Remixing | One | Up to Two | No |
Face Tracking | Yes | Yes | Yes |
Hand + Finger Tracking | Yes | With Leap Motion | Yes |
Upper body settings | Yes, with filtering | No Body Tracking Built in | Yes, Advanced settings |
Full body settings | Yes, with filtering | No Body Tracking Built in | Yes, Advanced settings |
Mobile App Support | MeowFace
Facemotion3d iFacialMocap |
iFacialMocap
FaceMocap3d VTube Studio |
|
First-Party Documentation | Yes |
Webcam Motion Capture[]
https://webcammotioncapture.info/
Highlights:
- UI is generally better laid out.
- Webcam tracking has a good out of the box experience
- User Profiles for quick switching
This is a paid solution, but has a free 'demo'. Payment options include monthly subscriptions, pre-paid, or lifetime.
Idle Animations means if the app cannot detect a specific limb, the app will place the limb back to the Idle Animation. However, if an Idle animation has a 'twist' (e.g. a dance animation), upper body tracking should be turned off (Advanced Tracking Settings > Enable/Disable Tracking > Uncheck Head + Upper Body)
Main Documentation:
https://webcammotioncapture.info/manual.php
Quick Setup:
- Startup, Login
- In Webcam Motion Capture window:
- Select the camera you wish to use
- Place webcam as you need it
- Change webcam resolution as needed (lower resolution for better performance)
- 'Hide Webcam' if you need to.
- In Webcam Motion Receiver window:
- On Top left, Click 'Load VRM' and select your VRM (default is generally good enough for bipedal humanoid VRMs)
- Turn on 'Send to External App' at the top
- On the left, Click 'Start'
- In Mikoverse:
- Go to Actions > "Enable VMC" ("Enable VMC Pose only" if using LiveLink)
VSeeFace[]
Highlights:
- Can receive multiple VMC Protocol apps
Parts available to control what is being controlled by which receiver or VSeeFace
- Has Leap Motion Support
VSeeFace does not have native body tracking, but is able to capture face motion and support two VMC Protocol receivers.
It's recommended to use multiple solutions so that VSeeFace acts as the main output to Mikoverse. For instance, you can use XR Animator to output to a specific port and have VSeeFace receive the VMC Protocol data from XR Animator.
You can also control which body part is controlled by which VMC Protocol Receiver
XR Animator[]
https://github.com/ButzYung/SystemAnimatorOnline
Highlights:
- Drag and Drop FBX animations
- MP4 video animations
- Has In-depth settings to adjust almost everything
- UI is terrible
Has a streamer mode that load previous settings for quick startup.
FBX animation files mostly work, but some FBX animation files will need to be put through Unreal Editor and re-exported
MP4 video files can be used once activated in the menu
Lots of tooltips, but the UI is difficult to navigate
Quick Setup
- Launch, Drag-and-drop VRM avatar file, Click 'Start'
- Webcam/Media
- Double-click 📷Webcam icon
- Click 'Yes' to enable selfie camera
- Select your webcam
- In case of using Local Media, you can drag-and-drop MP4 videos and the app will attempt to track the person in the video
- Motion Capture
- Double-click 🧍♂️Person icon
- Select your preferred motion capture mode (generally Body + Hands)
- Turn on VMC Protocol
- Double-click 'VMC' icon
- Click "VMC-protocol: OFF". Should switch to "VMC-protocol: ON
- Posing
- Double click 🕺 Dancing Icon
- Select a default pose
- Browsing poses can be a pain. Take some time to familiarize how to select a pose
- When the motion capture can't detect a specific body part, the body part returns to the selected pose.
- In Mikoverse:
- Go to Actions > "Enable VMC" ("Enable VMC Pose only" if using LiveLink)
To use FBX animations
- Make sure Mocap is off.
- To turn off Mocap, double-clicking the icon again and selecting turn off.
- Drag-and-drop the FBX animation directly into XR animator.
- Known working FBX animations are available on Mixamo. If the FBX animation doesn't work, the animation may need to be run through Unreal Editor and re-exported.
Luppet
WIP
Waidayo
WIP
VCam
WIP
HANA_Tool
WIP
VR Tracker based[]
With Headset: VR Support
Virtual Motion Capture
https://akira.works/VirtualMotionCapture-en/
Has ability to work with VR
SteamVR
WIP
Meta
WIP
Slime VR
Buy fully assembled or buy a kit for DIYing https://www.crowdsupply.com/slimevr/slimevr-full-body-tracker
Full Body Tracking Sensors[]
XSense[]
Professional Grade, but has native support within Mikoverse
Rokoko
WIP
Other Trackers[]
Ultraleap Leap Motion Controller 2 (Hand Only)[]
Native Leap Motion for Mikoverse is broken.
Workaround is to use VSeeFace and use VMC Protocol through VSeeFace