Меню

Как настроить vup vtuber animation motion capture 3d live2d

How to become a Vtuber with ‘Virtual Motion Capture’

Tuesday March 12th, 2019 Monday March 18th, 2019

In this article, I introduce an application called VirtualMotionCapture.
VirtualMotionCapture is an application to control a 3D model in VRM format by VR devices, and I recommend it especially for those who want to become ‘Vtuber’.
In addition, since it can be launched simultaneously with the VR game application, you can record an MR video as if the character entered the world of the game application by combining the game application screen with the VirtualMotionCapture screen.
This time, I explain how to become a Vtuber with VirtualMotionCapture and the space generated by STYLY.

The completion image

Introduction and preparation

Introduction

In order to record the MR video that combines the screens of the VR-game application and VirtualMotionCapture, you need to use the application for MR-overlay such as LIV, and the MR-overlay function by using externalcamera.cfg.
I will explain how to use those applications and how to set up them in the next article, so I introduce the basics to use VirtualMotionCapture in this article.

Preparation

VirtualMotionCapture is the application that handles the 3D model defined by the file format called ‘VRM’, so you need to prepare a 3D model in VRM format.
In addition to the models sold or distributed, you can also use your original VRM model created with the tools such as Vroid, ‘Cecil Henshin Application’.

▼This article explains how to use VRoid▼

And, we use a live-streaming application called OBS Studio at the final stage, so download this software as well.
However, if you use other live-streaming software, you may use it instead.

What is VirtualMotionCapture?

It is an application that allows you control VRM-format 3D model by VR device (HTC Vive/Oculus Rift/WinMR), so you can become so-called Vtuber.
Unlike other common applications that control the model by VR devices, it can control the model while playing a VR game.

The most notable feature of VirtualMotionCapture is that you can launch it simultaneously with the VR-game application, which is considered impossible with other common applications.
By using this feature, you can move a character in sync with the motion of the player who plays the VR game, so you can livestream the VR game like a Vtuber.
In addition, instead of launching it with the VR game, you can simply use it to control a 3D model, which is also an advantage for various applications.

*Note by the translator – VirtualMotionCapture supports English UI. You can select the language by the drop-down box at the bottom of the ‘Settings’ window.

Download VirtualMotionCapture

You can download VirtualMotionCapture for free from BOOTH.
And, if you are a paid subscription member, the pre-release version can be downloaded from pixv FUNBOX.
This time, I download the free version (Ver 0.23) from BOOTH.

Import VRM model

After launching VirtualMotionCapture, import the VRM model you want to control.
When launching the application, it also brings up the control panel at the same time.

Press the ‘Open VRM’ button on the ‘Settings’ tab to open the ‘file-selection’ window.

Select the VRM file by pressing the ‘Open VRM’ button

Select and open the VRM file prepared earlier.
It brings up the license information for the avatar, so check it closely before pressing the ‘Agree / Import’ button if you use the purchased model etc.

The license information is displayed

In the case you use your original model created with VRoid etc., the avatar information you set is displayed, so just press the ‘Agree / Import’ button.
If everything goes well, the model will be displayed on the screen.

The model has been imported

The setting for Camera, Lip Sync and Trackers

The setting for Camera

You can set up the Camera by selecting the ‘Camera’ tab.
Press ‘Front’ as we want to use the Front Camera.
Click the main screen and then use the mouse to adjust the position of the Camera.

The setting for the Camera

The setting for Lip Sync

By setting up ‘Lip Sync’, you can animate the lip of the avatar in sync with the voice input by the microphone.
If you have not specified the microphone for Lip Sync, the ‘Lip Sync’ tab is shown in red, so you can easily see whether it’s set up or not.

Читайте также:  Как настроить время в часах jet kid swimmer

The ‘Lip Sync’ tab – The microphone has not been specified.

The ‘Lip Sync’ tab – The microphone has been specified

If you don’t use Lip Sync, you don’t need to set it up.

The setting for Trackers

VirtualMotionCapture also supports ‘full tracking’ by Trackers.
In order to assign the Trackers, press the ‘Setting’ button on the ‘Settings’ tab.
Next, press the ‘Tracker Config’ button to open the ‘Tracker assignment setting’ window where you can assign the Trackers.
As we don’t use Trackers this time, leave it with the default settings.

The ‘Setting’ window

The ‘Tracker assignment setting’ window

Calibration

After finishing the settings, let’s move the model straight away.
Press the ‘Calibration’ button on the ‘Settings’ tab.

The settings for Calibration

There are three modes for the calibration. This time, select the ‘Normal’ mode at the top.
The calibration starts after 5 seconds when pressing the ‘Calibration’ button or pulling the trigger of the controller.
Spread your arms, keep the controllers upright and turn your palms forwards.
Although the screen instructs ‘take the same pose as the model displayed…’, the angle of the wrists would go wrong if you turn the controller and your palms downward.
The white spheres displayed in the main screen represents the positions of the HMD and the controllers.
Adjust the position of the HMD and the controllers so that those spheres are at the head and hands of the model.

The spheres represent the positions of the HMD and the controllers

You can control the model if the calibration is successful.

The calibration is successful

How to become Vtuber with STYLY’s space

Now you came to be able to move the model.
Next, create the live-streaming screen for Vtuber by using the space created with STYLY as the background.

The setting for background colour

By selecting the ‘Background’ tab, you can change the background colour to ‘blueback’, ‘greenback’ or other colours.
If you select one of those background colours, you can combine the character controlled by VirtualMotionCapture with the image or the video you want to use as the background, by using the ‘Chroma key’ processing.
In addition, you can select ‘transparent’ as the background of VirtualMotionCapture, so it’s possible to overlay the character on the desktop screen.
This time, by using ‘greenback’, we apply ‘Chroma key’ to the screen of VirtualMotionCapture to combine the character with the space created with STYLY.

Select the background colour

Prepare a space with STYLY

Create your favourite space with STYLY Editor.
Since we use the space as the background, you may create a dummy space by modelling only what the camera can see.
As you can record the video of the space with the Camera function in STYLY Editor, you can also make the background with motions.
This time, we use what is shown in the screenshot below.

Combine Character and Space

Finally, we overlay the character on the screenshot of the scene created with STYLY Editor.
We use the live-streaming software called OBS Studio this time. Launch VirtualMotionCapture before launching OBS Studio.
Firstly, we add the image to ‘Sources’.
Press the ‘+’ button below the ‘Sources’ box. Click ‘Image’ from the menu and select ‘Create new’.

Create a new ‘Image’

Then, you can add the background image to ‘Sources’ by selecting the image you want to use in the ‘Properties’ window.

Select the image

Now the image has been added

Next, add the window of VirtualMotionCapture to ‘Sources’.
As we did earlier, press the ‘+’ button below the ‘Sources’ box to display the menu.
In this time, Click ‘Window Capture’ from the menu and select ‘Create new’.

Select ‘Window Capture’

Create a new ‘Window Capture’

Читайте также:  Как настроить blackview bv6000

Then, select ‘[VirtualMotionCapture.exe]: VirtualMotionCapture v0.23’ in the Property window.
It adds the window of VirtualMotionCapture to ‘Sources’.

Select the window of VirtualMotionCapture

As a Source listed above another Source will be on the top of another Source in the display, reorder the Sources so that the Window Capture is above the Image.
In this state, only the screen of VirtualMotionCapture is visible and you cannot see the background image at all, so we apply ‘Chroma key’ to the VirtualMotionCapture window to show the background.

How it looks – before applying ‘Chroma Key’

Right-click the Window Capture in ‘Sources’ and select ‘Filters’ from the menu.
Next, press the ‘+’ button below the ‘Effect Filters’ box and select ‘Chroma Key’ from the menu.

Select ‘Chroma Key’

Select the ‘Chroma Key’ in the ‘Effect Filter’ box, and set ‘Key Color Type’ to the same colour as the background selected in VirtualMotionCapture.
Then, the ‘Chroma Key’ filter makes the background colour of VirtualMotionCapture transparent so that the character is combined with the background image in the display.

‘Chroma Key’ filter

How it looks after applying ‘Chroma Key’

It’s completed now. Once you set up the recording and the broadcasting, you will be able to create a video and livestream.

Congratulations!
In this article, I introduced the basics to use VirtualMotionCapture.
Even this basic way allows you to livestream gameplay like Vtuber straight away.
However, you still need to learn more if you want to create an MR video as if the character entered the world of the game application.
Next, I will introduce the application required to record an MR video and advanced way to use VirtualMotionCapture.

バーチャルモーションキャプチャーを使ったMR動画作りを得意とするバーチャルYouTuber。VRoid Studioやセシル変身アプリなどで作成したVRMモデルをアバター として使用している。株式会社ドワンゴ主催の「ミス・ミスターニコニ立体コンテスト2018」でVIVE賞を受賞。

Based on the VR/AR production and distribution platform STYLY, you will learn the basics of Unity and PlayMaker necessary for creating VR/AR content, as well as the process of creating the compelling content being distributed on STYLY (we’ll expose you to the production techniques).

NEWVIEW AWARDS 2020

xR content awards in the field of fashion/culture/arts The NEWVIEW AWARDS 2020 will be held!
ENTRY:2020.8.3>> 2020.11.2

This time, in addition to VR, AR/MR works will also be accepted. We are seeking further designs of ultra experiences.

Источник

Использовать VTuber Maker

Об этом ПО

VTuber Maker is the #1 free, simplistic, and 3D Tool for VTuber Content Production, just with a webcam.

The VTuber Maker Tool, powered by Live3D, has let more and more people create, share and have fun!

The core features of our product :
1) High-performance, high-sensitivity facial capture technology., even In bright or dim environments
2) Simple and convenient operation for vtuber live broadcast
3) Widget function for drag-and-drop on desktop
4) Diversified overlay design for different scene
5) Stunning video effect when using preset VTuber Avatars tuned by specialists

More core features, looking forward to your exploration

** Important Note
We support the import of VRM avatar, which needs to be implemented through the VTuber Editor
At the same time, we also support model customization:https://live3d.io/vtuber_pricing#pricing

The core products are planed as follows:
VTuber Maker — Free, extremely simplistic and intuitive panel to use.
VTuber Editor — Flexible, personalized, programmatic panel with reasonable price.

Our team is a passionate startup team that loves visual storytelling for vtuber.
Our mission is to build VTuber softwares that enable people to easily create virtual live shows and videos.

Thanks for your support!

And welcome to join the VTuber steam family to give us more opinions.

Источник



Как настроить vup vtuber animation motion capture 3d live2d

30,449 просмотров на YouTube™
223 уникальных посетителей
добавили в избранное

«Tools:VUP Unity3D 2018.4.9
Get VUP:vlivemax.com (or get VUP on steam)

Description
VUP is a VTuber tool. With just a webcam and a motion capture equipment, you can make your model synchronize your expressions and actions

Get Plugins In Videos
Unity 2018.4.x download address: https://unity.cn/releases
Please select Unity 2018.4.x series download
The version used for the tutorial is Unity 2018.4.9
The address of the plugin in the tutorial is as follows
1. The required plugin for importing the MMD(pmx/pmd) model.
Get the address: <ССЫЛКА УДАЛЕНА>https://share.weiyun.com/5vFbLrq
2.The required plugin for importing fbx model, action data, unity3D background
Get address: <ССЫЛКА УДАЛЕНА>https://share.weiyun.com/5EojTP9
3.The required plugin for importing Live2D models and its action.
Get the address: <ССЫЛКА УДАЛЕНА>https://share.weiyun.com/5PZHUHL

Читайте также:  Как настроить свои жесты iphone

Source Of Models
Miku:TDA
izumi_illust:Live2D Official
Rice:Live2D Official
Ukon(ver.1.21):キツネツキ(kitsune_tsuki)

BGM
China-X—徐梦圆
南锣鼓巷—接个吻,开一枪 _ Clare
锦里—HOPE T»

Источник

Использовать VUP- VTuber & Animation & motion capture & 3D & Live2D

Об этом ПО

VUP is a VTuber tool. It is based on powerful real-time capture technologies such as face capture technology and motion capture technology.It supports resource customization to enrich your creative resources.

VUP has the following characteristics
(1) Accurate capture technology
VUP supports webcam facial capture, Android phone camera facial capture and high precision sound capture. At the same time, a variety of types of motion capture interfaces are built in to capture the actor’s motion data in real time and synchronize to the avatar. It supports full body motion capture, half body motion capture, arms and fingers capture. It supports most motion capture models, such as: kinect full body capture, Intel RealSense full body capture, Noitom Perception Neuron full body capture and Hi5 gloves, Xsens full body capture, HTC VIVE full body capture, IKINEMA Orion VR full body capture , Virdyn full body capture and it’s gloves, FOHEART full body capture, ChingMu optical motion tracking , Nokov optical motion tracking, Leap Motion fingers and arms capture , HuanTek fingers capture , and Vrtrix fingers capture.
(2) Resource customization and cloudization
Support user-defined upload of character models( pmx / vrm / fbx / model3.json of Live2D ), props ( pmx / fbx ) , actions ( vmd / fbx ), scenes (3D / png / mp4 ) . At the same time, parameters can be edited to achieve resource diversification, cloudization, and sharing.
Note: resources of pmx, vmd and fbx need to be converted to vup format through Unity 2018.4.9 before they can be imported into VUP.
(3) VUP Supports customizing emojis, shortcut keys, shots and shot transition methods
(4) VUP supports wearing props and customizing poses for avatars.
(5) Multiplayer/ online
leading into the rooms, vup supports users to create, join rooms, customize room parameters, and achieve multi-person live online.
(6)Barrage Interaction
Supports multi-platform live-streaming barrage interactions. By setting up “barrage-interactions”, fans’ barrage / gifts can be transformed into interesting tastes with animation effects.

Системные требования

    Минимальные:

    • Требуются 64-разрядные процессор и операционная система
    • ОС: Windows 10
    • Процессор: Intel Core i5-4570 3.20GHz or equivalent
    • Оперативная память: 8 GB ОЗУ
    • Видеокарта: Nvidia GeForce GTX 1050Ti or equivalent
    • DirectX: Версии 11
    • Сеть: Широкополосное подключение к интернету
    • Место на диске: 4 GB
    • Звуковая карта: DirectX 11.0 compatible sound card
    • Дополнительно: For face tracking you need a camera (common computer camera is sufficient), for motion capture you need a set of motion capture equipment, for sound tracking you need a microphone If you have a mobile computer, please make sure the charger is plugged in, running in the best graphics adapter and high performance mode. We do not recommend low-end integrated graphics adapters for either desktop or mobile computers.
    Рекомендованные:

    • Требуются 64-разрядные процессор и операционная система
    • ОС: Windows 10
    • Процессор: Intel Core i7-8700 3.20GHz or equivalent
    • Оперативная память: 16 GB ОЗУ
    • Видеокарта: Nvidia GeForce RTX 2060 or equivalent
    • DirectX: Версии 12
    • Сеть: Широкополосное подключение к интернету
    • Место на диске: 4 GB
    • Звуковая карта: DirectX 12.0 compatible sound card
    • Дополнительно: For face tracking you need a camera (common computer camera is sufficient), for motion capture you need a set of motion capture equipment, for sound tracking you need a microphone If you have a mobile computer, please make sure the charger is plugged in, running in the best graphics adapter and high performance mode. We do not recommend low-end integrated graphics adapters for either desktop or mobile computers.

When is participating in our products you must all times to behave and to abide by our rules:
1.Do not use illegal account;
2.Do not steal others’ models;
3.Do not publish offending resources;
4.Public use of other people’s resources requires consent;

Источник