How VTube Studio’s NVIDIA integration is making VTube more accessible Dexerto

How VTube Studio's NVIDIA integration is making VTube more accessible Dexerto

VTubing can be easy, but at the higher end it can be an expensive and time-intensive process. VTube Studio and NVIDIA are looking to change that, with new technology that seeks to make the streaming medium more accessible to everyone.

Getting into VTubing can be as simple as getting a couple of PNG images, connecting them to a Discord bot, and streaming them live on Twitch. However, it can also be a complex combination of Live2D or 3D models, with individual body parts animated and manipulated to follow the transmitter’s every move.

That upper echelon of VTubing can be quite difficult to access. It requires high-quality equipment ⁠, whether it’s an iPhone or iPad to use face tracking, or a computer robust enough to track those same expressions from a webcam. Either way, it’s a huge investment in hardware to keep things running smoothly.

VTube Studio is one such face tracking software for VTube, and one of the most popular out there. The PC program allows users to upload their model before using a webcam or smartphone to track movements and mimic them on the model.

Denchi, the person behind VTube Studio, knows how resource intensive the whole medium can be. While it is inherently a PC app, many users choose to run it on their smartphone and send tracking data to their computer to reduce CPU and GPU usage.

“Over the past few years, I’ve tried pretty much every face-tracking framework, but they’re often unstable, highly experimental, or prohibitively expensive,” they told Dexerto.

“Most people use webcam-based face tracking or iOS face tracking right now. The existing webcam face tracking in VTube Studio, an open source library called OpenSeeFace, is already really impressive, especially considering that it’s made from scratch by a single person.

“But both webcam-based tracking and iOS tracking have their problems. Webcam tracking is relatively resource intensive and not as accurate as iOS tracking, while iOS tracking is very accurate and tracks more facial features, but users need an expensive iPhone or iPad to use it.”

However, that barrier to entry is being lowered further with a new collaboration between VTube Studio and NVIDIA. NVIDIA Broadcast’s new face tracking feature reduces the load on GPUs for VTubers looking to keep everything on their computer, and the Live2D program is one of the first to take advantage of it.

It’s been “optimized to run most of the face-tracking AI code… on its high-performance Tensor Cores that all of its RTX-series cards have” ⁠ – the same thing that makes your AAA games look smooth as silk on PC, but now it can also help with face tracking.

It also looks smoother without affecting performance too much ⁠; in fact, it potentially outperforms what’s currently on the market, says Denchi.

“The performance impact will be minimal and tracking can be run with even the most demanding games,” they continued. “NVIDIA face tracking accuracy is also extremely good, coming very close to the quality of current iOS tracking, perhaps even surpassing it in some respects.”

The feature not only helps VTube Studio, but also any developer looking to use face tracking on NVIDIA GPUs. It opens up a lot of opportunities for development in the VTube space that could drop the barrier to entry even lower.

It’s a space that NVIDIA is also trying hard to position itself in. Gerardo Delgado Cabrera, a product line manager at NVIDIA Studio working on the new streaming features, said it’s part of long-term plans to help “optimize” the VTubing space.

“As part of NVIDIA Studio, we work with all the major creative apps, as well as upcoming ones,” he told Dexerto. “And one of the hottest development areas in live streaming is VTubing.

“We reached out to all the major VTubing apps months ago and started working with all of them to help them optimize their apps. In fact, improvements have already been shipped via NVIDIA Studio drivers to help with optimization and stability.

NVIDIA Broadcast face tracking will be released in October, with an update pushed to VTube Studio at the same time. This will help around 30% of users who have RTX-enabled GPUs. The update will also be completely free for everyone, and the manufacturer is working with the VTube community to continually add new features and updates.

This includes a new tool in NVIDIA’s Augmented Reality Software Development Kit. call Facial Expression Estimationthat “helps animate facial meshes that can better convey emotions,” Delgado said.

It comes across as a huge leap for the tech side of the VTubing space, but at the end of the day, it’s only a small part of the experience. There is still a lot of growth in terms of what VTubers could become, and Denchi will continue to explore this with VTube Studio.

“I think tracking will definitely improve, but I also think it’s important to remember that tracking is only one aspect of VTube. Personally, most of the VTubers I see on a regular basis have very basic tracking and often quite simple models.

“At the end of the day, VTubers aren’t really that different from regular streamers. People watch VTubers because they like their personalities and they stream content. While a good follow setup can be helpful, nothing can replace a fun personality and engaging streaming content.

“That’s what I want to focus on with VTube Studio. Most of the features I plan to add in the future focus on improving viewer interaction and collaboration with other VTubers. That’s what I personally enjoy the most and also what I think sets VTubers apart from regular streamers.”



Reference-www.dexerto.com