NVIDIA has officially launched its eagerly awaited Augmented Reality (AR) Software Development Kit (SDK) optimized for Live Streaming, Gaming, as well as Video Conferencing platforms. The technology has debuted in StreamFX, a highly popular OBS Plugin. The technology promises some amazing capabilities including detecting and locking onto faces, and creating a 3D mesh of a face, all using a standard web camera.
NVIDIA has now introduced AR SDK for Live Streaming, which brings Artificial Intelligence (AI) capabilities to the increasingly popular and widely used remote web conferencing, communication and collaboration tools. The technology is claimed to assist gamers as well. Surprisingly, users can benefit from the addition of AI with a standard webcam, claims NVIDIA.
How Does NVIDIA AI AR SDK For Live Streaming Work?
As expected, the NVIDIA AR SDK, also known as the NVIDIA RTX Broadcast Engine, utilizes the powerful and versatile Tensor Cores on NVIDIA RTX GPUs. The SDK can enable fast, and reportedly high-quality tracking through an exclusive backend, cloud-based AI network that launched last year. NVIDIA claims the technology and the network can deliver accurate face tracking; face landmark tracking of face shape, lips, eyes, eyebrows and nose with 3 degrees of freedom; and an advanced face mesh (a 3D reconstruction of a human face and head pose) with 6 degrees of freedom.
The NVIDIA AI AR SDK for Live Streaming has debuted through StreamFX, a popular plugin for OBS (Open Broadcaster Software). The AR SDK is now available on GitHub. Interested developers can make use of the same by downloading it from NVIDIA’s website as well.
The new version of the StreamFX uses the SDK to create a smart camera that automatically crops and zooms in on the user, keeping them centered in the frame whenever they move. Speaking about the exciting new development, Michael Fabian ‘Xaymar’ Dirks, StreamFX Author said,
“The new Face Tracking filter in StreamFX required complex math and machine learning that would have taken months if not years to tune to run on many machines. However, thanks to NVIDIA’s AR SDK it was done in no time at all, and runs on any NVIDIA RTX hardware which is becoming more and more common thanks to the NVENC H.264 encoder on it. This has drastically reduced the necessary development time from years to just two weeks, and at no extra cost too.”
NVIDIA Releases New AR SDK to Create 3D Facial Meshes Using a Webcam and RTX GPU
The new AR SDK currently offers new options using nothing but a webcam and tensor cores. https://t.co/D5esIqtERq pic.twitter.com/qTxhIZlV3o
— TheFPSReview (@thefpsreview) April 7, 2020
What Fabian clearly implied was that developers now have a powerful SDK that NVIDIA has introduced to substantially boost the capabilities and features of multiple platforms with AI-based AR. Needless to add, game developers, live conferencing as well as live streaming platforms are set to gain immensely from the NVIDIA platform. Simply to summarize, the NVIDIA AR SDK, which brings the power of AI to Live Streaming, extends the following features through a standard webcam:
- Tracking faces on the camera to identify one or several subjects on an image,
- Overlaying assets on a person to put costumes or effects on people,
- Controlling an animated or game character with your face and head movements.
It is important to note that NVIDIA has offered its AR SDK as an open beta platform. This clearly means developers should expect a few bugs, and are encouraged to report the same to NVIDIA for further finetuning the platform.