You’re asking for a different project altogether.
100% Webcam based skeletal body and facial blendshape tracking. The models are from Google and are open source.
https://developer.apple.com/documentation/ARKit/tracking-and...
This plugin to blender is basically just receiving those values from the OS API and applying it. It’s a fairly common integration and almost all alternatives depend on ARKit on an iPhone as a result rather than implementing any algorithms themselves.
Variations of this plugins functionality have existed since the introduction of the iPhone X in 2017.
https://superhivemarket.com/products/faceit
>Faceit is a Blender Add-on that assists you in creating complex facial expressions for arbitrary 3D characters.
>An intuitive, semi-automatic and non-destructive workflow guides you through the creation of facial shape keys that are perfectly adapted to your 3D model's topology and morphology, whether it’s a photorealistic human model or a cartoonish character. You maintain full artistic control while saving a ton of time and energy.
https://faceit-doc.readthedocs.io/en/latest/FAQ/
This is a great explanation of how FaceIt works, facial animation, shape keys, face rigs, ARKit, etc:
This addon automates Facial Animation (FACEIT Tut 1)
https://www.youtube.com/watch?v=KQ32KRYq6RA&list=PLdcL5aF8Zc...
Edit: nvm, found it https://github.com/nmfisher/blender_livelinkface
It’s not usable standalone as it requires a companion app and a companion device.
If Blender did want to integrate it, there’s nothing novel here that would prevent them writing their own. There’s plenty of similar plugins, and it’s just forwarding events from the companion device.
The place where it would make the most sense to add would be for Blender on the iPad where it would require no companion device at all.