I just stumbled across these videos of the making of digital humans for the series ADAM from Oates Studio.
The series was picked up and expanded by Neil Blomkamp and his team at Oates Studio. Oates has taken a large interest in real time rendering and developing everything to work and play in Unity. ( all three of the ADAM shorts have been completely rendered in unity)
As you will see, part of the interesting thing is that they are taking full moving captures of the humans facial movements at 30 to 40 fps. Rather than face dots and mocap to transfer to a jointed/blendshapped facial rig, and then importing that data as alembic straight into unity. That combined with body mocap and some animation blending creates the final performances.
It looks like the alembic importer for unity has been around for a couple years. but with the help of Oates Studio it has gotten a major upgrade with they combined efforts to create the ADAM series.
It makes me wonder how large these data sets can be, and still run real time in unity, as well as if they are working on ways to use the alembic cache to drive alternate characters. Something that is normally handled by facial rig transfer from mocap data .. ( like how weta creates the faces for "Apes" https://youtu.be/txoEDIdbUrg)
This is a nice in depth article on the technological challenges solved by the ADAM team.
https://blogs.unity3d.com/2017/12/04/adam-the-evolution-of-alembic-support-in-unity/