Arrgh. Not a very productive day. Which I feel terrible about. Sorry!
I worked on the pipeline for pose passthrough. Reading the overview on the approach we use (attached), it’s really funny for a research paper that it has no actual math in it, just “we use this hocus pocus and here’s a flow diagram” but it does get the point over and gives a good high level of the approach.
I’ve mentioned how we use accelerometer data so I’ve been looking at that. On Android there’s the SensoryDevice API which gives you a 3x3 matrix using right handed rotation. In Javascript you have the Generic Sensory platform which gives you a 4x4 matrix and uses a left handed rotation system. Both of these APIs are by Google (well, under the guise of W3C standards for the Javascript one). One is written by the Android team, one by the Chrome team. I thought the philosophical differences there were weird within the same company.
I tidied up our source code so it can work with both projects and initialize either the Metropolis one or the Ship of the Dead project. I also tidied up the Blueprint add-ons I did to the Epic code since I’m using screenshots of how we use those in our Epic Mega Grant application. Epic provides some callbacks inside Blueprints for streaming, but only for “all connections closed” so you can reset your application. I added a few more like “new streamer connected”, “new streamer starting to connect”, and “new streamer has video stream”, these can be suggested as part of the Mega Grant application as code changes we’ll want to merge back to Epic (which I’d do anyway, it’s just good community to do that) but it needs to look “Epic”.
We’re close now.
Graeme.
Comments