Ok so that Facebook/Oculus livestream on future of AR was actually super inspiring. (FF'd to 30min session w/ Michael Abrash). Seems like we're gonna see a lot of FB employees walking around w/ employee-only glasses w/ cameras+sensors as FB works to build its 3D map of the world.
I appreciate FB for telegraphing the vision so clearly... "You're gonna see people wearing these FB glasses, don't freak out! We're gonna someday connect them to your brain so you can neural-click, don't freak out!"
IMHO, super-clever strategy to use army of Facebook employees wearing R&D camera-glasses to crowdsouce a 3D model of real world. Think: Google StreetView cars, except it's people walking thru cities modeling everything in sight... inside + outside + at home + in stores, etc.
Was clear watching the Oculus 2 presentation that the "Workspaces" demo is using Oculus as the training-wheels for AR glasses, esp w/ the passthru video on the world-facing camera. Also clever.
Q to any Oculus developers out there -- can you run YOLO object detection / image classification on Oculus using the world-facing camera? This must have already been done, right? (if not, why not? Oculus Go has world-facing camera,right?