How Meta’s ‘Presence Platform’ contributes to its vision for the Metaverse

At Connect last year (2021), Meta shared their vision for the Metaverse ‘a more connected digital experience that allows you to move seamlessly from one place to another and spend time with people who may be physically distant from you.’

One of the new things announced at the same time was the Presence Platform, a suite of machine perception and AI capabilities designed to help build more realistic mixed reality, interaction, and voice experiences that seamlessly blend virtual content with someone’s physical world.

Presence Platform capabilities

The platform is currently made up of 4 Software Development Kits, or SDKs. These are designed to make it easier for developers to build interaction into their metaverse experiences.

Insight SDK

The Insight SDK includes three pieces of functionality – Passthrough, Spatial Anchors and Scene Understanding.

  • Passthrough – this functionality provides a way to build and test applications that blend real and virtual worlds. This is often referred to as ‘augmented reality’. These applications could be used for gaming or remote working, for example. You can find out more details about passthrough on this Oculus blog post.
  • Spatial Anchors – these are ‘world-locked frames of reference that will enable you to place virtual content in a physical space that can be persisted across sessions.’ Effectively then, this enables developers to place digital ‘things’ in a physical space, locked in position, even if the user leaves the session and comes back later.
  • Scene Understanding – these capabilities help developers build connected interactions in a seamless way across an environment (i.e. a ‘scene’). It means developers can more accurately place virtual objects within a scene – like a virtual painting hanging on a real-world physical wall, and also allows for virtual representations of physical objects to be brought into VR.

Interaction SDK

This SDK is all about hands and controller interaction. Developers can access a library of ready to use common interactions, like grab, poke and select, to build into their metaverse experiences.

It also provides tooling to help developers build their own customer hand gestures, should they want to.

Tracked Keyboard SDK

The Tracked Keyboard SDK allows for a physical keyboard in the real-world to be 3D modelled and tracked in virtual reality. So, for example, someone using an Oculus Quest 2 will be able to ‘see’ and interact with their real keyboard like normal, within VR experiences.

Voice SDK

Voice SDK contains natural language functionality, allowing developers to create voice-driven, hands-free navigation and gameplay.

So, you could use your voice to navigate within a metaverse environment, or ask a question or for help via Voice FAQs. Powered by Facebook’s ‘Wit.ai’ natural language platform, the potential for Voice SDK to enrich experiences are enormous. From a gaming perspective, imagine roaming through a virtual Hogwarts, using your own voice to activate spells and incantations….

The first building blocks of Meta’s Metaverse future?

It’s fair to say that Presence Platform is a real marker of Meta’s intention to play a leading part in the Metaverse. The capabilities being developed in the platform are starting to allow developers to build ever-more enriched and engaging VR & AR experiences, entwining different levels of human interaction with virtual environments and objects in an intuitive way.

I, for one, am excited to see what Presence Platform can deliver in the hands of designers and developers.

Related articles
What is the metaverse?
When will the Oculus Quest 3 come out?