Frame at Microsoft Build: Speaking the World Into Existence
Frame is a web-based metaverse platform that makes it simple to create your own corner of the metaverse, right from the web browser on desktop, mobile, and VR. You can create a Frame for free here.
Check out the clip of Frame at Microsoft Build here (skip to 21:40 to go right to the Frame stuff, but the whole thing is cool)
We believe that creating virtual worlds in the metaverse should be accessible to anyone who has a vision they want to bring to life. It’s still more challenging than it should be. Most virtual worlds are created with sophisticated game engines like Unity3D or the Unreal Engine, tools that can be hard to wield for those without significant design or development experience.
At Frame, we empower users to create social 3D spaces right from the browser without coding, but there are limitations to the complexity of what you can make with Frame, and we're always working to make our no-code customization tools easier and more powerful.
New technologies, though, are opening up intriguing new avenues for metaverse creation. We were proud to collaborate with Microsoft on a prototype for Microsoft Build that showed off one of these avenues: AI-powered creation using natural language input. Or, as we called it, “Speaking the World Into Existence”.
This prototype was shown off during Microsoft CTO Kevin Scott’s keynote. As a small development team, we were thrilled to have the fruits of this collaboration shown on such an incredible stage. We hope it sparked the imaginations of builders around the world.
Let’s dive a bit deeper. In our prototype, you speak or type to an AI-powered bot inside Frame, and the bot manipulates the 3D scene, doing the best job it can with your commands to bring your vision to life. So, rather than fiddling with an intimidating array of widgets with your mouse and keyboard, you can just speak or write what you want to see.
The AI system that powered this experience was a custom version of OpenAI Codex, an AI system that translates natural language to code. The custom version of OpenAI Codex used in the prototype is called Babylon Codex, and it’s specially tuned to take natural language commands and output code in Babylon.js, the open source 3D development framework from Microsoft that we use to build Frame. To learn more about why we build Frame with Babylon.js, see this blog post from the Babylon team about our partnership.
We had to prompt Babylon Codex to do a few handy tricks for this prototype. For example, if Babylon Codex got the command to “make a whiteboard”, we prompted it to bring in our existing Frame whiteboard, rather than try to have it write the code from scratch to make a multi-user whiteboard. This whiteboard asset is complex and isn’t something Babylon.js or Babylon Codex can create “out-of-the-box” - yet!
Also, because Frame is a multi-user metaverse environment, we had to do some tweaks to ensure that the 3D scene updated for every user in the space, not just the particular user giving the command. This is what kept the scene in sync for everyone in the Frame.
We were surprised, though, at what Babylon Codex could do without any coaxing from us. The possibilities go well beyond “add some asset to the space”. There’s one part of the prototype where we ask Babylon Codex to create a box that users can click on to create other assets. Babylon Codex handled setting up that interactivity without issue. We had it adjust the gravity of the physics engine to match the gravity of the moon, which it was able to determine without any training from us.
This was still just scratching the surface of what can be done with these tools, and we’ll continue to experiment with Babylon Codex to explore what AI in the metaverse can unlock for people. Based on the compelling results of this collaboration, we believe this will be one of many ways that users create immersive experiences, a complement to other tools and modalities. In the video, you can see there’s a section where someone also manipulates the scene using the regular Frame editing tools.
While this collaboration resulted in something neat for Microsoft Build, the work on this prototype and the combination of technologies therein helped us explore what’s possible now and how this space can evolve in the future. Both Frame and Microsoft are deeply invested in empowering users to create and collaborate in the metaverse, and we think AI-powered tools will be a key piece of the puzzle.
Any questions? Feel free to reach out to firstname.lastname@example.org - we'll get back as soon as we're done shooting hoops on the moon.