8 Unreal Answers From Epic’s Tim Sweeney
Blink and you might’ve missed the dramatic shift taking place at the core of computing and entertainment. Over the last year the Unity and Unreal game engines — two of the leading toolsets used by developers around the world to make games — are becoming the engines of VR experience. If games themselves are the bridge to a new world where virtual reality is the next computing platform after PCs, the Internet and mobile phones, then the tools used to create these immersive 3D worlds are building that bridge.
There’s a reason Crytek’s CryEngine, yet another toolset for creating VR experiences, was licensed by Amazon and offered freely to developers and it’s the same reason both Unity and Unreal are evolving to allow the creation of VR while in VR. It’s also why 7-year-olds worldwide spend so much time with Minecraft, which Microsoft bought for $2.5 billion. And why Google bought Tilt Brush. And why Oculus made Medium and Quill.
The future of computing is about giving people the tools they need to build the worlds they want.
Tim Sweeney, CEO of Epic Games, is one of the people at the center of this shift. When Oculus CEO Brendan Iribe partnered with Palmer Luckey to co-found a company that would later be bought by Facebook for $2 billion, one of the first steps to rally the game industry was to secure integration with Unreal from Sweeney’s Epic Games.
There are around 400 people at Epic and about 150 of them work on the Unreal Engine. The vast majority of those working on the engine are working on VR in some way. Last week, Epic debuted a VR Editor for its Unreal Engine that was constructed in secret over more than a year by Mike Fricker, technical director at Epic. With hands now brought into VR by the latest consumer hardware from HTC, the ability to use Unreal Engine to literally shape the (virtual) world around you is here.
“This is going to change the world. The ability to build content in VR,” Sweeney told UploadVR in a phone interview. “Right now we’re doing the Unreal Engine as a professional tool but the general idea will be applicable to everything from that all the way down to Minecraft-type experiences where today you have 50 million people who actually built 3D environments using tools. There’s going to be a revolution.”
I talked to Sweeney for a bit about VR, asking a variety of questions about the past, present and future of technology. Check out the edited Q&A below.
What does the future look like for in-VR creation tools from Epic?
The first step is to implement scene editing in VR. So instead of looking at a little 3D port view on the monitor and interacting with it with a mouse, you’re there in the environment, immersed in it. To support the user interface, you have a little virtual iPad that you can bring up in your hand, which contains arbitrary parts of the existing user interface. So you might be able to, for example, bring up the material system and then create a visual material right there as if you’re doing it on an iPad in the real world…That’s the real trick to getting this thing up and running so quickly is that the entire underlying user interface can be exposed in VR very quickly. But the next step is to start designing custom parts of that user interface entirely for VR. Right now you bring up the content browser and you see on your little iPad a bunch of 2D icons showing content and then you can drag it out of the iPad and into the real world…it becomes a 3D object that you can move and position. The next step for that logically would be to have the content browser be something like a shelf full of 3D objects, so you pull that up and now you’re holding in your hand a shelf full of objects in 3D…you can grab them and drop those out into the world. That’s going to be the long term, evolutionary process…by using real world idioms, rather than computer idioms.
Right now one of the best ways to learn Unreal is by watching a Youtube video. But you can pretty easily animate avatars in VR now… is that a route you may go down for teaching people?
I think that is an awesome idea and it’s a logical way of teaching people anything in VR…You can teach someone how to build a brick wall in the real world using it. I think that’s one of the next steps in teaching and learning. We’ll definitely be working on solutions like that. Also, you have a long term desire to be in a shared space with other people, so besides us being able to watch a pre-recorded animation of somebody in VR that you can watch in 3D doing something, you’d actually be able to go online and work with a real person and have them teach you in real time, actually interact with you.
When did you first meet with Oculus and get interested in VR?
I was in close contact with the Oculus guys ever since Brendan Iribe joined Palmer Luckey. We worked really closely with Brendan in the past when he was at Scaleform…so when Palmer was out building the kit by himself with duct tape and…when Carmack showed it at GDC, I was just watching and reading about it at that point. But as soon as it came together as a company, we readily saw the future potential of it…this crazy idea from the 1990s that suddenly became practical given the advances in computing and display technology [Independently] I had been talking to Michael Abrash at Valve about VR for some time. Together this all made it very clear that the magical threshold that has existed with the iPhone or iPad in which this idea which turned out to be impractical in the early days of computing had suddenly just become completely feasible because you now had enough computing power and display technology resolution…that you could actually build a product and have an excellent experience. Michael Abrash explained that to me at one point, before I had actually seen a VR development kit, and it took me about five minutes and I was like, ‘Yeah, ok, I get it. I see the trend there’. It has the same effect going from the Apple Newton to the Apple iPad. The same idea. The only difference is the amount of computing technology at the time it was built.
I’ve been watching as Unreal has morphed into a VR experience engine. When do you see the company focusing more on “VR development” rather than traditional games?
That will definitely happen because the entire world is moving that direction. We try to be the leading edge, building new technology as soon as it becomes technologically possible and making it available to game developers where, over the course of their development cycle, they can be early to adopt new technologies. So I think the trajectory of the industry is on with the hardware, that calls for VR and augmented reality. You know augmented reality is probably the more mainstream consumer factor of this where instead of putting on a helmet and blacking out the world around you, you put on something similar to your Oakley sunglasses and see a transparent or opaque view of the world. That is the form factor I think will drive the hardware to billions of users, as opposed to maybe a million this year, maybe 10 million in couple years. This transition will happen over the next decade and as the hardware adoption curve moves up we’ll dedicate more and more of our resources into it.
There’s some debate about how big AR and VR will get and how quickly. Isn’t it possible AR will be a feature added to ever-shrinking VR headsets?
[We] start with VR because we know exactly how to build this, whereas augmented reality relies on some hardware components that are still within the realm of science fiction. So we will see that happen. We will see these headsets increase vastly in resolution, they’ll implement more sensors, cameras of all types mounted around it to pick up the environment. . . HTC has already announced something along those lines. They’ll also pick up your movement, your hand’s movement, your finger’s movement and everything else. And that’s all going to be developed very quickly. I imagine the potential size of the VR platform is probably twice the size of the console and PC gaming audience. There’s a huge number of consumers around the world, but not everybody. You have to look at the dichotomy…between the computer and the smart phone. The computer was the first thing out there. Not a very portable device which was extraordinarily powerful but, over time, the number of smartphone owners in the world came to eclipse that. Just because of the convenience, making it mainstream for a much larger number of people. There’s no doubt that the VR experience where you have your…field of view filled with completely immersive imagery is going to be the best for any completely immersive experience, but you know, being humans, I don’t think you’ll see a lot of people walking around the street wearing those. It would just kinda be weird whereas transparent glasses that have some kind of computer display where you can still see your eyes and still interact with people in a normal social way, I think that will be much more ubiquitous, [something] suitable for half the people in the world.
Games are important for the initial adoption of VR, but do you see Unreal going beyond games as the technology evolves?
That’s right. It’s happening now and games are at the center of this revolution because they are the most compelling…but the engine is now being picked up very rapidly by everybody throughout the industry. We have automobile manufactures using Unreal for design visualizations….they will start previewing them in in real-time 3D. Architects from all over the world are using real-time visualizations of environments. You know, you want to have an office building constructed, well first you can preview it in VR, make sure you’ve got the layout right before you start spending on design. It is happening in all kinds of other fields as well. Storytelling is another major area. For the first 100 years of movies, they’ve been recorded as a series of images, one after another that are played back to generate the appearance of motion, but that’s not going to work in VR and augmented reality where you’ve got to be able to move your head and actually feel immersed in the scene with parallax. Every movie in the future is going to require a real-time recorded version of the 3D scene, instead of taking pictures or frames.. be 3D scanning actors and models and all the motion of a scene in real time and combining it with all kinds of computer generated graphics to generate a compelling experience. I think that will be the best storytelling medium ever created and it is all going to be real-time 3D. I think the engine will be the medium of the future and I guess you need to remove the word game from it now because it is more general than that.
Can someone who is not a professional developer pick up and use Unreal in VR more easily than with a traditional monitor?
At GDC in March we will announce an overarching plan and the release timing of Unreal Engine VR support. And it will be coming sooner rather than later. We’ll be getting it out into everyone’s hands as soon as possible and the date will be announced soon. The real immediate benefit to this stuff is being able to build and tweak environments in VR. I think you’ll be able to pick it up and it is still a professional tool, so it’s not as easy as a program like Medium could be when they are designing something from the very beginning for simplicity…but it is certainly in the realm of you can pick up as a hobby. I think you’ll find it easier than the old style 2D interface. I’d say for the next couple of years, you’ll definitely be spending some time in VR and some time out of VR. The real highlight of the VR Editor is the immersive experience and the ability to interact with the user interface to change materials, tweak the brightness of lights, do all these things right there in the live 3D environment. but if you’re going to spend 10 minutes editing a text file…at that point it is still worth it taking the headset off and going back to the monitor because the VR resolution is still very low. The pixel density is the equivalent to playing Doom back in 1998.
Are there any moments in VR where you were you were emotionally impacted?
Two things really stand out. This isn’t me as a connoisseur but more my brain speaking as for what it recognized at a very real visceral level. One was the earliest Oculus…demo built in Unreal Engine 4 where you’re basically Batman standing on top a tower in a city and looking down. That experience highlighted that VR really does create a sense of presence. It was very very hard for me to convince my brain that if I walked forward a couple steps I wouldn’t fall to my death. That’s why people who haven’t seen VR yet don’t really appreciate the idea. It sounds like a more enhanced computer display but really it puts you there in the environment.
The other thing, which I’ve actually had dreams about, is this first VR Editor. Mike Fricker has been prototyping it in secret for more than a year now, but about two months ago it came to the point where it felt real and became extremely usable. I was out there building a scene picking up objects, moving them around. It was a funny mix of photorealistic objects and…objects that looked out of place or cartoony. The funny thing is I had some dreams about that later. Not dreams where I was at a computer interacting with VR, but dreams where I was actually in the environment and walking around. There was a pink object and a purple object and I’m like, whoa, what are these doing here and I’m moving them around in my dream. It’s really cool how this technology works.