Skip to content

Q&A With Matt Bell: How Matterport Started Capturing The Real Estate Market In VR

Q&A With Matt Bell: How Matterport Started Capturing The Real Estate Market In VR

Matterport Space, or 3D Space, is a complete three-dimensional representation of a space, which lets you “walk” through the space to experience it as if you were there in VR.

Anyone involved at the intersection of real estate and Virtual Reality has probably experienced a immersive walkthrough made with Matterport technology.

Matt Bell
Matt Bell

Matt Bell founded Matterport after the release of Microsoft Kinect and seeing it’s potential to create 3D environments to give someone that immersive feeling in a space. Bell and I discuss what it was like founding the company, building the prototype and looking forward to Google Project Tango.

How did Matterport come to be?

MB: Back in 2011, my co-founder Dave [Gausebeck] and I were very interested in the idea of creating the 3D equivalent of the camera. In the same way that photography became an instant automated process to capture moments, we wanted to do the same thing for entire 3D spaces. Instead of several 3D artists taking weeks create a 3D model.

We built the software to capture the raw 3D capture that was created by this 3D sensor, so that we could have a real time 3D reconstruction system and actually get that to work reliably, so people could use it to build 3D models of buildings.

So it was a very much a 3D tech startup, and we had to bring in a lot of our own and other computer vision expertise to make it happen.

tl;dr — Bell and his co-founder founded the company to map existing 3D spaces quickly and efficiently.

When you guys got started, did you have VR in mind, was that part of the mission?

MB: Funny enough, we were founded in 2011 and so VR was not on our radar. We were mainly focused on the web and mobile. When the first Oculus DK1 (Developer Kit 1 — the first iteration of the Oculus headset) came out, I remember getting my demo of it, and I realized that we had basically created the perfect way of bringing real spaces into VR.

We very quickly exported some of the 3D models that our camera was already creating so that you could walk around them in VR. We had a handful of really interesting and fun demos that we could show off to people even when VR was very nascent.

tl;dr — VR wasn’t even around when Matterport was founded.

It sounds like you guys had a pivot somewhere around 2011–2012 when you had that realization.

MB: Yeah, I wouldn’t describe it as a pivot as much as an added capability that we could deliver. The content we create — the 3D models created by our cameras — can be published on different mediums and different display types.

So VR is the ideal way of experiencing that due to the maximum conversion that you get. But you can also experience them on the web and on mobile via WebGL on the browser. That’s actually really good because, although the VR headset penetration numbers are impressive and growing rapidly, there are over a billion smartphones out there, and it’s going to take a few years for VR to reach that level of penetration.

tl;dr — VR is actually an added capability that Matterport has presented, not their core competency.

What was it like building your first prototype camera?

MB: Our very first prototype was literally a Microsoft Kinect plugged into my laptop, and we would walk around the house trailing a power cord. It was good for a proof of concept, but we quickly evolved from that.

We quickly ended up working with the Israeli company that developed the guts of the Kinect, they’re called PrimeSense. We ended up incorporating three of their sensors into a camera. What’s nice about that is that we can set that up with an embedded system and batteries that give you 10+ hours of scanning time, and then you can just bring that camera around the space and you control it with an iPad. That provides the real time display and alignment as you’re scanning the space.

tl;dr — Their first prototype was actually a Microsoft Kinect plugged into a laptop.

kinect

Does your camera have internal navigation where it maps out the space as well?

MB: It’s getting in color and depth data and as you move the camera from spot to spot, it’s stitching together all of the data into a coherent 3D model. So as you’re capturing a space, you’ll see it come together in real time.

And to give you an idea of how fast it is, you can scan a typical 3-bedroom house in 30–45 minutes. The alignment is all automated and then you just upload it to the cloud. Fairly quickly thereafter, you get the final results back which you can then view on the web, mobile or VR.

tl;dr — The camera has internal systems that map out it’s location within a structure. No editing necessary, this is all uploaded to the cloud afterwards.

matterport-scan
The Matterport Camera Create the 3D structure of an Interior.

Is your core focus Real Estate?

MB: We’re about 80% real estate right now. We’ve had tremendous success there, we’ve sold several thousand cameras at this point and the total number of buildings captured using Matterport is around 320,000

So it is the new standard in real estate for giving an immersive tour. We’ve started to see people buying homes without ever visiting them which is kind of amazing. There are a lot of cases where there is a remote buyer and they need to make a decision quickly. Sometimes, they’re not able to visit the property in person, or if a couple is buying the property and only one person can fly out, the other can experience it remotely in VR and get a sense of what it’s like to really be in the space.

tl;dr — PEOPLE HAVE PURCHASED HOMES WITHOUT SEEING THEM IN PERSON.

Experience a Matterport Home walkthrough here

Where do you see the company going in the future?

MB: We’re really building out the features of 3D content as a medium. Over the last few months, we’ve released a set of features that let you add tags and more specific information inside of 3D space. We’re going to continue expanding that and also open up the capability for third party developers to work on and customize the content.

We just launched the beta version of our SDK a couple of weeks ago, and that will allow third parties to incorporate Matterport models into their own VR apps.

We really see 3D content as a platform for a wide variety of interesting, rich content to get creative. That’s one area that we’re moving in.

The second area is that we’re very interested in other capture modalities. So for example, Google’s Project Tango is very interesting to us. We’ve written some software that runs on Project Tango that lets you do handheld scans of small spaces.

It’s not nearly at the visual quality level as our professional product but it could end up being a handy tool for quickly grabbing all the dimensions in a room and having a visual reference of the size and shape of everything in the room.

tl;dr — Soon you’ll be able to capture spaces using Google Project Tango phones.

This post by Hayim Pinson originally appeared on Medium as part of the “Beyond The Headset” interview series.

UploadVR Member Takes

Weekly Newsletter

See More