In conversation with Head of Zapbox, Simon Taylor

10 min read
Blog Author

In conversation with Head of Zapbox, Simon Taylor

Blog Author
10 min read
The Zapbox philosophy is to keep trying to push what we've got as far as we can.

 

Watch the full interview

 

Behind all-new Zapbox


Matt Scott:
So after starting out in cardboard, what has inspired the quite significant shift now, design-wise,  into this whole new form for all-new Zapbox?


Simon Taylor:
Yeah, it's a great question. I think it comes down to that, when people try cardboard, they think, "Oh, that's better than I expected." And yeah, we wanted a product that felt well-matched to the quality of experience that you would get. Also cardboard had certain material constraints, so we'd done some interesting experiments about, what if you do allow people to have some direct peripheral view as they're moving around? Because even with the wide angle lens adapter, it doesn't help if you want to look at your feet, because you would naturally glance downwards and obviously the camera's always pointing this way. So we're always going to be limited in how much field of view we could provide just with a single camera on the device. Yeah, the Zapbox philosophy is to keep trying to push what we've got as far as we can.


Making that open peripheral view headset was an important step forward and we tried some mock-ups and we were, again, impressed by the impact that had on the experience. And at that point, there was no real way of doing that in cardboard anymore, just the material doesn't really have the ability to take the stress and to be shrunk to the level we wanted so that you have that really open, peripheral view.


Yeah. So those two reasons, making the product feel well-matched to the quality of experiences that it provides and also having the necessary robustness to allow us to really strip back the supports and the material that would be required to boost that peripheral view that we're able to offer users with all-new Zapbox.


Matt Scott:
So whilst designing all-new Zapbox, did the progression of everyday smartphone tech have any impact on the physical form that you wanted it to take?


Simon Taylor:
Yeah. I suppose there have been a few trends. One of them was smart phones getting bigger. Back in the original smartphone days, there were still some  iPhone 5S type four inch screen devices that were still relatively common. And then that was already on the way out really. So I think that allowed Google to do another version of Cardboard with a larger field of view, bigger lenses, and I think a lot of the reason that that was possible was to do with the average smartphone screen getting bigger.


The other aspect that's changed more recently, in the last couple of years, is ultra wide cameras being built into devices. That's now much more common across both iOS and Android. So we did consider early on in the all-new Zapbox design process, whether we would just target those devices and not include the lens adapter. And so we looked into it a little bit and it turns out that on some devices, the ultra wide angle cameras don't offer quite the same level of manual control that you get on the main lens. Or they might not offer the 60 FPS that we really want for the smoothest experience.


And we're also obviously super keen to make Zapbox accessible and affordable and use your existing smartphone. So after looking into it, we thought this version, we're definitely going to stick with shipping a lens adapter in the box, but we will add support for devices that do have good ultra wide 60 FPS cameras. And if the field of view is sufficient, you probably won't need the lens adapter on those devices.


  

All-new Zapbox Tracking


Matt Scott:
So all-new Zapbox features a brand new approach to the visual tracking and you spoke about how efficient the point codes were in V1 and 2, but these new versions with that dot-dot-dash scheme, they seem like a complete departure from that. What was the development like for that?


Simon Taylor:
Yeah. Well, the development is still ongoing in terms of the tracking code. The design for it came out of some of the learnings we had from point codes. So point code tracking does actually work really nicely when we have control of camera exposure on the device and we're able to get a really short exposure mode that reduces blur.


So one of the main driving forces of changing the market design this time around was to have a design that would definitely hold up better under blur, and so not based on needing to detect edges accurately in all directions. So that was one of the big driving forces. The other one was to try and make the controllers smaller. And ideally, we really wanted a cylinder sort of wand design, which feels quite natural to hold, but it's also quite easy to cover up with content when you're trying to render something that isn't the controller. So keeping that profile as small as possible, was a key driving force.


These dot-dot-dash patterns essentially are our detection feature. And the nice thing about them is they're linear. They're a lot thinner on one side than they are on the other side, which means you can actually spread them around a cylinder and still have them detectable. Core idea is a single dot dot dash can itself be a marker, but in order to get a good solid track from that, you need to combine multiple ones in different angles in order to work out the full 3D pose accurately.


  

I love the design. I think the design's in a nice place. The world markers themselves have free space in the middle now, so that's going to be useful we think for all sorts of other extended tracking approaches where people will be able to put their own logos in there without needing to update tracking files or anything like that.


The other big change that I should talk about here is world tracking and extended tracking and how we're aiming to incorporate natural features that are in your environment to help with the stability of tracking and to reduce the number of those world codes you need. So since ARKit and ARCore have become very common on devices, and unfortunately we can't just use that in all-new Zapbox because as soon as you put the fish eye, the ultra wide camera adapter on the phone, it's fun, you can try it, running some AR content, but it doesn't like it very much, it will sort of go all over the place. And so that's why we need our own tracking solution there.


And markers are still great because they are very solid, when you see them in one frame, you know how big they are so you know exactly where the camera is relative to them. They define the origin of the content, so you can just physically put them down on a table, stick your headset on, you know your content is going to be there. You don't have to do the ARKit thing of moving the device around and selecting where you want it. And it means you can use tables that are just plain white or reflective or made of glass or whatever. You've got the freedom to do that.


But in the old one, because all we were tracking was markers, you needed quite a few of them around and you had to build a map of where they were. So with this version, we get essentially the best of both worlds. You've got that really easy physical definition of where you want your content to appear, but once that's placed, we'll also build a map of the natural features that are in your environment to allow us to track if that code is covered up, or if you look away from it, to continue to provide that 6 DoF tracking.


All-new Zapbox Unity SDK

 

 

Matt Scott:
Development of all-new Zapbox includes an SDK for Unity. Why is this so exciting for both developers and consumers?


Simon Taylor:
Yeah, that's right. It's the first time that we've been able to expose Zapbox tracking to Unity developers. And yeah, we're super excited by the possibilities that that unlocks. A lot of the reason is that Unity, in the last couple of years, have worked on their XR tech stack themselves, where they've separated out what they call provider implementation. So your ARCore rail kit providers are at the bottom of their stack. And then they built on top of that cross device and even sort of AR VR crossover, sort of XR type tools on top of that that content developers can then use.


But the really nice thing they've done is they've allowed other third parties to provide those provider plugins. So we can provide a Zapbox plugin to map their cross platform 5 DoF controller abstraction, if you like, to map that to something that will work with Zapbox. Because for us, it means there's less Zapbox specific code that we have to write to bring us to Unity. And from Unity developers' point of view, it means if they've built their content already using that stack, then it should be relatively straightforward to get that to run for Zapbox as a native iOS or Android app.


So those are the two great things from a developer perspective. From a consumer's perspective, that is something that we hope will allow the indie content developers to bring some of their content to Zapbox and publish it to the app store.


New mixed reality content for all-new Zapbox


Matt Scott:
So what are the plans for new mixed reality experiences with the new Zapbox? And how will they take advantage of the new features?


Simon Taylor:
Yeah, that's a great question. So we've got existing content from the first versions of Zapbox, which will all be working. So that covers quite a range of experiences. There is some new stuff that I really love to build and I think the Unity plugin will make that easier for us to build some of that in-house.


So one of them is something we've talked about for a while called ZapBrush, which is essentially a mixed reality or virtual reality painting. And one of the reasons that the Unity plugin will make that potentially easier is that there are some open source components from Tilt Brush themselves, so I think the entire library of Tilt Brush brushes are available for example. And so, by using some of that existing Unity content, we can maybe bring ZapBrush type experience to users a bit more easily.


Model viewing, I think, is a really exciting and interesting use case. So Sketchfab has a massive collection of models. They have a download API, so we could build an app that would allow you to view, at least the downloadable Sketchfab models. Model viewing in general is something I think is an area that we will be thinking about what the best way of tooling around that is, so that 3D artists can experience their content in interacting with reality and walk around it.


Yeah. And then obviously, there's the whole indie developer community. So we have a pledge level specifically for developers that will include early beta access to the Unity plugin. We'll support the cardboard point code based Zapbox tracking as well, so that people can use their existing Zapboxes or we're actually going to be sending some more of the cardboard ones out to people who pledge for those kits. And so, hopefully there'll be some great Unity built content from the community as well, ready for users to enjoy. Yeah. We're looking forward to seeing what the community is able to come up with.


Matt Scott:
Awesome. Thanks so much, Simon for your time today.


Simon Taylor:
No worries. Thanks. Great to chat about it.



Comments