Volu is now in public beta, but how did we get there?— Volograms’ building-in-public series

1st - Friends & family. 2nd - Private beta. 3rd - You!

Rafa Pagés
Volograms

--

It’s only been a few months since we came out of stealth mode and announced Volu, the first app that allows users to capture people in AR. This week, we are opening our beta to any iOS user.

We believe Volu will unlock user-generated content for AR, but for that we need to nail the user experience: how do you integrate 3D capture and AR content playback without getting everyone confused? How do you flow from 2D video to AR with a seamless experience? We, of course, had some ideas (some of us are very opinionated 😅), but we needed to test them as much as we could before making Volu available for everyone. So today I want to tell you a bit more about our beta testing journey and some of the things we needed to do before opening Volu testing to everyone.

Volu is now in public beta!

Friends & family, our first beta heroes

The purpose of our first beta was validating that the main flow of the app worked and made sense: capture a vologram, play with it in AR and share the content you create. As simple as that.

We did the onboarding of every new user, and offered them personal support in the process; we needed to understand what was obvious and what wasn’t. It was the first time our 3D reconstruction pipeline was tested with people outside of Volograms, and it didn’t take long for the first issues to show up. For example, testers who were familiar with 3D scanning, moved around the subject while capturing, producing very strange and even psychedelic results; testers who were not familiar with volumetric video tried capturing objects such as chairs, stuffed toys, or sometimes their pets… we were not expecting that! All these attempts caused the system to crash, as one of the first steps of the system is detecting the person in the scene: we had to make the system more robust, and also improve our communication skills 🤣.

Volu’s first beta only allowed vologram capture with LiDAR-enabled iOS devices, however, the rest of the app worked on any ARKit-enabled iOS devices (anything newer than the iPhone 6S), so we made sure there were some cool models our first testers could play with — even if they couldn’t capture their own volograms, they would be able to play with superpowers (cool special effects that can be added to a vologram) and find some bugs.

And, after a few weeks of intense testing, we decided to get out of our confort zone and let more people in.

Volu’s closed beta, learning from our testers

We started inviting some of the people on our waitlist around mid-April, and continued doing so until this week. Along these three months we progressively invited around 1000 testers, and we learned so much from them! If you are building your own app, I totally recommend going through a process like this.

During these months we got the app and back-end ready for self-onboarding, we integrated a ticketing system for support, fixed many bugs and tweaked the UI so the user experience was significantly better. We also added some curated content to help testers get the most out of the app, integrated some tooltips, and extended the capture capabilities to support iPhone XR or any other newer iOS device. Lastly, we continued improving the reconstruction system to make it more robust (I will write another post eventually about how our human segmentation algorithm was chopping off hands and sometimes even heads, and how we fixed it 🤓), and improved our communications so testers know what they can capture with the app (although we still get a chair or a cat every now and then 😆).

But the most important part of this beta was understanding how our testers were using the app, and thanks to that, we prioritised features that were initially planned for later in the year. For example, some testers were home alone or self-isolating during the pandemic so they didn’t have anyone to capture them, so we built a simple timer that helped them capture themselves.

All this communication was mainly through our Discord server, where more than 700 enthusiasts have been sharing some of their creations, and some of them have been very active: check out Russ Johnson, Gabriele Romagnoli, Patricia Russ or Carlos Duque! If you are one of our testers: thanks so much! We wouldn’t have made it without your help!

What’s next 👉 test Volu today!

This week we made Volu’s iOS beta public, and you have the download link on our website: https://getvolu.com. You will need to download Apple’s TestFlight app first.

During the next few months we will add new cool functionalities, including some our testers have been asking for, we will launch the next version of our 3D reconstruction pipeline with more detailed models and fewer artifacts, and we will get the app ready to be released, also for Android!

In the meantime, if you are interested, go ahead and give Volu a try! We would love to get your feedback and build something that is cool and useful.

But remember it’s still in beta, so if you try to break it… you will! But have fun in the process! 😃

--

--

Rafa Pagés
Volograms

Coffee Enthusiast Officer at Volograms, creators of Volu: user-generated content for #AR ☕️ 🍻 🏀 🤿 🎸 📷 📱