We recently published The Condor, our first augmented-reality app, on the app stores. Half a year’s work is behind us, and we now have 22 unique scenes with 3D animation. As indie mobile app developers are only beginning to get to grips with developing applications using augmented reality, I’ve decided to share our experience. Perhaps it will come in useful. But first, a word about how we got into augmented reality.


My Canadian company, Saturn Animation Studios, had been making interactive apps for children since 2014. We had published 10 apps based on classic children’s fairy tales. They were all structured like books, with pages consisting of text, limited 2D animation and games – jigsaw puzzles, colouring games and arcade games. If you’re interested, the financial return was lamentable. Each app cost at least 20 thousand dollars to make, but brought in a few dollars a day at most.

Last year, already intending to switch to games and looking into promising avenues, I went to my first PG Connects conference, in London. We even published two games shortly after the conference: a platform runner game inspired by the Stranger Things series, and also a memory-training game. But this was all done pretty much by inertia. The things that had really grabbed me at the exhibition were the developments in augmented reality. I decided to move in that direction sooner or later. But how? With fairy tales? Business apps? Games?

I should mention that Narcos had made an impression on me that lasted a whole year after the series started (until season 3, which ruined everything). Because the series was set in Colombia, and because I’ve lived for four years now in British Columbia, Canada, an “original” rhyme kept popping into my head: Colombia/British Columbia. Step by step, my screenwriter friend Vadim Sveshnikov and I came up with a story about a Colombian condor somehow finding himself in Canada and having this dialogue with a local inhabitant: “Is this Colombia?”. And the local replies: “Yes, this is Columbia. British Columbia.”

It was around that time that the Canada Media Fund announced a competition for the funding of Canadian/Colombian co-production projects. That was when it all came together. Long story short, we quickly finished the screenplay, and in August we got the funding to make a pilot version of the Web series and the mobile app.

The characters we started with changed a lot in the course of the work. And the actual concept changed too. Initially, we planned to create the story using the 2D animation we were familiar with, and add augmented reality elements to certain scenes, allowing children to have their photo taken with the characters or to see what they look like in the home setting. But, as the work progressed, it became clear that this was boring. Dull. So, we decided to use augmented reality technology for the whole story. All 22 scenes.

Several questions remained unanswered. How could we optimize the characters so as to prevent the app from becoming unacceptably bloated? And which augmented reality technology (or rather, which SDK) should we use for development?

The size issue was quickly settled. It turned out that the characters we used for animation couldn’t possibly be used for the app. In the original models, the characters contained over 30 thousand polygons. The sizes were off the scale. We began to reduce the number of polygons, and finally hit on a compromise that didn’t significantly affect image quality. That means that now, the full two episodes (22 original scenes) take up about 230 MB on Android, which isn’t too bad – although on iOS they take up 600 MB.

Deciding on augmented reality tools was trickier. Initially, we were inclined toward a cross-platform solution like Vuforia, which would work on most modern devices. But then in September Apple and Google went on the offensive with ARkit and ARcore. Now we had a dilemma: who to go with?

We decided to do a test using Vuforia. It had the great advantage of being a cross-platform solution, and also of having an SDK that let you turn any image into a marker and launch an animation directly on it. This meant no need to print out and use the same marker. Whatever is handy will do. Scan any picture with your phone, and it turns into a marker.

We made several test scenes, and they turned out OK. It all worked to plan. But. You couldn’t attach an animation to the floor or a table, and those are obviously ideal platforms for this kind of app. And, more importantly, the picture was unstable and often vanished. The most disheartening thing was that, if the image of the marker was not all that detailed, the animation would start to shake; it would wobble and occasionally disappear.

We then decided to give native tools a go. This turned out more promising. The picture was stable and rich. The animation stayed put as if nailed down – you could walk round it all you liked, move towards it or away from it, and it didn’t go away. True, Google’s solution did seem a bit lacking in polish. The app took longer to find a surface. What’s more, the camera would lose the surface if so much as the slightest shadow fell on it. For all their drawbacks, though, the native solutions were clearly more reliable. We therefore started looking for a solution to the “shadow” problem, and as soon as we found one, we made a final choice in favour of ARkit and ARcore.

Of course, this choice immediately cut off the significant segment of our audience using devices that these SDKs won’t work on – devices older than the iPhone 7 or the Samsung Galaxy S8. But it was a conscious decision to sacrifice them. Another, unexpected problem cropped up only after publication. We discovered that the Android version wouldn’t run without special incantations. The user first had to install the ARcore SDK, which (be it noted!) is still not available as an app on Google Play.

The most obvious solution to this problem – making the app check for the presence of the ARcore SDK on the device during installation – is not the way we went. We found no ready-made solutions. Devising one of our own would have taken who knows how long. What’s more, the result is the same anyway as far as the user is concerned: if the ARcore SDK is not installed, it has to be installed manually. We therefore decided to simply flag up a warning that ARcore must be installed for the app to work properly. Now it all worked. But…

To be honest, when I looked at the steps the user has to take, I realized that we would lose a significant proportion of users at this stage. Many would simply not risk clicking “OK” on a succession of windows popping up with incomprehensible warnings from Google. Why Google doesn’t solve this issue while it is actively promoting ARcore, is a mystery. From doing this project, then, we have definitely concluded that, for ease of app development and for ease of use, Google’s solution is greatly inferior to Apple’s.

That, basically, is the whole story of how we worked on our Condor. As I set out to talk about our work implementing augmented reality, I haven’t touched on issues like creating the UI for the app, animation and Unity 3D compatibility, ASO, monetization, etc.

If you would like to have a look at our Condor, it is free to install via the links on this page.  

I will be happy to receive any comments – and also to collaborate with any talented programmers who would like to try their hand at augmented reality applications.