ARCore for Android developers - pARt 1: The basics



Nowadays, augmented reality sounds like a buzzword, but actually, as an Android developer, you have a pretty easy to use toolset to do basic things - like showing a model - with only a few lines of code. The goal of this article is to introduce you to the tools and methods to use with the ARCore framework, focusing mostly on the Sceneform helper library.

First of all, you should have a look at the following guides:

If you are done with the guides, let's get started. You'll create an application in which you can add a chosen model to your augmented environment!


This guide and sample application will use Kotlin and coroutines with a twist. All long-running tasks in Sceneform should be started from the main thread, and the library handles concurrency for us, but we'll use the suspending capabilities of coroutines anyway.

You'll need at least Android Studio 3.1 or newer and the Google Sceneform Tools (Beta) plugin to be installed. Hint: always be sure that the plugin version matches the ARCore dependency version, otherwise it could cause serious problems to debug the errors.

Create a new project with an Empty Activity and a minimum API level of 24. This seems pretty high right now, but Sceneform requires it and most of the supported devices are on this API.


Make sure that your project level build.gradle file contains the google() repository, and add the following to the app level build.gradle:

android {
    compileOptions {
        sourceCompatibility 1.8
        targetCompatibility 1.8

dependencies {
    // ARCore
    def ar_core_version = '1.14.0'
    implementation "com.google.ar:core:$ar_core_version"
    implementation "com.google.ar.sceneform.ux:sceneform-ux:$ar_core_version"
    implementation "com.google.ar.sceneform:core:$ar_core_version"

    // Coroutines
    def coroutines_version = '1.2.0'
    implementation "org.jetbrains.kotlinx:kotlinx-coroutines-core:$coroutines_version"
    implementation "org.jetbrains.kotlinx:kotlinx-coroutines-jdk8:$coroutines_version"
    implementation "org.jetbrains.kotlinx:kotlinx-coroutines-android:$coroutines_version"

The compileOptions configuration is necessary because the ARCore library is based on Java 8 features. Next to the usual coroutine dependencies, you may notice the jdk8 extension library, which you'll use to bridge the coroutine functionality with the CompletableFuture in JDK8.

Manifest modifications

Next, you'll need to update the AndroidManifest.xml file:

<manifest ...>

    <uses-permission android_name="android.permission.CAMERA" />
    <uses-feature android_name="android.hardware.camera.ar" />
    <uses-feature android_glEsVersion="0x00030000" android_required="true" />

	   ... >
        <meta-data android_name="com.google.ar.core" android_value="required" />


You're defining the minimum OpenGL version, the CAMERA permission, the AR required value, and restricting the application in the Play Store to AR capable devices.

Add the sampledata folder

The next step is to change the project tab's view mode from Android to Project and create a new sampledata folder inside the app folder.

Switching to Project view

The created sampledata folder inside app

You can put all original model files into this folder. These won't be packaged into the final application, but will be part of the project. You'll use this folder later!

Would you be surprised if I said you are already halfway to your goal?

Plane finding

So let's assume you have a MainFragment or MainActivity that starts when the application is launched. Its layout XML should look like this:

<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns_android="http://schemas.android.com/apk/res/android"

        android_layout_height="match_parent" />


The root ViewGroup contains only a single fragment element which is referencing ArFragment. This Fragment is a complete, all-in-one solution for handling the basic AR related configuration, checking the ARCore companion application availability, checking the API level, handling permissions, and so on.

Now you can install the application on an emulator - or preferrably, a physical device. You should see something like this (with permission and companion app handling at first start, if needed):

The initial run of the application, with plane finding

As you can see, the built-in Fragment gives us a hand waving icon which guides the user on how to move the phone around, and if the system finds a plane, it highlights it with small white dots. Note that ARCore only works on colorful, non-homogeneous surfaces! So for example, it's nearly impossible for it to detect a plain white wall or floor.

Add your model

Next, you'll need to find a model to use. You could use your own models made in Blender, 3DS Max, Maya, etc., or download one from the Internet. In my opinion, a good source for this is Sketchfab, where you can find free models with CC licensing and a "bonus feature". In many cases, you will face an issue where the textures will not appear on your model when you place it in the AR environment. There are many ways to handle this, but to keep it simple, you may download the model from Sketchfab automatically converted to gltf, which is one of the supported file formats. If that doesn't work either, then I suggest looking for another model, as debugging or fixing 3D models is generally not worth the time as an Android developer.

Because a certain series is so popular right now (and I personally like it too), you will use a Baby Yoda model in the application, this one:

A note about the model: it's made up of around 10.000 triangles and multiple image texture files, which means it's pretty complex. This greater model complexity comes with a greater memory footprint, which is why you added the largeHeap="true" option to the AndroidManifest.xml. At least it looks great!

You should save this as an auto-converted gltf, unpack it, and copy the model file with all related files like textures, .bin, etc. to the previously created sampledata folder. Then, in Android Studio, right-click on the .gltf file and select the Import Sceneform Asset option. This would open up a dialog:

The Import Sceneform Asset dialog

Here you can leave everything on default, and just click Finish.

If everything goes well, a Gradle task would start and convert your model to a Sceneform Asset (.sfa) and to a Sceneform Binary (.sfb) file. You will find the latter in your src/main/assets folder, and this will be compiled into your application. The relation between sfa and sfb files is that the sfb is generated from the sfa, so you should always modify the sfa file to apply any changes to your binary model. At the end of this tutorial, if you find that your model is too small or too large when shown, open the generated sfa file, look for the scale parameter, and set the value to your liking. For the Baby Yoda model, you can try setting it to 0.15.

So right now you have a converted model and a working plane detecting application, but how do you add the model to the scene?

Placing the model

First, you should load the binary model into the ARCore framework. I assume you are familiar with coroutines and use a CoroutineScope somewhere in your application to handle background tasks. For the sake of simplicity, you can also use the lifecycleScope of a Fragment.

private fun loadModel() {
    lifecycleScope.launch {
        yodaModel = ModelRenderable
            "Model available",

Here, you build a ModelRenderable with a given source and await() its completion. The build method returns a CompletableFuture, and the aforementioned JDK8 coroutines library provides the await() extension for it. This component stores the model and is responsible for the render mechanism. The model name in the Uri.parse() call should be the same as the generated .sfb file name.

Then you initiate the tap listener. For this purpose, you have to have a reference to the contained Fragment instance:

override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
    super.onViewCreated(view, savedInstanceState)
    arFragment = childFragmentManager.findFragmentById(R.id.arView) as ArFragment

With that, the tap listener initialization is as follows:

private fun initTapListener() {
    arFragment.setOnTapArPlaneListener { hitResult, _, _ ->
        val anchorNode = AnchorNode(
        val yodaNode = Node()
        yodaNode.renderable = yodaModel

As you can see, it's pretty easy to add a model to your AR scene. In just a few steps:

  • Assign a tap listener to the Fragment, just like a click listener.
  • Create an anchor node from the given hitResult .
  • Set the Fragment's scene as its parent.
  • Crate a Node() which will show the ModelRenderable and set the anchorNode as its parent.

And that's it, you are done! Build and run the application, find a plane, and place the model by tapping on it! Magic.

The final demo with models being added to the scene


This guide should have given you a small introduction into AR usage as an Android developer. I hope you liked this article, and the small but effective sample application.

You can find the source code here.

We are planning to release more AR related articles, so be sure to follow us!

Follow us on social media

Ezek is érdekelhetnek


+36 70 907 3300
Gábor Dénes utca 4, Infopark C épület, I. emelet,
1117 Budapest, Magyarország
Ne maradj le a legfrissebb innovációs trendekről.
Szakmai tartalmak elsőkézből.
AutSoft © 2021 Minden jog fenntartva
magnifiercross linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram