When Sceneform was discontinued and archived back in 2020, it made creating ARCore apps on Android more difficult. Inspired by the original Sceneform APIs and based on the ARCore samples, I have created AugmentedFaceFragment
and AugmentedFaceListener
interface to be able to easily create ARCore AugmentedFaces features on Android.
I will be releasing 3 articles building different AugmentedFaces features using AugmentedFaceFragment
and AugmentedFaceListener
interface.
Part 1 of the series will cover the basics of the AugmentedFaceFragment
and AugmentedFaceListener
as well as the overview of all the helper classes. At the end of this article you will build a simple demo by writing a few lines of code.
Clone the repository:
git clone https://github.com/droid-girl/arfaces_labs.git
Our starting point is a modified version of ARCore SDK sample for Augmented Faces. The code has been modified in a way that we can now easily add different textures and objects to a face object.
Helpers
- original Java files from sample codeRendering
- original Java files from sample code that handles rendering of AR objects and background
Additional files for the repository:
AugmentedFaceFragment.kt
- handles rendering and tracking of Augmented FacesAugmentedFaceListener.kt
- handles Add and Update Augmented Faces callbacksAugmentedFaceNode.kt
- handles the rendering of a face including a texture and associated 3D modelsAugmentedFaceRenderer.kt
- renders face textureFaceRegion.kt
- renders 3D model located on a defined FaceLandmark<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<androidx.fragment.app.FragmentContainerView android:name="com.ar.arfaces.arface.AugmentedFaceFragment"
android:id="@+id/face_view"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_gravity="top" />
</LinearLayout>
class MainActivity : AppCompatActivity(), AugmentedFaceListener {
private lateinit var binding: ActivityMainBinding
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
binding = ActivityMainBinding.inflate(layoutInflater)
binding.faceView.getFragment<AugmentedFaceFragment>().setAugmentedFaceListener(this)
setContentView(binding.root)
}
override fun onFaceAdded(face: AugmentedFaceNode) {}
override fun onFaceUpdate(face: AugmentedFaceNode) {}
}
In activity_main.xml
we add AugmentedFaceFragment
as a main view and to be able to receive events from this fragment, we need to define and set AugmentedFaceListener
to our fragment.
onFaceAdded
method will be called when ARCore detects a new face and onFaceUpdate
will be called on each frame update.
Let’s add a face texture as our next step.
We will use ARCore sample assets for our first face mask. You can find assets/models
folder in your project. Let’s add freckles.png
as face texture:
In MainActivity.kt
modify onFaceAdded
method as follows:
override fun onFaceAdded(face: AugmentedFaceNode) {
face.setFaceMeshTexture("models/freckles.png")
}
AugmentedFace class uses the face mesh and center pose to identify face regions. These regions are:
This project includes the AugmentedFaceNode
class. Inspired by Sceneform, it is a node that will render visual effects on a detected face using ARCore. AugmentedFaceNode
defines same face regions as AugmentedFace
in a companion object:
companion object {
enum class FaceLandmark {
FOREHEAD_RIGHT,
FOREHEAD_LEFT,
NOSE_TIP
}
}
Later in this tutorial, we will extend FaceLandmark enum class
and add our own face regions.
AugmentedFaceNode
includes faceLandmarks HashMap
that connects FaceLandmark with a 3D model.
The source code for this project can be found here.