Now that you’ve had a chance to augment reality in your Xamarin iOS apps with ARKit, it’s time to explore Google’s take on AR in your Xamarin Android apps.

The new ARCore SDK provides APIs for Augmented Reality features, such as motion tracking, plane detection, and light estimation. These are the building blocks you will use to add AR experiences to your Android apps.

Getting Started

ARCore is currently only available on select devices such as the Google Pixel, Google Pixel 2, and the Samsung Galaxy S8.

In order to use ARCore, you need to prepare your device by downloading and installing arcore-preview.apk.

After you set up your device for ARCore development, you need to install the ARCore prerelease NuGet package.

ARCore API Basics

To help you detect surfaces to place objects on and calculate their location in space relative to the camera, ARCore uses a few basic types.

  • Session Object: Your main point of interaction with ARCore. It will help manage the AR state by keeping track of any anchors you add, surfaces the engine has detected, and current snapshots of the device.
  • Plane: A surface the SDK has detected, onto which you can place an Anchor to describe a fixed real-world location of an object (including its orientation). Currently surfaces facing upward and downward can be detected separately (think floor and ceiling).
  • Session: The current snapshot of the AR state when you call .Update(), which returns a Frame Object.
  • Frame: A convenient HitTest(..) method, which can help determine if tapped coordinates on the display intersect with any planes, for example. Each frame also contains information about the camera’s orientation and relation to the real world and helps compute projection matrices for displaying visual representations to the user.
  • Another interesting feature of the SDK is the ability to obtain a LightEstimate from a given frame. This estimate includes the PixelIntensity of the camera view.

Basic Walkthrough

We’ve ported the HelloAR sample to Xamarin, and you can go check it out on GitHub! Now let’s walk through a few of the basic things going on in this sample.

First, in your activity you need to create a session in OnCreate and make sure ARCore is supported on the device at runtime:

Remember, you also need to request Android.Manifest.Permission.Camera permissions to display the live camera feed / augmented reality view to the user. In the “HelloAR” sample, we use a GLSurfaceView to render camera and augmentations to the user. Make sure you set up your GL Surface or look at how it’s done in the sample code.

With a session running, we can obtain a snapshot of the AR system state in our GL surface’s OnDrawFrame implementation. After we check to ensure the frame is in a Tracking  state, we can check for any hit results and, assuming they intersect a plane, add an anchor to the plane.

We also want to render the various objects in our scene in our drawing method. The HelloAR sample has various renderers to do the heavy OpenGL lifting and achieve this based on the projections calculated from the frame:

After dissecting the sample, you can see the actual ARCore code is relatively straightforward and most of the sample code is about the OpenGL rendering. 

Again, be sure to check out the HelloAR sample in its entirety on GitHub! We look forward to seeing what Augmented Reality experiences you create with ARCore in your Xamarin Android apps.

Discuss this post on the forums!