Cool AR apps were possible before, but now they have become increasingly sophisticated, thanks to new features in Google's ARCore and Sceneform. Objects can now be placed on vertical planes such as walls or doors. 3D images render faster and more easily, and augmented reality scenes can be shared between devices. Overall, the AR experience has become much more advanced and easier to use compared to preceding versions.
In a previous post (and another), we discussed what augmented reality is and how it can help e-commerce, as well as how to get AR to recognize vertical planes. In this post, we will dissect our demo app, which can virtually furnish a room with customized, detailed 3D augmented reality images to see how this piece of furniture will look like in a real life. We break down this process into three parts:
Now that we’ve laid out an overview for the post, let’s discuss how to create 3D objects and make them usable for our augmented reality app.
The first step in the augmented reality process for an e-commerce app, is getting a 3D model of a physical object. There are several ways to accomplish this:
For our example, we will create a 3D piece of art with Blender. Here is our artwork in the Blender editor:
Our next step was choosing a file format for the model: whichever format we chose had to work on iOS and Android, as our app operates on both devices. After a short investigation, we decided to use an “.obj” file, because it is a universal format for 3D image editors. We then began to work on determining a 3D object scale: for example, an object in 3D Max might be three meters long and look normal, but will be far too large when shown through the AR app.
Therefore, we formulated rules that adjust the size of any 3D object that will be displayed on a device screen. This way, the image size is the same in the editor as it would appear in real life. We implemented these rules for 3D Max alone, but this practice could be put in place for any other 3D editor as well.
Now that we have created a 3D model and edited it to size, the next step is rendering it so that it can be displayed virtually. The best option for rendering before the ARCore updates was OpenGL, a separate application that ran outside of ARCore. Now we can use Sceneform, a framework that has been added by Google to ARCore SDK (Software Developer Kit), providing an in-house rendering of 3D objects. It comprises a high-level scene graph API, a physically-based renderer, and an Android Studio plug-in for importing, viewing, and building 3D assets. These tools make integration with ARCore easy, simplifying the process of building AR apps.
Before we continue with the rendering process, here is an explanation of how the various elements of Sceneform work:
Now that Sceneform has been explained, let’s look at how you can use it to put a 3D model onto the augmented reality app.
The app build.gradle includes two Sceneform dependencies, the plug-in and a rule for converting sampledata assets, such as the 3D art model that we built above. This app is used to create a resource (in a .sfb file format) that is packaged with your augmented reality app, which Sceneform can load at runtime:
…
dependencies {
…
implementation 'com.google.ar.sceneform:core:1.4.0'
implementation 'com.google.ar.sceneform.ux:sceneform-ux:1.4.0'
}
apply plugin: 'com.google.ar.sceneform.plugin'
sceneform.asset(
'sampledata/models/armchair.obj'
'default',
'sampledata/models/armchair.sfa',
'src/main/res/raw/armchair
)
The next step is to load the resource as a renderable onto our augmented reality app:
ModelRenderable.builder()
.setSource(this, R.raw.armchair)
.build()
.thenAccept(renderable -> andyRenderable = renderable)
.exceptionally(
throwable -> {
Log.e(TAG, "Unable to load Renderable.", throwable);
return null;
});
The Android Studio plug-in provided by Sceneform then helps to import, convert and preview the 3D models. With the 3D object made and rendered, we will review some basic augmented reality concepts and terms to make the walkthrough to our app a little bit easier to understand.
Before we proceed with the step-by-step walkthrough of how to furnish a room with an augmented reality app, we need to understand five main concepts of ARCore SDK:
Since 2017, Google has invested a lot into ARCore. Here is a brief summary of the recent improvements in ARCore:
The final step to using our augmented reality app is adding an ArFragment to an Android layout file:
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.google.ar.sceneform.samples.hellosceneform.HelloSceneformActivity">
<fragment android:name="com.google.ar.sceneform.ux.ArFragment"
android:id="@+id/ux_fragment"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</FrameLayout>
Having explained the processes of building 3D images, adding them to ARCore and then using them in ARCore, it’s time to start talking about how to actually use our augmented reality app itself.
If we run the demo application, we see a hand holding a phone, which is the app’s equivalent to a cursor:
The app immediately starts detecting different points to anchor world coordinates, showing detected cloud points on the surface. If several points belong to one surface, then a plane has been identified:
The next step is selecting an image to place on the selected plane. In this case, we will use furniture to demonstrate what our app can do for e-commerce purposes.
We embedded several 3D objects into our demo application: a piece of artwork and TV set to hang on the wall; several furniture items like a sofa, chair and table; a chandelier to attach to the ceiling; and some other objects for decoration like vases. In a real-world application, all of these objects would be stored on a remote server, and there may be hundreds of thousands of them.
We also created a helper class, Model, to simplify operations with 3D objects stored in .sfb files. Model is defined below:
public class Model {
private int modelSfbRes;
private Integer modelSfbRes2;
private String displayableName;
private Plane.Type planeType;
public Model(int modelSfbRes, String displayableName, Plane.Type planeType) {
this.modelSfbRes = modelSfbRes;
this.displayableName = displayableName;
this.planeType = planeType;
}
public int getModelSfbRes() {
return modelSfbRes;
}
public String getDisplayableName() {
return displayableName;
}
public Integer getModelSfbRes2() {
return modelSfbRes2;
}
public Plane.Type getPlaneType() {
return planeType;
}
}
After we have selected the furnishing and created a corresponding Model for it, the next step is to actually place the object in the room. In order to do so, when the user taps on the screen, the app needs to:
All of these processes are accomplished with the following code commands:
arFragment.setOnTapArPlaneListener(
(HitResult hitResult, Plane plane, MotionEvent motionEvent) -> {
if (plane.getType() != modelInteractor.getChosenModel().getPlaneType()) {
Toast toast =
Toast.makeText(this, "Model plane doesn't match tapped plane", Toast.LENGTH_LONG);
toast.setGravity(Gravity.CENTER, 0, 0);
toast.show();
return;
}
ModelRenderable.builder()
.setSource(this, modelInteractor.getChosenModel().getModelSfbRes())
.build()
.thenAccept(model -> {
placeModel(model, hitResult, modelInteractor.getChosenModel().getPlaneType());
})
.exceptionally(this::showModelBuildError);
});
There are two steps to displaying an object. First, we must establish an anchor to create a point of reference for the camera. Next, a TransformableNode must be created to move the object and bind it to the anchor. Doing so results in an image like the armchair below:
private void placeModel(ModelRenderable renderable, HitResult hitResult, Plane.Type planeType) {
Vector3 size = ((Box) renderable.getCollisionShape()).getSize();
// Create the Anchor.
Anchor anchor = hitResult.createAnchor();
changeAnchorForPainting(anchor);
AnchorNode anchorNode = new AnchorNode(anchor);
anchorNode.setParent(arFragment.getArSceneView().getScene());
// Create the transformable and add it to the anchor.
TransformableNode node = new TransformableNode(arFragment.getTransformationSystem());
node.setParent(anchorNode);
if (planeType == Type.HORIZONTAL_DOWNWARD_FACING) {
Node downward = new Node();
downward.setParent(node);
downward.setLocalPosition(new Vector3(0, size.y, 0));
downward.setLocalRotation(new Quaternion(0, 0, 1, 0));
downward.setRenderable(renderable);
} else if (planeType == Type.VERTICAL) {
Node downward = new Node();
downward.setParent(node);
downward.setLookDirection(curPlaneNormal.negated());
downward.setRenderable(renderable);
} else {
node.setRenderable(renderable);
}
TranslationController controller = node.getTranslationController();
controller.setAllowedPlaneTypes(EnumSet.of(planeType));
node.select();
}
}
Vertical plane recognition is a new feature introduced in ARCore in recent months, enabling the placement of objects on vertical planes such as walls or doors without any lengthy processes or separate algorithms. Additionally, this recognition helps with depth perception. For example, a sofa can now be placed up against a wall, allowing users to see how far from the wall the sofa extends. Ceiling objects such as chandeliers can also operate on vertical planes, and need to be rotated so that they are displayed correctly. Here is the code for rotating and placing objects on vertical planes, with the resulting chandelier below:
TransformableNode node = new TransformableNode(arFragment.getTransformationSystem());
Node downward = new Node();
downward.setParent(node);
downward.setLocalPosition(new Vector3(0, size.y, 0));
downward.setLocalRotation(new Quaternion(0, 0, 1, 0));
downward.setRenderable(renderable);
ARCore cannot detect a vertical plane when the wall has a solid color. We can use an algorithm that we described in our last blog post on augmented reality to determine the vertical surface in these cases:
private MotionEvent getNewTapOnBottomCloud(MotionEvent tap, Frame frame) {
if (tap == null) {
return null;
}
for (int i = 0; i < 70; i++) {
for (HitResult hit : frame.hitTest(tap)) {
Trackable trackable = hit.getTrackable();
if (trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose()) && ((Plane) trackable).getType() == Type.HORIZONTAL_UPWARD_FACING) {
float distance = PlaneRenderer.calculateDistanceToPlane(hit.getHitPose(), arFragment.getArSceneView().getArFrame().getCamera().getPose());
scale = distance;
ModelRenderable.builder()
.setSource(this, modelInteractor.getChosenModel().getModelSfbRes())
.build()
.thenAccept(model -> placeModel(model, hit, modelInteractor.getChosenModel().getPlaneType()))
.exceptionally(this::showModelBuildError);
return tap;
}
}
tap.offsetLocation(-1 * 15, 0);
if (tap.getX() < -100) {
break;
}
}
return tap;
}
After a piece of furniture has been placed, we can move and rotate the object. Move and rotate functionality is provided by using TransformableNode. After you select an object to move, setAllowedPlaneTypes will specify the type of surface on which the object will be moved. Additionally, we can also select another object to modify. All objects can be moved and rotated by using two fingers:
TransformableNode node = new TransformableNode(arFragment.getTransformationSystem());
node.setRenderable(renderable);
TranslationController controller = node.getTranslationController();
controller.setAllowedPlaneTypes(EnumSet.of(planeType));
node.setParent(anchorNode);
node.select();
Now that the object has been moved so that it fits correctly in the room, the next step is to adjust its appearance. This feature was created as a dialog to change the color for the selected object, and applies the selected color using the setFloat3 method for changing the texture color. You can see the results of the color change in the two images below:
private void changeColor() {
ColorPickerDialog.Builder dialog = new ColorPickerDialog.Builder(this);
dialog.setTitle("Choose a color");
dialog.setNegativeButton(R.string.cancel, (dialog1, which) -> {
});
dialog.setPositiveButton(getString(R.string.ok), colorEnvelope -> {
BaseTransformableNode node = arFragment.getTransformationSystem().getSelectedNode();
if (node != null) {
node.getRenderable().getMaterial().setFloat3("baseColorTint", new Color(colorEnvelope.getColor()));
}
});
dialog.show();
}
You can also use this method to change the actual texture of the images, as seen in the two couches below:
Texture.builder()
.setSource(this, texturesInteractor.getRes(position))
.build()
.thenAccept(t -> {
BaseTransformableNode node = arFragment.getTransformationSystem().getSelectedNode();
if (node != null) {
node.getRenderable().getMaterial().setTexture("baseColor", t);
}
})
.exceptionally(this::showModelBuildError);
We have put together a demo videothat demonstrates all of the features that we described above. Seeing the app work as it would for a regular user really puts into perspective how useful augmented reality apps will be for shoppers. Check out the video, and see for yourself!
ARCore’s new features have allowed us to build a more sophisticated augmented reality app. Augmented reality apps like ours will open up all kinds of doors for retailers, mostly because of the power of visualization over browsing. AR can blend products with the environment in which they will be used, all from the comfort of a shopper's home. An entire room can now be furnished virtually, and customized to the user's specifications with ease, enabling convenient shopping and a pleasant experience.
Developments in AR over the past year have surpassed our predictions from a year ago. However, while there has already been a substantial expansion of AR-capable devices over the past year, we expect greater growth in the year to come. Additional features, such as face detection, could also arrive via extensions of ARCore with the Google ML (Machine Learning) Kit. Finally, we hope to soon see the development of custom chips on mobile devices that can implement 3D algorithms on the hardware level. These would greatly speed up the processing of real-time video, making for a superior customer experience.
We are excited to see where augmented reality goes over the years to come, and look forward to exploring further updates with blog posts as they become available. If you want to learn more about our augmented reality application, or about the new features in ARCore, please contact us — we will be happy to help!