May 12, 2025
  • All
  • Tutorials
  • A-Frame
  • Angular
  • Blender
  • Capacitor
  • Ionic
  • Tutorials

Cross-platform AR/VR With the Web: WebXR with A-Frame, Angular, and Capacitor (Part II)

Logan Brade

This is the second blog post in a multipart series. While it’s not necessary to read the first part it will definitely add some much needed context: Cross-platform AR/VR With the Web: WebXR with A-Frame, Angular, and Capacitor (Part I)

Last year I covered a topic about incorporating WebXR into your Capacitor projects to create cross-platform Virtual Reality experiences. While this previous blog post was a great starting point for using WebXR it really only scratched the surface of what’s possible with A-Frame. Recently, I revisited this project to create a low-code AR/VR video tutorial on the OutSystems’ YouTube channel for OutSystems Developer Cloud (ODC) and it inspired me to expand on this original blog post to show off how it can be done in Capacitor.

For the second part of this blog post I want to accomplish three things:

  • Incorporate a custom animated character
  • Expand by adding AR functionality
  • Run the project in a mobile app using native device functionality (i.e. camera)

The reason why I want to do these tasks is to provide more depth so you have flexibility with your version of the project. In the original blog post I demonstrated how to create a controlled version of the project that fit within the lines of the tutorial and this time I want to really explore how you would go on to create that experience yourself. This means using custom animated 3D models that you might have created in Blender, showing you libraries that enhance your XR projects that I didn’t cover, and showing you how to overcome the challenges you’ll face with native device functionality when you run this AR project in mobile.

To accomplish this I am going to use a 3D Neo model that I custom animated in Blender and use A-Frame, A-Frame Extras, AR.js, and Angular to build a project around Neo. Then we’ll use Capacitor to make our project cross-platform and run it on an iOS device with the proper camera permissions.

Let’s jump into it!

Setting up the Project

  1. Create an Ionic-Angular Project
ionic start webAR blank --type=angular

2. Install A-Frame and A-Frame Extras into your project

npm install --save aframe
npm install --save aframe-extras

Note: The A-Frame Extras library lets us play our animations that are attached to our model

3. Import A-Frame and A-Frame Extras in polyfills.ts under Browser Polyfills

* BROWSER POLYFILLS
 */
import 'aframe';
import 'aframe-extras';
/**

4. Download aframe-ar-nft.js and add it to the assets folder

https://raw.githack.com/AR-js-org/AR.js/master/aframe/build/aframe-ar-nft.js

Note: When I tried using NPM for this library it required a separate import for Three.js whereas downloading it did not. That’s why we’re downloading it.

5. Add aframe-ar-nft.js to your angular.json file scripts

"scripts": [
"src/assets/aframe-ar-nft.js"
]

Animated Neo

A really important aspect of this project is the 3D model that we’ll be using in our Ionic-Angular project, the software we’re using to build/animate our characters, and where we plan to store our character:

In this case, I am using Blender to animate a 3D Neo provided by the OutSystems design team and have named this character animation Waving since that is what Neo is doing in the animation. While I don’t want to get into the details of how to animate the character (since this would quickly become an intermediate Blender tutorial) there’s many resources on how to animate 3D models online. For this tutorial, you just need to export it from your 3D animation software in a .glb or .gltf format then upload it someplace where it’ll be accessible like an S3 bucket in AWS to reference it in our HTML.

Once you have exported your 3D model and it’s in a place where it can be accessed by your app then you can start referencing it in your app. If you don’t have a 3D model available or want to work with a model that’s animated then definitely check out places like TurboSquid that have tons of assets.

AR

Next we’ll need to modify our project to recognize the custom elements that are a part of A-Frame then download the marker we’ll be using as a reference for our app to place the 3D model on:

  1. Import CUSTOM_ELEMENTS_SCHEMA to home.page.ts and inject CUSTOM_ELEMENTS_SCHEMA as a schema into our home component
import { Component, CUSTOM_ELEMENTS_SCHEMA } from '@angular/core';
import { IonHeader, IonToolbar, IonTitle, IonContent } from '@ionic/angular/standalone';

@Component({
  selector: 'app-home',
  templateUrl: 'home.page.html',
  styleUrls: ['home.page.scss'],
  imports: [IonHeader, IonToolbar, IonTitle, IonContent],
  schemas: [CUSTOM_ELEMENTS_SCHEMA]
})
export class HomePage {
  constructor() {}
}

Note: If you don’t define CUSTOM_ELEMENTS_SCHEMA your project will throw an error

2. Add AR HTML to home.page.html

<a-scene embedded arjs>
  <a-marker preset='hiro'>
    <a-entity
      position='0 0 0'
      scale='0.5 0.5 0.5'
      gltf-model='Place .glb URL here'
      animation-mixer='clip: Waving'
    ></a-entity>
  </a-marker>
  <a-entity camera></a-entity>
</a-scene>

3. Download and print the Hiro marker

If you checked out my previous blog post on WebXR this should look familiar but there are a few differences: In our scene we’re letting A-Frame and AR.js know that this is an embedded AR experience with embedded arjs and adding an <a-marker> tag defining that we’ll be placing our object on a Hiro marker. Then we are using an <a-entity> tag because we’re moving away from preset shapes to import a custom 3D model and adding animation-mixer so the aframe-extras library knows that the attached animation clip Waving needs to be played. Finally, we are adding another <a-entity> tag defining it as a camera because the camera in our scene operates as its own entity.

4. (Optional) Test the application to make sure the experience is working correctly with a camera

ionic capacitor build
ionic serve

Mobile

For our project to run on mobile we’ll need to add a iOS as a platform to our Capacitor project and configure the permissions so we have access to the camera:

  1. Add, build, sync, and run on your platform of choice
ionic capacitor add ios

ionic capacitor build
ionic capacitor sync

//Works best if you run it on a physical device
ionic capacitor run ios

Note: You should be able to add Android and run it on a WebXR capable Android device. I don’t have an Android device to test this project so I’m excluding it as a platform.

2. Adjust permissions in info.plist in XCode

//info.plist - iOS
<key>NSCameraUsageDescription</key>
<string>Allow camera</string>

With that complete you can now run your project in XCode on an iPhone! Given the experimental nature of this library with Capacitor it’s possible to run into sizing issues with orientation and you will likely need to tweak the HTML to fit your particular use case. In my experience, this works best in landscape mode and it minimizes having to code around the interface for it to scale in multiple orientations. It also reduces how much you need to scale and/or position your 3D model in your scene to get it to fit on your Hiro marker.

Overall, this project was a lot of fun but there’s still more projects you can do with A-Frame, AR.js, and Capacitor! Be sure to check out some of the other features available and feel free to share your projects with us on Discord, X, or Bluesky!


Logan Brade