Building AR Experiences with ARCore & ARKit in Flutter

Building AR Experiences with ARCore & ARKit in Flutter

Building AR Experiences with ARCore & ARKit in Flutter

Building AR Experiences with ARCore & ARKit in Flutter

Summary
Summary
Summary
Summary

With ar_flutter_plugin, Flutter developers can create immersive AR experiences on Android and iOS by integrating plane detection, 3D models, and anchors. This guide covers project setup, rendering assets, and optimizing performance. Combined with Vibe Studio, these tools enable fast deployment of rich AR-enabled Flutter apps.

With ar_flutter_plugin, Flutter developers can create immersive AR experiences on Android and iOS by integrating plane detection, 3D models, and anchors. This guide covers project setup, rendering assets, and optimizing performance. Combined with Vibe Studio, these tools enable fast deployment of rich AR-enabled Flutter apps.

With ar_flutter_plugin, Flutter developers can create immersive AR experiences on Android and iOS by integrating plane detection, 3D models, and anchors. This guide covers project setup, rendering assets, and optimizing performance. Combined with Vibe Studio, these tools enable fast deployment of rich AR-enabled Flutter apps.

With ar_flutter_plugin, Flutter developers can create immersive AR experiences on Android and iOS by integrating plane detection, 3D models, and anchors. This guide covers project setup, rendering assets, and optimizing performance. Combined with Vibe Studio, these tools enable fast deployment of rich AR-enabled Flutter apps.

Key insights:
Key insights:
Key insights:
Key insights:
  • Cross-Platform Setup: Use ar_flutter_plugin to bridge ARKit and ARCore in one Flutter codebase.

  • 3D Model Placement: Load glTF assets and render them interactively with gestures like drag and rotate.

  • Anchors & Tracking: Pin objects in space using AR anchors and detect images or surfaces.

  • iOS & Android Support: Configure Info.plist, AndroidManifest.xml, and asset paths for both platforms.

  • Performance Optimization: Minimize nodes, disable extra visuals, and recycle objects for efficiency.

  • Advanced Interactions: Extend with gesture input, shaders, and real-time updates for deeper immersion.

Introduction

Building immersive AR experiences in Flutter has become more accessible thanks to plugins that bridge ARCore on Android and ARKit on iOS. In this advanced tutorial, you will learn how to set up a Flutter project for both platforms, integrate plane detection and anchors, and render 3D models with gestures. We leverage the ar_flutter_plugin package to unify your codebase across devices. By the end, you’ll understand how to add custom interactions for objects in space and optimize performance for production.

Setting up the Flutter Project

Begin by creating a Flutter app and adding ar_flutter_plugin to your pubspec.yaml:

dependencies:
  flutter:
    sdk: flutter
  ar_flutter_plugin

Run flutter pub get. For Android, edit AndroidManifest.xml to request camera and enable ARCore:

<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera.ar" android:required="true"

Set minSdkVersion to 24+. In build.gradle (app level), ensure:

minSdkVersion 24

For iOS, open ios/Runner/Info.plist and add:

<key>NSUserCameraUsageDescription</key>
<string>Camera permission is required for augmented reality.</string>
<key>ARKit</key>
<true

Enable the ARKit capability in Xcode under Signing & Capabilities.

Integrating ARCore & ARKit

Import the plugin in your Dart code:

import 'package:ar_flutter_plugin/ar_flutter_plugin.dart';

Initialize an ARSessionManager in a StatefulWidget’s initState:

late ARSessionManager arSessionManager;

@override
void initState() {
  super.initState();
  arSessionManager = ARSessionManager(onError: print);
}

Embed the AR view in build():

ARView(
  onARViewCreated: onARViewCreated,
  planeDetectionConfig: PlaneDetectionConfig.horizontal,
)

Implement the callback:

void onARViewCreated(ARSessionManager sessionManager, ARObjectManager objectManager) {
  arSessionManager = sessionManager;
  arSessionManager.onInitialize(showFeaturePoints: false, showPlanes: true);
  // Additional config here
}

Rendering 3D Models and Interactions

Use ARObjectManager to place 3D assets. First, bundle a glTF file in assets and register it in pubspec.yaml. Then load it on tap:

late ARObjectManager arObjectManager;

void onARViewCreated(ARSessionManager sessionManager, ARObjectManager objectManager) {
  arSessionManager = sessionManager;
  arObjectManager = objectManager;
  arObjectManager.onTap = onNodeTap;
}

Future<void> addModel(ARHitTestResult hit) async {
  final node = ARNode(
    type: NodeType.webGLB,
    uri: "assets/models/spaceship.glb",
    scale: Vector3(0.2, 0.2, 0.2),
    position: hit.worldTransform.getTranslation(),
    rotation: hit.worldTransform.getRotation(),
  );
  await arObjectManager.addNode(node);
}

void onNodeTap(List<String> nodeNames) {
  // Handle selection or animation.
  print("Tapped on: ${nodeNames.first}");
}

Use gesture detectors if you need drag, scale, or rotate. Update node.transform accordingly.

Advanced Tracking and Anchors

Anchors let you lock objects in world coordinates. Use ARAnchorManager:

late ARAnchorManager anchorManager;

void onARViewCreated(...) {
  anchorManager = ARAnchorManager(arSessionManager);
}

Future<void> addAnchor(Vector3 position) async {
  final anchor = ARAnchor(
    transform: Matrix4.identity()..translate(position.x, position.y, position.z),
  );
  await anchorManager.addAnchor(anchor);
  // Attach a child node to this anchor
}

To track custom images or objects, configure ARSessionManager:

arSessionManager.onInitialize(
  showFeaturePoints: false,
  showPlanes: false,
  customPlaneTexturePath: null,
  handleTaps: false,
  detectionImagesGroupName: "ARResources",
);

Place your reference images in Xcode’s AR Resource Group (Assets.xcassets) for image detection on iOS. For Android, pack a set of images in src/main/assets.

Optimizing performance:

• Disable feature points and world origin.

• Limit the number of active nodes.

• Recycle nodes instead of recreating them.

Vibe Studio

Vibe Studio, powered by Steve’s advanced AI agents, is a revolutionary no-code, conversational platform that empowers users to quickly and efficiently create full-stack Flutter applications integrated seamlessly with Firebase backend services. Ideal for solo founders, startups, and agile engineering teams, Vibe Studio allows users to visually manage and deploy Flutter apps, greatly accelerating the development process. The intuitive conversational interface simplifies complex development tasks, making app creation accessible even for non-coders.

Conclusion

In this tutorial, you learned how to bootstrap a Flutter ar project, unify ARCore and ARKit integration, render interactive 3D models, and implement advanced anchors and image tracking. These techniques let you craft compelling augmented reality scenes with precise control over placement and performance.

Build AR Apps Visually in Flutter

Build AR Apps Visually in Flutter

Build AR Apps Visually in Flutter

Build AR Apps Visually in Flutter

Use Vibe Studio to design and deploy powerful AR experiences—fast, cross-platform, and no code required.

Use Vibe Studio to design and deploy powerful AR experiences—fast, cross-platform, and no code required.

Use Vibe Studio to design and deploy powerful AR experiences—fast, cross-platform, and no code required.

Use Vibe Studio to design and deploy powerful AR experiences—fast, cross-platform, and no code required.

Other Insights

Other Insights

Other Insights

Other Insights

Join a growing community of builders today

Join a growing
community

of builders today

Join a growing

community

of builders today

© Steve • All Rights Reserved 2025

© Steve • All Rights Reserved 2025

© Steve • All Rights Reserved 2025

© Steve • All Rights Reserved 2025