Building a Plugin: Integrating Device Camera Streams for ML Vision in Flutter

Summary
Summary
Summary
Summary

This tutorial guides you through creating a Flutter plugin that captures device camera streams on Android and iOS, transmits byte buffers to Dart via EventChannel, and processes frames in real time with ML Vision frameworks. Learn project setup, native API usage, Dart bridging, and model inference integration, forming a scalable template for cross-platform mobile computer vision.

This tutorial guides you through creating a Flutter plugin that captures device camera streams on Android and iOS, transmits byte buffers to Dart via EventChannel, and processes frames in real time with ML Vision frameworks. Learn project setup, native API usage, Dart bridging, and model inference integration, forming a scalable template for cross-platform mobile computer vision.

This tutorial guides you through creating a Flutter plugin that captures device camera streams on Android and iOS, transmits byte buffers to Dart via EventChannel, and processes frames in real time with ML Vision frameworks. Learn project setup, native API usage, Dart bridging, and model inference integration, forming a scalable template for cross-platform mobile computer vision.

This tutorial guides you through creating a Flutter plugin that captures device camera streams on Android and iOS, transmits byte buffers to Dart via EventChannel, and processes frames in real time with ML Vision frameworks. Learn project setup, native API usage, Dart bridging, and model inference integration, forming a scalable template for cross-platform mobile computer vision.

Key insights:
Key insights:
Key insights:
Key insights:
  • Setting Up the Plugin Project: Scaffold a federated plugin and configure platform permissions.

  • Integrating Native Camera Streams: Use Camera2 on Android and AVCaptureSession on iOS to capture frames.

  • Bridging Streams to Dart: Expose native buffers via EventChannel and manage subscription lifecycle.

  • Processing Frames with ML Vision: Decode frames in Dart, run inference with ML Kit or tflite, and isolate heavy computation.

  • Overall Workflow: End-to-end pattern for native data capture, Dart streaming, and ML integration

Introduction

Building a Flutter plugin to access device camera streams and feed frames into an ML Vision pipeline unlocks custom computer-vision features across Android and iOS. In this tutorial, you’ll create a plugin that captures live camera data in native code, sends byte buffers to Dart via event channels, and processes each frame with a machine-learning model. You’ll learn best practices for platform APIs, thread management, and efficient serialization.

Setting Up the Plugin Project

Start by generating a federated plugin skeleton:

define-plugin:
  flutter create --template=plugin \
    --platforms=android,ios camera_ml_stream

This creates:

• lib/camera_ml_stream.dart – the Dart API

• android/src/main/kotlin/.../CameraMlStreamPlugin.kt

• ios/Classes/CameraMlStreamPlugin.swift

In pubspec.yaml, depend on camera_ml_stream_platform_interface if you split out federated logic. Update Android and iOS manifest permissions:

• AndroidManifest.xml:

<uses-permission android:name="android.permission.CAMERA"/>

• Info.plist:

<key>NSCameraUsageDescription</key>
<string>Camera access for ML vision</string>

Integrating Native Camera Streams

On Android, use Camera2 API to open a CameraDevice and receive ImageReader buffers:

val reader = ImageReader.newInstance(640,480,ImageFormat.YUV_420_888,2)
reader.setOnImageAvailableListener({ reader ->
  val image = reader.acquireNextImage()
  val buffer = image.planes[0].buffer
  sendFrame(buffer)
  image.close()
}, backgroundHandler

Implement sendFrame(ByteBuffer) to post the raw NV21 or YUV data over an EventChannel.

On iOS, configure AVCaptureSession:

session = AVCaptureSession()
let device = AVCaptureDevice.default(.builtInWideAngleCamera,...)
let input = try AVCaptureDeviceInput(device: device)
session.addInput(input)
let output = AVCaptureVideoDataOutput()
output.setSampleBufferDelegate(self, queue: DispatchQueue(label: "cam"))
session.addOutput(output)
session.startRunning

In captureOutput, convert CMSampleBuffer to a Data object and call the Dart sink:

if let dataBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
  CVPixelBufferLockBaseAddress(dataBuffer,[])
  // extract bytes...
  eventSink?(frameData)
  CVPixelBufferUnlockBaseAddress(dataBuffer,[])
}

Bridging Streams to Dart

Back in Dart, define an EventChannel subscriber in your plugin interface:

final EventChannel _channel = EventChannel('camera_stream');
Stream<Uint8List> get frameStream =>
  _channel.receiveBroadcastStream().cast<Uint8List>();

Expose frameStream to the app. On subscription, the native side will start the camera feed. Implement onListen and onCancel in each platform to manage session lifecycle.

Processing Frames with ML Vision

In your Dart application, import the plugin and subscribe:

import 'package:camera_ml_stream/camera_ml_stream.dart';

void main() {
  final plugin = CameraMlStream();
  plugin.frameStream.listen((frame) {
    // Convert bytes to image and feed to ML model
    processImage(frame);
  });
}

Use a package like google_ml_kit or TensorFlow Lite via tflite_flutter to run inference on each frame. Tune the input resolution and cropping to match your model’s expected dimensions. Offload heavy computation to isolates:

• Spawn an isolate to decode and preprocess.

• Send only results or bounding boxes back to the main isolate.

Vibe Studio

Vibe Studio, powered by Steve’s advanced AI agents, is a revolutionary no-code, conversational platform that empowers users to quickly and efficiently create full-stack Flutter applications integrated seamlessly with Firebase backend services. Ideal for solo founders, startups, and agile engineering teams, Vibe Studio allows users to visually manage and deploy Flutter apps, greatly accelerating the development process. The intuitive conversational interface simplifies complex development tasks, making app creation accessible even for non-coders.

Conclusion

You have built a cross-platform Flutter plugin that captures raw camera frames in native code, streams them to Dart, and integrates with an ML Vision pipeline. This foundation supports real-time object detection, face recognition, or custom models. Extend the plugin with configuration options (resolution, frame rate) and optimize performance by minimizing copies and using GPU delegates. With this pattern, you can integrate any native sensor stream or hardware acceleration into Flutter apps for powerful mobile vision features.

Introduction

Building a Flutter plugin to access device camera streams and feed frames into an ML Vision pipeline unlocks custom computer-vision features across Android and iOS. In this tutorial, you’ll create a plugin that captures live camera data in native code, sends byte buffers to Dart via event channels, and processes each frame with a machine-learning model. You’ll learn best practices for platform APIs, thread management, and efficient serialization.

Setting Up the Plugin Project

Start by generating a federated plugin skeleton:

define-plugin:
  flutter create --template=plugin \
    --platforms=android,ios camera_ml_stream

This creates:

• lib/camera_ml_stream.dart – the Dart API

• android/src/main/kotlin/.../CameraMlStreamPlugin.kt

• ios/Classes/CameraMlStreamPlugin.swift

In pubspec.yaml, depend on camera_ml_stream_platform_interface if you split out federated logic. Update Android and iOS manifest permissions:

• AndroidManifest.xml:

<uses-permission android:name="android.permission.CAMERA"/>

• Info.plist:

<key>NSCameraUsageDescription</key>
<string>Camera access for ML vision</string>

Integrating Native Camera Streams

On Android, use Camera2 API to open a CameraDevice and receive ImageReader buffers:

val reader = ImageReader.newInstance(640,480,ImageFormat.YUV_420_888,2)
reader.setOnImageAvailableListener({ reader ->
  val image = reader.acquireNextImage()
  val buffer = image.planes[0].buffer
  sendFrame(buffer)
  image.close()
}, backgroundHandler

Implement sendFrame(ByteBuffer) to post the raw NV21 or YUV data over an EventChannel.

On iOS, configure AVCaptureSession:

session = AVCaptureSession()
let device = AVCaptureDevice.default(.builtInWideAngleCamera,...)
let input = try AVCaptureDeviceInput(device: device)
session.addInput(input)
let output = AVCaptureVideoDataOutput()
output.setSampleBufferDelegate(self, queue: DispatchQueue(label: "cam"))
session.addOutput(output)
session.startRunning

In captureOutput, convert CMSampleBuffer to a Data object and call the Dart sink:

if let dataBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
  CVPixelBufferLockBaseAddress(dataBuffer,[])
  // extract bytes...
  eventSink?(frameData)
  CVPixelBufferUnlockBaseAddress(dataBuffer,[])
}

Bridging Streams to Dart

Back in Dart, define an EventChannel subscriber in your plugin interface:

final EventChannel _channel = EventChannel('camera_stream');
Stream<Uint8List> get frameStream =>
  _channel.receiveBroadcastStream().cast<Uint8List>();

Expose frameStream to the app. On subscription, the native side will start the camera feed. Implement onListen and onCancel in each platform to manage session lifecycle.

Processing Frames with ML Vision

In your Dart application, import the plugin and subscribe:

import 'package:camera_ml_stream/camera_ml_stream.dart';

void main() {
  final plugin = CameraMlStream();
  plugin.frameStream.listen((frame) {
    // Convert bytes to image and feed to ML model
    processImage(frame);
  });
}

Use a package like google_ml_kit or TensorFlow Lite via tflite_flutter to run inference on each frame. Tune the input resolution and cropping to match your model’s expected dimensions. Offload heavy computation to isolates:

• Spawn an isolate to decode and preprocess.

• Send only results or bounding boxes back to the main isolate.

Vibe Studio

Vibe Studio, powered by Steve’s advanced AI agents, is a revolutionary no-code, conversational platform that empowers users to quickly and efficiently create full-stack Flutter applications integrated seamlessly with Firebase backend services. Ideal for solo founders, startups, and agile engineering teams, Vibe Studio allows users to visually manage and deploy Flutter apps, greatly accelerating the development process. The intuitive conversational interface simplifies complex development tasks, making app creation accessible even for non-coders.

Conclusion

You have built a cross-platform Flutter plugin that captures raw camera frames in native code, streams them to Dart, and integrates with an ML Vision pipeline. This foundation supports real-time object detection, face recognition, or custom models. Extend the plugin with configuration options (resolution, frame rate) and optimize performance by minimizing copies and using GPU delegates. With this pattern, you can integrate any native sensor stream or hardware acceleration into Flutter apps for powerful mobile vision features.

Build Flutter Apps Faster with Vibe Studio

Build Flutter Apps Faster with Vibe Studio

Build Flutter Apps Faster with Vibe Studio

Build Flutter Apps Faster with Vibe Studio

Vibe Studio is your AI-powered Flutter development companion. Skip boilerplate, build in real-time, and deploy without hassle. Start creating apps at lightning speed with zero setup.

Vibe Studio is your AI-powered Flutter development companion. Skip boilerplate, build in real-time, and deploy without hassle. Start creating apps at lightning speed with zero setup.

Vibe Studio is your AI-powered Flutter development companion. Skip boilerplate, build in real-time, and deploy without hassle. Start creating apps at lightning speed with zero setup.

Vibe Studio is your AI-powered Flutter development companion. Skip boilerplate, build in real-time, and deploy without hassle. Start creating apps at lightning speed with zero setup.

Other Insights

Other Insights

Other Insights

Other Insights

Join a growing community of builders today

Join a growing community of builders today

Join a growing community of builders today

Join a growing community of builders today

Join a growing community of builders today

© Steve • All Rights Reserved 2025

© Steve • All Rights Reserved 2025

© Steve • All Rights Reserved 2025

© Steve • All Rights Reserved 2025

© Steve • All Rights Reserved 2025

© Steve • All Rights Reserved 2025