Skip to main content
FRAMOS Logo

Prototyping Event Based Vision Systems with the FRAMOS FSM-IMX636 Devkit on the NVIDIA Jetson Platform for Edge AI and Robotics

FRAMOS

FRAMOS

October 31, 2023

Prototyping Event Based Vision Systems with the FRAMOS FSM-IMX636 Devkit on the NVIDIA Jetson Platform for Edge AI and Robotics

Unlike conventional imaging technologies that transmit entire images, the IMX636 sensor, realized in collaboration between Sony and PROPHESEE, is an event camera and dynamic vision sensor that captures asynchronous changes at the pixel level, inspired by the human eye and human retina. This event based vision sensor, developed through advances in neuromorphic engineering and as part of the Prophesee Metavision technology suite, detects only changes in luminance, enabling the detection of even the slightest vibrations, subtle movements, and fast-moving objects with unprecedented accuracy and the tracking of people without creating privacy issues.

With its wide dynamic range and high dynamic range, the IMX636 operates effectively in challenging lighting conditions, including low light, and inherently avoids motion blur while providing low latency data output. These features make it ideal for scenarios where traditional cameras struggle, ensuring reliable performance in both low-light and high-contrast environments.

Event cameras like the IMX636 represent a paradigm shift in the computer vision and machine vision fields, as they differ fundamentally from conventional frame-based systems by offering continuous, asynchronous data streams. This shift enables new approaches to visual perception tasks and enhances the robustness and speed of vision algorithms.

With its performance and reduced processing power and bandwidth requirements, event-based sensing (EVS) technology expands the limits of embedded vision systems with unprecedented capabilities and precision, overcoming the limitations of conventional solutions.

The FRAMOS IMX636 Development Kit allows you to easily test Sony EVS technology on the NVIDIA Jetson AGX Orin and Jetson AGX Xavier system-on-modules to discover new capabilities or improve the current performance of embedded vision systems.

Benefits of Event-Based Sensing Technology

Event-based sensors achieve high temporal resolution (up to microseconds) and dynamic range (up to 140dB) while reducing data redundancy and power consumption.

Event data produced by these sensors enables advanced object detection, pattern recognition, and image processing tasks, leveraging temporal information for real-time analysis. The use of neural networks, artificial intelligence, and self supervised learning further enhances the processing of event data, improving performance and generalizability. Software tools such as the Metavision Intelligence Suite and other intelligence suite solutions are designed to analyze event data efficiently. Event-based vision systems are used in various applications, and their efficient data output benefits downstream systems by enabling accurate and timely analysis across different industries.

Their high precision, ability to capture even the slightest changes, invisibility to other sensors, decreased data transfer, and power efficiency due to operating only with changing data opens new horizons in fields such as robotics, autonomous driving, industrial automation, and biomedical imaging.

The FRAMOS FSM IMX636 Development Kit allows you to test Sony IMX636 image sensor capabilities and then quickly scale them to production, simplifying the product development process.

Event-based Sensing Technology Use Cases

Here are a few use cases where EVS technology exceeds conventional sensing technology, allowing developers to capture data with unprecedented speed and precision.

The event stream generated by event cameras consists of asynchronous events, enabling advanced processing such as optical flow, optical flow estimation, event based optical flow, and motion estimation for high-speed, low-latency applications. Techniques like line and motion estimation, often utilizing an n point linear solver, provide efficient and stable analysis of dynamic scenes. Unlike frame based systems that capture full images at fixed intervals and generate unnecessary data, event-based approaches focus only on relevant changes, reducing redundancy and motion blur. Additionally, event based shape reconstruction, including methods like event-based shape from polarization, is made possible by the unique properties of event camera data.

Preventive Maintenance through Vibration Monitoring

When machinery starts to develop faults, such as bearing defects or misaligned belts, it generates abnormal vibrations. EVS can identify even the slightest deviations from standard vibration patterns, enabling early fault detection and repair before it causes severe and expensive damage.

Robotics Navigation

EVS technology’s fast and reliable data processing in real-time brings higher performances to robotics navigation, allowing robots to avoid obstacles, navigate in a more agile and responsive manner, and make fast and well-informed decisions in dynamic environments. Event-based vision systems also enhance the perception and navigation capabilities of intelligent robots, enabling them to operate more effectively in complex settings.

EVS can provide more regulation-compliant security and safety solutions since it can avoid capturing facial details, instead focusing on tracking movements – i.e., it can count people without compromising privacy. That is very important in regions with strict data privacy regulations, such as the EU.

EVS allows retailers to gather valuable marketing and sales data without violating individuals’ privacy rights.

High-Speed Object Counting and Tracking

Event-based algorithms, the core of EVS technology, are very reliable at high-speed object counting, exceeding a thousand counts per second. These sensors maintain accurate detection even at high speeds, where traditional sensors often struggle. The high resolution of modern event-based sensors further enhances their ability to track fast-moving objects with precision. This makes EVS sensors ideal for applications like counting sparks in welding or tracking fast-moving objects like golf balls. In contrast, RGB cameras often suffer from motion blur and increased latency in high-speed scenarios, making event-based sensors a superior choice for reliable object counting and tracking.

Benefits of Testing EVS Technology With the FSM IMX636 Development Kit on NVIDIA Jetson AGX Orin and Jetson AGX Xavier

The FSM-IMX636 Devkit provides an all-in-one package, enabling testing event-based vision sensors on NVIDIA Jetson AGX Orin and Jetson Xavier modules. It is part of a broader range of evaluation kits designed for event-based vision development. The NVIDIA Jetson platform provides a rich set of tools and libraries to help developers integrate event-based sensors with AI models and optimize their performance and efficiency, allowing them to test all the FSM IMX Development Kit’s capabilities easily.

Innovations and research using the development kit are often shared and documented in pattern recognition workshops and automation letters, providing valuable resources and forums for the community.

Besides that, the NVIDIA Jetson platform also offers advanced AI and machine learning capabilities, excellent connectivity, and I/O options.

As mentioned above, a seamless transition to practical use is one of the most important advantages of testing EVS technology with FSM IMX636 on NVIDIA Jetson AGX Orin and Jetson AGX Xavier. If EVS benefits your embedded vision systems, the testing kit provides a foundation to develop your product rapidly, significantly reducing costs and time to market.

Learn more about FSM IMX636 on this page, and if you would like to dive into [more details about EVS technology](https://framos.com/news/), follow this link.