linkedin-tracking-code

Event-based Vision Systems:
Technology and R&D Trend Analysis


Jan 2020 | 92 pages

Get up-to-speed on the commercial readiness of state-of-the-art event-based vision systems across the globe


Get your copy



What’s included:

Rise in the scope and opportunity for event-based vision systems

Analysis of over 100 active entities working on event-based vision systems across imaging, automotive, semiconductor, electronics and software, as well as, research labs and universities

Analysis of 200+ patents from 2010 to 2019

10 companies, startups, and research institutes working on event-based cameras, including established firms such as Samsung and Sony

Seven major challenges with event-based cameras targeted by patenting entities

Top event-based vision system patentees for automotive-specific applications

Top event-based vision system patentees in the semiconductor industry

Global competitive landscape

Key partnerships and alliances in developing powerful event-based vision systems and their future roadmaps

Crucial insights pertaining to 8 major ongoing and closed projects in the domain



About

Present computer vision systems are highly reliant on frame-based approaches for capturing objects in motion. This approach produces large amounts of data, increasing the overall transmission and computation time. Event-based vision systems overcome this challenge. Unlike traditional cameras that capture frames at a fixed rate, they respond to pixel-level brightness changes. This makes event cameras operate with low latency and be more power efficient and sensitive to light. These advantages make them more suitable for object tracking, pose estimation, 3D reconstruction of scenes, depth estimation, feature tracking, and other perception tasks required by connected devices.

Latest AI-driven advancements in computer vision are enabling more human-like vision sensor systems. Also known as a neuromorphic, event-based vision, or dynamic vision sensor (DVS) camera, these systems have the potential to transform the computer vision landscape through reduced latency and low power consumption features. Their application areas include autonomous vehicles (for lower latency, HDR object detection, and low memory storage needs), robotics, IoT (for low power, always on devices), augmented reality/virtual reality (AR/VR) (low power and low-latency tracking), and other industrial automation use cases.

This report details the state-of-the-art event-based vision systems that have the potential to transform the traditional vision sensing architectures. It describes how event-based vision systems are finding applications across sectors and their potential to replace the frame-based solutions in critical, real-time applications. Get a deeper understanding of the computer vision ecosystem through the lens of emerging event-based vision systems and opportunities for investments and partnerships. Additionally, it includes a detailed analysis of relevant patents to help you develop IP strategies related to event-based vision system for automotive applications.

This report also asesses the challenges involved in the adoption of event-based vision systems, the solutions and approaches that the active participants are developing for introducing innovative products. The report combines a comprehensive analysis of patent filings, companies active in the space, and R&D activities from universities and research labs across the world, delivering key insights into the maturity and evolution of the technology.


Get your copy now


1. Introduction

  • Significance of event-based vision technologies and its rise from 2010 to 2019
  • Differences in the output from frame-based and event-based approaches

2. Methodology of the Study

3. Entities Active in the event-based vision system

4. Patent Trend Analysis

  • Filing Trends
  • Assignee Landscape
  • Patenting Activities by Startups
  • Patent Trend Focused on Key Challenges
  • Patent Publications Mapped to Automotive Applications
    • Collision Avoidance
    • Monitoring of Parked Vehicles
    • Always On Operations
    • Analysis of a Road Surface
    • In-car Installment of DVS Camera
    • Object Detection and Classification
    • Multi-object Tracking
    • Inaccuracies Introduced by Non-event Pixel Points
    • LiDAR and 3D Point Cloud
    • 3D Pose Estimation
    • Hardware Security
    • Edge Processing
    • Other Highlights
    • Key Takeaways

5. Competitive Landscape

  • Prophesee
  • iniVation
  • Insightness
  • Qelzal
  • MindTrace
  • CelePixel
  • Sunia
  • Australian Institute of Technology
  • Samsung
  • Sony
  • Benchmarking of the Commercialized/In-pipeline event-based vision products
  • Key Takeaways

6. Projects

  • Project 1 – Ultra-Low Power Event-Based Camera (ULPEC)
  • Project 2 – The Internet of Silicon Retinas (IoSiRe): Machine to machine communications for neuromorphic vision sensing data
  • Project 3 – Event-Driven Compressive Vision for Multimodal Interaction with Mobile Devices (ECOMODE)
  • Project 4 – Convolution Address-Event-Representation (AER) Vision Architecture for Real Time (CAVIAR)
  • Project 5 – Embedded Neuromorphic Sensory Processor – NeuroPsense
  • Project 6 – Event–Driven Morphological Computation for Embodied Systems (eMorph)
  • Project 7 – EB-SLAM: Event-based simultaneous localization and mapping
  • Project 8 – SLAMCore

7. Research Laboratories

  • Lab 1: Robotics and Perception Group
  • Lab 2: Neuroscientific System Theory (NST)
  • Lab 3: Perception and Robotics Labs
  • Lab 4: Robot Vision Group
  • Key Takeaways

8. Research Institutes Focusing on Event Cameras

9. Insights and Recommendations

10. Concluding Remarks

11. Acronyms

12. References


Get your copy now


Onam Prasad

Onam holds a Master’s degree from the University of Illinois at Chicago with specialization in telecommunications. At the University of Illinois her course of study covered subjects including wireless communications, RF systems, semiconductors, networking, digital communication, etc. She was also a member of Society of Women Engineers and attended many conferences and sessions related to initiatives by women technologists and awareness of technology across diversity. Onam has research expertise in topics related to IoT, 5G connectivity solutions, autonomous systems, RF devices, semiconductor fabrication, telecommunication equipment, ICs and Chipsets, etc. She works on custom research projects and provides technology consulting and advisory to clients on a range of aspects including creation of strategic partnerships, development and adoption of innovative or disruptive technologies, R&D and product roadmap creation, etc. by leveraging robust research approaches and methodologies that combine competitive intelligence, patent landscape, M&A, trend analysis and other important parameters. The combination of her education, expertise and professional experiences makes her one of Netscribes lead analysts in the electronics, telecom and semiconductors domains.


Faizal Shaikh

Faizal has been associated with Netscribes’ Innovation Research team since the last 3 years. He has worked on emerging technology domains including IoT connectivity solutions and management, sensing/antenna technologies for autonomous vehicles and connected cars, 5G technology trends, hardware security solutions, RF front-end for portable devices, advanced semiconductor chipsets, futuristic display technologies, etc.

His areas of interest include semiconductor fabrication and processing, networking and 5G related studies, telecommunications, display technologies, user interfaces, sensors and robotics among others. He has been actively involved in technology assessment/consulting, competitor analysis and benchmarking, and technology roadmap studies that require a complete understanding of the ecosystem.

Faizal graduated with a Bachelor’s degree in Engineering with specialization in Electronics and Telecommunications from Mumbai University.


Get your copy now




Please enter the following details to download case studies

Contact Us
X
  • I agree to receive updates on the latest industry trends, products and services from Netscribes.
  • We respect your right to data privacy and security. You may unsubscribe from our communications at any time. For more information, check out our Privacy Policy.