We have an updated report [Version - 2023] available. Kindly sign up to get the sample of the report.

Event-Based Vision Systems

  • Report Code : 289722
  • Industry : Telecom and IT
  • Published On : May 2019
  • Pages : 92
  • Publisher : Netscribes
  • Format: WMR PPT FormatWMR PDF Format

Latest AI-driven advancements in computer vision focus on emulating the characteristics of the human eye in a vision sensor system. Also known as a neuromorphic or event-based vision system, or dynamic vision sensor (DVS) camera, the system can potentially transform the computer vision landscape by ensuring reduced latency and lower power consumption for upcoming solutions. Its potential application areas include autonomous vehicles (for lower latency, HDR object detection, and low memory storage needs), robotics, IoT (for low power, always on devices), augmented reality/virtual reality (AR/VR) (low power and low-latency tracking), and other industrial automation use cases.
This report focuses on assessing the challenges involved in the adoption of event-based vision systems, and the solutions and approaches that the active participants are developing for introducing innovative products. The report combines a comprehensive analysis of patent filings, companies active in the space, and R&D activities from universities and research labs across the world, delivering key insights into the maturity and evolution of the technology.
Patent Trend Analysis
Patent filings over the last decade (2010-2019) were analyzed to evaluate the level of participation of various entities in the R&D space. This section details the assignee landscape and key patents in the domain. The different patent filings have been studied to understand the key challenges addressed by the patent publications. Additionally, patent filings related to event-based vision technologies with a focus on automotive applications are described in-depth to highlight the different deployment scenarios spanning the sector.
Competitive Intelligence
The section provides a detailed description of established companies, startups, and research institutes working on event-based cameras. Different parameters, including company overview, technology stack, partnerships, key personnel, future roadmap, and limitations have been considered for a comprehensive competitive profiling.
A key highlight to emerge from this analysis is that several European startups are directly competing against Samsung in the event-based vision technology domain.
Further, a benchmarking matrix of the commercialized and in-pipeline products has also been included for an in-depth analysis.
Companies mentioned in the report
1. Prophesee
2. iniVation
3. Insightness
4. Qelzal
5. MindTrace
6. CelePixel
7. Sunia
8. Australian Institute of Technology
9. Samsung
10. Sony
Key Insights
 Event-based vision systems overcome the issue related to redundant information in the traditional frame-based vision systems.
 Event-based technology is at the early stages of development and significant research and investments are increasingly focusing on accelerating the development of such systems.
 Event-based vision techniques are being explored in the automotive sector for both in-car applications and for scenarios outside the vehicle.
 Some early adopters of the technology are focusing on the DVS fabrication processes and on pixel size reduction.
 Samsung is amongst the earliest adopters of the DVS technology.
 Research laboratories are focusing on emulating the various parameters of DVS to address challenges such as low dynamic range, pixel size, motion blur, and high latency.
 Event-based vision systems are finding application in self-driving cars, drones, IoT, robotics, wearable devices, and surveillance.
Key questions addressed in the report:
• What is the difference between frame-based vision and event-based vision?
• How do event-based vision techniques overcome the limitations of a traditional frame-based vision system?
• What are the key challenges faced by the technology, and which are the entities addressing them?
• What is the patent filing trend for the technology between the years 2010-2019?
• How is event-based vision technology transforming the automotive sector?
• What is the competitive scenario for event-based vision technology?
• What are the projects and research activities related to the technology?
• Which universities and research institutes, active in the domain, should you watch?
• What are the different collaborations and investment opportunities for new companies seeking to explore event-based vision technologies?

1 Introduction
2 Methodology of the Study
3 Entities Active in the Event-Based Vision System
4 Patent Trend Analysis
4.1 Filing Trends
4.2 Assignee Landscape
4.3 Patenting Activities by Startups
4.4 Patent Trend Focused on Key Challenges
4.5 Patent Publications Mapped to Automotive Applications
4.5.1 Collision Avoidance
4.5.2 Monitoring of Parked Vehicles
4.5.3 Always On Operations
4.5.4 Analysis of a Road Surface
4.5.5 In-car Installment of DVS Camera
4.5.6 Object Detection and Classification
4.5.7 Multi-object Tracking
4.5.8 Inaccuracies Introduced by Non-event Pixel Points
4.5.9 LiDAR and 3D Point Cloud
4.5.10 3D Pose Estimation
4.5.11 Hardware Security
4.5.12 Edge Processing
4.5.13 Other Highlights
4.5.14 Key Takeaways
5 Competitive Landscape
5.1 Prophesee
5.2 iniVation
5.3 Insightness
5.4 Qelzal
5.5 MindTrace
5.6 CelePixel
5.7 Sunia
5.8 Australian Institute of Technology
5.9 Samsung
5.10 Sony
5.11 Benchmarking of the Commercialized/In-pipeline Event-based Vision Products
5.12 Key Takeaways
6 Projects
6.1 Project 1 - Ultra-Low Power Event-Based Camera (ULPEC)
6.2 Project 2 - The Internet of Silicon Retinas (IoSiRe): Machine to machine communications for neuromorphic vision sensing data
6.3 Project 3 - Event-Driven Compressive Vision for Multimodal Interaction with Mobile Devices (ECOMODE)
6.4 Project 4 - Convolution Address-Event-Representation (AER) Vision Architecture for Real Time (CAVIAR)
6.5 Project 5 - Embedded Neuromorphic Sensory Processor - NeuroPsense
6.6 Project 6 - Event-Driven Morphological Computation for Embodied Systems (eMorph)
6.7 Project 7 - EB-SLAM: Event-based simultaneous localization and mapping
6.8 Project 8 - SLAMCore
7 Research Laboratories
7.1 Lab 1: Robotics and Perception Group
7.2 Lab 2: Neuroscientific System Theory (NST)
7.3 Lab 3: Perception and Robotics Labs
7.4 Lab 4: Robot Vision Group
7.5 Key Takeaways
8 Research Institutes Focusing on Event Cameras
9 Insights and Recommendations
10 Concluding Remarks
11 Acronyms
12 References

Choose License Type

Happy To Assist You

Contact Images

We will be happy to help you find what you need. Please call us or write to us:

Frequently Asked Questions

This report incorporates the analysis of factors that augments the market growth. Report presents competitive landscape of the global market. This also provides the scope of different segments and applications that can potentially influence the market in the future. The analysis is based on current market trends and historic growth data. It includes detailed market segmentation, regional analysis, and competitive landscape of the industry.
The report efficiently evaluates the current market size and provides industry forecast. The market was valued at xxx Million US$ in 2019, and is expected to grow at a CAGR of xx% during the period 2020-2027.
The report efficiently evaluates the current market size and provides forecast for the industry in terms of Value (US$ Mn) and Volume (Thousands Units).
  • Types
  • Applications
  • Technology
  • End-use Industries
  • Regions
The report share key insights on the following:
  • Current market size
  • Market forecast
  • Market opportunities
  • Key drivers and restraints
  • Regulatory scenario
  • Industry trend
  • Pestle analysis
  • Porter’s analysis
  • New product approvals/launch
  • Promotion and marketing initiatives
  • Pricing analysis
  • Competitive landscape
It helps the businesses in making strategic decisions.
Customization helps the organization to gain insight on specific segments and regions of interest. Thus, WMR offers tailored report information based on business requirement in order to take strategic calls.
Contact us

mapicon
Sales Office (U.S.):
Worldwide Market Reports, 533 Airport Boulevard, Suite 400, Burlingame, CA 94010, United States

mapicon+1-415-871-0703

mapicon
Asia Pacific Intelligence Center (India):
Worldwide Market Reports, 403, 4th Floor, Bremen Business Center, Aundh, Pune, Maharashtra 411007, India.

Newsletter

Want us to send you latest updates of the current trends, insights, and more, signup to our newsletter (for alerts, special offers, and discounts).


Secure Payment By:
paymenticon

This website is secured Origin CA certificate on the server, Comodo, Firewall and Verified Sitelock Malware Protection

secureimg

© 2024 Worldwide Market Reports. All Rights Reserved