{"id":54111,"date":"2019-07-28T11:52:02","date_gmt":"2019-07-28T09:52:02","guid":{"rendered":"http:\/\/www.prophesee.ai\/?p=54111"},"modified":"2023-05-14T21:29:06","modified_gmt":"2023-05-14T13:29:06","slug":"event-based-vision-2","status":"publish","type":"post","link":"https:\/\/www.prophesee-cn.com\/en\/2019\/07\/28\/event-based-vision-2\/","title":{"rendered":"What is event based vision?"},"content":{"rendered":"

[et_pb_section fb_built=”1″ next_background_color=”#ffffff” _builder_version=”4.16″ background_image=”\/wp-content\/uploads\/2018\/04\/PROPHESEE-Email-Header-2000×1000-ONBOARD-tests-2.jpg” parallax=”on” custom_padding=”224px|0px|224px|0px|true|false” bottom_divider_style=”asymmetric4″ bottom_divider_height=”50px” global_colors_info=”{}”][et_pb_row _builder_version=”4.16″ global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ header_font_size=”38px” header_letter_spacing=”19px” header_line_height=”1.3em” global_colors_info=”{}”]<\/p>\n

WHAT IS EVENT BASED VISION?<\/b><\/span><\/h1>\n

[\/et_pb_text][et_pb_button button_url=”\/whitepaper-download\/” button_text=”Get the White Paper” button_alignment=”center” _builder_version=”4.16″ custom_button=”on” button_text_color=”#1e2534″ button_bg_color=”#ffffff” button_border_color=”#407ec9″ button_border_radius=”0px” button_font=”||||||||” button_use_icon=”off” button_text_color_hover=”#ffffff” button_border_color_hover=”#407ec9″ button_bg_color_hover=”#407EC9″ global_colors_info=”{}” button_text_size__hover_enabled=”off” button_one_text_size__hover_enabled=”off” button_two_text_size__hover_enabled=”off” button_text_color__hover_enabled=”on” button_text_color__hover=”#ffffff” button_one_text_color__hover_enabled=”off” button_two_text_color__hover_enabled=”off” button_border_width__hover_enabled=”off” button_one_border_width__hover_enabled=”off” button_two_border_width__hover_enabled=”off” button_border_color__hover_enabled=”on” button_border_color__hover=”rgba(0,0,0,0)” button_one_border_color__hover_enabled=”off” button_two_border_color__hover_enabled=”off” button_border_radius__hover_enabled=”off” button_one_border_radius__hover_enabled=”off” button_two_border_radius__hover_enabled=”off” button_letter_spacing__hover_enabled=”off” button_one_letter_spacing__hover_enabled=”off” button_two_letter_spacing__hover_enabled=”off” button_bg_color__hover_enabled=”on” button_bg_color__hover=”#407EC9″ button_one_bg_color__hover_enabled=”off” button_two_bg_color__hover_enabled=”off”][\/et_pb_button][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=”1″ _builder_version=”4.16″ custom_padding=”47px|0px|0|0px|false|false” global_colors_info=”{}”][et_pb_row _builder_version=”4.16″ custom_padding=”8px|0px|0px|0px|false|false” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_code disabled_on=”on|on|off” _builder_version=”4.21.0″ _module_preset=”default” hover_enabled=”0″ global_colors_info=”{}” sticky_enabled=”0″]\t

<\/div>[\/et_pb_code][et_pb_video src=”https:\/\/v.prophesee-cn.com\/sv\/2e0dd8bc-1880160e549\/2e0dd8bc-1880160e549.mp4″ image_src=”http:\/\/www.prophesee-cn.com\/wp-content\/uploads\/2023\/01\/vlcsnap-2023-01-04-11h25m41s764.jpg” disabled_on=”off|off|on” _builder_version=”4.21.0″ _module_preset=”default” hover_enabled=”0″ global_colors_info=”{}” sticky_enabled=”0″][\/et_pb_video][et_pb_text _builder_version=”4.16″ custom_padding=”32px|||||” global_colors_info=”{}”]<\/p>\n

EVENT-BASED VISION FUNCTIONS LIKE THE EYE AND THE BRAIN TO OVERCOME <\/strong><\/span><\/h3>\n

INHERENT LIMITATIONS OF CONVENTIONAL MACHINE VISION<\/strong><\/span><\/h3>\n

[\/et_pb_text][et_pb_divider color=”#000000″ divider_weight=”5px” _builder_version=”4.16″ max_width=”10%” module_alignment=”center” height=”0px” global_colors_info=”{}”][\/et_pb_divider][et_pb_divider show_divider=”off” _builder_version=”4.16″ global_colors_info=”{}”][\/et_pb_divider][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=”1″ _builder_version=”4.16″ custom_padding=”24px|0px|24px|0px|true|false” global_colors_info=”{}”][et_pb_row _builder_version=”4.16″ custom_padding=”0|0px|0px|0px|false|false” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ text_font=”Montserrat||||||||” text_text_color=”#1e2534″ header_font=”||||||||” text_orientation=”justified” global_colors_info=”{}”]<\/p>\n

The human eye shares little with a conventional video camera.<\/strong><\/span><\/span><\/h3>\n

Since their inception 150 years ago, all conventional video tools have represented motion by capturing a number of still frames each second. Displayed rapidly, such images create an illusion of continuous movement. From the flip book to the movie camera, the illusion became more convincing but <\/strong>its basic structure never changed<\/strong>.\u00a0<\/span><\/p>\n

For a computer, this representation of motion is of little use. The camera is blind between each frame<\/strong>, losing information on moving objects. Even when the camera is recording, each of its “snapshot” images contains no information about the motion of elements in the scene. Worse still, within each image, the same irrelevant background objects are repeatedly recorded,<\/strong> generating excessive unhelpful data.\u00a0<\/span><\/p>\n

[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=”1″ _builder_version=”4.16″ custom_padding=”2px|0px|0|0px|false|false” global_colors_info=”{}”][et_pb_row _builder_version=”4.16″ background_size=”initial” background_position=”top_left” background_repeat=”repeat” max_width=”500px” custom_padding=”27px|0px|0|0px|false|false” use_custom_width=”on” custom_width_px=”500px” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ text_font=”||||||||” text_text_color=”#1e2534″ global_colors_info=”{}”]<\/p>\n

Consider a video of a golfer taking a swing<\/h3>\n

[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row _builder_version=”4.16″ background_size=”initial” background_position=”top_left” background_repeat=”repeat” max_width=”500px” custom_padding=”10px|0px|27px|0px|false|false” use_custom_width=”on” custom_width_px=”500px” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_gallery gallery_ids=”54397,54398,54399,54400″ fullwidth=”on” _builder_version=”4.19.4″ auto=”on” auto_speed=”1200″ global_colors_info=”{}”][\/et_pb_gallery][\/et_pb_column][\/et_pb_row][et_pb_row _builder_version=”4.16″ background_size=”initial” background_position=”top_left” background_repeat=”repeat” custom_padding=”0|0px|3px|0px|false|false” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ global_colors_info=”{}”]<\/p>\n

Oversampled: Sky, grass and trees<\/span><\/strong><\/span><\/p>\n

Undersampled: Motion of golfer, club and ball<\/span><\/strong><\/span><\/p>\n

[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=”1″ _builder_version=”4.16″ custom_padding=”21px|0px|21px|0px|true|false” global_colors_info=”{}”][et_pb_row _builder_version=”4.16″ custom_padding=”0|0px|0px|0px|false|false” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ text_font=”Montserrat||||||||” header_font=”||||||||” text_orientation=”justified” global_colors_info=”{}”]<\/p>\n

Consider a video of a golfer taking a swing. A conventional sensor applies an arbitrary frame rate to the whole scene, let’s say 30 frames per second. The important information is the<\/span> swing of the club and the movement of the ball<\/span> but ironically, the sensor will miss segments of this information<\/strong> while repeatedly taking an extensive<\/span> inventory of the sky, trees and grass <\/span>behind him.<\/span><\/p>\n

[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=”1″ _builder_version=”4.16″ custom_padding=”0px|0px|0px|0px” global_colors_info=”{}”][et_pb_row _builder_version=”4.16″ custom_padding=”34px|0px|34px|0px|true|false” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_button button_url=”\/whitepaper-download\/” button_text=”Get the White Paper” button_alignment=”center” _builder_version=”4.16″ custom_button=”on” button_text_color=”#1e2534″ button_bg_color=”#ffffff” button_border_color=”#407ec9″ button_border_radius=”0px” button_font=”||||||||” button_use_icon=”off” animation_style=”fade” button_text_color_hover=”#ffffff” button_border_color_hover=”#407ec9″ button_bg_color_hover=”#407ec9″ global_colors_info=”{}” button_text_size__hover_enabled=”off” button_one_text_size__hover_enabled=”off” button_two_text_size__hover_enabled=”off” button_text_color__hover_enabled=”on” button_text_color__hover=”#ffffff” button_one_text_color__hover_enabled=”off” button_two_text_color__hover_enabled=”off” button_border_width__hover_enabled=”off” button_one_border_width__hover_enabled=”off” button_two_border_width__hover_enabled=”off” button_border_color__hover_enabled=”on” button_border_color__hover=”rgba(0,0,0,0)” button_one_border_color__hover_enabled=”off” button_two_border_color__hover_enabled=”off” button_border_radius__hover_enabled=”off” button_one_border_radius__hover_enabled=”off” button_two_border_radius__hover_enabled=”off” button_letter_spacing__hover_enabled=”off” button_one_letter_spacing__hover_enabled=”off” button_two_letter_spacing__hover_enabled=”off” button_bg_color__hover_enabled=”on” button_bg_color__hover=”#407EC9″ button_one_bg_color__hover_enabled=”off” button_two_bg_color__hover_enabled=”off”][\/et_pb_button][\/et_pb_column][\/et_pb_row][et_pb_row _builder_version=”4.16″ custom_padding=”27px|0px|0px|0px” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ global_colors_info=”{}”]<\/p>\n

NATURE’S ELEGANT SOLUTION<\/b><\/span><\/h3>\n

[\/et_pb_text][et_pb_divider color=”#000000″ divider_weight=”5px” _builder_version=”4.16″ max_width=”10%” module_alignment=”center” height=”0px” global_colors_info=”{}”][\/et_pb_divider][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=”1″ _builder_version=”4.16″ custom_padding=”50px|0px|50px|0px|false|false” global_colors_info=”{}”][et_pb_row _builder_version=”4.16″ custom_padding=”0px|0px|0px|0px” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ text_font=”||||||||” text_text_color=”#1e2534″ text_orientation=”justified” global_colors_info=”{}”]<\/p>\n

Evolution developed an elegant solution so that natural vision never encounters these problems. It doesn’t take frames<\/strong>. Cells in your eye report back to the brain when they detect a change in the scene – an event<\/strong>. If nothing changes, the cell doesn\u2019t report anything. The more an object moves, the more your eye and brain sample it.<\/span><\/p>\n

This process allows human vision to collect all the information it needs, without wasting time and energy reprocessing images of the unchanging parts of the scene.<\/span><\/p>\n

By only recording what changes, the eye and brain can gather useful information from things changing at up to 1000 times a second, without needing to engage enormous amounts of brain power. Neither the predator nor the prey has\u00a0time to waste processing irrelevant information.<\/strong><\/span><\/p>\n

This is event-based vision –\u00a0<\/strong>independent receptors collecting all the essential information, and nothing else.\u00a0<\/strong><\/span><\/p>\n

[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=”1″ fullwidth=”on” disabled_on=”on|on|off” _builder_version=”4.16″ global_colors_info=”{}”][et_pb_fullwidth_header text_orientation=”center” disabled_on=”off|off|off” _builder_version=”4.16″ title_font_size=”70px” content_font_size=”30px” background_color=”rgba(255,255,255,0)” background_image=”\/wp-content\/uploads\/2018\/04\/PROPHESEE-Email-Header-2000×1000-ONBOARD-tests-2.jpg” parallax=”on” background_layout=”light” custom_margin_tablet=”” custom_margin_phone=”” custom_margin_last_edited=”on|phone” custom_padding=”||200px|” title_font_size_tablet=”30px” title_font_size_phone=”” title_font_size_last_edited=”on|phone” content_font_size_tablet=”20px” content_font_size_phone=”15px” content_font_size_last_edited=”on|phone” custom_css_main_element=”250px 0 250px” global_colors_info=”{}” button_one_text_size__hover_enabled=”off” button_two_text_size__hover_enabled=”off” button_one_text_color__hover_enabled=”off” button_two_text_color__hover_enabled=”off” button_one_border_width__hover_enabled=”off” button_two_border_width__hover_enabled=”off” button_one_border_color__hover_enabled=”off” button_two_border_color__hover_enabled=”off” button_one_border_radius__hover_enabled=”off” button_two_border_radius__hover_enabled=”off” button_one_letter_spacing__hover_enabled=”off” button_two_letter_spacing__hover_enabled=”off” button_one_bg_color__hover_enabled=”off” button_two_bg_color__hover_enabled=”off”][\/et_pb_fullwidth_header][\/et_pb_section][et_pb_section fb_built=”1″ _builder_version=”4.16″ background_color=”#f7f7f7″ custom_padding=”62px|0px|62px|0px|true|false” animation_direction=”top” border_radii=”on|20px|20px|20px|20px” global_colors_info=”{}”][et_pb_row _builder_version=”4.16″ custom_padding=”50px|0px|50px|0px” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ global_colors_info=”{}”]<\/p>\n

EVENT-BASED VISION SYSTEMS PERCEIVE THE VITALITY OF THE SCENE<\/b><\/h3>\n

AND OVERLOOK THE\u00a0<\/b>IRRELEVANT<\/b><\/h3>\n

[\/et_pb_text][et_pb_divider color=”#000000″ divider_weight=”5px” _builder_version=”4.16″ max_width=”10%” module_alignment=”center” height=”0px” global_colors_info=”{}”][\/et_pb_divider][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=”2_3,1_3″ _builder_version=”4.16″ custom_padding=”3px|0px|12px|0px” global_colors_info=”{}”][et_pb_column type=”2_3″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ text_font=”||||||||” text_text_color=”#1e2534″ text_orientation=”justified” global_colors_info=”{}”]<\/p>\n

PROPHESEE creates both neuromorphic sensors and bio-inspired algorithms that function like the eye and brain. This holistic approach is a fundamental shift in computer vision – the departure from frame-based sensors,<\/strong> to event-based vision systems, also known as event cameras.<\/p>\n

Each pixel only reports when it senses movement. Whereas in a frame-based sensor all pixels record at the same time, in an event-based sensor each pixel is perfectly independent.\u00a0<\/strong><\/p>\n

[\/et_pb_text][\/et_pb_column][et_pb_column type=”1_3″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_image src=”\/wp-content\/uploads\/2018\/06\/PROPHESEE_Whitepaper_Event-based_sensing.jpg” alt=”Prophesee event based vision sensor ” title_text=”Second generation Prophesee silicon sensor” align_tablet=”center” align_phone=”” align_last_edited=”on|desktop” _builder_version=”4.16″ global_colors_info=”{}”][\/et_pb_image][\/et_pb_column][\/et_pb_row][et_pb_row _builder_version=”4.16″ custom_padding=”20px|0px|0|0px|false|false” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ text_font=”||||||||” text_text_color=”#1e2534″ text_orientation=”justified” global_colors_info=”{}”]<\/p>\n

NOTHING IS LOST BETWEEN THE FRAMES<\/b><\/h3>\n

[\/et_pb_text][et_pb_divider color=”#000000″ divider_weight=”5px” _builder_version=”4.16″ max_width=”10%” module_alignment=”center” height=”0px” custom_margin=”||47px|” global_colors_info=”{}”][\/et_pb_divider][et_pb_text _builder_version=”4.16″ text_font=”||||||||” global_colors_info=”{}”]<\/p>\n

When each pixel is free to\u00a0record only when it is triggered, the information created does not arrive frame by frame. Rather, <\/span>movement<\/span><\/strong> is captured as a continuous stream of information<\/strong>. Nothing is lost between frames.\u00a0<\/span><\/p>\n

[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row _builder_version=”4.16″ custom_padding=”0|0px|2px|0px|false|false” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_image src=”\/wp-content\/uploads\/2018\/06\/robot-arm-demo.jpg” alt=”event based vision vs frame-based vision ” title_text=”Robotic arm demonstration of event-based vision” align_tablet=”center” align_phone=”” align_last_edited=”on|desktop” _builder_version=”4.16″ custom_margin=”50px|||” global_colors_info=”{}”][\/et_pb_image][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=”2_3,1_3″ _builder_version=”4.16″ custom_padding=”0|0px|0|0px|false|false” global_colors_info=”{}”][et_pb_column type=”2_3″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ global_colors_info=”{}”][\/et_pb_text][\/et_pb_column][et_pb_column type=”1_3″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ text_font=”||||||||” text_font_size=”11px” global_colors_info=”{}”]<\/p>\n

The Prophesee sensor records a rotating robotic arm as a continuous stream of movement.<\/p>\n

[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section][et_pb_section fb_built=”1″ fullwidth=”on” disabled_on=”on|on|off” _builder_version=”4.16″ custom_margin=”-3px|||” custom_padding=”0|0px|0|0px|false|false” global_colors_info=”{}”][et_pb_fullwidth_header text_orientation=”center” disabled_on=”off|off|off” _builder_version=”4.16″ title_font_size=”70px” content_font_size=”30px” background_color=”rgba(255,255,255,0)” background_image=”\/wp-content\/uploads\/2018\/04\/PROPHESEE-Email-Header-2000×1000-ONBOARD-tests-2.jpg” parallax=”on” background_layout=”light” custom_margin_tablet=”” custom_margin_phone=”” custom_margin_last_edited=”on|” custom_padding=”||300px|” custom_padding_tablet=”” custom_padding_phone=”||200px|” custom_padding_last_edited=”on|phone” title_font_size_tablet=”30px” title_font_size_phone=”” title_font_size_last_edited=”on|phone” content_font_size_tablet=”20px” content_font_size_phone=”15px” content_font_size_last_edited=”on|phone” custom_css_main_element=”250px 0 250px” global_colors_info=”{}” button_one_text_size__hover_enabled=”off” button_two_text_size__hover_enabled=”off” button_one_text_color__hover_enabled=”off” button_two_text_color__hover_enabled=”off” button_one_border_width__hover_enabled=”off” button_two_border_width__hover_enabled=”off” button_one_border_color__hover_enabled=”off” button_two_border_color__hover_enabled=”off” button_one_border_radius__hover_enabled=”off” button_two_border_radius__hover_enabled=”off” button_one_letter_spacing__hover_enabled=”off” button_two_letter_spacing__hover_enabled=”off” button_one_bg_color__hover_enabled=”off” button_two_bg_color__hover_enabled=”off”][\/et_pb_fullwidth_header][\/et_pb_section][et_pb_section fb_built=”1″ _builder_version=”4.16″ custom_padding=”0px|0px|0px|0px” global_colors_info=”{}”][et_pb_row _builder_version=”4.16″ custom_padding=”50px|0px|0px|0px” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ global_colors_info=”{}”]<\/p>\n

A DISRUPTIVE TECHNOLOGY<\/b><\/h3>\n

[\/et_pb_text][et_pb_divider color=”#000000″ divider_weight=”5px” _builder_version=”4.16″ max_width=”10%” module_alignment=”center” height=”0px” global_colors_info=”{}”][\/et_pb_divider][\/et_pb_column][\/et_pb_row][et_pb_row _builder_version=”4.16″ custom_padding=”50px|0px|0|0px|false|false” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ text_font=”||||||||” text_text_color=”#1e2534″ text_orientation=”justified” inline_fonts=”Montserrat” global_colors_info=”{}”]<\/p>\n

Event-based vision systems\u00a0perceive the vitality of the scene and overlook the irrelevant<\/strong>. They produce up<\/strong>\u00a0to 1000 times fewer data<\/strong> than a conventional sensor whilst achieving a higher equivalent temporal resolution\u00a0<\/strong>of\u00a0>10 000 fps<\/em>.\u00a0<\/em><\/span><\/p>\n

By bypassing inherent\u00a0limitations in conventional computer vision, event-based vision is disrupting the current technology in fields such as automotive vehicles, artificial intelligence & deep learning, industrial automation, IoT, security, surveillance, and health-care among others.<\/span><\/p>\n

 <\/p>\n

[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row _builder_version=”4.16″ custom_padding=”50px|0px|53px|0px|false|false” global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_text _builder_version=”4.16″ global_colors_info=”{}”]<\/p>\n

Download the White Paper to learn more about the fundamental shift happening in machine vision.\u00a0<\/strong><\/h4>\n

[\/et_pb_text][et_pb_button button_url=”\/whitepaper-download\/” button_text=”Get the White Paper” button_alignment=”center” _builder_version=”4.16″ custom_button=”on” button_text_color=”#1e2534″ button_bg_color=”#ffffff” button_border_color=”#407ec9″ button_border_radius=”0px” button_font=”||||||||” button_use_icon=”off” animation_style=”fade” button_text_color_hover=”#ffffff” button_border_color_hover=”#407ec9″ button_bg_color_hover=”#407ec9″ global_colors_info=”{}” button_text_size__hover_enabled=”off” button_one_text_size__hover_enabled=”off” button_two_text_size__hover_enabled=”off” button_text_color__hover_enabled=”on” button_text_color__hover=”#ffffff” button_one_text_color__hover_enabled=”off” button_two_text_color__hover_enabled=”off” button_border_width__hover_enabled=”off” button_one_border_width__hover_enabled=”off” button_two_border_width__hover_enabled=”off” button_border_color__hover_enabled=”on” button_border_color__hover=”rgba(0,0,0,0)” button_one_border_color__hover_enabled=”off” button_two_border_color__hover_enabled=”off” button_border_radius__hover_enabled=”off” button_one_border_radius__hover_enabled=”off” button_two_border_radius__hover_enabled=”off” button_letter_spacing__hover_enabled=”off” button_one_letter_spacing__hover_enabled=”off” button_two_letter_spacing__hover_enabled=”off” button_bg_color__hover_enabled=”on” button_bg_color__hover=”#407EC9″ button_one_bg_color__hover_enabled=”off” button_two_bg_color__hover_enabled=”off”][\/et_pb_button][\/et_pb_column][\/et_pb_row][et_pb_row _builder_version=”4.16″ global_colors_info=”{}”][et_pb_column type=”4_4″ _builder_version=”4.16″ custom_padding=”|||” global_colors_info=”{}” custom_padding__hover=”|||”][et_pb_code _builder_version=”4.16″ global_colors_info=”{}”]\n\t\t\t

<\/p>\n\t\t\t\n\t\t\t\t\n\t\t\t\t\t\n\t\t\t\t\t\t\n\t\t\t\t\t\t\t\"Prophesee\n \n\t\t\t\t\t\t\t\t<\/a>Evaluation kit 3 featuring GenX320 sensor and IMX636 HD sensor \n\t\t\t\t\t\t\t\t<\/rs-layer>\n\t\t\t\t\t\t\t\t<\/rs-layer><\/rs-group><\/rs-bg-elem>\n \n\t\t\t\t\t\t\t\t<\/a>Prophesee introduces Inventors Community \n\t\t\t\t\t\t\t\t<\/rs-layer><\/rs-bg-elem>\n \n\t\t\t\t\t\t\t\t<\/rs-layer><\/rs-group> \n\t\t\t\t\t\t\t\t<\/a>Sony to Release Event-Based Vision Sensors Co-Developed with Prophesee \n\t\t\t\t\t\t\t\t<\/rs-layer>\"\" \n\t\t\t\t\t\t\t\t<\/rs-layer><\/rs-group><\/rs-bg-elem>\n \n\t\t\t\t\t\t\t\t<\/a><\/rs-bg-elem>\n \n\t\t\t\t\t\t\t\t<\/rs-layer><\/rs-group>\t\t\t\t\t<\/rs-slide>\n\t\t\t\t\t<\/rs-slides>\n\t\t\t\t<\/rs-module>\n\t\t\t\t