throbber
Publication Number
`
`WO02015152304A1
`
`Title
`
`DRIVING ASSISTANCE DEVICE AND DRIVING SUPPORT SYSTEM
`
`Priority
`2014-03-31
`
`JP2014-072419
`
`Publication
`
`2015-10-08 WO2015152304A1
`
`Abstract
`
`Provided is a driving assistance device equipped with: a detection means that detects
`
`conditions around a vehicle; a recognition means that recognizes objects around the
`
`vehicle on the basis of the detection result of the detection means; an analysis means
`
`that analyzes the objects recognized by the recognition means; a setting means thatsets,
`
`on the basis of the analysis result of the analysis means, degrees of caution to be
`
`exercised with respect to the objects; a generation means that generates, on the basis
`
`of the degrees set by the setting means, an image for causing a driver to visually
`
`recognize the objects; and a display means that displays the image generated by the
`
`generation means.
`
`Claim
`
`1. Brief description of the the present invention is provided with a detection means for
`
`detecting a situation around a vehicle,
`
`a recognition means for recognizing an object around the vehicle on the basis of the
`
`detection result of the detection means,
`
`an analysis means for analyzing the object recognized by the recognition means,
`
`On the basis of the analysis result of the analysis means,
`
`a generation means for generating an image for allowing the object to be visually
`
`recognized by the driver of the vehicle on the basis of the degree set by the setting
`
`means,
`
`a display means for displaying the image generated by the generating means,
`
`The driving support device is characterized by being provided with the driving support
`device.
`
`

`

`2. Brief description of the a determination means for determining whether or not the
`
`object recognized by the recognition means is a person,
`
`and a storage means for storing image data of a symbol symbolindicating the state of
`
`the person,
`
`The analysis means analyzes the state of the object determined to be a person by the
`
`determination means,
`
`The generation means is provided with a reading means for reading out image data
`
`corresponding to a symbol symbolindicating the state of the person from the storage
`
`means on the basis of an analysis result of the analysis means,
`
`Whenthe image data of the symbol symbol are read by the reading means, the display
`
`means displays a symbol symbol represented by the image data,
`
`The driving support device according to claim 1, wherein the driving support device is
`
`characterized in that the driving support device is provided with a control unit for
`
`controlling the operation of the driving support device according to the first aspect of the
`
`present invention.
`
`3. According to a third aspect of the present invention, there is provided an image
`
`forming apparatus according to claim 3, wherein
`
`a line-of-sight detecting means for detecting the line of sight of the driver of the vehicle,
`
`an identification means for identifying the image recognized by the driver out of the
`
`images displayed by the display means from the moving state of the line of sight of the
`
`driver as the detection result of the line-of-sight detection means,
`
`an erasing means for erasing the image identified by the driver by the identifying means,
`
`The driving support device according to claim 1, further comprising a control unit for
`
`controlling the operation of the driving support device according to the second aspect of
`
`the present invention.
`
`brief description of the drawings
`
`Description
`Driving support device and driving support system
`
`Cross-reference of related application
`
`[0001] This
`
`international application claims priority based on Japanese Patent
`
`Application No. 2014-72419,
`
`filed Mar. 31, 2014,
`
`filed Mar. 31, 2014, which is
`
`incorporated by reference in the international application by referring to the entire
`
`

`

`contents of Japanese Patent Application No. 2014-72419.
`
`Technical field
`
`[0002]
`
`The present invention relates to a driving assistance device and a driving assistance
`
`system that notify a driver of a vehicle of a risk according to a situation around a vehicle
`
`Background art
`
`[0003]
`
`Conventionally, a display device capable of projecting outside situations and scenery is
`
`knownin a window of a vehicle (see, for example, Patent Document 1)
`
`The Patent Document
`
`(1) The display device according to the present invention is
`
`provided with: an observation device for observing the state (position, speed, etc.) of the
`
`vehicle; and a storage device for previously storing the image information of the outside
`
`scenery. On the basis of the information representing the position of the vehicle observed
`
`by the observation device, the image information of the scenery which would appear
`
`outside the vehicle at the observed position is acquired from the storage device, and the
`
`image represented by the image information is displayed on the window of the vehicle.
`
`Prior Art Document
`
`Patent Document
`
`[0004]
`
`Patent Document 1: Japanese Patent Application Laid-Open No. 2004-20223
`
`SUMMARYOF THE INVENTION
`
`SUMMARYOF THE INVENTION
`
`[0005]
`
`However, according to the conventional image display device as described above, the
`
`landscape image stored in advancein the storage device is only displayed on the window
`
`of the vehicle. That is, the situation around the vehicle is detected in real time, and it is
`
`

`

`not possible to notify the driver of the vehicle about the situation. Therefore, the driver
`
`cannot recognize the risk (for example, the risk of collision with an object around the
`
`vehicle) generated during operation.
`
`[0006]
`
`It is desirable that the danger corresponding to the situation around the vehicle can be
`
`reported to the driver of the vehicle in an easy-to-understand manner.
`
`MEANS FOR SOLVING THE PROBLEM
`
`[0007]
`
`Adriving support device according to a first aspect of the present invention, The present
`
`invention is provided with a detection means for detecting a situation around a vehicle,
`
`On the basis of the detection result of the detection means, The present invention is
`
`provided with: a recognition means for recognizing an object around a vehicle, an
`
`analysis means for analyzing the object recognized by the recognition means, a setting
`
`meansfor setting the degreeto be alert on the basis of the analysis result of the analysis
`
`meansfor the object, a generation means for generating an image for allowing the object
`
`to be visually recognized by the driver of the vehicle on the basis of the degree set by
`
`the setting means, and a display means for displaying the image generated by the
`
`generation means.
`
`[0008]
`
`The "analysis" is intended to determine, determine, or estimate the type and state of the
`
`object by various analysis.
`
`According to the driving support device, a degree (hereinafter referred to as a warning
`
`level) to be alert for each object is set according to the type and state of the object around
`
`the vehicle, and an image corresponding to the alert level is displayed. Therefore, the
`
`driver of the vehicle can grasp what degree of warning should be made (whether or not
`
`the vehicle should be watched) in accordancewith the type and state of the object.
`
`[0009]
`
`Thus, the vehicle can be properly operated. For example, when an image indicating that
`
`a warning level is high is displayed for the object, it is possible to decelerate to avoid
`
`dangeror perform a predetermined handle operation. The "danger" refers to the risk of
`
`collision or the like.
`
`

`

`[0010]
`
`The generation means may generate the following image as an image for allowing the
`
`object to be visually recognized by the driver of the vehicle.
`
`An image processing apparatus includes: an image
`
`An image of an object (for example, an image of an arrow)
`
`An image (such as an image ofan illustration of an object)
`
`Image of a message notifying the presence of an object
`
`The image may be displayed by a display means alone or in combination.
`
`[0011]
`
`Thus, the driver can easily recognize the presence of the object.
`
`In the case of being combined and displayed, a plurality of
`
`images may be
`
`simultaneously displayed or displayed with a time difference. When the image is
`
`displayed with a time difference, the next image may be displayed after a predetermined
`
`time after a certain image is displayed. For example, an image of an arrow pointing to
`
`the object may be displayed after a predetermined time after an image surrounding the
`
`object is displayed
`
`[0012]
`
`In one example, when an object is detected, an image surrounding the object is displayed
`
`first, and then an image of an arrow pointing to the object may be displayed at a stage
`
`where the distance from the own vehicle to the object becomes equal to or less than a
`
`predetermined distance. According to such an aspect,
`
`the driver can continuously
`
`support the object so as to makeit easy to recognize the object.
`
`[0013]
`
`In addition, The driving support device, a determination means for determining whether
`
`the object recognized by the recognition means is a person or not, a storage means for
`
`storing image data of a symbol symbol indicating the state of the person, The analysis
`
`means analyzesthe state of the object determined to be a person by the determination
`
`means, and the generation means is provided with a reading means for reading the
`
`image data corresponding to the symbol symbolindicating the state of the person from
`
`the storage means on the basis of the analysis result of the analysis means, When the
`
`image data of the symbol symbol is read by the reading means, the symbol symbol
`
`represented by the image data may bedisplayed.
`
`

`

`[0014]
`
`Thus, the image can be displayed when the object is a person, so that the safety of the
`
`operation can be further improved. Further, by displaying a symbol symbol indicating the
`
`state according to the state of the person, the driver can recognize the state of a person
`
`around the vehicle. Thus, the driver can perform appropriate operation in consideration
`
`of the state of a person around the vehicle.
`
`[0015]
`
`The driving support device mayalso include a visual line detection means for detecting
`
`the line of sight of the driver of the vehicle, an identification means for identifying the
`
`image recognized by the driver among the images displayed by the display means from
`
`the moving state of the line of sight of the driver, which is the detection result of the line-
`
`of-sight detection means, and an erasing meansfor erasing the image identified by the
`
`driver by the identification means.
`
`[0016]
`
`Thus, when the driver recognizes the object, the image for viewing the object can be
`
`erased. Thus, if the driver recognizes the image (in other words, it is recognized that the
`
`object is present and the state is recognized), the image is continued to be displayed (in
`
`other words, continuation of warning to the driver) can be avoided.
`
`[0017]
`
`On the other hand, the image which is not recognized by the driver is continuously
`
`displayed, and the warning is continued in that part. Thus, the effect of enhancing the
`
`safety of operation is not lost. Thus, the usability and the effect of driving support can be
`
`made compatible at a high level.
`
`[0018]
`
`In one example, the driving support device may include at least one imaging device as
`
`a detection means. When the image of the periphery of the vehicle is picked up by the
`
`imaging device, the type and state of the object around the vehicle can be analyzed in
`
`moredetail by image analysis. Moreover, various methods of image analysis are known
`
`and relatively easily analyzed by using a conventional technique.
`
`[0019]
`
`

`

`Whenthe detection means includes a plurality of imaging devices, the driving support
`
`device (or detection means) may be configured to compare the captured imagesof the
`
`respective imaging devices and select the captured images to be adopted on the basis
`
`of the comparison results. It is also possible to extract a part (a part having a small noise
`
`componentor the like) having high accuracy from the data of each picked-up image, and
`
`to integrate the parts to generate one data. Thus, the accuracy of the image analysis can
`
`be further enhanced. Thus, the situation around the vehicle can be detected and grasped
`
`at a high level.
`
`[0020]
`
`Further, the detection means may reproduce parallax (difference in position and visual
`
`direction of an image) by using a plurality of imaging devices (specifically two bases),
`
`and acquire stereoscopic information (specifically depth information) of the object on the
`
`basis of the parallax. Thus, the driving support device can set the alert level on the basis
`
`of the three-dimensional information of the object. In one example, the driving support
`
`device may display a stereoscopic image (3D image) of the object.
`
`In this case, the
`
`display means may be configured to display a stereoscopic image (3D image) of the
`
`object on the basis of the detection result (stereoscopic information) of the detection
`means.
`
`[0021]
`
`The driving support device (or detection means) may be configured to identify a plurality
`
`of objects overlapping in the field of view of the imaging device from three-dimensional
`
`information (depth information) of each object. For example, the recognition means may
`
`be configured to recognize a plurality of overlapping objects as separate objects from
`
`three-dimensional information (depth information) of each object.
`
`[0022]
`
`When viewed from an imaging device, a plurality of objects which are present in the
`
`same direction and appear to be partially overlapped are recognized as one identical
`
`object only by image analysis based on two-dimensional information. On the other hand,
`
`based on the stereoscopic information (depth information), the plurality of objects can be
`
`identified as separate objects.
`
`[0023]
`
`In this case, the generation means may be configured to generate an image for allowing
`
`

`

`the operator to visually recognize each of the plurality of objects that are partially
`
`overlapped.
`
`In this case, by changing the mode of the image, each of the plurality of
`
`objects may be easily recognized.
`
`[0024]
`
`According to the above configuration, the driver can more accurately inform the driver of
`
`the situation around the vehicle. Further, the driver can more easily grasp the situation
`
`around the vehicle. For example, it may be easier for a driver to recognize a separate
`
`object hidden behind the object
`
`[0025]
`
`The analysis means may then be configured to analyze whether the object is present on
`
`a travel route of the own vehicle (vehicle on which the driving support device is mounted).
`
`Specifically, the road is recognized by the detection means and the recognition means,
`
`and the analysis means may analyze whether or not the object is present on the road.
`
`When the course of the vehicle is estimated from the motion state or the like of the vehicle,
`
`it is also possible to analyze whether or not the object exists on the course.
`
`[0026]
`
`Inthis case, the setting means maysetthe alert level relatively high for the object existing
`
`on the travel route. On the other hand, for an object not present on the travel route, the
`
`alert level may be setrelatively low.
`
`[0027]
`
`The analysis means may be configured to analyze the distance from the own vehicle to
`
`the object.
`
`For example, the analysis can be performed on the basis of the detection result of the
`
`detection meansor the like. In one example, the distance can be calculated by image
`
`analysis of the captured image of the imaging device as a detection means. When the
`
`detection means includes a distance sensor, the distance to the object can be calculated
`
`on the basis of the output result (output signal) of the distance sensor.
`
`[0028]
`
`In this case, the setting means may set the alert level relatively high for the object whose
`
`distance from the own vehicle to the object is relatively small. On the other hand, for the
`
`object whose distance from the own vehicle to the objectis relatively large, the alert level
`
`

`

`may besetrelatively low.
`
`[0029]
`
`The analysis means may be configured to analyze whether or not the person carries a
`
`portable terminal such as a cellular phone, a smartphone, or a tablet when the objectis
`
`a person. Specifically, the picked-up imageof the imaging device as the detection means
`
`may be subjected to image analysis. For example, during the operation of the portable
`
`terminal, the brightness of the display portion becomesbright, and the boundary of the
`
`display can be detected as an edge in image analysis. Based on the recognition of the
`
`display by such edge detection,
`
`it
`
`is possible to analyze whether or not the portable
`
`terminal is in operation (whether or not the portable terminal is present).
`
`[0030]
`
`The analysis means may be configured to analyze whether the person is operating the
`
`portable terminal or not. The analysis may include analyzing the relationship between
`
`the position (especially the position of the hand and the face) of the person's body and
`
`the position of the portable terminal, the direction of the face, the positional relationship
`
`of both eyes to the portable terminal, or the like. Based on the analysis, it is also possible
`
`to determine whether the person is operating the portable terminal or not. The analysis
`
`means may be configured to analyze whether the person is busy at the portable terminal
`
`or not.
`
`[0031]
`
`In this case, the setting means maysetthe alert level relatively high for the object being
`
`the operation of the portable terminal and the object being the call. On the other hand,
`
`for the object which is not operating the portable terminal and the object whichis notin
`
`the speech, the alert level may be setrelatively low.
`
`[0032]
`
`The analysis means may be configured to analyze (or estimate) whether the person is
`
`recognizing the presence of the own vehicle when the person is operating the portable
`
`terminal. For example, if the portion of the person's face is analyzed and both eyes can
`
`be extracted, it is determined that the person is directed in the direction of the own vehicle,
`
`and the person may determine that the presence of the own vehicle is recognized. On
`
`the other hand, if the eyes cannot be extracted, it is determined that the personis not
`
`directed in the direction of the own vehicle, and the person may determine that the
`
`

`

`presence of the ownvehicle is not recognized.
`
`[0033]
`
`In this case, the setting means mayset the alert level relatively high for the object which
`
`has not recognized the existence of the own vehicle.
`
`On the other hand, the setting means mayset the alert level relatively low for the object
`
`recognizing the presenceof the own vehicle. Further, the security level may be lowered.
`
`[0034]
`
`The analysis means may be configured to analyze the reaction of the person after issuing
`
`an alarm to the person by the driving support device, and to analyze whether or not the
`
`person is aware of the presence of the own vehicle (in other words, whether the person
`
`is aware of the presence of the own vehicle). For example, both eyes may be extracted
`
`as described above. Further, the movement of the face may be analyzed. For example,
`
`when it is detected that the face of the person faces the own vehicle side,
`
`it may be
`
`determined that the person has been awareof the presence of the own vehicle.
`
`[0035]
`
`The analysis means may be configured to analyze the sex and age of the person by
`
`using the face recognition technique.
`
`The analysis means may be configured to analyze whether or not a person uses a
`
`headphone. The analysis means may be configured to analyze (or estimate) whether the
`
`person is recognizing the presence of the own vehicle when the person uses the
`
`headphone.
`
`[0036]
`
`The analysis means may be configured to analyze whether a person is in a conversation
`
`or not. The analysis means may be configured to analyze (or estimate) whether the
`
`person is recognizing the presence of the own vehicle when the person is in a
`conversation.
`
`[0037]
`
`The analysis means may be configured to analyze the movementstate of the person.
`
`Specifically, the moving direction of the person may be determined.
`
`It may also be
`
`configured to analyze whether or not the person is approaching the own vehicle (in other
`
`words, whether the person is moving away).
`
`

`

`[0038]
`
`Inthis case, the generation means may be configured to generate an image representing
`
`the moving direction.
`
`Further, the analysis means may calculate the movement speed of the person.
`
`Inthis case, the generation means may be configured to generate an image representing
`
`the movement speedof the person.
`
`[0039]
`
`The analysis means may determine whether the person is a child or an adult on the basis
`
`of the size of the person (specifically, height). More specifically,
`
`it
`
`is possible to
`
`determine whether or not the student is a middle student or not, and the average height
`
`of a predetermined age published as statistical data may be used as a threshold in the
`
`determination.
`
`[0040]
`
`The setting means maysetthe alert level relatively high when the person is a child.
`
`In another aspect, the present invention may be a system (driving support system)
`
`provided with the driving support device described above.
`
`[0041]
`
`In addition, In another aspect, The present invention, The present invention is provided
`
`with a detection means for detecting a situation around a vehicle, On the basis of the
`
`detection result of the detection means, The present
`
`invention is provided with: a
`
`recognition means for recognizing an object around a vehicle, an analysis means for
`
`analyzing the object recognized by the recognition means, The operation support system
`
`includes: setting means for setting a degree to be alert on the basis of an analysis result
`
`of the analysis means for the object; generation means for generating an image for
`
`allowing a driver of the vehicle to visually recognize the object on the basis of the degree
`
`set by the setting means; and display means for displaying the image generated by the
`
`generation means.
`
`[0042]
`
`The driving support system may have a configuration similar to that of the above-
`
`described driving support device.
`
`

`

`BRIEF DESCRIPTION OF THE DRAWINGS
`
`[0043]
`FIG.
`
`1
`
`is a diagram showing an example of application to a vehicle of an operation
`
`support device according to an embodiment.
`FIG. 2 is a block diagram showing a configuration of a driving support device according
`
`to a first embodiment:
`
`FIG. 3 is a flowchart showing the flow of hand support processing executed by a control
`ECU.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`4 is a flowchart showing a flow of an extraction process executed by a control ECU.
`
`5 is a flowchart showing the flow of the analysis processing 1;
`
`6 is a flowchart showing the flow of the analysis processing 2.
`
`7 is a flowchart showing the flow of the analysis processing 3.
`
`8 is a flowchart showing the flow of the analysis processing 4.
`
`9 is a flowchart showing the flow of the analysis processing.
`
`10 is a flowchart showing the flow of an analysis process.
`
`11 is a flowchart showing the flow of a subroutine of the analysis processing 6.
`
`12 is a flowchart showing the flow of the analysis processing 7.
`
`13 is a flowchart showing a flow of vehicle recognition determination processing;
`
`14 is a flowchart showing a flow of display data generation processing;
`
`15 is a flowchart showing the flow of the emphasis image generation processing;
`
`16 is a flowchart showing a flow of display processing;
`
`17 is a diagram showing a display example.
`
`18A-18 B shows A display example.
`
`19 is a diagram showing a display example.
`
`20 is a diagram showing a display example.
`
`21 is a block diagram showing a configuration of a driving support device according
`
`to a second embodiment;
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`FIG.
`
`22 is a flowchart showing the flow of the recognition processing 2.
`
`23 is a flowchart showing a flow of screen determination processing;
`
`24 is a flowchart showing the flow of the analysis processing 8.
`
`25 is a flowchart showing the flow of the display processing 2;
`
`26 is a diagram explaining the operation of the driving support apparatus according
`
`to the second embodiment;
`
`FIG. 27 is a block diagram showing a configuration of a driving support device according
`
`

`

`FIGS. 28A -28Billustrate line-of-sight detection.
`
`to a third embodiment;
`
`FIG. 29 is a flowchart showing the flow of the driving support processing 2.
`
`FIG. 30 is a flowchart showing a flow of correction determination processing;
`
`FIG. 31 is a flowchart showing a flow of display correction processing;
`
`FIG. 32 is a flowchart showing the flow of the recognition determination processing 2.
`
`FIG. 33 is a flowchart showing the flow of the analysis processing 9.
`
`FIG. 34 is a diagram showing a modification example 1;
`
`FIG. 35 is a diagram showing a modification example 2.
`
`FIG. 36 is a diagram showing a modification example 3;
`
`FIG. 37 is a flowchart showing the flow of the analysis processing 10.
`
`FIGS. 38A -38Billustrate the detection of raindrops.
`
`FIG. 39 is a flowchart showing the flow of the analysis processing 11.
`
`FIG. 40 is a flowchart showing the flow of the analysis processing 12.
`
`FIG. 41A-41 Bis a flowchart showing a flow of vehicle control processing.
`
`FIG. 42A -42 B is a drawing for explaining an example of a display mode.
`
`FIG. 43 is a diagram illustrating an example of a display mode;
`
`FIG. 44 is a diagram illustrating an example of a display mode;
`
`FIGS. 45A -45 C are drawings for explaining an example of A display mode.
`
`Description of the Code
`
`[0044]
`
`1,100,101 ... Driving assistance device, 2 ... Infrared radar, 3 ... Millimeter group radar,
`
`4... Infrared camera, 5 ... Visible light camera, 6 ... Momentum amount detection unit,
`
`DESCRIPTION OF SYMBOLS7 ... Head-up display (HUD), 8 ... Image projector, 9 ...
`
`Speakerunit, 10 ... Gaze detection unit, 11 ... Inter-vehicle communication unit, 12 ...
`
`Vehicle position sensor 20 ... Control ECU.
`
`DETAILED DESCRIPTION OF THE INVENTION
`
`[0045]
`
`An embodiment to which the present invention is applied will be described with reference
`
`to the drawings.
`
`The method includes: a first embodiment and a gt;
`
`whole configuration
`
`

`

`As shownin FIG. 1, a driving assistance device (1) according to the first embodimentis
`
`provided with an infrared radar (2), a millimeter wave radar (8), an infrared camera (4),
`
`a visible light camera (5), a motion amount detection unit (6), a head-up display (7), a
`
`speakerunit (8), and a control ECU (20)
`
`[0046]
`
`In addition, in FIG. 1, an image projection device 9, a line-of-sight detection unit 10, a
`
`vehicle-to-vehicle communication unit 11, and a vehicle position sensor 12 are shown
`
`The respective configurations of the driving support apparatus 1 will be described with
`reference to FIGS. 1 and 2.
`
`[0047]
`
`To provide an infrared radar system which can be used for
`
`The infrared radar 2 detects the surrounding situation by using infrared rays (in other
`
`words, detects the presence or absence of an object (hereinafter referred to as an object)
`
`and the distance to the object
`
`[0048]
`
`As
`
`shown
`
`in
`
`FIG.
`
`2,
`
`an
`
`infrared
`
`radar
`
`(2)
`
`is provided with
`
`an_
`
`infrared
`
`transmission/reception unit (2a), a signal processing unit (2b), and an external interface
`
`(2c)
`
`The infrared radar 2 irradiates an infrared ray with an infrared transmission/reception part
`
`2a, and receivesreflected light reflected by the object and returned. A signal processing
`
`part 2b calculates the distance to the object on the basis of the time difference between
`
`the irradiation time of the infrared ray and the light reception time of the reflected light.
`
`Data representing the calculated distance is transmitted to a control ECU 20 via an
`external interface 2c
`
`[0049]
`
`The distance detectable by the infrared radar 2 is several tens m (for example, 20-30 m).
`
`The infrared radar 2 May be provided on the other side part and the rear part of the front
`
`part of the vehicle as shown in FIG. 1.
`
`[0050]
`
`[Millimeter Wave Radar]
`
`The millimeter wave radar 3 is a radar for detecting a surrounding situation by using a
`
`

`

`radio wave of a millimeter wave band
`
`As shown in FIG. 2, a millimeter wave radar (3)
`
`is provided with a millimeter wave
`
`transmission/reception unit (3a), a signal processing unit (3b), and an external interface
`
`(3c)
`
`[0051]
`
`A millimeter wave radar 3 irradiates a millimeter wave with a millimeter wave
`
`transmission/reception part 3a, and receives a reflected wave reflected by an object and
`
`returned. A signal processing part 3b calculates the distance to the object on the basis
`of the time difference between the irradiation time of the millimeter wave and the
`
`reception time of the reflected wave. Data representing the calculated distance is
`transmitted to a control ECU 20 via an external interface 3c
`
`[0052]
`
`The distance detectable by the millimeter wave radar 3 is about 150 m (or more). As the
`
`resolution, what is about several tens cm to 1 m is known
`
`In the first embodiment, an object of a short distance (up to several tens m) is detected
`
`by the infrared radar 2, and an object of a long distance (several tens to 150 m)
`
`is
`
`detected by a millimeter wave radar 3
`
`[0053]
`
`To provide an infrared camera.
`
`The infrared camera 4 is a camera for detecting the surrounding situation by detecting
`
`infrared rays emitted from the object
`
`[0054]
`
`As shownin FIG. 2, an infrared camera (4) is provided with an infrared image sensor
`
`(4a), an image processing unit (4b), and an external interface (4c)
`
`An infrared camera 4 detects the light (infrared) in the infrared region by an infrared
`
`image sensor 4a. An image processing part 4b converts the wavelength and intensity of
`
`the infrared ray detected by the infrared image sensor 4a into an electric signal, and
`
`generates an image on the basis of the electric signal. Data representing the generated
`
`image is transmitted to a control ECU 20 via an external interface 4c
`
`[0055]
`
`Since the infrared camera 4 forms an image by detecting infrared rays emitted from the
`
`

`

`object, the object can be detected even in a state where ambient light (sunlight or the
`
`like) or headlight light is absent. Therefore, the object can be detected even at night or
`
`the like.
`
`[0056]
`
`Inthe first embodiment, as shown in FIGS. 1 and 2, two infrared cameras 4a, 4b arranged
`
`at different positions are provided as the infrared camera 4. The parallax (difference in
`
`the position and visual direction of the image) is reproduced bythe infrared camera 4a
`
`and the infrared camera 4b. Based on the parallax being in correlation with the distance
`
`to the object, the distance to the object can be calculated according to the parallax.
`
`[0057]
`
`When the infrared camera 4 is referred to as an infrared camera 4, both of the infrared
`
`cameras 4a and 4b are used unless a special explanation is given.
`
`To provide a visible light camera.
`
`The visible light camera 5 is a camera for detecting the surrounding situation by detecting
`
`the ambient light and the reflected light of the headlight light
`
`[0058]
`
`As shown in FIG. 2, the visible light camera 5 is provided with a CCD image sensor 5a
`
`as an imaging element, an image processing part 5b, and an external interface 5c
`
`The visible light camera 5 detects light by a CCD image sensor 5a and photoelectrically
`
`converts the light and darkness of the detected light into a charge amount. The data of
`
`the charge amountare transferred to an image processing part 5b. An image processing
`
`unit 5b reproducesthe color and light and darknesson the basis of the data of the charge
`
`amount for each pixel to generate a color image. The information of the generated image
`is transmitted to a control ECU 20 via an external interface 5c
`
`[0059]
`
`In the first embodiment, two visible light cameras 5a, 5b arranged at different positions
`
`are provided as the visible light camera 5. The parallax is reproduced by the visible light
`
`camera 5a and the visible light camera 5b, thereby generating a stereoscopic image.
`
`Similarly to the case of the infrared camera 4, the distance to the object can be calculated.
`
`[0060]
`
`Whenthe visible light camera 5 is referred to as the visible light camera 5, the visible
`
`

`

`light cameras 5a and 5b referto the visible light cameras 5a and 5b.
`
`[MEANS FOR SOLVING PROBLEMS]A [motion amount detection unit]
`
`The momentum detection unit (6) is an unit for detecting the momentum of the own
`
`vehicle, and is provided with a vehicle speed sensor (6a), an yaw rate sensor (6b), and
`
`a steering angle sensor (6c)
`
`[0061]
`
`Specifically, the traveling speed of the own vehicle is detected by a vehicle speed sensor
`
`6a, an yaw rate actin

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket