throbber
* NOTICE *
`
`JPO and INPIT are not responsibie for any damages caused by the useof this translation.
`
`4. This document has been translated by computer. So the translation may not reflect the original precisely.
`
`2. “** shows a word which cannot be translated.
`
`3. In the drawings, any words are not translated.
`
`Publication Number
`
`JPH11083530A
`
`Bibliography
`
`(19) [Publication country] JP
`
`(12) [Kind of official gazette] A
`
`(11) [Publication number] 11083530
`
`(43) [Date of publication of application] 19990326
`
`(54) [Title of the invention] OPTICAL FLOW DETECTOR FOR IMAGE AND SELF-
`POSITION RECOGNIZING SYSTEM FOR MOBILE BODY
`
`(51) {international Patent Classification 6th Edition]
`
`GO1C 21/00
`
`GO1B 11/24
`
`GO6T 7/00
`
`GO6T 7/60
`
`#GO5D 1/02
`
`[Fi]
`GO1C 21/00
`
`GO1B 11/24
`
`GO5D 1/02
`
`Z
`
`K
`
`K
`
`GO6F 15/62
`
`415
`
`15/70
`
`3508
`
`(21) [Application number] 09247320
`
`(22) [Filing date] 19970911
`
`(71) [Applicant]
`
`[Name] FUJI HEAVY IND LTD
`
`(72) [Inventor]
`
`[Full name] KISE KATSUYUKI
`
`Abstract
`
`

`

`(57) [Overview]
`
`PROBLEM TO BE SOLVED: To recognize a self-position by detecting, at high speed,
`
`an optical flow real time from an imaged picture with a small device which can be
`
`mounted on a mobile body.
`
`SOLUTION: A distance image is generated at a stereo-process part 30 of a hardware
`
`circuit for a picture imaged with a camera assembly 10, and then from the distance image,
`
`a histogram for evaluating a No.1 block which can be used for an optical flow is generated.
`
`Then, at an optical flow process part 60 of a hardware circuit, a city block distance is
`
`calculated to fast explore No.2 block corresponding to the No.1 block. Then, at a
`
`navigation process part 90, a speed componentof a mobile body is calculated from the
`
`optical flow of a lower part image, the speed component is subtracted by a rotational
`
`speed component calculated from the optical flow of the lower part image to obtain net
`
`translational speed component, and the net
`
`translational speed component
`
`is
`
`accumulated to calculate a navigation iocus.
`
`Claim
`{Patent Claims]
`
`[Claim 1] An optical flow detection device for an image, which uses one of two images
`
`captured at different timings as a reference image and the other as a comparison image,
`
`searchesfor corresponding positions of the images, and obtains an optical flow between
`
`the images, the optical flow detection device comprising:
`
`A data transfer buffer configured to rearrange the luminance data for each pixel in the
`
`predetermined region of the reference image and the luminance data for each pixel in
`
`the search region of the comparison imageinto data series each having a predetermined
`
`number of bytes and to transfer each data series in a ring shape; and
`
`Calculating an absolute value of a difference between the two series of data transferred
`
`from the transfer buffer in parallel by a plurality of arithmetic units, and then adding
`
`outputs of the arithmetic units in series by a plurality of stages of adders to calculate a
`
`city block distance; An arithmetic processing circuit that outputs an address at which the
`
`city block distance becomes a minimum value as a corresponding pasition of the
`
`comparison image with respect to a predetermined region of the reference image.
`
`[Claim 2] An imaging unit that is mounted on a moving body and includes a pair of two
`
`stereo cameras for imaging a distant scene and a pair of two stereo cameras for imaging
`
`a lower scene:
`
`A distance image generation unit that searches for corresponding positions in the two
`
`images captured by the stereo camera, obtains a pixel shift amount generated according
`
`

`

`to a distance to an object, and generates a distance image in which perspective
`
`information to the object obtained from the pixel shift amountis digitized; and
`
`A histogram generation unit configured to hoid the gradation data and the frequency of
`
`the distance image in respective latches, and generate histogram data by integrating the
`
`frequency of the held gradation data based on a match output from a comparator
`
`configured to compare newly input gradation data with the held gradation data; and
`
`Referring to the histogram data generated by the histogram generation unit from the
`
`distance image, and extracting a plurality of regions of a characteristic pattern suitable
`
`for a navigation calculation as first blocks from the captured image; and An image
`
`processing unit configured to set a search range for searching for a region corresponding
`
`to the first block as a second block from an image having an imaging timing different
`
`from that of the image from whichthefirst block is extracted; and
`
`A data transfer buffer configured to the luminance data for each pixel in the first block
`
`and the luminance data for each pixel in the search range in the second biock into data
`
`series each having a predetermined numberof bytes, and transfers each data series in
`
`a ring shape; and An arithmetic processing circuit configured to calculate an absolute
`
`value of a difference between the data of the two systems transferred from the transfer
`
`buffer in parallel by a plurality of arithmetic units, add outputs of the arithmetic units in
`
`series by a plurality of stages of adders to calculate a city block distance, and output an
`
`address at which the city block distance becomes a minimum value; and
`
`An optical flow from the first block to the second block is calculated based on an output
`
`from the optical flow processing unit for a captured image of a distant scene and the
`
`distance image.
`
`Obtaining an optical flow from the first block to the second block based on an output from
`
`the optical flow processing unit for a captured image of a lower scene and the distance
`
`image to calculate a velocity component of the moving object between frames; and
`
`Converting a transiational velocity component between frames obtained by removing the
`
`rotational velocity component from the velocity component into a translational velocity
`
`component viewed from a distance measurement start point, and accumulating the
`
`converted translational velocity component to obtain a navigation trajectory in a three
`
`dimensional space. A navigation processing unit configured to recognize a self-position
`
`of the moving body.
`
`Description
`[Detailed description of the invention]
`
`

`

`{0001}
`
`[Technical field of invention] The present invention relates to an image optical flow
`
`detection device for detecting an optical flow of an image at a high speed by a hardware
`
`circuit, and a self-position recognition system for a moving body for processing a
`
`captured image in real time to detect an optical flow of an image at a high speed and
`
`recognizing a self-position of a moving body.
`
`{0002}
`
`[Prior art] In the related art, various technologies such as movement control, path
`
`detection,
`
`course detection, and location detection have been developed for
`
`autonomously moving mobile bodies such as unmanned robots, autonomously traveling
`
`work vehicles, and unmanned helicopters. Among these technologies, self-position
`
`recognition is one of important technologies.
`
`[O003]As a technique of the self-position recognition, for example,
`
`in a mobile body
`
`autonomously traveling on the ground such as an autonomously traveling work vehicle,
`
`a two dimensional angular velocity is detected by a vibration gyro or an optical gyro, and
`
`There is a technique in which a translation speed is detected by a sensor that measures
`
`a ground speed, and a movement amount from a reference position is calculated to
`
`measure a self-position. In a flying mobile body such as an unmanned helicopter, there
`
`is a technique in which a gravitational acceleration is detected by an inertial navigation
`
`device to detect an acceleration of the flying body, and the acceleration is integrated to
`
`know a movement amount.
`
`[O004}Further,
`
`in recent years, a technique for recognizing a self-position of a moving
`
`body by applying an image processing technique has been developed. In particular, in a
`
`technique for recognizing a self-position by capturing an image of a surrounding
`
`environment by imaging a camera on a moving body and detecting a motion of the
`
`moving body by obtaining an optical flow between two images captured at different
`
`timings Thus,it is possible to analyze the surrounding environment just like an image
`
`having a huge amountof information, and to realize accurate autonomous navigation by
`
`discriminating complicated topography.
`
`[O005}
`
`{Problem to be solved by the invention] However, when the movement of the moving
`
`body is detected by the optical flow of the image, conventionally, in order to capture the
`
`captured image and recognize the self-position in real time, a processing capability
`
`equivalent to that of a workstation is required, which leads to an increase in size and
`
`weight of the apparatus.
`
`[O006]jFor
`
`this reason,
`
`in a conventional apparatus that can be mounted on an
`
`

`

`autonomous mobile body, a captured image is captured and then processed offline.
`
`Therefore, it is difficult to apply this method to a moving body such as an unmanned
`
`helicopter which needs to recognize its own position in real time, and the application
`
`range is extremely limited.
`
`{O007]The optical flow detection device of the image which the present invention was
`
`madein light of the above-mentioned circumstances, and enables detection of an optical
`
`flow at high speed in real time -- and -- it can carry in a mobile body, without enlarging
`
`To provide a self-position recognition system for a moving body capable of recognizing
`
`the self-position of the moving body from the optical flow of an image by processing a
`
`picked-up image in real time.
`
`{0008}
`
`[Means for solving the problem]it is an optical flow detection device of the image which
`
`the invention according to claim 1 makes another side a comparison image by using as
`
`a reference image one side of the image of two sheets imaged to different timing,
`
`searches a mutual correspondence position, and asks for the optical flow between
`
`images, A data transfer buffer configured to rearrange the luminance data for each
`
`pixel in the predetermined region of the reference image and the juminance data for each
`
`pixel
`
`in the search region of the comparison image into data series each having a
`
`predetermined number of bytes and to transfer each data series in a ring shape; and
`
`Calculating,
`
`in parallel by a plurality of calculators, an absolute value of a difference
`
`between data of two systems transferred from the transfer buffer; and And an arithmetic
`
`processing circuit for adding the outputs of the respective arithmetic units in series by a
`
`plurality of stages of adders to calculate a city block distance and outputting an address
`
`where the city block distance becomes a minimum value as the corresponding position
`
`of the comparison image to the prescribed area of the reference image.
`
`[O009jAccording to a second aspect of the present invention, there is provided an
`
`imaging device including: an imaging unit that is mounted on a moving body and includes
`
`a pair of two stereo camerasfor imaging a distant scene and a pair of two stereo cameras
`
`for imaging a lower scene; A range image generation unit configured to generate a range
`
`image in which perspective information to an object obtained from the pixel shift amount
`
`is digitized; and a comparator configured to hold gradation data and a frequency of the
`
`range image in respective latches and compare newly input gradation data with the held
`
`gradation data A histogram generation unit configured to generate histogram data by
`
`integrating a frequencyof the held gradation data; and extract a plurality of regions of a
`
`characteristic pattern suitable for a navigation calculation as first blocks from the
`
`

`

`captured image with reference to the histogram data generated by the histogram
`
`generation unit from the distance image, and An image processing unit configured to set
`
`a search range for searching for a region corresponding to the first block as a second
`
`block from an image having an imaging timing different from that of the image from which
`
`the first block is extracted; and A data transfer buffer for rearranging the luminance data
`
`of each pixel of the first block and the luminance data of each pixel in the search range
`
`of the second block into data series each having a predetermined numberof bytes and
`
`transferring each data series in a ring shape;
`
`And an arithmetic processing circuit for calculating a city block distance by adding
`
`outputs of the respective arithmetic units in series by a plurality of stages of adders and
`
`outputting an address at which the city block distance becomes a minimum value.
`
`Obtaining an optical flow from the first block to the second block based on an output from
`
`the optical flow processing unit with respect to a captured image of a distant scene and
`
`the distance image to caiculate a rotational speed component between frames of a
`
`moving object; and Obtaining an optical flow from the first block to the second block
`
`based on an output from the optical flow processing unit with respect to a captured image
`
`of a lower scene and the distance image to calculate a velocity component between
`
`frames of a moving object; and And a navigation processing part for recognizing the self-
`
`position of the moving body by converting a translational velocity component between
`
`frames obtained by removing the rotational velocity component from the velocity
`
`componentinto a translational velocity component viewed from a distance measurement
`
`start point, and accumulating the converted translational velocity components to obtain
`
`a navigation locus in a three dimensional space.
`
`[00140]That is, in the optical flow detection apparatus for an image according to the first
`
`aspect of the present invention, when luminance data for each pixel in a predetermined
`
`region of a reference image and luminance data for each pixel in a search region of a
`
`comparison image are input to a data transfer buffer in two images captured at different
`
`timings Each data series is rearranged into a data series of a predetermined numberof
`
`bytes and transferred to an arithmetic processing circuit in a ring shape. After absolute
`
`values of differences between data of two systems are calculated in parallel by a plurality
`
`of arithmetic units, outputs of the respective arithmetic units are added in series by a
`
`plurality of stages of adders to calculate a city block distance, and an address at which
`
`the city block distance becomes a minimum value is output as a corresponding position
`
`of a comparison image to a predetermined area of a reference image. Therefore, the
`
`optical flow can be obtained from the search result of the corresponding position between
`
`

`

`the two images.
`
`[001 1jin a self-position recognition system for a moving body according to a second
`
`aspect of the present invention, a distant scene and a lower scene are captured by
`
`respective stereo cameras, and the distance image generation unit Searching for a
`
`corresponding position in two images by a stereo camera to obtain a pixel shift amount
`
`generated according to a distance to an object, and generating a distance image in which
`
`perspective information to the object obtained from the pixel shift amountis digitized; In
`
`a histogram generation part, the gradation data and the frequencyof the distance image
`
`are respectively held in latches, and the frequency of the held gradation data is integrated
`
`by a coincidence output from a comparator for comparing newly inputted gradation data
`
`with the held gradation data to generate histogram data.
`
`{0042]Extracting, by an image processing unit, a plurality of regions of a characteristic
`
`pattern suitable for navigation calculation as first blocks from the captured image with
`
`reference to the histogram data of the distance image; A search for searching for a region
`
`corresponding to the first block as a second block from an image having an imaging
`
`timing different from that of the image from which thefirst block is extracted.
`
`Set a range.
`
`[0043]}Next, in the optical flow processing unit, the luminance data for each pixel of the
`
`first block and the luminance data for each pixel of the search range of the second block
`
`are calculated with respect to the captured image of the distant scene and the captured
`
`image of the lower scene. When each data series is rearranged into a data series of a
`
`predetermined number of bytes by a data transfer buffer and each data series is
`
`transferred to an arithmetic processing circuit in a ring shape, the arithmetic processing
`
`circuit performs: After absolute values of differences between luminance data of two
`
`systems are calculated in parallel by a plurality of arithmetic units, outputs of the
`
`respective arithmetic units are added in series by a plurality of stages of adders to
`
`calculate a city block distance, and an address where the city block distance becomes a
`
`minimum value is outputted.
`
`[0014]Then, a navigation processing part obtains an optical flow on the basis of an output
`
`from the optical flow processing part for the picked-up image of the distant scenery and
`
`the distance image, and calculates a rotational speed component between framesof the
`
`moving body. Obtaining an optical flow based on an output from an optical flow
`
`processing unit for a captured image of a lower scene and a distance image to calculate
`
`a velocity component between frames of a moving body; and When a translational
`
`velocity component between frames obtained by removing the rotational velocity
`
`

`

`component from the velocity component
`
`is converted into a translational velocity
`
`component viewed from the distance measurement start point,
`
`the converted
`
`translational velocity components are accumulated to obtain a navigation trajectory ina
`
`three dimensional space, anc the self-position of the moving body is recognized.
`
`{0015}
`
`{Embodiment of invention] Hereinafter, embodiments of the present invention will be
`
`described with reference to the drawings. Fig. 1 is an overall configuration diagram of a
`
`self-position recognition system, Fig. 2 is a configuration diagram of a camera assembly,
`
`Fig. 3 is a block diagram of a stereo processing unit, Fig. 4 is a block diagram of an
`
`optical flow processing unit, and Fig. 5 is a configuration diagram of a histogram
`
`processing circuit. Fig.6 is a circuit configuration figure of self-addition type gradation
`
`and a frequency module, and Fig.7 is a configuration diagram of a data transfer buffer
`
`Fig.8 is a configuration diagram of an arithmetic processing circuit, Fig.9 is an
`
`explanatory view of stereo processing, Fig.10 is an explanatory view of histogram
`
`processing, Fig.11 is a time chart of self-addition type gradation and a frequency module,
`
`and Fig.i2 is an explanatory view of optical flow processing, Fig.13 is a flow chart of
`
`Noi block-groups acquisition processing routine, Fig.14 is a flow chart of a rotation data-
`
`processing routine, Fig.15 is a flow chart of an advancing-side-by-side data-processing
`
`routine, and Fig.16 is a flow chart of a navigation data-processing routine.
`
`{OO16]Fig.
`
`1 illustrates a camera assembly 10 including two sets of stereo cameras for
`
`three dimensional space measurement; And a self-position recognition device 20 for
`
`recognizing the self-position on the basis of an optical flow (a distribution of a movement
`
`vector representing a movement on an imaging coordinate plane between frames)
`
`between images obtained by imaging the surrounding environment by the camera
`
`assembly 10 at every fixed time. in the present embodiment, the self-position recognition
`
`system is mounted on, for example, an unmanned helicopter used for agrochemical
`
`application or the like, and is capable of performing navigation by processing an image
`
`captured by the camera assembly 10 in real time.
`
`The trajectory data is transmitted to a flight control computer (not shown) to enable
`
`precise control of the altitude of the helicopter with respect to the ground and the flight
`
`direction of the helicopter along a flight path of absolute coordinates set in advance ora
`
`flight path set with respect fo a target on the ground.
`
`{0017]The self-position recognition device 20 is provided with a stereo processing part
`
`30 for stereo-processing an image picked up by the camera assembly 10, an optical flow
`
`processing part 60 for obtaining an optical flow between time-series picked-up images,
`
`

`

`and an inter-frame moving amount based on the optical flow obtained by the optical flow
`
`processing part 60. A navigation processing part 90 for calculating the rotation and
`
`translation components of a moving body (helicopter) and calculating a navigation locus
`
`is provided, and the stereo processing part 30 and the optical flow processing part 60
`
`are constituted of hardware circuits. The navigation processing unit 90 has a multi-
`
`processor configuration in which a plurality of RISC processors are operated in parallel,
`
`so that on-line navigation data can be acquired by high-speed processing.
`
`{O018]As shown in FIG. 2, the camera assembly 10 serves as a distance measuring
`
`sensor for capturing a change in the surrounding environment on a screen-by-screen
`
`basis and acquiring distance information corresponding to a horizontal deviation. A pair
`
`of distant stereo cameras 12 for imaging a distant scene necessary for calculating a
`
`rotational speed component of a moving body and a pair of lower stereo cameras 13 for
`
`imaging a lower scene (ground surface) necessary for calculating a translational speed
`
`component of the moving body are installed on a frame 71.
`
`[0019]The distance stereo camera 12 is composed of a main camera (reference camera)
`
`i2a and a sub camera 12b which are synchronized with each other and whose shutter
`
`speeds are variable, and the main camera (reference camera) 12a and the sub camera
`
`12b are disposed so as to have a relationship of a base line length LS and sothatvertical
`
`axes of imaging surfaces thereof are parallel to each other.
`
`{O020}Similarly, the lower stereo camera 13 also includes a main camera (reference
`
`camera) 13a and a sub camera 13b that are synchronized with each other and have
`
`variable shutter speeds, and includes the main camera (reference camera) 13a and The
`
`main camera 13a and the sub-camera 13b having the same specifications are disposed
`
`so as to have the relationship of the base length SS and sothat the vertical axes of the
`
`imaging surfaces thereof are parallel to each other. The distance stereo camera 12 and
`
`the lower stereo camera 13 are also synchronized with each other.
`
`[002 1]in the self-position recognition system of the present embodiment, with respect to
`
`each of a captured image of a distant scene and a captured image of a lower scene, a
`
`motion of a moving object is detected from an optical flow, that is, a motion between
`
`images having different imaging times (timings), and The motion of a distant place image
`
`and the motion of a lower part image were converted into the movement magnitude in
`
`real space based on each depth map, the revolving speed component by motion of a
`
`distant place image was removed from the velocity component by motion of a lower part
`
`image, and it asked for the pure advancing-side-by-side velocity component.
`
`Thereafter, the position is converted into a translational velocity component as viewed
`
`

`

`from a distance measurement start point (start point) and accumulated to obtain a
`
`navigation trajectory in a three dimensional space, thereby recognizing the self-pasition
`
`of the moving body.
`
`[0022]In this case, in order to detect the rotational speed of the moving body by the long-
`
`range stereo camera 12 and detect the rotational and translational speeds of the moving
`
`body by the low-range stereo camera 13 It is ideal that the vertical axis of the imaging
`
`plane of the distant stereo camera 12 and the vertical axis of the imaging plane of the
`
`downward stereo camera 13 are orthogonal to each other, and the distant main camera
`
`axis and the downward main camera axis are on the same plane and their reference
`
`points coincide with each other.
`
`[O023}]However, since it is practically difficult to make the reference points of the two
`
`main cameras 12a and 13a the same, the imaging plane vertical axes of the distance
`
`stereo camera 12 and the lower stereo camera 13 are arranged so as to be orthogonal
`
`to each other. A main camera 12a for a distance and a main camera 13a for a lowerpart
`
`are arranged close to each other, and three axes of one camera are arranged in parallel
`
`with any one of three axes of the other camera, so that the movement of a distance
`
`image and a lower part image obtained from two sets of stereo cameras 1213 can be
`
`handled in one real space coordinate system.
`
`[0024}in this case the center of the three axes is placed on the downward stereo camera
`
`13 but even if the distant stereo camera 12 rotates around the downward stereo camera
`
`13 and an offset amount between the two occurs the movementof the distant image is
`
`not affected due to the nature of the distant image as described later. Since the lower
`
`camera axis and the distant camera axis are orthogonal to each other,it is possible to
`
`simplify a process of removing a rotation speed component from a speed component
`
`including translation and rotation and a navigation calculation process viewed from the
`
`origin (the origin of the XYZ coordinate system fixed in the space at the distance
`
`measurement start point) and to accurately grasp the movement of the image.
`
`{OO25]The camera used in the camera assembly 10 may be an ElA-based black-and-
`
`white camera, a color CCD camera (including a 3-CCD camera), an infrared camera, a
`
`night vision camera, or the like, which performs area scanning, or a camera which
`
`digitally outputs information from an image pickup device.
`
`[0026]The stereo processing unit 30 digitizes an analog image captured by the camera
`
`assembly 10 and performs stereo matching between an image (main image) captured
`
`by the main camera 12a (13a) and an image (sub-image)} captured by the sub-camera
`
`42b (13b) so as to An area part having the same pattern as an imaging object area of a
`
`main camera 12a (13a) is searched from imaging coordinates of a sub-camera 12b (13b)
`
`

`

`to obtain deviation (= disparity) of a pixel generated according to a range from an imaging
`
`device to an object, and three dimensional image information (range image) obtained by
`
`digitizing perspective information to the object obtained from the deviation amountis
`
`acquired.
`
`[0027]FIG. 3 is a block diagram of the stereo processing unit 30.
`
`An analog | / F having a gain control amplifier (GCA) 31a and processing an image signal
`
`from the camera assembly 10; (Interface) 31, an A/ D converter 32 for A/ D-converting
`
`an output from the analog | / F31, a shading correction memory 33 including a ROM and
`
`a RAMfor storing shading correction date for correcting distortion of brightness of each
`
`pixel A D / A converter 34 for generating a control voltage to the GCA3ta and the A/D
`
`converter 32, and an FPGA(field programmable gate array) in which various functions
`
`for digital processing such as correction and conversion of an image and camera control
`
`are configured by gates. ( Field Programmable Gate Array ) 35, an image memory 36 for
`
`storing a main image and a sub-image of a distant and lower part after correction and
`
`conversion, a shutter control original image memory 37 for storing an image datum for
`
`camera shutter control, and an image datum for optical flow. A stereo matching circuit
`
`39 for generating a distance image by performing stereo matching between the main
`
`image and the sub image, and a distance image memory 40 for storing the distance
`
`image; The histogram distance data memory 41 stores distance data for performing
`
`histogram processing (to be described later).
`
`[O028]A d/ A controller 45 for controlling the D / A converter 34 and a gain and an offset
`
`amount of the image signal for correcting the dispersion of the mutual image signals
`
`caused by the difference of the outputting characteristics of the image sensors of the
`
`respective cameras are provided as the various functions for the digital signal process
`
`constituted inside the FPGA35. A look-up table (LUT; composed of ROM) 46 for
`
`changing, a multiplier 47 for multiplying image date corrected by this LUT46 by shading
`
`correction date from the memory 33 for shading correction, and a sensitivity adjusting
`
`means for performing logarithmic conversion of a light and dark part of an image. A Log
`
`conversion table (composed of a ROM) 48, an address controller 49, a shading controller
`
`50 for controlling transfer of shading correction data from the shading correction memory
`
`33 to the multiplier 47, and each camera There are a camera controller 51 for controlling
`
`the shutter speed of the 12a,12b,13a,13b , an external | / F52 for externally rewriting or
`
`reading various parameters in the FPGA35, and thelike.
`
`[0029}On the other hand,
`
`[ the above-mentioned optical flow processing part 60 ] As
`
`shown in Fig.4,
`
`in the histogram processing circuit 70 which creates the histogram for
`
`

`

`evaluating the pattern part (1 block of No(es) are called hereafter) of a proper imaging
`
`range to navigation data processing, and the image from which imaging time differs, it is
`
`the same part as the pattern of 1 block of No(es).
`
`(hereinafter, referred to as a No2
`
`block), the histogram memory 62, the FPGA61 block address, and the No1 block scan
`
`start address. The optical flow processing circuit 80 searches for the No2 block.
`
`An optical flow address memory 63 is provided for storing.
`
`{O030}]The Not block is determined by the navigation processing unit 90, and evenif the
`
`image is slightly rotated in an appropriately small area, the No1 block can be reliably
`
`matched with each No2 block only by parallel movement, and the range to the range-
`
`finding point can be reliably narrowed down from a large amount of information. in the
`
`present embodiment, the size of the Not block is a smail region of 12 x 6 pixels.
`
`[003 1]The histogram processing circuit 70 generates a histogram of range image values
`
`appearing in a range image region corresponding to a region of 12 = 6 pixels which is a
`
`candidate for a No1 block. This histogram is referred to when determining the No1 block
`
`in the navigation processing unit 90, and the reliability of the captured imageis evaluated.
`
`Thatis, ifimage patterns captured by the left and right cameras are the same, a distance
`
`image value corresponding to a shift amountin the scanning line direction is obtained.
`
`Therefore, by using this feature,
`
`if there are a predetermined number of the same
`
`distance image values in the periphery, it is determined that the target image is a certain
`
`image that actually exists.
`
`[0032]The configuration of the histogram processing circuit 70 is shown in FIG. 5, and is
`
`composedof n self-addition type gradation / frequency modules 71 (#1 to #n) connected
`
`in parallel, a control circuit 72 for controlling the operation of each gradation / frequency
`
`module 71, and a preprocessing circuit 73 for calculating the frequency of valid data in
`
`advance. The numberof the gradation / frequency modules 71 to be used corresponds
`
`to the larger one of the number of samples and the number of gradations in the input
`
`image.
`
`[OO33]As shown in FIG. 6, each gradation / frequency module 71 is composed of a
`
`gradation latch 74, a coincidence detection circuit 75 using a comparator, an OR circuit
`
`76, a frequency latch 77, and an adder 78. To a D input terminal, a CK terminal, an E
`
`terminal, and a SET terminal of the gradation latch 74, the distance data (gradation data)
`
`from the histogram distance data memory 41, the synchronization clock, the load signal
`
`(latch enable signal} from the contro! circuit 72, and the gradation data are input,
`
`respectively. A clear signal for initialization is input, and gradation data from the
`
`histogram distance data memory 41 and laich data from the Q terminal of the gradation
`
`

`

`latch 74 are input to the respective input terminals of the coincidence detection circuit 75.
`
`[0034]The value obtained by adding the frequency data from the preprocessing circuit
`
`73 and the latch data from the Q@ terminal of the frequency latch 77 by the adder 78 is
`
`input to the D input terminal of the frequency latch 77. The synchronousclock, the output
`
`of the OR circuit 76, and the clear signal from the control circuit 72 are input to a CK
`
`terminal, an E terminal, and a CL terminal of the frequency latch 77, respectively. The
`
`OR circuit 76 receives the coincidence determination signal fr

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket