throbber
* NOTICE*
`
`JPO and INPIT are not responsible for any damages caused bythe useofthis translation.
`
`1. This document has been translated by computer. So the translation may notreflect the original precisely.
`
`2.“ shows a word which cannotbetranslated.
`
`3.
`
`In the drawings, any words are nottranslated.
`
`Patent Number
`
`JP2002260166A
`
`Bibliography
`
`(19) [Publication country] JP
`
`(12) [Kind of official gazette] A
`
`(11) [Publication number] 2002260166
`
`(43) [Date of publication of application] 20020913
`
`(54)
`
`[Title
`
`of
`
`the
`
`invention)
`
`INFORMATION DISTRIBUTION DEVICE AND
`
`INFORMATION DISTRIBUTION SYSTEM
`
`(51) [International Patent Classification 7th Edition]
`
`GO8G 1/04 HO4N=5/225GO6T 1/00 GO8G) 1/13 H04B 7/26
`
`
`
`
`
`
`
`HO4N 7/173
`
`/G08G 1/00
`
`GO08G
`
`1/09
`
`[Fl]
`
`GO08G
`
`1/04
`
`D
`
`GO6T 1/00
`
`330A
`
`GO08G
`
`1/13
`
`
`
`HO4N=5/225 Cc
`
`7H173=6102
`
`GO08G
`
`1/00
`
`1/09
`
`A
`
`H04B 7/26
`
`A
`
`M
`
`(21) [Application number] 2001057082
`
`(22) [Filing date] 20010301
`
`(71) [Applicant]
`
`[Name] OMRON CORP
`
`(72) [Inventor]
`
`

`

`[Full name] KOBAYASHI HIDEYUKI
`
`[Full name] ANDO TANICHI
`
`[Full name] MUKOGAWA SHINICHI
`
`[Full name] SHIMIZU ATSUSHI
`
`[Full name] MITSUDA MASA
`
`Abstract
`
`(57) [Overview]
`
`PROBLEM TO BE SOLVED: To provide an information distribution device capable of
`
`easily performing determination even by a user with less knowledge and experiences.
`
`SOLUTION: Image processing is performed on image data D from an image pick-up
`
`means to extract meaningful
`
`information | associated with the image data. The
`
`meaningful information is distributed to a terminal 5 of a user 5' together with the picked-
`
`up image D or the appropriately processed image D'. The user can easily perform
`
`determination by referring to the meaningful information.
`
`Claim
`
`[Patent Claims]
`
`[Claim 1] An information distribution device for distributing information to a user via a
`
`network, and an information extraction means for performing image processing on the
`
`image data from the
`
`imaging means
`
`and extracting meaningful
`
`information
`
`accompanying the image data;
`
`Means for generating distribution image data from the image data;
`
`Distribution information generating means for synthesizing the distribution image data
`
`and the meaningful information to generate distribution information ;
`
`An information delivery device comprising : communication means for outputting the
`
`delivery information to the network.
`
`[Claim 2] A sensor for acquiring information other than an image is provided.
`
`An information distribution apparatus according to claim 1, wherein said distribution
`
`information generating means has a function of synthesizing and outputting the
`
`meaningful information obtained from said sensor.
`
`[Claim 3] An information distribution apparatus according to claim 1 or 2, wherein said
`
`distribution information generation means determines a type of meaningful information
`
`to be combined in response to a request received via said network.
`
`[Claim 4] The imaging means is for imaging a road.
`
`An information distribution apparatus according to any one of claims 1 to 3, wherein said
`
`

`

`meaningful information is traffic relationship information of said road.
`
`[Claim 5] An information extracting means for performing image processing on the
`
`image data from the
`
`imaging means
`
`and extracting meaningful
`
`information
`
`accompanying the image data;
`
`Means for generating distribution image data from the image data;
`
`A plurality of information distribution devices including distribution information generation
`
`means for synthesizing the distribution image data and the meaningful information to
`
`generate distribution information ;
`
`An information delivery system comprising an integration processing device which
`
`receives information output from a plurality of information delivery devices, integrates
`
`the information to generate integrated meaningful
`
`information, and outputs the
`
`integrated meaningful information to a network.
`
`Description
`
`[Detailed description of the invention]
`
`[0001]
`
`[Technical field of invention] The present invention relates to an information distribution
`
`apparatus and an information distribution system, and more particularly, to a system
`
`which usesbidirectional wireless communication such as a mobile phone and distributes
`
`traffic congestion information and other traffic related information.
`
`[0002]
`
`[Prior art] Typical traffic information includes highway radios on highway and traffic
`
`information that is passed during general radio broadcasts. This information is used
`
`when the user wants to listen, because the place where the broadcast
`
`is being
`
`broadcastedis limited (in the former case), and the time to be broadcasted is determined
`
`(in the latter case).
`
`It is difficult to obtain necessary information. Furthermore,
`obtain current information in real time.
`
`it has not been possible to
`
`[0003]In order to solve such a problem, a video camera captures a situation such as a
`
`traffic jam on a road and stores the captured image data in a server. Then, the image
`
`data stored in the server is distributed to a predetermined terminal via the Internet or the
`
`like. In other words, the user operates a terminal capable of displaying image data and
`
`accessesthe server. Then, it selects image information obtained by picking up an image
`
`of a desired point. With this selection, the image data stored in the server is output to the
`
`display screen of the terminal.
`
`

`

`[0004]Then, since the image data currently being picked up by the video camerais stored
`
`in the server, it is possible to distribute the image data which is being captured by the
`
`terminal. Therefore, when desired, the user can acquire the state of the necessary
`
`location (a predetermined location where the video camera is installed) as the image
`
`information. Accordingly, the user can know the degree of congestion, the causeof the
`
`congestion, and the like by looking at the acquired image information.
`
`[0005]
`
`[Problem to be solved by the invention] However, in the conventional system, although
`
`many information can be obtained based on the distributed image data, in order to obtain
`
`suchinformation, it is necessary for the user to view the image data and understand the
`
`contents thereof. In other words, to obtain useful information, knowledge and experience
`
`are required to some extent on the user side. Further,it is still difficult for the user to
`
`effectively utilize the acquired information, and it is difficult for all users to obtain and
`utilize the correct information.
`
`[O006]To provide an information distribution device and an information distribution
`
`system capable of easily making use of image information to be distributed and easily
`
`determining even a user with little knowledge / experience.
`
`[0007]
`
`[Means for solving the problem] An information distribution apparatus according to the
`
`present invention is an apparatus for distributing information to a user through a network.
`
`An information extracting means for performing an image processing on image data from
`
`an imaging means and extracting meaningful information accompanying the image data;
`
`It has a means to generate the image data for distribution from the aforementioned image
`
`data, a delivery information generating means which synthesizes the aforementioned
`
`image datafor distribution, and the above-mentioned significant clever information, and
`
`generates delivery information, and a means of communication which outputs the
`
`delivery information to a network.
`
`[0008]"Distributing information to a user" includes a case in which an information
`
`distribution apparatus directly transmits information to a user terminal, and a case in
`
`which the information is once transmitted to another apparatus and indirectly distrib uted
`
`to a user terminal via the apparatus. The meaningful information is information suitable
`
`for performing determination accurately and quickly when judging from the distributed
`
`image.
`
`Information that is difficult to determine from an image is determined, and it takes time
`
`to determine, and a meaningful information that requires an experience, Knowledge, or
`
`

`

`the like is extracted as meaningful information in advance. In the embodiment, character
`
`information is displayed as character information, but various kinds of symbols may be
`used.
`
`[O009]The meaningful information extracting means maydirectly receive the image data
`
`sent from the imaging means and perform processing, or may temporarily store the
`
`image data in the image storage means and read the image data and perform processing.
`
`In the embodiment, the means for generating the distribution image data is realized by
`
`the preprocessing unit 17 a. Note that "generating image data for distribution” is referred
`
`to as "generating image data for distribution", but it is not always necessary to perform
`
`processing for an image as in the embodiment. Whenthe image data sent from the image
`
`pickup means can be distributed as it
`
`is, processing such as selection of necessary
`
`image data without processing is performed.
`
`[0010]According to the present
`
`invention, since the meaningful
`
`information is also
`
`distributed when the image data is distributed,
`
`the received user can quickly and
`
`accurately perform the determination regardless of the degree of the knowledge and
`
`experience by viewing the meaningful information together with the image. Furthermore,
`
`since an image is normally sent, various kinds of information which cannot be obtained
`
`by meaningful information depending on the ability of an individual can be obtained.
`
`[0011]In a preferred embodiment of the present invention, it is provided with a sensor for
`
`acquiring information other than an image, and the distribution information generating
`
`means hasa function of synthesizing and outputting the meaningful information obtained
`
`from the sensor. Of course, such a sensoris not necessarily provided. When the sensor
`
`is provided as described above,
`
`information which is difficult to be understood in an
`
`image can be distributed together, so that more accurate determination can be quickly
`
`performed.
`
`[0012]Further, various kinds of meaningful information to be distributed are prepared.
`
`Accordingly, all of them may be distributed, or the distribution information generating
`
`means may determine the type of meaningful information to be combined in responseto
`
`a request received via the network. In this case, since only information required by the
`
`user is transmitted, there is no useless information for the user, and it is possible to
`
`immediately view and determine necessary information. In particular, when the terminal
`
`has a small display screen, if there is more meaningful information to be delivered, it is
`
`difficult to see an image, and there is a possibility that the image is obstructed when it is
`
`determined that the image is viewed, but
`
`the problem is solved by sending only
`
`necessary one.
`
`[0013]It is to be noted that the image pickup means is intended to image a road, and the
`
`

`

`meaningful
`
`information may be used asthe traffic relation information of the road.
`
`According to such an application, a traffic jam situation or the like can be acquired in real
`
`time and is useful in determining a route to be traveled. Note that the traffic relationship
`
`information includes various kinds oftraffic such as the traveling state of a vehicle such
`
`as a traffic jam state and the like, and the state such as a road surface or other road.
`
`[0014]In an information distribution system according to the present
`
`invention, an
`
`information extracting unit for performing an image processing on image data from an
`
`imaging unit and extracting meaningful information associated with the image data is
`
`provided. A plurality of information distribution devices including a means for generating
`
`image data for distribution from the image data and a distribution information generation
`
`means for synthesizing the image data for distribution and the meaningful information to
`
`generate distribution information are provided. FIG. 3 is a block diagram showing a
`
`configuration of an integrated semantic information management system according to an
`
`embodiment of the present invention ;.
`
`[0015]It is not always necessary for each information distribution apparatus to be of the
`
`sametype, and it is also possible to have different functions. In addition, it is of course
`
`possible to add other functions such as an external sensor in addition to the above-
`
`described functions. By thus making an integrated judgment and generating the
`
`integrated meaningful
`advanced information.
`
`information,
`
`it
`
`is possible to distribute more complicated and
`
`[0016]The above described components of the invention can be combined as much as
`
`possible. Each means constituting an information distribution apparatus according to the
`
`present invention can be realized by a dedicated hardware circuit, or can be realized by
`
`a programmed computer.
`
`[0017]
`
`[Embodimentof invention] FIG. 1 shows an example of an entire information distribution
`
`system using a network to which the present invention is applied. As shown in FIG. 1, a
`
`plurality of information collecting apparatuses 1 and a server 2 are connected to a
`
`network 3 such as the Internet, and images and other information collected by the
`
`information collecting apparatus 1 are sequentially transmitted to the server 2 via the
`
`information is processed and arranged on the server 2 and
`network 3. The sent
`converted into a form suitable for distribution.
`
`[0018]Further, by connecting the terminal 5 to the network 3, the user downloads
`
`information collected by the desired information collecting apparatus 1 to the terminal 5
`
`and acquires the information. In other words, in response to a request issued from a user
`
`

`

`(terminal), the server 2 distributes the stored predetermined information to the user
`
`terminal 5 via the network3.
`
`[0019]Thus, the user can obtain information collected by the information collection device
`
`1 whenever the environment and situation can be connected to the network 3. Thus, by
`
`operating and storing information acquired by the information collection device 1
`
`in real
`
`time on the server 2 and storing the information, the user can obtain and determine the
`
`necessary information at any time.
`
`[0020]The information collecting device 1
`
`is provided with an imaging device whichis
`
`installed around the road and capturesthe situation of the installed road. Specifically,
`
`FIG. 2 and Fig. 3 are as shownin Fig. 1. As shownin FIG. 4, the imaging device 10 is
`
`composed of a video camera using a CCD (charge-coupled device).
`
`To output digital data (digital moving picture information) such as a nJPEG format. This
`
`output digital moving picture information is supplied to the data processing apparatus 11.
`
`[0021]An imaging device 10 composed of a video camera captures a state of a road in
`
`a monitoring area and obtains moving image data. Then, this photographing process
`
`operates so as to execute the flowchart shown in FIG. 5. In other words, the imaging
`
`device 10 has an internal timer, and takes an image by the CCD camera every time a
`
`prescribed time (photographing interval) elapses (ST 11 to ST 13). Here, a video image
`
`is assumed to be a moving image, and a resolution of 720 x 480 pixels is relatively high.].
`
`For example, the imaging interval is 30 frames / second. Thus, the image data obtained
`
`at predeterminedintervals is converted into digital data (ST 14), and the obtained image
`
`data is temporarily stored in the temporary image storage device (ST 15) and transferred
`
`to the data processing device 11.
`
`In the photographing process, the above-described
`
`processing is repeatedly executed.
`
`[0022]The data processing device 11 has a processing capability of a normal PC level,
`
`and changesthe digital moving image information captured by the imaging device 10 to
`
`extract information which is easy to be used for the processing, and also changesthe
`
`data amountto be easily distributed to a terminal connected to a network having a small
`
`line capacity by, for example, compressing, compressing or deleting unnecessary data.
`
`[0023]Further, the data processing apparatus 11 is connected to a storage 12 configured
`
`of a hard disk or other storage medium, and stores an image processed by the data
`
`processing apparatus 11 and data for processing. As data for processing, there may be
`
`used an image which is picked up by an image pickup device 10 and which is to be
`
`compared with an image to be processed, an image whoseresolution is changed, and
`extracted data.
`
`

`

`[0024]Then,
`
`the image and other information processed by the data processing
`
`apparatus 11 are given to the communication apparatus 13. The communication device
`
`13 transmits the acquired image and other information to the network 3. In other words,
`
`by generating a transmission frame or the like to which an address of the server 2 of the
`
`transmission destination is added and transmitting it on the network 3, the processed
`
`image or other information is collected and accumulated in the server 2.
`
`[0025]FIG. 3 showsan internal configuration of the data processing apparatus 11. That
`
`is,
`
`it has an input unit 15 that receives output data from the imaging device 10 and
`
`acquires information received via the communication device 13.
`
`Information (data)
`
`acquired by the input unit 15 is supplied to the CPU 17.
`
`[0026]The CPU 17 reads out predetermined data stored in an external storage device
`
`(storage) 12 and executes predetermined processing on the image data received from
`
`the input unit 15 by appropriately using a memory 14 as a work memory. Then, the
`
`generated processing data is passed to the output unit 16. The output unit 16 transmits
`
`the received processing data to the communication device 13, and enables transmission.
`*
`
`The processing unit is connected to the bus and transfers data via the bus. Further, the
`
`data processing device 11 and the communication device 13 can be realized by, for
`
`example, a personal computer.
`
`[0027]Here,
`
`in the present invention, as shown in FIG. 4, the processing in the data
`
`processing apparatus 11 performs the feature extraction based on the acquired image
`data D or the pre-processed image data D ’
`obtained from the image data D, and
`
`generates the meaningful information |. Actually, in consideration of transmission on the
`Internet, an image to be distributed is pre-processed and the image data D ’
`is
`
`the feature extraction is
`obtained in which the data capacity is reduced. Further,
`performed on the basis of the image data D (which may be acorrected D ’
`for clarity).
`Then, the image data (D ’
`) and the meaningful information | are paired and distributed.
`
`In other words, the meaningful information | extracted from the image is distributed
`simultaneously with the image.
`In practice, the related image data D ’
`and the
`meaningful information | are stored in the server 2, and the user 5 ’
`accesses the
`server 2 to download the image dataD ’
`and the meaningful information | related to
`
`the terminal 5 and display them.
`[0028]Then, the user 5 ’
`performs final determination while viewing the displayed
`image data D ’
`and the meaningful information |. At this time, since the meaningful
`
`information |
`
`is displayed together with the image, it is possible to easily and quickly
`
`

`

`perform an accurate determination based on the meaningful information | even if the
`
`experience and knowledge necessary for
`
`the determination are small. As
`
`the
`
`determination result, there is, for example, "to change a route,” "to change a route,” "to
`
`change a traffic jam, and so on, so that a traffic jam is cleared,” and so on.
`
`[0029]FIG. 6 showsa software configuration of the CPU 17 for extracting the meaningful
`
`information |. That is, since the image data is transferred every predetermined interval
`
`from the image pickup device 10 every 1 frames, the CPU 17 acquires the image data
`
`(shooting data) via the input unit 15. Specifically, it is supplied to the pre-processing unit
`
`17 a. Further, the image data is stored in an external storage device (storage) 12 asit is.
`
`[0030]In order to correctly extract the meaningful
`
`information, the imaging device 10
`
`captures image data with high accuracy. As a result, the capacity required for 1 frames
`
`becomeslarge, and ifit is left asitis, it becomesdifficult to distribute video on the premise
`
`of the Internet. Therefore, the preprocessing unit 17 a reduces the number of colors to
`
`be expressed so asto be able to distribute moving images on the premiseofthe Internet,
`
`and performs processing of reducing the resolution and the frame rate. Further, a
`
`processing for making the blurred portion of the image easy to see and improving the
`
`dynamic range of the contrast and the improved brightness and darknessof the focus is
`
`performed. This preprocessed image data is once stored in the storage 12 and used for
`
`other operations and image distribution.
`
`[0031]In this example, all of the image data that has been captured and transferred is
`
`executed, but it is also possible to perform, for example, an image that is necessary for
`
`receiving a distribution request and actually distributing information.
`
`In this case, for
`
`example, it can be implemented by a procedure shownin a flowchart shown in FIG. 7.
`
`In other words, upon receiving the input of the request image data, a necessary image
`
`is acquired from the storage (image temporary storage device) 12 (ST 21, ST 22). If there
`
`is no suitable image, the process jumps to Step 1 and waits for an input associated with
`
`the transfer of a new image (ST 23).
`
`[0032]Whena suitable image is obtained, an image processing such as a change in the
`
`numberof colors and a change in resolution is performed (ST 24, ST 25). Whether or
`
`not the request received in step 21 is a moving image is determined (ST 26), and in the
`
`case of a moving image, a necessary image is collected (ST 27). For this necessary
`
`image, a change in the number of colors, a change in resolution, and the like are
`
`performed. Then, the format is changed (ST 28). Note that image data (distribution image
`
`data) generated by changing this format may be temporarily stored in, for example, a
`
`distribution image data storage unit in the storage 12, or may be passedto the distribution
`
`

`

`information generation unit 17 c.
`
`[0033]Further,
`
`the information extracting unit 17 b for extracting the meaningful
`
`information extracts the photographed image data stored in the storage 12 and extracts
`
`the meaningful
`
`information therefrom. Specifically, a flowchart shown in FIG. 8 is
`
`implemented.
`
`[0034]In other words, as shown in FIG. 4, the data processing device 11 reads the highly
`
`accurate image data transferred from the imaging device 10 from the storage 12 which
`
`is an image temporary storage device (ST 31). Then, a necessary portion in the image
`
`is cut out (ST 32). For example, on the basis of the carriageway region data stored in the
`
`storage 12 or the like, the cutting-out portion cuts out the carriageway portion in the
`
`image.
`
`[0035]Then, a moving object is detected for the cut-out image (ST 33). The detection of
`
`the moving body can be performed by using a "background difference method", a "time
`
`difference method”, or the like provided as a general vehicle extraction algorithm. In other
`
`words, in the background difference method, a moving object is detected by preparing
`
`an image (background image) without a vehicle or a falling object in advance and taking
`
`a difference between the background image and the image to be processed. Also, the
`
`time difference method extracts an object moving between 2 images by performing a
`
`comparison operation on different images between time intervals 6.
`
`[0036]One of the 2 vehicle extraction algorithms may be used, but since each of the two
`
`vehicle extraction algorithms has a single long and short one, in this embodiment, both
`
`of the two vehicle extraction algorithms are employed and extracted by appropriately
`
`switching them according to the situation. That
`
`is, while the background difference
`
`method is suitable for detecting moving objects and stationary objects (e.g., parked
`
`vehicles), the accuracy of detection is due to the accuracyof the background image, and
`
`the reliability of the background image decreases, for example,
`
`if there is a sudden
`
`change in sunlight or shadow. Thatis,
`
`To cope with environmental variation. On the other hand, in the time difference method,
`
`there is a problem that, although correspondence to environmental variation and
`
`extraction of moving objects can be performed with high accuracy, a stationary object
`
`cannot be extracted. Therefore, the presence or absence of environmental variation
`
`(degree) is determined, and when the variation is small, extraction by the background
`
`subtraction method is performed, and when the variation is large, extraction by the time
`
`difference method is performed.
`
`[0037]Next, the vehicle is identified from the extracted moving object, and the numberof
`
`

`

`passes is counted (ST 34). Then,
`
`the number of passes is output as character
`
`information. In practice, the data is stored together with the date and time data counted
`
`in the storage unit 12.
`
`[0038]The count of the number of vehicles can be handled, for example, by the following
`
`algorithm. A rectangular passing vehicle detection area is set at a predetermined position
`
`in the cut-out region. The size of this rectangle allows the front of the vehicle to enter and
`
`is installed for each lane. Then, the vehicle uses the presence of a large number of
`
`horizontal edges to apply a differential filter in the passing vehicle detection area. As a
`
`result, when the extracted moving object is a vehicle and the front of the vehicle passes
`
`through the area, a peak is generated at an output of the differential filter, so that the
`
`presence or absence of the peak is determined. Then, the number of detected vehicles
`
`per unit time (the number of peaks) is the required numberof passes.
`
`[0039]Further, by tracking the vehicle, the speed of the vehicle is measured (ST 35).
`
`Then, the vehicle speed is output as character information. In practice, it is stored in the
`
`vehicle speed storage in the storage 12 along with the measured date and time data.
`
`[0040]This speed measurement of the vehicle can be handled, for example, by the
`
`following algorithm. In other words, a speed measurement areais set in each lane. This
`
`speed measurement area is set to correspond to a constant distance (a distance
`
`sufficiently longer than the length of 1 vehicles) along the lane. Then, the vehicle moving
`
`in this area is tracked. Then, a moving distance is calculated from the position at which
`
`tracking of the vehicle is started and the end position, and the speed is calculated from
`
`the tracking time.
`
`[0041]In actual tracking, a vehicle to be tracked is detected by a vehicle detection
`
`algorithm similar to a number counting algorithm, and image data of the vehicle is
`
`registered as a template. Then, template matching is performed on the incoming and
`
`outgoing image data, and when the vehicle matching the template is detected in the area,
`
`the vehicle is tracked. Tracking of the vehicle can be performed by assuming a moving
`
`direction of the vehicle and performing template matching on the assumedregion.
`
`[0042]Further, a congestion distance is calculated from the number of passages and the
`
`speed determined as described above (ST 36). And,
`
`The calculated congestion distance is output as character information. In practice, it is
`
`stored together with date and time data in the congestion information storage unit in the
`
`storage 12.
`
`[0043]This calculation of the congestion distance can be handled, for example, by the
`
`following algorithm. In other words, for example, when the traveling speed is high, it can
`
`

`

`be estimated that no congestion has occurred. In addition, even when the traveling speed
`
`is extremely slow, if the number of passes within a certain period of time is extremely
`
`slow, it can be said that the vehicle has traveled slowly and no traffic jam has occurred.
`
`On the other hand, even if the speedis slow, it can be estimated that a traffic jam has
`
`occurred if a certain number of passes have passed. Further, as a characteristic of the
`
`traveling state (speed) of the vehicle during the traffic jam, since the stop and the start
`
`are repeated, even when the speed variation frequently occurs within a low speed, it can
`
`be estimated that the vehicle is busy.
`
`In addition,
`
`in case of congestion,
`
`it can be
`
`estimated that the longer the stopped time and the slower the speed, the longerthe traffic
`
`jam distance. Then, the table which associatedthetraffic jam distance over the traveling
`
`condition (input condition) of vehicles, such as the number of passage and speed, can
`
`be created preliminarily, hold stores can be carried out to the storage 12 etc., and it can
`
`ask by referring to the above-mentioned table in calculation processing of Step 36.
`
`[0044]Of course, it is also possible to set a plurality of cameras along a road, for example,
`
`to determine a head ofa traffic jam and a tail, and calculate a traffic jam distance from a
`distance therebetween.
`
`[0045]Further,
`
`it
`
`is also possible to determine whether the current
`
`traffic jam is
`
`progressing or expanding from the time series data of the traffic jam distance, and to
`
`store and store the same. This may beindicated, for example, as an average of the past
`
`numberof congestion distances as 0%, and in the case of an increase, as X%, and ina
`
`case of decreasing, as -%. Although the congestion distance of the comparison
`
`reference is an average in the above example,
`
`the congestion distance may be
`
`compared with the 1 congestion distances of the previous or predetermined numberof
`
`times. Further,
`
`it is also possible to predict a congestion distance after a lapse of a
`
`predetermined time from a locus of a change in the congestion distance.
`
`[0046]Further, as the meaningful information to be extracted, other than the above, for
`
`example, a ratio of a vehicle type based on a vehicle direction, an interval, a size of a
`
`density vehicle, and the like, and a brightness of a photographing range can be
`
`determined by taking into consideration information from a camera to be photographed,
`
`and a lighting rate of a vehicle can be measured from search and tracking of a bright
`
`portion.
`
`[0047]The distribution information generation unit 17 c integrates and distributes the
`
`meaningful information obtained by the information extraction unit 17 b and the image
`
`data processed by the preprocessing unit 17 a, and specifically implements a flowchart
`
`shown in FIG. 9. In other words, the distribution request is waited (ST 41, ST 42), and if
`
`there is a request, the corresponding distribution image data is read from the distribution
`
`

`

`image data storage unit in the storage 12 (ST 43).
`
`[0048]The text which judges the existence of the text to synthesize (being usually), and
`
`corresponds in existing -- that is, -- Y
`
`Semantic information is read (ST 44, ST 45).
`
`In this case, the type of meaningful
`
`information to be read may be fixed in advance or may be read out according to a
`
`condition sent together with a distribution request from the user.
`
`[0049]Then, the read character information and the pre-processed distribution image
`
`data are combined, and the obtained distribution image (with the character information)
`
`is output (ST 46, ST 47). For example, the output destination of the distribution image
`
`may be a communication device 13 via an output unit, and may be transmitted to a
`
`networkasit is, or temporarily stored in the storage unit 12 and then transmitted. It is to
`
`be noted that the video to be distributed basically uses a live video with little delay, and
`
`provides information that does not lose real-time performance.
`
`[0050]Thus, for example, it is possible to integrate the information and the image itself,
`
`and to distribute the information according to the request of the user received through
`
`the network. For example, as shown in FIG. 10, an image obtained by taking an image
`
`of a situation of a road ina monitoring area is provided, whereas according to the present
`
`embodiment, character information is displayed on a display screen of the terminal 5 as
`
`shown in Fig. 11. By transmitting the data extracted from the video and video datain this
`
`way, it is possible to easily receive information whichis difficult to be judged only by the
`
`video.
`
`[0051]For example, the detection of the moving object (vehicle) and the accompanying
`
`processing associated therewith, for example, "Developmentof the traffic flow monitoring
`
`image processin

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket