`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and TrademarkOffice
`Address; COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`16/828,059
`
`03/24/2020
`
`Toshiyasu SUGIO
`
`2020-0644A
`
`1011
`
`CP
`Lind&
`Wenderoth,
`Wenderoth, Lind & Ponack, L.L.P.
`1025 Connecticut Avenue, NW
`Suite 500
`Washington, DC 20036
`
`MAZUMDER, TAPAS
`
`2616
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`06/06/2022
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`eoa@ wenderoth.com
`kmiller@wenderoth.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`1-15 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) ___ is/are withdrawn from consideration.
`C} Claim(s)
`is/are allowed.
`Claim(s) 1-15 is/are rejected.
`S)
`) © Claim(s)____is/are objected to.
`Cj) Claim(s
`are subjectto restriction and/or election requirement
`)
`S)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) )
`
`Application Papers
`10)() The specification is objected to by the Examiner.
`11) The drawing(s) filed on 03/24/2020 is/are: a)[¥) accepted or b)(_) objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12)Z) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d) or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)X) All
`1.2 Certified copies of the priority documents have been received.
`2.2) Certified copies of the priority documents have been received in Application No.
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`* See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3) (J Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`4)
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20220529
`
`Application No.
`Applicant(s)
`16/828 ,059
`SUGIO etal.
`
`Office Action Summary Art Unit|AIA (FITF) StatusExaminer
`Tapas Mazumder
`2616
`Yes
`
`
`
`-- The MAILING DATEofthis communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensions of time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133}.
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 03/24/2020.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)L) This action is FINAL. 2b)¥)This action is non-final.
`3)02 An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4\0) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AlA or AIA Status
`
`The present application, filed on or after March 16, 2013, is being examined
`
`under the first inventor to file provisions of the AIA.
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103)is incorrect, any
`
`correction of the statutory basis for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be
`
`the same under either status.
`
`Claim Rejections - 35 USC § 102
`
`The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that
`
`form the basis for the rejections under this section madein this Office action:
`
`A person shall be entitled to a patent unless —
`
`(a)(2) the claimed invention was described in a patent issued under section 151, or in an
`application for patent published or deemed published under section 122(b), in which the
`patent or application, as the case may be, namesanother inventor and waseffectively filed
`before the effective filing date of the claimed invention.
`
`Claim(s) 6, 9-11 and 15 are rejected under 35 U.S.C. 102(a)(2) as being
`
`anticipated by Fukui et al. (US patent publication: 20190019087,“Fukui”).
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 3
`
`Regarding claim 15, Fukui teaches, A server (element 10) that is capable of
`
`communicating with a client device (element 30) , the server comprising:
`
`a processor; and memory; wherein the processor uses the memoryto:
`
`( Fig. 16 and [0204] indicates that element 10 or server has a processor CPU 871 and
`
`memory RAM 873 , [0204] Next, a hardware configuration example common to the
`
`environment generating apparatus 10, the control learning apparatus 20, and the
`
`information processing apparatus 30 according to the present disclosure will be
`
`described. FIG. 16 is a block diagram illustrating a hardware configuration
`
`example of each of the environment generating apparatus 10, the control learning
`
`apparatus 20, and the information processing apparatus 30 according to the
`
`present disclosure. Referring to FIG. 16, each of the environment generating
`
`apparatus 10, the control learning apparatus 20, and the information processing
`
`apparatus 30 includes, for example, a CPU 871, a ROM 872, a RAM 873,.....anda
`
`communication apparatus 883.”)
`
`receive sensor information from the client device that is obtained through a
`
`sensor equipped in the client device and indicates a surrounding condition of the client
`
`device;
`
`(Fig.4 element 10 has a communication unit 130 which receives sensor data
`
`from client 130 (See [0080]) . The acquiring unit 310 of the client device 30 acquires the
`
`sensor data covering surrounding data of the client ( See {0068] and [0032] . “[0068]
`
`Further, the communication unit 130 has a function of receiving sensor
`
`information acquired from on or a plurality of sensors provided at a second
`
`control target. “ “[0032]The self-driving car recognizes a surrounding
`
`environment from information acquired by various kinds of sensors and realizes
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 4
`
`autonomoustravelling in accordance with the recognized environment” “[0080]
`
`The acquiring unit 310 may have a function as a sensor information acquiring unit
`
`which acquires sensor information from one or more sensors. In the case where
`
`the information processing apparatus 30 is an automated driving apparatus, the
`
`acquiring unit 310 can acquire the above-described sensor information from
`
`sensors providedat the vehicle 40 which is a control target.”) and
`
`create three-dimensional data of a surrounding area of the client device using the
`
`sensor information received (“[0196] For example, the environment generating
`
`apparatus 10 may generate a cluster relating to a predetermined unknown object
`
`X by generating an unknown object cluster using an unknown object
`
`determination device and performing determination asto an identical object in
`
`the cluster. In this event, the environment generating apparatus 10 may,for
`
`example, constitute property of material such as a shapein three dimensions
`
`from information relating to the unknown object X on the basis that an
`
`appearance frequencyof the unknownobject X in a predetermined areais high,
`
`and capture the three-dimensional property of material as a new environmental
`
`model”).
`
`Claim 6 is directed to a method andits steps are similar in scope and functions of
`
`the elements of the device claim 15 and therefore claim 6 is rejected with same
`
`rationales as specified in the rejection of claim 15.
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 5
`
`Regarding claim 9, Fukui teaches, wherein the sensor information includesat
`
`least one of information obtained by a laser sensor, a luminance image, an infrared
`
`image, a depth image, sensor position information, or sensor speed information.
`
`(“[0054]....Here, the vehicle 40 may have various sensorsfor observing a state of
`
`the real world. The above-described sensor includes, for example, a RGB-D
`
`camera,a laser range finder, a GPS, Wi-Fi (registered trademark), a geomagnetic
`
`sensor, a pressure sensor, an acceleration sensor, a gyro sensor, a vibration
`
`sensor, or the like.”)
`
`Regarding claim 10, Fukui teaches, wherein the sensor information includes
`
`information that indicates a performance of the sensor. ([0107]..... Further, the sensor
`
`system information mayinclude information relating to each sensor such as an
`
`image sensor, a lidar, a millimeter wave radar, a depth sensor and a microphone.
`
`Still further, the sensor system information mayinclude information of a position
`
`where each sensor is attached, a search range, sensor performance, variation
`
`relating to the position where each sensoris attached, or the like.”’)
`
`Regarding claim 11, Fukui teaches, correcting the three-dimensional data in
`
`accordance with the performanceof the sensor. (Fig. 12 shoes that different parameter
`
`is sent to the server that is used by generating unit to generate three dimensional data.
`
`This parameter include sensor performance. See.....
`
`“[0105]..... The environment
`
`generating apparatus 10 according to the present embodiment can perform simulation
`
`in accordancewith anindividual difference of the vehicle 40, for example, by capturing
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 6
`
`the internal parameters relating to a sensor and a drive system provided at the
`
`vehicle 40. That is, according to the environment generating apparatus 10 according to
`
`the present embodiment, it is possible to effectively realize calibration for absorbing an
`
`individual difference of apparatuses. [0106] Here, the above-described vehicle body
`
`information may include characteristics information, installation position information, or
`
`the like, of each part. Specifically, the vehicle body information may include information
`
`relating to age of service (aged degradation index) of each part or variation of
`
`performance. Further, the vehicle body information may include, for example,
`
`information in accordance with characteristics of each part, such as a drive system, a
`
`steering wheel, a brake system and a sensor system. [0107] For example, the drive
`
`system information mayinclude information of a temperature, a torque, response
`
`characteristics, or the like. The steering wheel information may include information of
`
`response characteristics, or the like. The brake system information may include
`
`information of abrasion, a friction coefficient, temperature characteristics, a degree of
`
`degradation, or the like. Further, the sensor system information mayinclude information
`
`relating to each sensor such as an image sensor,a lidar, a millimeter wave radar, a
`
`depth sensor and a microphone. Still further, the sensor system information may include
`
`information of a position where each sensor is attached, a search range, sensor
`
`performance, variation relating to the position where each sensor is attached, or the
`
`like.”)
`
`Claim Rejections - 35 USC § 103
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 7
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all
`
`obviousnessrejections setforth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed
`invention is not identically disclosed as set forth in section 102, if the differences between the
`claimed invention and the prior art are such that the claimed invention as a whole would have
`been obvious before the effective filing date of the claimed invention to a person having
`ordinary skill in the art to which the claimed invention pertains. Patentability shall not be
`negated by the manner in which the invention was made.
`
`Claims 1-4 and 14 are rejected under 35 U.S.C. 103 as being unpatentable
`
`over Knorret al. (US Patent publication: 20180245928, Knorr”) and Fukui et al.
`
`(US patent publication: 20190019087, “Fukui”).
`
`Regarding claim 14, Knorr teaches, A client device (Fig.
`
`1 element 100) , the
`
`client device comprising:
`
`create three-dimensional data of a surrounding area of the client device using
`
`sensor information that is obtained through a sensor equippedin the client device and
`
`indicates a surrounding condition of the client device; ( Client device has sensor that
`
`captures surrounding environment
`
`in point cloud form ( three dimensional environment
`
`data. “[0037] FIG. 1 schematically shows components of a system 100 for
`
`localizing an automated motor vehicle, according to an example embodimentof
`
`the present invention. System 100 includes a sensor device 10 of the automated
`
`motor vehicle, preferably a video sensor, radar sensor, etc., which captures a
`
`driving environment and thereby ascertains driving environmentdata of the
`
`motor vehicle. For this purpose, sensor device 10 captures image data, in
`
`particular in the form of point clouds, line features, etc. The mentioned data can
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 8
`
`also include semantic features, such astrees, street lighting devices, buildings,
`
`etc.”)
`
`estimate self-location of the client device using the three-dimensional data
`
`created; (Element 20 performs self-localization based on the captured sensor data from
`
`element 10 and localization reference data. “[0039] The automated motorvehicle is
`
`localized, and a localization accuracy,or rather localization quality of the motor
`
`vehicle is ascertained on the basis of a required or predefined localization
`
`accuracy using a localization device 20 configured in the motor vehicle. The
`
`following data are examples of the data that can be included in the determination
`
`of the localization quality: the location accuracies and/or the volume of the
`
`localization reference data, and the degree of unique recognition of localization
`
`references (for example, traffic signs, objects in the driving environment, etc.) in
`
`the current vehicle driving environment using sensor device 10. Localization
`
`device 20 is thus able to ascertain or estimate the position of the motor vehicle
`
`and a corresponding accuracyof the position estimate on the basis of the
`
`existing localization reference data.”) and
`
`transmit the sensor information obtained to a server or an other client device.
`
`(“[0058] Furthermore, the driving environment data captured by sensor
`
`device 10 and linked to location information are communicated via second
`
`interface S2 to server device 40.”)
`
`Though Knorr teachesa client device performing the functionality of this claim
`
`but doesn’t expressly teach that the client device has
`
`a processor; and memory,
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`wherein
`
`Page 9
`
`the processor uses the memory to perform the limitations of the claim.
`
`However, Fukui has a processor and memory where the processor uses
`
`memory to perform similar functionality of creating three-dimensional data of a
`
`surrounding area ( Fig.4 environment geniting apparatus element 10 which has a
`
`processor and a memory according to Fig.16 and [0204] indicating that element 10 or
`
`server has a processor CPU 871 and memory RAM 873)
`
`Knorr and Fukui are analogous asthey are from the field of generating
`
`environment data of a space.
`
`Therefore it would have been obvious for an ordinary skilled person in the art
`
`before the effective filing date of the claimed invention of the claimed invention to have
`
`modified Knorr to have included a processor and a memory and the processor to use
`
`the memory to perform the functionality of the claim as taught by Fukui for the purpose
`
`of using standard way of generating 3D environment data.
`
`Claim 1
`
`is directed to a method and its steps are similar in scope and functions of
`
`the elements of the device claim 14 and therefore claim 1
`
`is rejected with same
`
`rationales as specified in the rejection of claim 14.
`
`Regarding claim 2, Knorr as modified by Fukui teaches,
`
`transmitting a
`
`transmission request for a three-dimensional map to the server; and receiving the three-
`
`dimensional map from the server, (Fukui Fig. 3 requests three dimensional map
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 10
`
`from database. Knorr “[0063] Besides the driving environment data that is
`
`presently linked to location information and that has been communicated, already
`
`previously communicated driving environment datathat arestill available on
`
`server device 40, and/or previous localization reference data, already present on
`
`server device 40, can be included in the generation of the localization reference
`
`data.
`
`[0064] FIG. 2 schematically shows a basic sequence of a method according
`
`to an example embodimentof the present invention. In a step 200, a localization
`
`accuracyis specified for the automated motor vehicle during operation,
`
`localization reference data for a defined location being requested by the motor
`
`vehicle at the defined localization accuracyto be attained and being
`
`communicated to the automated motor vehicle”.)
`
`wherein in the estimating of the self-location, the self-location is estimated using
`
`the three-dimensional data and the three-dimensional map. ( Knorr, (Element 20
`
`performs self-localization based on the captured sensor data from element 10 and
`
`localization reference data. “[0039] The automated motorvehicle is localized, and a
`
`localization accuracy, or rather localization quality of the motor vehicle is
`
`ascertained on the basis of a required or predefined localization accuracy using a
`
`localization device 20 configured in the motor vehicle. The following data are
`
`examples of the data that can be includedin the determination of the localization
`
`quality: the location accuracies and/or the volumeof the localization reference
`
`data, and the degree of unique recognition of localization references(for
`
`example, traffic signs, objects in the driving environment, etc.) in the current
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 11
`
`vehicle driving environment using sensor device 10. Localization device 20 is
`
`thus able to ascertain or estimate the position of the motor vehicle and a
`
`corresponding accuracyof the position estimate on the basis of the existing
`
`localization reference data.”’)
`
`Regarding claim 3, Knorr as modified by Fukui
`
`teaches, wherein the sensor
`
`information includes at least one of information obtained by a laser sensor, a luminance
`
`image, an infrared image, a depth image, sensor position information, or sensor speed
`
`information. (“Fukui, “[0054]....Here, the vehicle 40 may have various sensors for
`
`observing a state of the real world. The above-described sensor includes, for
`
`example, a RGB-D camera,a laser rangefinder, a GPS, Wi-Fi (registered
`
`trademark), a geomagnetic sensor, a pressure sensor, an acceleration sensor, a
`
`gyro sensor, a vibration sensor, or the like.”)
`
`Regarding claim 4, Knorr as modified by Fukui teaches, wherein the sensor
`
`information includes information that indicates a performance of the sensor. (Fukui,
`
`[0107]..... Further, the sensor system information mayinclude information
`
`relating to each sensor such as an image sensor, a lidar, a millimeter wave radar,
`
`a depth sensor and a microphone.Still further, the sensor system information
`
`mayinclude information of a position where each sensor is attached, a search
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 12
`
`range, sensor performance, variation relating to the position where each sensor
`
`is attached, or the like.”)
`
`Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Knorr
`
`as modified by Fukui and further in view of Berlier et al. (US Patent Publication:
`
`2018/0348734, “Berlier’).
`
`Regarding claim 5, Knorr as modified by Fukui doesn’t expressly teach,
`
`encoding or compressing the sensor information, wherein in the transmitting of the
`
`sensor information, the sensor information that has been encoded or compressedis
`
`transmitted to the server or the other client device.
`
`However, Berlier teaches, encoding or compressing the sensor information,
`
`wherein in the transmitting of the sensor information, the sensor information that has
`
`been encoded or compressedis transmitted to the server or the other client device.
`
`(“[(0055] The sensor data is adaptively compressed based on the scoring data 610 as
`
`described above, i.e., fidelity of the compression process applied to each portion of the
`
`part being fabricated is controlled by corresponding values of the scoring data. The
`
`scoring data may be configured so that less complicated portions of the geometry are
`
`compressed with less accuracythanthe identified complex structures. In disclosed
`
`embodiments, eachtype of identified structure may have a defined weight so that, for
`
`example, higher fidelity compression can be used for corners than for edges. The
`
`compressed sensor data is transmitted via a network 615.
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 13
`
`[0056] The compressed sensor data is received via the network by a data
`
`host/server 620 and then maybe transmitted to a user interface device.”)
`
`Knorr as modified by Fukui and Berlier are analogous as they are from the
`
`field of image processing.
`
`Therefore it would have been obvious for an ordinary skilled person in the art
`
`before the effective filing date of the claimed invention to have modified Knorr as
`
`modified by Fukui to have included encoding or compressing the sensor information,
`
`wherein in the transmitting of the sensor information, the sensor information that has
`
`been encoded or compressedis transmitted to the server or the other client device as
`
`taught by Berlier for the purpose od sending data with less bandwidth and securely to
`
`another device.
`
`Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Fukui
`
`andfurther in view of Whaleyet al. ( US patent Publication: 20180181741,
`
`“Whaley”).
`
`Regarding claim 7, Fukui doesn’t expressly teach, transmitting a transmission
`
`requestfor the sensor information to the client device.
`
`However Whaleyteaches, transmitting a transmission request for the sensor
`
`information to the client device. ([0092] FIG. 7B illustrates a variation of the system
`
`illustrated in FIG. 7A that uses an adaptive notification server 707 to trigger collection of
`
`sensor data in accordance with the disclosed embodiments. During operation, adaptive
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 14
`
`notification server 707 requests sensor data from mobile client 703 at variable intervals,
`
`based on previous feedbackfrom the learner 105.”)
`
`Fukui and Whaleyare analogous as they are from the field of sensor data
`
`processing.
`
`Therefore it would have been obvious for an ordinary skilled person in the art
`
`before the effectivefiling date of the claimed invention to have modified Fukui to have
`
`included transmitting a transmission request for the sensor information to the client
`
`device as taught by Whaleyfor the purposeof controlling the sensor or data
`
`transmission based on the need of the server or 3D data generating device and thereby
`
`avoid unnecessary data collection and processing.
`
`Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Fukui
`
`andfurther in view of Sequeira et al. (US patent Publication: 20180075643,
`
`“Sequeira”).
`
`Regarding claim 8, Fukui doesn’t expressly teach, updating a three-dimensional
`
`map using the three-dimensional data created;.
`
`However, Sequeira teaches, updating a three-dimensional map using the three-
`
`dimensional data created; (“[0030] In a second aspect, the invention relates to a method
`
`for real-time mapping, localization and change analysis of an environment, i.e. relative
`
`to the 3D reference map of the environment which is available from a method according
`
`to the first aspect of the invention as described aboveor from a such
`
`a 3D reference map already updated or modified through a previous run of the present
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 15
`
`method, in particular in a GPS-denied environment, preferably comprising the following
`
`steps: [0031] (a) acquiring (8D) scanner data of the environmentwith a real-time laser
`
`range scanner at a rate of at least 5 frames (point clouds) per second, [0032] (b) during
`
`place recognition, identifying a current location of the laser range scanner inside a
`
`known environment (i.e. within the 3D reference map) with no prior knowledge of the
`
`scanner pose during place recognition and pre-computing of simple and compact
`
`descriptors of the scene acquired by the laser range scanner using a reduced search
`
`space within the scene;”)
`
`Fukui and Sequeira are analogous asthey are from the filed of mapping of
`
`environment.
`
`Therefore it would have been obvious for an ordinary skilled person in the art
`
`before the effectivefiling date of the claimed invention to have modified Fukui to
`
`updating a three-dimensional map using the three-dimensional data created as taught
`
`by Sequeira for the purpose of dynamically update a 3d mapfor the user.
`
`Fukui as modified by Sequeira teaches, transmitting the three-dimensional
`
`mapto the client device in responseto a transmission request for the three-dimensional
`
`map from the client device. (Fukui [0182]...., ‘‘ the server communication
`
`unit 340 transmits the sensor information, the environmental parameters and the
`
`control information to the environment generating apparatus 10 (S1303). [0183]
`
`Subsequently, the information processing apparatus 30 may notify a passenger,
`
`or the like. (S1304). Specifically, when the determining unit 330 determines an
`
`unknown environmentor a dangerous environment, the determining unit 330 can
`
`generate notification data based on the determination. The server communication
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 16
`
`unit 340 maytransmit the above-described notification data to a display unit, or
`
`the like, to cause notification content to be displayed.”)
`
`Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Fukui
`
`and further in view of Clausenet al. (US patent Publication: 20060195197,
`
`“Clausen”).
`
`Regarding claim 12, Fukui doesn’t expressly teach, wherein in the receiving of
`
`the sensor information: a plurality of pieces of the sensor information are received from
`
`a plurality of client devices each being the client device; and
`
`the sensor information to be usedin the creating of the three-dimensionaldata is
`
`selected, based on a plurality of pieces of information that each indicates the
`
`performanceof the sensor included in the plurality of pieces of the sensor information.
`
`However, Clausen teaches, a plurality of pieces of the sensor information are
`
`received from a plurality of client devices each being the client device; and the sensor
`
`information to be used is selected, based onaplurality of pieces of information that
`
`eachindicates the performance of the sensor included in the plurality of pieces of the
`
`sensor information. ([(0023] One embodimentof the invention includes a methodof
`
`selecting a device associated with a limb of a user. The selection method of one
`
`embodimentincludes: providing a device having a sensor secured thereto; measuring
`
`with the sensor a performance characteristic of the device while the device is in use by
`
`a user; comparing the performance with a pre-determined matrix of performance data
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 17
`
`of different devices; and selecting an appropriate device for the user based on the
`
`comparison.”)
`
`Fukui and Sequeira are analogous asthey are from the field of sensor usage
`
`for measurement or capturing image.
`
`Therefore it would have been obvious for an ordinary skilled person in the art
`
`before the effectivefiling date of the claimed invention to have modified Fukui too have
`
`included a plurality of pieces of the sensor information are received from a plurality of
`
`client devices each being the client device; and the sensor information to be used in the
`
`creating of the three-dimensional data is selected, based on a plurality of pieces of
`
`information that each indicates the performance of the sensor includedin the plurality of
`
`pieces of the sensor information similar to a plurality of pieces of the sensor
`
`information are received from a plurality of client devices each being the client device;
`
`and the sensor information to be used is selected, based on a plurality of pieces of
`
`information that each indicates the performance of the sensor included in the plurality of
`
`pieces of the sensor information as taught by Clausen.
`
`The motivation to include the modification is to correctly create three dimensional
`
`data based on available input data from the clients.
`
`Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Fukui
`
`and further in view of Berlier.
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 18
`
`Regarding claim 13, Fukui doesn’t expressly teach, decoding or decompressing
`
`the sensor information received; and creating the three-dimensional data using the
`
`sensor information that has been decoded or decompressed.
`
`However, Berlier teaches, decoding or decompressing the sensor information
`
`received; (“[0056] The compressed sensor data is received via the network by a
`
`data host/server 620 and then maybe transmitted to a user interface device. The
`
`scoring data(relative to working tool positions) can be reproduced 625 ata
`
`receiving end using the knownscoring function/algorithm and the process data
`
`(e.g., the CLI file 320). The scoring data is used to decompressthe sensor
`
`data 630, as described above. The decompressed sensor data is correlated with
`
`the working tool positions to produce sensor data values v. working tool
`
`positions 635..”)
`
`Fukui and Berlier are analogous astheyare from the field of image processing.
`
`Therefore it would have been obvious for an ordinary skilled person in the art
`
`before the effectivefiling date of the claimed invention to have modified Fukui to have
`
`included decoding or decompressing the sensor information received; as taught by
`
`Berlier for the purpose of sending data with less bandwidth and securely to another
`
`device and standard method of converting compressed data to original sensor data.
`
`Fukui as modified by Berlier teaches, creating the three-dimensional data
`
`using the sensor information that has been decoded or decompressed. (Fukui uses the
`
`decompressed sensor data (integrated from Berlier) to generate 3d data. Fukui,
`
`“[0196] For example, the environment generating apparatus 10 may generate a
`
`cluster relating to a predetermined unknown object X by generating an unknown
`
`
`
`Application/Control Number: 16/828,059
`Art Unit: 2616
`
`Page 19
`
`object cluster using an unknown object determination device and performing
`
`determination as to an identical object in the cluster. In this event, the
`
`environment generating apparatus 10 may, for example, constitute property of
`
`material such as a shapein three dimensions from information relating to the
`
`unknown object X on the basis that an appearancefrequenc