`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and TrademarkOffice
`Address; COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`16/299,564
`
`03/12/2019
`
`Takahiro NISHI
`
`2019-0413A
`
`2642
`
`CP
`Lind&
`Wenderoth,
`Wenderoth, Lind & Ponack, L.L.P.
`1025 Connecticut Avenue, NW
`Suite 500
`Washington, DC 20036
`
`ZAYKOVA-FELDMAN, LYUDMILA
`
`2865
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`06/03/2022
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`eoa@ wenderoth.com
`kmiller@wenderoth.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`13-19 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) ___ is/are withdrawn from consideration.
`C} Claim(s)
`is/are allowed.
`Claim(s) 13-19 is/are rejected.
`S)
`) © Claim(s)____is/are objected to.
`Cj) Claim(s
`are subjectto restriction and/or election requirement
`)
`S)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) )
`
`Application Papers
`10)() The specification is objected to by the Examiner.
`11) The drawing(s) filed on 03/12/2019 is/are: a)[¥) accepted or b)( objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)¥) All
`1.4) Certified copies of the priority documents have been received.
`2.2) Certified copies of the priority documents have been received in Application No.
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`* See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`2) (J Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3) (J Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`4)
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20220518A
`
`Application No.
`Applicant(s)
`16/299 564
`NISHI et al.
`
`Office Action Summary Art Unit|AIA (FITF) StatusExaminer
`Lyudmila Zaykova-Feldman
`2865
`Yes
`
`
`
`-- The MAILING DATEofthis communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensions of time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133}.
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 04/08/2022.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)L) This action is FINAL. 2b)¥)This action is non-final.
`3)02 An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4\0) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 2
`
`Notice of Pre-AlA or AIA Status
`
`The present application, filed on or after March 16, 2013, is being examined
`
`under the first inventor to file provisions of the AIA.
`
`Continued Examination Under 37 CFR 1.114
`
`A requestfor continued examination under 37 CFR 1.114, including the fee set
`
`forth in 37 CFR 1.17(e), wasfiled in this application after final rejection. Since this
`
`application is eligible for continued examination under 37 CFR 1.114, and the fee set
`
`forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action
`
`has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on
`
`04/08/2022 has been entered.
`
`Response to Amendment
`
`The action is responsive to the Amendmentfiled on 04/08/2022. Claims 13-19
`
`are pending. Claims 1-12 have been canceled.
`
`Regarding Reference Cited Form (PTO-892): the reference of US20180032823
`
`to Ohizumi is included into the updated PTO-892 form.
`
`Response to Arguments
`
`Regarding rejections under 35 USC § 103:
`
`Applicant’s arguments, filed on 04/08/2022 with respect to 35 U.S.C. 103
`
`rejection regarding claims 13-19, have been considered but are moot because of the
`
`new ground ofrejection necessitated by the applicant’s amendment.
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 3
`
`Claim Rejections - 35 USC § 103
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103)is incorrect, any
`
`correction of the statutory basis for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be
`
`the same under either status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all
`
`obviousnessrejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed
`invention is not identically disclosed as set forth in section 102, if the differences between
`the claimed invention and the prior art are such that the claimed invention as a whole
`would have been obvious before the effective filing date of the claimed invention to a
`person having ordinary skill in the art to which the claimed invention pertains.
`Patentability shall not be negated by the manner in which the invention was mace.
`
`The factual inquiries for establishing a background for determining obviousness
`
`under 35 U.S.C. 103 are summarized asfollows:
`
`1. Determining the scope and contents of the prior art.
`
`2. Ascertaining the differences between the prior art and the claims at issue.
`
`3. Resolving the level of ordinary skill in the pertinent art.
`
`4. Considering objective evidence presentin the application indicating
`
`obviousness or nonobviousness.
`
`This application currently names joint inventors. In considering patentability of the
`
`claims the examiner presumes that the subject matter of the various claims was
`
`commonly ownedasof the effective filing date of the claimed invention(s) absent any
`
`evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 4
`
`point out the inventor and effective filing dates of each claim that was not commonly
`
`ownedas ofthe effectivefiling date of the later invention in order for the examiner to
`
`consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2)
`
`prior art against the later invention.
`
`Claims 13-19 are rejected under 35 U.S.C. 103(a) as being unpatentable over
`
`US20070030212 to Shibata (hereinafter ‘Shibata’) in view of US20050134440 to Breed
`
`(hereinafter ‘Breed’).
`
`Regarding Claim 1: Shibata discloses:
`
`“& three-dimensional data generation method” (para 0021 — “a three-dimensional
`
`measuring unit adapted to perform (interpreted as a ihree-dimensional data generation
`
`method, added by examiner), by a three-dimensional measuring instrument, a three-
`
`dimensional measurement for an environment... a measurement value receiving unit
`
`adaptedto receive a three-dimensional measurementvalue from outside’), camprising
`
`“wirelessly receiving first data by a device (oara 0066 - "The vehicle outside-image
`
`display apparatus 201 receives the vehicle outside-image information 222 and the
`
`three-dimensional measurement value 262(i.e. first data, added by examiner), and
`
`transmits, to a further rearward vehicle, renewed vehicle outside-image
`
`information 227’; para OO67 — “The wireless communication interface 252 performs data
`
`transmission/reception with the first vehicle A’},
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 5
`
`“the device being provided on a first moving body which moves along a traveling
`
`direction” (Figs. 9 and 10; para 0050 ~ °a vehicle outside-image display
`
`apparatus 101 providedfora first vehicle A’};
`
`“the first dala representing three-cimensional positions’ (para 0066 ~ “a three-
`
`dimensional measurementvalue 262(i.e. first data, added by examiner) obtained from
`
`a three-dimensional measuring instrument 232 that performs a three-dimensional
`
`measurement of an environment}.
`
`“generating second dala based on detection by a sensor provided on the first
`
`moving body, the second data representing three-dimensional positions in a
`
`space in front of the first moving body in the traveling direction’ (Fig 9; para 0066
`
`~ “a vehicle outside-image display apparatus 201 providedforthe first vehicle A... The
`
`vehicle outside-image display apparatus 201 also transmits, to the rearward vehicle, a
`
`renewed three-dimensional measurement value 261 (i.e. second data, added by
`
`examiner)... with a three-dimensional measurementvalue obtained by the three-
`
`dimensional measuring instrument 231 (i.e. sensor, added by examiner) that performs a
`
`three-dimensional measurementfor an environment},
`
`“wirelessly transmilting third data io at least one of a second moving body
`
`following the first moving body in the traveling direction, the third data
`
`representing three-dimensional postions generated based on the Hirst data and
`
`the second data’ (Figs 9 and 10; para 6067 - “The wireless communication
`
`interface 252 performs data transmission/reception with the first vehicle A”; para O068 —
`
`“The measurement value receiving unit 236 receives the three-dimensional
`
`measurement value 262 from the second vehicle B through the wireless communication
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 6
`
`interface 251. The measurement value combining unit 237 combinesthe three-
`
`dimensional measurement value 262 received by the measurement value receiving
`
`unit 236 with the three-dimensional measurement value acquired by the three-
`
`dimensional measuring unit 234. The measurement value transmitting
`
`unit 235 transmits renewed three-dimensional measurement value 261 (i.e. third data,
`
`added by examiner) produced by the measurement value combining unit 237, to the
`
`third vehicle C through the wireless communication interface 251’).
`
`Shibata is silent on:
`
`“a traffic monitoring system’.
`
`However, Breed discloses:
`
`“a traffic monitoring system”(Figs. 13-15, 15A, 16B; para 0010 — “fre inventian
`
`reiaies to the use of a Global Positioning Systern (GPS) Ge. traffic moniioring
`
`system,
`
`added by examinen, differential GPS CDGPS"), olher infrastructure-based location aids,
`
`cameras, radar, laser radar, terahertz radar and an inertial navigation system as the
`
`orimary host vehicle and target locating system with centimeter level accuracy. The
`
`invention is further supplemented by a processor to detect, recognize and track ail
`
`relevant potential obstacies, inciuding other venicies, pedestrians, animais, and other
`
`objects on or near the roadway).
`
`It would have been obvious to one of ordinary skill in the art before the effective
`
`filing date of the claimed invention to modify the three-dimensionai data creation
`
`method, disclosed oy Shibata, as taughi by Breed, in order lo imprave the safety of
`
`muilli-vehicie combined operation and to avoid the collision or abrupt stops using the
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 7
`
`system, which allows to regisier the objects ahead af the vehicle on the road and
`
`transmit this information.
`
`Regarding Claim 14: Shibata/Breed combination discloses the three-
`
`dimensional dala generation methad according to claim 13 (see the rejection far Claim
`
`13}.
`
`Shibata further discloses:
`
`“wherein the first data received has been transmitted from a third moving body
`
`preceding the first moving body in the traveling direction” (para 0066 - “The
`
`vehicle outside-image display apparatus 201 receives the vehicle outside-image
`
`information 222 and the three-dimensional measurementvalue 262(i.e. first data,
`
`added by examiner), and transmits, to a further rearward vehicle, renewed vehicle
`
`outside-image information 227°;}, and
`
`“the Hirst dala represents three-dimensional positions around the third moving
`
`body’ (para 0068 — “The measurementvalue transmitting unit 235 transmits renewed
`
`three-dimensional measurement value 261 (interpreted as three-dimensional positions
`
`around the third-moving body, added by examiner) produced by the measurement value
`
`combining unit 237’).
`
`Regarding Claim 15: Shibata/Breed combination discloses the ihree-
`
`dimensional dala generation methad according to claim 13 (see the rejection far Claim
`
`13},
`
`Shibata further discloses:
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 8
`
`“the third data is pari of merge Gata of the first dala and the second cata” (para
`
`6066 - “The vehicle outside-image display apparatus 201 also transmits, to the
`
`rearward vehicle, a renewed three-dimensional measurement value 261, which is a
`
`measurementvalue obtained by combining(interpreted as merging, added by
`
`examiner) the received three-dimensional measurement value 262 with a three-
`
`dimensional measurementvalue obtained by the three-dimensional measuring
`
`instrument 231 that performs a three-dimensional measurementfor an environmentin a
`
`picture range of the camera 211.”}, and
`
`“the three-dimensional positions represented by the third data include at least
`
`three- dimensional positions in a region hidden from view of the second moving
`
`body" (para 0058 ~ “part of the image, captured by the installed camera 111 or 113, in
`
`which the field of vision is blocked by the forward vehicle is combinedwith a portion of
`
`the received vehicle outside-image information 122 or 121 to produce the renewed
`
`vehicle outside-image information 121or 123 in which the field of vision is cleared’, para
`
`0085 — “it is also possible to transmit the vehicle outside-image
`
`information 321 produced by the combiningto a further rearward vehicle by the image
`
`transmitting unit 315. Moreover, it is possible to combine the image information 322 with
`
`received vehicle outside-image information from a forward vehicle (interpreted as the
`
`information from a region hidden from view, added by examiner)’}.
`
`Regarding Claim 16: Shibata/Breed combination disclosesthe three-
`
`dimensional data generation method according to claim 13 (see the reflection for Claim
`
`43),
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Shibata further discloses:
`
`Page 9
`
`“updating the third data when the three-dimensional position represented by the
`
`third data are changed" (para 0068 ~ “The measurementvalue receiving
`
`unit 236 receives the three-dimensional measurement value 262 from the second
`
`vehicle B through the wireless communication interface 251. The measurement value
`
`combining unit 237 combines the three-dimensional measurement value 262 received
`
`by the measurementvalue receiving unit 236 with the three-dimensional measurement
`
`value acquired by the three-dimensional measuring unit 234. The measurement value
`
`transmitting unit 235 transmits renewed(i.e. updated, added by examiner) three-
`
`dimensional measurementvalue 261 (i.e. third data, added by examiner) produced by
`
`the measurement value combining unit 237, to the third vehicle C through the wireless
`
`communication interface 251°}.; and
`
`“wirelessly reiransmilting the third data updated to the second moving body"
`
`(ogra 0085 ~ “itis also possible to transmit the vehicle outside-image
`
`information 321 produced by the combiningto a further rearward vehicle by the image
`
`transmitting unit 315.}
`
`Regarding Claim 17: Shibata/Breed combination discloses the three-
`
`dimensional data generation method according to claim 13 (see the relection for Claim
`
`13).
`
`Regarding the limitation “wherein a first distance in the traveling direction
`
`between the first moving body and the space varies depending on a moving
`
`speed of the first moving body’: it would have been obvious to one of ordinary skill in
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 10
`
`the art before the effective filing date of the claimed invention to notice that the distance
`
`Varies in accordance with a traveling speed af the mobile chiect, since according to ihe
`
`definition of speed, the faster the cbiect moves, the longeris the distance if can cover
`
`during the same time interval.
`
`Regarding Claim 18: Shibata discloses:
`
`“& three-dimensional data generation device provided on a first moving body
`
`which moves along a traveling direction” (para 021 ~ “the invention provides a
`
`vehicle outside-image display apparatus... comprising: a three-dimensional measuring
`
`unit adapted to perform, by a three-dimensional measuring instrument, a three-
`
`dimensional measurementfor an environment... a measurement value receiving unit
`
`adapted to receive a three-dimensional measurementvalue from outside; and a
`
`viewpoint converting unit adapted to convert the vehicle outside-image information...
`
`and to supply the converted information to the image combining unit’} , the three-
`
`dimensional data generation device comprising:
`
`“a communication circull configured to wirelessly receive first data representing
`
`ihree- dimensional positions” (Fig. 10; para 0051 — “The wireless communication
`
`interface 152 (i.e. communication circuit, added by examiner) performs data
`
`transmission/reception with the first vehicle A’),
`
`“a processor coupled to the communication circull to generate second data
`
`based on detection by a sensor provided on the first moving Body” (para 0021 —
`
`“a three-dimensional measuring unit (interpreted as a processor coupled to the
`
`communication circuit, added by examiner) adapted to perform, by a three-dimensional
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 11
`
`measuring instrument(i.e. sensor, added by examiner), a three-dimensional
`
`measurementfor an environment),
`
`“the second data representing three- dimensional positions in a space in front of
`
`the first moving body in the traveling direction” (Fig. 9; para 0066 — “a vehicle
`
`outside-image display apparatus 201 providedforthe first vehicle A.... The vehicle
`
`outside-image display apparatus 201 also transmits, to the rearward vehicle, a renewed
`
`three-dimensional measurement value 261 (i.e. second data, added by examiner)’...
`
`with a three-dimensional measurement value obtained by the three-dimensional
`
`measuring instrument 231 (i.e. sensor, added by examiner) that performs a three-
`
`dimensional measurement for an environment}"}; anc
`
`“the communication circul configured to wirelessly transmit third data to at least
`
`one of a second moving body following the first maving body in the traveling
`
`direction’ (Fig. 10; para 0051 — “The wireless communication interface 152 (i.e.
`
`communication circuit, added by examiner) performs data transmission/reception with
`
`the first vehicle A’),
`
`“the third data representing three-dimensional positions generated based on the
`
`first data and the second data” (para G066 ~ “The measurementvalue receiving
`
`unit 236 receives the three-dimensional measurement value 262 from the second
`
`vehicle B through the wireless communication interface 251. The measurement value
`
`combining unit 237 combines the three-dimensional measurement value 262 received
`
`by the measurementvalue receiving unit 236 with the three-dimensional measurement
`
`value acquired by the three-dimensional measuring unit 234. The measurement value
`
`transmitting unit 235 transmits renewed three-dimensional measurementvalue 261 (i.e.
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 12
`
`third data, added by examiner) produced by the measurement value combining
`
`unit 237, to the third vehicle C through the wireless communication interface 251°}.
`
`ohibata is silent on “a traffic monitoring system’.
`
`However, Bread discloses:
`
`“a traffic monitoring system’(Figs. 13-15, 15A, 16B; para 0010 — “fhe invention
`
`rélaies to ine use of a Global Positioning Sysiem (GPS? Ue. traffic monitoring systern,
`
`added by examiner, differential GPS CDGPS"), other infrastructure-based location aids,
`
`cameras, radar, laser radar, teraheriz radar and an inertial navigation system as the
`
`primary host vehicle and targel locating sysiem wih centimeter level accuracy. The
`
`invention is further supplemented by a processor to detect, recognize and track ail
`
`relevant potential obstacies, including other vehicies, pedestrians, animals, and other
`
`oblects on or near the roadway).
`
`It would have been obvious to one of ordinary skill in the art before the effective
`
`filing date of the claimed invention to modify the three-cimensionai data creation
`
`method, disclosed by Shibata, as taught by Breed, in order to improve the safety of
`
`mulli-vehicie combined operation and to avoid the collision or abrupt stops using the
`
`system, which allows to regisier the objects ahead af the vehicle on the road and
`
`transmit this information.
`
`Regarding Claim 18: Shibata discloses:
`
`“wirelessiy transmitting first data to a first moving body which moves along a
`
`traveling direction, ihe first data representing three-dimensional postions’ (para
`
`6066 ~— “The vehicle outside-image display apparatus 201 receives the vehicle outside-
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 13
`
`image information 222 and the three-dimensional measurementvalue 262(i.e.first
`
`data, added by examiner), and transmits, to a further rearward vehicle, renewed vehicle
`
`outside-image information 221°; para 0067 ~ “The wireless communication
`
`interface 252 performs data transmission/reception with the first vehicle A”}: and
`
`“wirelessly receiving third data from the first moving body, the third data
`
`representing three-dimensional positions generated based on ine fret data and
`
`second data, the second data being generated based on detection by a sensor
`
`provided on the first moving body, the second data representing three-
`
`dimensional positions in a space in front of the first moving body in the traveling
`
`direction’ (Figs 9 and 10; para 0067 — “The wireless communication interface 252
`
`performs data transmission/reception with the first vehicle A’; para 0068 ~“ The
`
`measurement value receiving unit 236 receives the three-dimensional measurement
`
`value 262 from the second vehicle B through the wireless communication interface 251.
`
`The measurement value combining unit 237 combines the three-dimensional
`
`measurement value 262 received by the measurementvalue receiving unit 236 with the
`
`three-dimensional measurement value acquired by the three-dimensional measuring
`
`unit 234. The measurementvalue transmitting unit 235 transmits renewed three-
`
`dimensional measurementvalue 261 (i.e. third data, added by examiner) produced by
`
`the measurement value combining unit 237, to the third vehicle C through the wireless
`
`communication interface 251°}.
`
`otibata is sient an:
`
`“A traffic monitoring method, performedbya traffic monitoring system”.
`
`However, Breed discloses:
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 14
`
`“A traffic monitoring method, performedbya traffic monitoring system”(Figs. 13-
`
`15, 15A, 16B; para 0010 — “ihe invention relates to the use of a Global Positioning
`
`System (GPS”) (6, traffic monitoring system, added by examiner}, differential GPS
`
`CDGPS"), other infrastructure-based location aids, cameras, radar, iaser radar,
`
`teraheriz radar and an inertial navigation sysiem as the orimary nost vehicle and target
`
`locating system with centimeter ievel accuracy. The invention Js further supplemented
`
`by @ processor to detect, recognize and track all relevant potential obstacies, including
`
`other venicies, pedestrians, animals, and other objects on or near the roadway).
`
`It would have been obvious to one of ordinary skill in the art before the effective
`
`filing date of the claimed invention to modify the three-dimensionai data creation
`
`method, disclosed by Shibata, as taught by Breed, in order to improve the safety of
`
`mulli-vehicie combined operation and to avoid the collision or abrupt stops using the
`
`system, which allows io register the objects ariead of the vehicle on the road and
`
`transmit this information.
`
`Conclusion
`
`The prior art made of record and notrelied upon is considered pertinent to
`
`applicant's disclosure.
`
`US20170120814A1 to Kentley-Klay et al. (hereinafter Kentley-Klay) discloses
`
`method for robotic vehicle communication with an external environment configured to
`
`determine a position relative to the autonomous vehicle.
`
`US20070146136 to Chen et al. (hereinafter Chen) discloses a navigation system
`
`and method applied in a predetermined space.
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 15
`
`US20200026303 to Fergusonetal. (hereinafter Ferguson) discloses methods
`
`and devices for actively modifying a field of view of an autonomous vehicle in view of
`
`constraints.
`
`US8798841 to Nickolaou et al. (hereinafter Nickolaou) disloses system and
`
`method of improving sensor visibility for a host vehicle operalirig in an autonomous
`
`driving moce when one or more forward-looking sensors are being occluded or
`
`obstructed.
`
`Any inquiry concerning this communication or earlier communications from the
`
`examiner should be directed to Lyudmila Zaykova-Feldman whose telephone number is
`
`(469)295-9269. The examiner can normally be reached 7:30am - 4:30pm, Monday
`
`through Friday.
`
`Examiner interviews are available via telephone, in-person, and video
`
`conferencing using a USPTO supplied web-based collaboration tool. To schedule an
`
`interview, applicant is encouraged to use the USPTO Automated Interview Request
`
`(AIR) at http:/Avww.uspto.gov/interviewpractice.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner's
`
`supervisor, Arleen M. Vazquez can be reached on 571-272-2619. The fax phone
`
`number for the organization where this application or proceeding is assigned is 571 -
`
`273-8300.
`
`Information regarding the status of published or unpublished applications may be
`
`obtained from Patent Center. Unpublished application information in Patent Center is
`
`available to registered users. To file and manage patent submissions in Patent Center,
`
`visit: https://patentcenter.uspto.gov. Visit https:/Awww.uspto.gov/patents/apply/patent-
`
`
`
`Application/Control Number: 16/299,564
`Art Unit: 2865
`
`Page 16
`
`center for more information about Patent Center and
`
`https://Awww.uspto.gov/patents/docx for information aboutfiling in DOCX format. For
`
`additional questions, contact the Electronic Business Center (EBC) at 866-217-9197
`
`(toll-free). If you would like assistance from a USPTO Customer Service
`
`Representative, call 800-786-9199 (IN USA OR CANADA)or 571-272-1000.
`
`/LYUDMILA ZAYKOVA-FELDMAN/
`Examiner, Art Unit 2865
`
`/ALEXANDER SATANOVSKY/
`Primary Examiner, Art Unit 2863
`
`