`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and TrademarkOffice
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/403,488
`
`08/16/2021
`
`Fabio DALLA LIBERA
`
`083710-3399
`
`6468
`
`Rimon PC - Pansonic Corporation
`423 WashingtonStreet
`Suite 600
`San Francisco, CA 94111
`
`GLENNII, FRANK T
`
`3662
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`11/14/2023
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`Thetime period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`USPTOmail@rimonlaw.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`1-7 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) ___ is/are withdrawn from consideration.
`Cj} Claim(s)
`is/are allowed.
`Claim(s) 1-6 is/are rejected.
`Claim(s) 7 is/are objectedto.
`C] Claim(s
`are subjectto restriction and/or election requirement
`S)
`“If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http:/Awww.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)[¥j The specification is objected to by the Examiner.
`11) The drawing(s)filed on 16 August 2021 is/are: a)f¥) accepted or b)([) objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12)[¥] Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d) or (f).
`Certified copies:
`c)() None ofthe:
`b)( Some**
`a) All
`1.4] Certified copies of the priority documents have been received.
`2.1.) Certified copies of the priority documents have been received in Application No.
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`* See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`4) (J Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20231031A
`
`Application No.
`Applicant(s)
`17/403,488
`DALLA LIBERA, Fabio
`
`Office Action Summary Art Unit|AIA (FITF) StatusExaminer
`FRANK T GLENN III
`3662
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensions of time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s)filed on 16 August 2021.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)() This action is FINAL. 2b)¥)This action is non-final.
`3)02 An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)\0) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AIA or AIA Status
`
`1.
`
`The present application, filed on or after March 16, 2013, is being examined underthefirst
`
`inventor to file provisions of the AIA.
`
`2.
`
`Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
`
`Priority
`
`Information Disclosure Statement
`
`3.
`
`The information disclosure statement (IDS) submitted on 08/16/2021 is in compliance with the
`
`provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the
`
`examiner.
`
`4.
`
`The disclosure is objected to because of the following informalities:
`
`Specification
`
`e On Pg. 26, within Mathematical Expression 16, Formulae 15-18 include text whichis
`
`rendered difficult to read due to the small font size. In particular, the superscripts and
`
`subscripts of variables within Formulae 15-18 are unclear. The Examiner recommends
`
`increasing the font size of these formulae so that each variable and associated
`
`superscripts and subscripts are legible.
`
`Appropriate correction is required.
`
`Claim Rejections - 35 USC § 102
`
`5.
`
`In the event the determination ofthe status of the application as subject to AIA 35 U.S.C. 102 and
`
`103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 3
`
`(i.e., changing from AIA to pre-AIA)for the rejection will not be considered a new groundofrejection if
`
`the prior art relied upon, and the rationale supporting the rejection, would be the same undereitherstatus.
`
`6.
`
`The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis
`
`for the rejections under this section madein this Office action:
`
`A personshall be entitled to a patent unless —
`
`(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale,
`or otherwise available to the public before the effective filing date of the claimed invention.
`
`(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for
`patent published or deemed published under section 122(b), in which the patent or application, as the
`case may be, names another inventor and waseffectively filed before the effective filing date of the
`claimed invention.
`
`7.
`
`Claim(s) 1 and 4-5 is/are rejected under 35 U.S.C. 102(a)(1)/(a)(2)as being anticipated by
`
`Saneyoshi (US 6,025,790 A).
`
`Regarding claim 1, Saneyoshi discloses a mobile robot that autonomously travels in a
`
`predetermined space, comprising:
`
`e
`
`a housing;
`
`o
`
`Saneyoshi discloses (Col. 6 lines 10-24): "The camera assembly 10 serves as a
`
`ranging sensor, which is mounted on a moving object..."
`
`e
`
`a first camera attached to the housing and configured to generate a first lower image by
`
`photographing below the housing;
`
`o
`
`Saneyoshi discloses (Col. 6 lines 10-24): "The camera assembly 10 serves as a
`
`ranging sensor, which is mounted on a moving object, for imaging a
`
`surrounding environment every a predetermined amountof time to capture a
`
`variation in the surrounding environment every one screen and to capture a
`
`displacement information in accordance with a horizontal displacement. As
`
`shownin FIG. 2, the camera assembly 10 comprises:... a set of stereo
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 4
`
`cameras(each of which will be hereinafter referred to as a "'down-view stereo
`
`camera'') 13a and 13b, provided on the frame 11, for imaging a downward
`
`landscape(the ground surface) required to calculate a componentoftranslation
`
`speed of the moving object."
`
`e
`
`adetector attached to the housing and configured to detect an attitude of the housing;
`
`©
`
`Saneyoshi discloses (Col. 9 lines 37-49): "That is, the position recognizing unit
`
`30 basically comprises: ... a slave processor (SYS2 slave) 43 for calculating
`
`rotation, translation, and navigation..." FIG. 1, included below, demonstrates
`
`that the position recognizing unit 30 containing slave processor 43 is attached to
`
`the camera assembly 10 through stereo image processing unit 20. Here the
`
`detector is being interpreted as the rotation detection functions of slave processor
`
`43. Saneyoshi further discloses (Col. 17 lines 23-38): "At subsequent step S54,
`
`an angular velocity (roll, pitch, yaw) corresponding to a difference between the
`
`positions of the No. 1 and No. 2 block groupsis derived by the nonlinear least
`
`square method... Moreover, the linear approximation of the difference is carried
`
`out, and the partial differential of the linear-approximated differenceis
`
`carried out every componentof rotation about three axes (X, Y, Z axes).
`
`Then, an appropriate deflection angle is added thereto to derive variations
`
`(dx, dy) in the difference between the positions of the No. 1 and No. 2 blocks
`
`on the imaging element for each ofroll, pitch, and yaw. The respective
`
`componentsof roll, pitch, and yaw are obtainedas the limits of the variations
`
`(dx,dy)."
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 5
`
`
`
`
`at
`INTTIALIZATION
`
`INTRALIZATION
`
`/
`PART i
`
`
`
`EG
`PIXEL DISTANCE
`IMAGE HISTOGRAM
`
`INFORMATION
`
`
`
`PROCESS F No.1 BLOGK GROUP
`
`
`
`
`
`: CAPTURING PART
`EXTRACTION OF
`
`No.1 BLOCK
`
`
`
`
`
`OYF¥ BLO
`
`DISTANCE
`
`
`
` tobi]
`EXTRACTION OF
`Ti
`Ld woz BLOCK
`
`; ROTATION
`
`
`PROCESSING PART
`
`
`
` TRANSLATION.
`PROCESSING PART
`
`
`
`
`
` : N
`
`PROCESSING PART
`
`
`
`
`
`
`COMMUNICATION |
`
` co -
` PROCESSE
`
`SYSTEM
`x6 PART]
`
`
`
`FIG. 1
`
`e
`
`acalculator configured to calculate a velocity of the mobile robot based on theattitude
`
`and the first lower image;
`
`o
`
`Saneyoshi discloses (Col. 9 lines 37-49): "That is, the position recognizing unit
`
`30 basically comprises: ... a slave processor (SYS2 slave) 43 for calculating
`
`rotation, translation, and navigation..." Here, the calculator is being interpreted
`
`as the rotational velocity calculation functions of slave processor 43. Saneyoshi
`
`further discloses (Col. 8 lines 30-36): "The No. 1 block group capturing part 32
`
`clips pattern portions (each of which will be hereinafter referred to as a "No. 1
`
`block") of an image region suitable for the navigation processing from each of
`
`the distant-view and down-view original imagesat regular intervals, and
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 6
`
`captures a plurality of No. 1 blocks (a No. 1 block group) with respect to cach of
`
`the distant-view and down-view images." Saneyoshi even further discloses (Col.
`
`17 lines 23-38): "At subsequent step $54, an angular velocity (roll, pitch, yaw)
`
`corresponding to a difference between the positions of the No. 1 and No. 2
`
`block groupsis derived by the nonlinear least square method... Moreover, the
`
`linear approximation of the difference is carried out, and the partial differential of
`
`the linear-approximated difference is carried out every componentof rotation
`
`about three axes (X, Y, Z axes). Then, an appropriate deflection angle is added
`
`thereto to derive variations (dx, dy) in the difference between the positions of
`
`the No. 1 and No. 2 blocks on the imaging elementfor each ofroll, pitch, and
`
`yaw. Therespective componentsofroll, pitch, and yaw are obtained as the
`
`limits of the variations (dx, dy)."
`
`e
`
`an estimator configured to estimate a self-position of the mobile robot in the
`
`predetermined space based on the velocity;
`
`©
`
`Saneyoshi discloses (Col. 9 lines 37-49): "That is, the position recognizing unit
`
`30 basically comprises: ... a slave processor (SYS2 slave) 43 for calculating
`
`rotation, translation, and navigation..." Here, the estimator is being interpreted
`
`as the position determination functions of slave processor 43. Saneyoshi further
`
`discloses (Col. 2 line 42 - Col. 3 line 7): "In addition, a componentof velocity of
`
`the autonomousrunning vehicle between the frames is derived on the basis of the
`
`movement of the imaged picture of the downward landscape from the first block
`
`to the second block and on the basis of an elapsed time between the frames, and
`
`the componentof rotational speed is removed from the component of
`
`velocity to derive a componentof translational speed of the autonomous
`
`running vehicle between the frames. Then, the componentof translation speed
`
`between the frames is converted to a componentof translation speed viewed from
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 7
`
`a ranging starting point to accumulate the converted componentoftranslation
`
`speed to derive a navigation locus in a three-dimensional space to recognize
`
`the position of the autonomous runningvehicle."
`
`e
`
`and acontroller configured to control the mobile robot to travel based on theself-
`
`position.
`
`©
`
`Saneyoshi discloses (Col. 5 line 58 - Col. 6 line 9): "The position recognizing
`
`system of a moving object, such as an autonomous running vehicle generally
`
`comprises: ... a position recognizing unit 30 for recognizing the moving
`
`object's own position in the three-dimensional space on the basis of an image
`
`information to output a navigational data to an external target control system
`
`100." Saneyoshi further discloses (Col. 10 lines 22-32): "Then, on the basis of
`
`the moving-object's own position obtained by the navigation data, the
`
`terrain clearance and flying direction of the helicopter are precisely
`
`controlled so that the helicopter takes a route in a preset absolute coordinate or a
`
`route set to a target on the ground."
`
`Regarding claim 4, Saneyoshi discloses the aforementioned limitations of claim 1. Saneyoshi
`
`further discloses:
`
`e
`
`an angular velocity sensor attached to the housing and configured to measure an angular
`
`velocity of the mobile robot,
`
`©
`
`Saneyoshi discloses (Col. 9 lines 37-49): "That is, the position recognizing unit
`
`30 basically comprises: ... a slave processor (SYS2 slave) 43 for calculating
`
`rotation, translation, and navigation..." FIG. 1, included above, demonstrates
`
`that the position recognizing unit 30 containing slave processor 43 is attached to
`
`the camera assembly 10 through stereo image processing unit 20. Saneyoshi
`
`further discloses (Col. 17 lines 23-38): "At subsequent step S54, an angular
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 8
`
`velocity (roll, pitch, yaw) corresponding to a difference between the positions
`
`of the No. 1 and No. 2 block groupsis derived by the nonlinear least square
`
`method... Moreover, the linear approximation of the difference is carried out, and
`
`the partial differential of the linear-approximated difference is carried out every
`
`component of rotation about three axes (X, Y, Z axes). Then, an appropriate
`
`deflection angle is added thereto to derive variations (dx, dy) in the difference
`
`between the positions of the No. 1 and No. 2 blocks on the imaging element
`
`for each ofroll, pitch, and yaw. The respective componentsofroll, pitch, and
`
`yaw are obtained asthe limits of the variations (dx, dy)."
`
`e wherein the estimator estimates the self-position based on the angular velocity and the
`
`velocity.
`
`©
`
`Saneyoshi discloses (Col. 9 lines 37-49): "That is, the position recognizing unit
`
`30 basically comprises: ... a slave processor (SYS2 slave) 43 for calculating
`
`rotation, translation, and navigation..." Here, the estimator is being interpreted
`
`as the position determination functions of slave processor 43. Saneyoshi further
`
`discloses (Col. 2 line 42 - Col. 3 line 7): "In addition, a component of velocity of
`
`the autonomousrunning vehicle between the frames is derived on the basis of the
`
`movement of the imaged picture of the downward landscape from the first block
`
`to the second block and on the basis of an elapsed time between the frames, and
`
`the componentof rotational speed is removed from the component of
`
`velocity to derive a componentof translational speed of the autonomous
`
`running vehicle between the frames. Then, the componentof translation speed
`
`between the frames is converted to a component of translation speed viewed from
`
`a ranging starting point to accumulate the converted componentoftranslation
`
`speed to derive a navigation locus in a three-dimensional space to recognize
`
`the position of the autonomous runningvehicle."
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 9
`
`Regarding claim 5, Saneyoshi discloses the aforementioned limitations of claim 1. Saneyoshi
`
`further discloses:
`
`e
`
`asecond cameraattached to the housing and configured to generate a second lower image
`
`by photographing below the housing,
`
`©
`
`Saneyoshi discloses (Col. 6 lines 10-24): "The camera assembly 10 serves as a
`
`ranging sensor, which is mounted on a moving object, for imaging a
`
`surrounding environment every a predetermined amountof time to capture a
`
`variation in the surrounding environment every one screen and to capture a
`
`displacement information in accordance with a horizontal displacement. As
`
`shownin FIG. 2, the camera assembly 10 comprises: ... a set of stereo
`
`cameras(each of which will be hereinafter referred to as a "down-view stereo
`
`camera'') 13a and 13b, provided on the frame 11, for imaging a downward
`
`landscape(the ground surface) required to calculate a componentoftranslation
`
`speed of the moving object."
`
`e wherein the calculator calculates an angular velocity of the mobile robot based on the
`
`first lower image and the second lower image,
`
`©
`
`Saneyoshi discloses (Col. 9 lines 37-49): "That is, the position recognizing unit
`
`30 basically comprises: ... a slave processor (SYS2 slave) 43 for calculating
`
`rotation, translation, and navigation..." Here, the calculator is being interpreted
`
`as the rotational velocity calculation functions of slave processor 43. Saneyoshi
`
`further discloses (Col. 8 lines 30-36): "The No. 1 block group capturing part 32
`
`clips pattern portions (each of which will be hereinafter referred to as a "No. 1
`
`block") of an image region suitable for the navigation processing from each of
`
`the distant-view and down-view original imagesat regular intervals, and
`
`captures a plurality of No. 1 blocks (a No. 1 block group) with respect to cach of
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 10
`
`the distant-view and down-view images." Saneyoshi even further discloses (Col.
`
`17 lines 23-38): "At subsequent step $54, an angular velocity (roll, pitch, yaw)
`
`corresponding to a difference between the positions of the No. 1 and No. 2
`
`block groupsis derived by the nonlinear least square method... Moreover, the
`
`linear approximation of the difference is carried out, and the partial differential of
`
`the linear-approximated difference is carried out every componentofrotation
`
`about three axes (X, Y, Z axes). Then, an appropriate deflection angle is added
`
`thereto to derive variations (dx, dy) in the difference between the positions of
`
`the No. 1 and No. 2 blocks on the imaging elementfor each ofroll, pitch, and
`
`yaw. Therespective componentsofroll, pitch, and yaw are obtained as the
`
`limits of the variations (dx, dy)."
`
`and the estimator estimates the self-position based on the angular velocity and the
`
`velocity.
`
`©
`
`Saneyoshi discloses (Col. 9 lines 37-49): "That is, the position recognizing unit
`
`30 basically comprises: ... a slave processor (SYS2 slave) 43 for calculating
`
`rotation, translation, and navigation..." Here, the estimator is being interpreted
`
`as the position determination functions of slave processor 43. Saneyoshi further
`
`discloses (Col. 2 line 42 - Col. 3 line 7): "In addition, a component of velocity of
`
`the autonomousrunning vehicle between the frames is derived on the basis of the
`
`movement of the imaged picture of the downward landscape from the first block
`
`to the second block and on the basis of an elapsed time between the frames, and
`
`the componentof rotational speed is removed from the component of
`
`velocity to derive a componentof translational speed of the autonomous
`
`running vehicle between the frames. Then, the componentoftranslation speed
`
`between the frames is converted to a component of translation speed viewed from
`
`a ranging starting point to accumulate the converted componentoftranslation
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 11
`
`speed to derive a navigation locus in a three-dimensional space to recognize
`
`the position of the autonomous runningvehicle."
`
`Claim Rejections - 35 USC § 103
`
`8.
`
`The following is a quotation of 35 U.S.C. 103 which formsthe basis for all obviousness
`
`rejections set forth in this Office action:
`
`A patent for a claimed invention maynotbe obtained, notwithstanding that the claimed inventionis not
`identically disclosed as set forth in section 102, if the differences between the claimed invention and the
`prior art are such that the claimed invention as a whole would have been obviousbefore the effective
`filing date of the claimed invention to a person having ordinaryskill in the art to which the claimed
`invention pertains. Patentability shall not be negated by the mannerin which the invention was made.
`
`9.
`
`The factual inquiries for establishing a background for determining obviousness under 35 U.S.C.
`
`103 are summarizedas follows:
`
`1. Determining the scope and contents of the priorart.
`
`2. Ascertaining the differences between the prior art and the claimsat issue.
`
`3. Resolving the level of ordinary skill in the pertinentart.
`
`4. Considering objective evidence present in the application indicating obviousness or
`
`nonobviousness.
`
`10.
`
`Claim(s) 2 is/are rejected under 35 U.S.C. 103 as being unpatentable over Saneyoshi in view of
`
`Tokuraet al. (US 2021/0373169 A1), hereinafter Tokura.
`
`Regarding claim 2, Saneyoshi teaches the aforementioned limitations of claim 1. However,
`
`Saneyoshi does not outright teach hat the detector includes three or more distance measurementsensors,
`
`each of the three or more distance measurement sensors measuring a distance between a floor surface on
`
`which the mobile robot travels and the housing, and the calculator calculates the attitude based on the
`
`distance obtained from each of the three or more distance measurement sensors. Tokura teaches a
`
`movable object and distance measurement method, comprising:
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 12
`
`e
`
`the detector includes three or more distance measurement sensors, each of the three or
`
`more distance measurement sensors measuring a distance between a floor surface on
`
`which the mobile robot travels and the housing,
`
`o Tokura teaches ((0067]): "The orientation measurement parts 211B are
`
`sensors provided at four corners of the vehicle body 201 in a plan view, and
`
`measure distances to a floor F. The control device 220 can acquire an
`
`orientation angle A of the vehicle body 201 from the distances to the floor F
`
`measured by the orientation measurement parts 211B."
`
`e
`
`and the calculator calculates the attitude based on the distance obtained from each of the
`
`three or more distance measurementsensors.
`
`o Tokura teaches ((0067]): "The orientation measurement parts 211B are
`
`sensors provided at four corners of the vehicle body 201 in a plan view, and
`
`measuredistances to a floor F. The control device 220 can acquire an
`
`orientation angle A of the vehicle body 201 from the distancesto the floor F
`
`measured by the orientation measurement parts 211B."
`
`It would have been prima facie obviousto one of ordinary skill in the art before the effective
`
`filing date of the claimed invention to have modified Saneyoshi to incorporate the teachings of Tokura to
`
`provide that the detector includes three or more distance measurement sensors, each of the three or more
`
`distance measurement sensors measuring a distance between a floor surface on which the mobile robot
`
`travels and the housing, and the calculator calculates the attitude based on the distance obtained from each
`
`of the three or more distance measurement sensors. Saneyoshi and Tokura are each directed towards
`
`similar pursuits in the field of moving object navigation including imaging below the moving object.
`
`Accordingly, one of ordinary skill in the art would find it advantageousto incorporate the teachings of
`
`Tokura,as utilizing the four distance measurementsensors of Tokura advantageously allowsfor
`
`prevention of erroneousdetection dueto a tilt of a singular distance measurement sensor, as recognized
`
`by Tokura ({0070)).
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 13
`
`11.
`
`Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Saneyoshi in view of
`
`Smith et al. (US 2016/0231426 A1), hereinafter Smith.
`
`Regarding claim 3, Saneyoshi teaches the aforementioned limitations of claim 1. Saneyoshi
`
`further teaches:
`
`e
`
`and the calculator calculates the attitude and the velocity based on the first lower image.
`
`©
`
`Saneyoshi teaches (Col. 9 lines 37-49): "That is, the position recognizing unit 30
`
`basically comprises: ... a slave processor (SYS2 slave) 43 for calculating
`
`rotation, translation, and navigation..." Here, the calculator is being interpreted
`
`as the rotational velocity calculation functions of slave processor 43. Saneyoshi
`
`further teaches (Col. 8 lines 30-36): "The No. 1 block group capturing part 32
`
`clips pattern portions (each of which will be hereinafter referred to as a "No. 1
`
`block") of an image region suitable for the navigation processing from each of
`
`the distant-view and down-view original imagesat regular intervals, and
`
`captures a plurality of No. 1 blocks (a No. 1 block group) with respect to cach of
`
`the distant-view and down-view images.” Saneyoshi even further teaches (Col.
`
`17 lines 23-38): "At subsequent step $54, an angular velocity (roll, pitch, yaw)
`
`corresponding to a difference between the positions of the No. 1 and No. 2
`
`block groupsis derived by the nonlinear least square method... Moreover, the
`
`linear approximation of the difference is carried out, and the partial differential of
`
`the linear-approximated difference is carried out every componentof rotation
`
`about three axes (X, Y, Z axes). Then, an appropriate deflection angle is added
`
`thereto to derive variations (dx, dy) in the difference between the positions of
`
`the No. 1 and No. 2 blocks on the imaging elementfor each ofroll, pitch, and
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 14
`
`yaw. Therespective componentsofroll, pitch, and yaw are obtained as the
`
`limits of the variations (dx, dy)."
`
`However, Saneyoshi does not outright teach that the detector includes a light source that emits
`
`structured light toward below the mobile robot, wherein the first camera generates the first lower image
`
`by detecting reflected light of the structured light emitted from the light source and reflected on a floor
`
`surface on which the mobile robot travels. Smith teaches a machine positioning system, comprising:
`
`e
`
`the detector includesa light source that emits structured light toward below the mobile
`
`robot,
`
`o
`
`Smith teaches ((0028]): "The light sourcesare lasers that emit light that reflects
`
`off of, for example, the surfaces of side walls 24 and/or other surfaces of
`
`objects in worksite 20 within a field of view of the Lidar unit 32. The light may
`
`be emitted by a single laser that is reflected by a rotating mirror to scan the
`
`portion of the worksite in the field of view. Alternatively, the light may be
`
`emitted by multiple lasers directed to different angles within the field of view
`
`so to radiate light across the field of view a non-scanning manner. Oneor
`
`more detectors of the Lidar unit 32 receive the reflected light and send signals to
`
`controller 18 indicative of the light received. Controller 18 then calculates
`
`distances to the various points on the surfaces. The calculated distances are based
`
`on the elapsed time between emission of the light and detection of the light, the
`
`distance to the surface being half of the elapsed time multiplied by the speed of
`
`light."
`
`e
`
`the first camera generatesthe first lower image by detecting reflected light of the
`
`structured light emitted from the light source and reflected on a floor surface on which
`
`the mobile robottravels,
`
`o
`
`Smith teaches ((0028]): "The light sourcesare lasers that emit light that reflects
`
`off of, for example, the surfaces of side walls 24 and/or other surfaces of
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 15
`
`objects in worksite 20 within a field of view of the Lidar unit 32. The light may
`
`be emitted by a single laser that is reflected by a rotating mirror to scan the
`
`portion of the worksite in the field of view. Alternatively, the light may be
`
`emitted by multiple lasers directed to different angles within the field of view
`
`so to radiate light across the field of view a non-scanning manner. Oneor
`
`more detectors of the Lidar unit 32 receive the reflected light and send signals to
`
`controller 18 indicative of the light received. Controller 18 then calculates
`
`distances to the various points on the surfaces. The calculated distances are
`
`based on the elapsed time between emission of the light and detection of the
`
`light, the distance to the surface being half of the elapsed time multiplied by the
`
`speedof light."
`
`It would have been primafacie obviousto one of ordinary skill in the art before the effective
`
`filing date of the claimed invention to have modified Saneyoshito incorporate the teachings of Smith to
`
`provide that the detector includes a light source that emits structured light toward below the mobile robot,
`
`wherein the first camera generates the first lower image by detecting reflected light of the structured light
`
`emitted from the light source and reflected on a floor surface on which the mobile robottravels.
`
`Saneyoshi and Smith are each directed towards similar pursuits in the field of moving object navigation
`
`including imaging below the moving object. Accordingly, one of ordinary skill in the art would findit
`
`advantageousto incorporate the teachings of Smith, as in addition to determining distances to the surface
`
`(see at least [0028]), Smith anticipates that obtained LIDAR data can be used to determine shapes of
`
`scanned objects or portions of the worksite (see at least [0030]). This worksite data can be stored for later
`
`access as worksite data, as recognized by Smith ([0033]), thereby allowing for later access of the LIDAR
`
`information.
`
`12.
`
`Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Saneyoshi in view of
`
`Milici (US 2020/0134853 A1).
`
`
`
`Application/Control Number: 17/403,488
`Art Unit: 3662
`
`Page 16
`
`Regarding claim 6, Saneyoshi teaches the aforementioned limitations of claim 1. Saneyoshi
`
`further teaches:
`
`e
`
`asecond cameraattached to the housing and configured to generate a second lower image
`
`by photographing below the housing,
`
`©
`
`Saneyoshi teaches (Col. 6 lines 10-24): "The camera assembly 10 serves as a
`
`ranging sensor, which is mounted on a moving object, for imaging a
`
`surrounding environment every a predetermined amountof time to capture a
`
`variation in the surrounding environment every one screen and to capture a
`
`displacement information in accordance with a horizontal displacement. As
`
`shownin FIG. 2, the camera assembly 10 comprises: ... a set of stereo
`
`cameras(each of which will be hereinafter referred to as a "'down-view stereo
`
`camera'') 13a and 13b, provided on the frame 11, for imaging a downward
`
`landscape(the ground surface) required to calculate a componentoftranslation
`
`speed of the moving object."
`
`e
`
`the first camera and the second cameraare attachedto the housing,
`
`©
`
`Saneyoshi teaches (Col. 6 lines 10-24): "The camera assembly 10 serves as a
`
`ranging sensor, which is mounted on a moving object, for imaging a
`
`surrounding environment every a predetermined amountof time to capture a
`
`variation in the surrounding environment every one screen and to capture a
`
`displacement information in accordance with a horizontal displacement. As
`
`shownin FIG. 2, the camera assembly 10 comprises: ... a set of stereo
`
`cameras(each of which will be hereinafter referred to as a "'down-view stereo
`
`camera'') 13a and 13b, provided on the frame 11, for imaging a downward
`
`landscape(the ground surface) required to calculate a componentoftranslation
`
`speed of the moving object."
`
`
`
`Application/Cont