`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and TrademarkOffice
`Address: COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`17/3 16,760
`
`05/11/2021
`
`Yumiko OHNO
`
`083710-3336
`
`6311
`
`McDermott Will and Emery LLP
`The McDermott Building
`500 North Capitol Street, N.W.
`Washington, DC 20001
`
`BRUTUS,JOEL F
`
`3793
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`09/26/2023
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`Thetime period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`
`mweipdocket@mwe.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`1-14 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) ___ is/are withdrawn from consideration.
`Cj} Claim(s)
`is/are allowed.
`Claim(s) 1-14 is/are rejected.
`S)
`) © Claim(s)___is/are objected to.
`Cj) Claim(s
`are subjectto restriction and/or election requirement
`)
`S)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://Awww.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) )
`
`Application Papers
`10) The specification is objected to by the Examiner.
`11)0) The drawing(s) filedon__ is/are: a)(J accepted or b)( objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12)[VM. Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`c)Z None ofthe:
`b)() Some**
`a) All
`1.{¥] Certified copies of the priority documents have been received.
`2.1) Certified copies of the priority documents have beenreceived in Application No.
`3.1.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`* See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3)
`
`(LJ Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`4) (J Other:
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20230913
`
`Application No.
`Applicant(s)
`17/316,760
`OHNO etal.
`
`Office Action Summary Art Unit|AIA (FITF) StatusExaminer
`JOEL F BRUTUS
`3793
`Yes
`
`
`
`-- The MAILING DATEof this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensions of time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s)filed on 5/11/2021.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)() This action is FINAL. 2b)¥)This action is non-final.
`3)02 An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4)\0) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 17/316, 760
`Art Unit: 3793
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AlA orAIA Status
`
`The present application,filed on or after March 16, 2013, is being examined under the first
`
`inventorto file provisions of the AIA.
`
`Claim Rejections - 35 USC § 112
`
`The following is a quotation of 35 U.S.C. 112(b):
`(b} CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out
`and distinctly claiming the subject matter which the inventor or a joint inventor regards as the
`invention.
`
`The following is a quotation of 35 U.S.C. 112 (pre-AlA), second paragraph:
`The specification shall conclude with one or moreclaims particularly pointing out and distinctly
`claiming the subject matter which the applicant regards as his invention.
`
`1.
`
`Claims 1-14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AlA), second paragraph, as
`
`being indefinite for failing to particularly point out and distinctly claim the subject matter which the
`
`inventoror a joint inventor (or for applications subject to pre-AlA35 U.S.C. 112, the applicant), regards
`
`as the invention.
`
`Claim 1 recites the limitation "the data". There is insufficient antecedent basis for this limitation
`
`in the claim. Claims 2-12 are rejected for the same reasonsas set forth in claim 1.
`
`Claims 13-14 recites the limitation "the data". There is insufficient antecedent basis for this
`
`limitation in the claim.
`
`The term “amountoflight” in claims 1, 13-14 is a relative term which renders the claim
`
`indefinite. The term “amountoflight” is not defined by the claim, the specification does not provide a
`
`standard for ascertaining the requisite degree, and one of ordinaryskill in the art would not be
`
`reasonably apprised of the scope of the invention. Claims 2-12 are rejected for the same reasons as set
`
`forth in claim 1.
`
`
`
`Application/Control Number: 17/316, 760
`Art Unit: 3793
`
`Page 3
`
`Claim Rejections - 35 USC § 102
`
`Inthe event the determination of the status of the application as subject to AIA35 U.S.C. 102
`
`and 103 (or as subject to pre-AlA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory
`
`basis (i.e., changing from AIA to pre-AlA) for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same
`
`under either status.
`
`The following is a quotation of the appropriate paragraphsof 35 U.S.C. 102 that form the basis
`
`for the rejections under this section made in this Office action:
`
`A person shall be entitled to a patent unless —
`
`(a}(1) the claimed invention was patented, described ina printed publication, or in public use, on sale,
`or otherwise available tothe public before the effective filing date of the claimed invention.
`
`2.
`
`Claim(s) 1-3, 9 and 12-14 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Fuji et al
`
`(Pub. No.: US 2017/0289568)
`
`Regarding claims 1, 13-14, Fuji et al disclose a biometric apparatus comprising:
`
`a light source that emits a light pulse radiated onto a target part including a head of a target [see
`
`0046, 0061];
`
`an image sensorthat receives a reflected light pulse which is caused as thelight pulse is radiated
`
`onto the target part, and that outputsfirst image data indicating appearance ofa face of the target [see
`
`abstract, 0068, 0078, 0097] by disclosing as a consequenceof the light emission, a prescribed light
`
`image (an irradiation dot pattern) is formed on the head 102 (the forehead) [see 0097] and bydisclosing
`
`control circuit 114 converts signal charges accumulated as a consequence of the light reception into
`
`digital data, and instructs thefirst signal processing circuit 115 and the secondsignal processing circuit
`
`116 to process image data thus obtained [see 0078];
`
`
`
`Application/Control Number: 17/316, 760
`Art Unit: 3793
`
`Page 4
`
`second image data according to distribution of an amountoflight of at least one of components
`
`of the reflected light pulse [see 0078, 0046-0047, 0069, 0072, 0078, 0081-0082];
`
`a controlcircuit (114, fig 1) that controls the light source and the image sensor[see 0046, 0061]
`
`by disclosing a control circuit 114; a light source 600 and an image sensor 602 which are controlled by
`
`the control circuit 114 [see 0061];
`
`a signal processing circuit (604, fig 1), wherein the control circuit causes the light source to emit
`
`the light pulse repeatedly and the image sensor to outputthe first image data and the second image
`
`data [see 0046];
`
`the signal processing circuit generates data indicating a state of the target based on a temporal
`
`change in thefirst image data and a temporal changein the second image data and outputs the data
`
`[see 0071-0072, 0081] by disclosing the image pickup apparatus 1001 detects density distribution
`
`conditions and temporal changesof oxidized hemoglobin and deoxidized hemoglobin in the brain to be
`
`observed, and constructs the density distribution conditions in the form of two-dimensional images [see
`
`0072] and by disclosing the signal processing circuit 604 obtains changesin density of oxidized
`
`hemoglobin and deoxidized hemoglobin from acquired information on brightness and darkness, and
`
`outputs the brain activity in the form of imaging data by means of computation using the changesin
`
`density of oxidized hemoglobin and deoxidized hemoglobin thus obtained [see 0071].
`
`Regarding claim 2, Fuji et al disclose wherein the control circuit causes the image sensor to
`
`generate the second image data by causing the image sensor to detect a componentof the reflected
`
`light pulse in a period including at least a part of a falling period; the falling period being a period froma
`
`beginning to an end of a decrease in intensity of the reflected light pulse, after the falling period starts
`
`[see 0102-0103, 0162].
`
`
`
`Application/Control Number: 17/316, 760
`Art Unit: 3793
`
`Page5S
`
`Regarding claim 3, Fuji et al disclose wherein the control circuit causes the image sensor to
`
`generate thefirst image data by causing the image sensor to detect a componentof the reflected light
`
`pulse in a period including at least a part of a period before the falling period of the reflected light pulse
`
`starts [see 0102-0103, 0069].
`
`Regarding claim 9, Fuji et al disclose wherein the image sensor includes light detection cells
`
`arranged in two dimensions [see 0004, 0046];
`
`each of the light detection cells includes a photoelectric conversion element, a first charge
`
`accumulator, and a second charge accumulator[see 0046];
`
`the control circuit causes the first charge accumulator to accumulate first charge and the first
`
`image data is generated based on thefirst charge [see 0046];
`
`the control circuit causes the second charge accumulator to accumulate second charge and the
`
`second image data is generated based on the second charge [see 0046].
`
`Regarding claim 12, Fuji et al disclose wherein the signal processing circuit presents the data
`
`indicating the state (brain activity) of the target tothe target through aninformation device [see 0071-
`
`0072, 0081] by disclosing and outputs the brain activity in the form of imaging data by meansof
`
`computation using the changesin density of oxidized hemoglobin and deoxidized hemoglobin thus
`
`obtained [see 0071].
`
`Claim Rejections - 35 USC § 103
`
`Inthe event the determination of the status of the application as subject to AIA35 U.S.C. 102
`
`and 103 (or as subject to pre-AlA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory
`
`basis (i.e., changing from AIA to pre-AlA) for the rejection will not be considered a new ground of
`
`
`
`Application/Control Number: 17/316, 760
`Art Unit: 3793
`
`Page 6
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same
`
`under either status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections
`
`set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is
`not identically disclosed as set forthin section 102,if the differences between the claimed invention
`and the prior art are such that the claimed invention as a whole would have been obvious before the
`effective filing date of the claimed invention to a person having ordinaryskill in the art to which the
`claimed invention pertains. Patentability shall not be negated by the manner in which the invention
`was made.
`
`3.
`
`Claim(s) 4-8, 10 are rejected under 35 U.S.C. 103 as being unpatentable over Fuji et al (Pub. No.:
`
`US 2017/0289568) in view of Aoyama et al (Pub. No.: US 2017/0310743)
`
`Regarding claims 4-5, Fuji et al don’t disclose wherein resolution of the first image data and
`
`resolution of the second image data aredifferent from each other and wherein resolution of the first
`
`image data is higher than resolution of the second image data.
`
`Nonetheless, Aoyama et al disclose wherein resolution of the first image data and resolution of
`
`the second image dataaredifferent (by using azoom method, emphasis added, 1641) from each other
`
`[see 1639] and wherein resolution of the first image data is higher (by using a zoom method, emphasis
`
`added) than resolution of the second image data [see 0752, 1148, 1188, 1639, 1641. 1649-1652] by
`
`disclosing image sensor 10080a captures animage having a resolution of 16 pixels in width and 12 pixels
`
`in height, out of the 32 by 24 imaging elements included in the image sensor 10080a [see 1639] and
`
`further disclose only odd-numbered or even-numbered imaging elements in each of the heightwise and
`
`widthwise arrangements of imaging elements is used to capture an image. By doing so, an image
`
`10080b having a desired resolution is obtained [see 1639].
`
`Therefore,it is obvious to one skilled in the art at the time the invention was filed and would
`
`have been motivated to combine Fuji et aland Aoyama et al by using resolution of the first image data
`
`
`
`Application/Control Number: 17/316, 760
`Art Unit: 3793
`
`Page 7
`
`and resolution of the second image data are different from each other and resolution of the first image
`
`data is higher thanresolution of the second image data; toreduce image noise.
`
`Regarding claim 6, Fuji et al don’t disclose performs a process for changing at least one
`
`resolution selected from the group consisting of resolution of at least a part of an imageindicated by the
`
`first image data and resolution of at least a part of an image indicated by the second image data and
`
`generates the data indicating the state of the target based on the temporal changein thefirst image
`
`data and the temporal changein the second image data after the process is performed.
`
`Nonetheless, Aoyama et al disclose performs a process for changing at least one resolution
`
`selected from the group consisting of resolution of at least a part of an imageindicated by the first
`
`image data and resolution of at least a part of an image indicated by the second image data and
`
`generatesthe data indicating the state (position, emphasis added and/or gaze of the user after blinking
`
`and/ordirection of the eye, see 0686, 0727) of the target based onthe temporal changein the first
`
`image data and the temporal change in the second image data after the process is performed [see 0758,
`
`0686, 0729, 0731] by disclosing estimates the position and direction of the head and the gaze direction
`
`(the position and direction of the eye) of the user 7510e by image processing [see 0727].
`
`Therefore,it is obvious to one skilled in the art at the time the invention was filed and would
`
`have been motivated to combine Fuji et al and Aoyama et al by performing a process for changing at
`
`least one resolution selected from the group consisting of resolution of at least a part of an image
`
`indicated by the first image data and resolution of at least a part of an image indicated by the second
`
`image data and generating the data indicating the state (position, emphasis added) of the target based
`
`on the temporal change in thefirst image data and the temporal changein the second image data after
`
`the process is performed; to reduce image noise.
`
`
`
`Application/Control Number: 17/316, 760
`Art Unit: 3793
`
`Page 8
`
`Regarding claim 7-8, Fuji et al don’t disclose wherein the image sensor outputs thefirst image
`
`data ata first frame rate, the image sensor outputs the second image data at a second frame rate and
`
`the first frame rate and the second frame rate are different from each other and the image sensor
`
`outputs the second image data at a second frame rate and the first frame rate is higher than the second
`
`frame rate [see 1708-1710]
`
`Nonetheless, Aoyama et al disclose outputting the first image data ata first frame rate and the
`
`second image data at a second frame rate and the first frame rate and the second frame rate are
`
`different from eachother and outputs the second image data at a second frame rate andthefirst frame
`
`rate is higher than the second framerate [see 1708-1710].
`
`Therefore,it is obvious to one skilled in the art at the time the invention was filed and would
`
`have been motivated to combine Fuji et aland Aoyama et al by outputting the first image data at a first
`
`frame rate and the second image data at a second frame rate andthefirst frame rate and the second
`
`frame rate are different from each other and outputs the second image data at a second frame rate and
`
`the first frame rate is higher than the second frame rate; in order to receive the visible light signal with
`
`increased speed [see 1708].
`
`Regarding claim 10, Fuji et al don’t disclose a temporal change in appearanceinformation
`
`indicating at least one selected from the group consisting of a line of sight of the target, size of a pupil of
`
`the target, frequency of blinking of the target, time intervals of blinking of the target, and facial
`
`expression of the target and generates the data indicating the state of the target based on the temporal
`
`change in the appearanceinformation and the temporal changein the second image data.
`
`Nonetheless, Aoyama et al disclose a temporal change in appearanceinformation indicating at
`
`least one selected from the group consisting ofa line of sight of the target [see 1660], size of a pupil of
`
`the target, frequency of blinking of the target [see 2013, 2017, 2804], time intervals of blinking of the
`
`
`
`Application/Control Number: 17/316, 760
`Art Unit: 3793
`
`Page 9
`
`target [see 2804, and facial expression of the target and generates the data indicating the state of the
`
`target based on the temporal change in the appearanceinformation and the temporal change in the
`
`second image data[see 0758, 0686, 0729, 0731].
`
`Therefore,it is obvious to one skilled in the art at the time the invention was filed and would
`
`have been motivated to combine Fuji et aland Aoyama et al by atemporal change in appearance
`
`information indicating at least one selected from the group consisting of a line of sight of the target, size
`
`of a pupil of the target, frequency of blinking of the target, time intervals of blinking of the target, and
`
`facial expression of the target and generates the data indicating the state of the target based on the
`
`temporal change in the appearance information and the temporal change in the second image data; to
`
`reduce image noise.
`
`4.
`
`Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Fuji et al (Pub. No.: US
`
`2017/0289568) in view of Kogut et al (Pub. No.: US 2014/0375785)
`
`Regarding claim 11, Fuji et al don’t disclose the light source to emit the light pulse and the image
`
`sensor to generate thefirst image data and the second image data witha stimulus given to the target
`
`the data indicating the state of the target indicates at least one state selected from the group
`
`consisting of interest of the target, comfort of the target, sleepiness of the target, and concentration of
`
`the target in reaction to the stimulus.
`
`Nonetheless, Kogut et al disclose emit the light pulse and the image sensor to generate the first
`
`image data and the second image data with a stimulus given to the target [see abstract, 0041-0046,
`
`claim 8]; the data indicating the state of the target indicates at least one state selected fromthe group
`
`consisting of interest of the target, comfort of the target, sleepiness (fatigue) of the target, and
`
`concentration of the target in reaction to the stimulus [see abstract, 0041-0046] by disclosing direct
`
`illuminating light onto a face of the subject. The illuminating light can reflect off the face of the subject
`
`
`
`Application/Control Number: 17/316, 760
`Art Unit: 3793
`
`Page 10
`
`to form reflected light. The system can include collection optics that collect a portion of the reflected
`
`light and produce video-rate images of the face of the subject [see abstract]
`
`Therefore,it is obvious to one skilled in the art at the time the invention was filed and would
`
`have been motivated to combine Fuji et al and Kogut et al by emitting the light pulse and the image
`
`sensor to generate thefirst image data and the second image data witha stimulus given to the target
`
`and the data indicating the state of the target indicates at least one state selected from the group
`
`consisting of interest of the target, comfort of the target, sleepiness ofthe target, and concentration of
`
`the target in reaction to the stimulus; to determine a stress level of the subject from the stress
`
`signatures.
`
`Conclusion
`
`Anyinquiry concerning this communication or earlier communications from the examiner
`
`should be directed to JOELF BRUTUS whose telephone number is (571)270-3847. The examiner can
`
`normally be reached Mon-Fri, 10:00 AM to 7:00 PM.
`
`Examiner interviews are available via telephone, in-person, and video conferencing using a
`
`USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use
`
`the USPTO Automated Interview Request (AIR) at http://www. uspto.gov/interviewpractice.
`
`If attempts to reachthe examiner by telephone are unsuccessful, the examiner’s supervisor,
`
`Christopher Koharski can be reached on 571-272-7230. The fax phone number for the organization
`
`wherethis application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of published or unpublished applications may be obtained from
`
`Patent Center. Unpublished application information in Patent Center is available to registered users. To
`
`file and managepatent submissions in Patent Center,visit: https://patentcenter.uspto. gov.Visit
`
`https ://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and
`
`
`
`Application/Control Number: 17/316, 760
`Art Unit: 3793
`
`Page 11
`
`https ://www.uspto.gov/patents/docx for information aboutfiling in DOCX format. For additional
`
`questions, contact the Electronic Business Center (EBC) at 866-217-9197(toll-free). If you would like
`
`assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA)or
`
`571-272-1000.
`
`/JOELF BRUTUS/
`Primary Examiner, Art Unit 3793
`
`