`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMIVHSSIONER FOR PATENTS
`PO. Box 1450
`Alexandria1 Virginia 22313-1450
`www.uspto.gov
`
`
`
`
`
`14/169,874
`
`01/31/2014
`
`Takeshi Yamaguchi
`
`WASHM—52186
`
`9823
`
`Pearne & Gordon LLP
`1801 East 9th Street, Suite 1200
`Clevaland, OH 441 14-3108
`
`TRUONG’ NGUYEN H
`
`ART UNIT
`
`2622
`
`PAPER NUIVIBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`08/12/2016
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above—indicated "Notification Date" to the
`following e—mail address(es):
`
`patdocket @ pearne.c0m
`jcholley @pearne.c0m
`
`PTOL—90A (Rev. 04/07)
`
`
`
`
`
`Applicant(s)
`Application No.
` 14/169,874 YAMAGIUCHI ET AL.
`
`
`AIA (First Inventor to File)
`Art Unit
`Examiner
`Office Action Summary
`
`
`NGUYEN H. TRUONG $2215 2622
`-- The MAILING DA TE of this communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLY IS SET TO EXPIRE g MONTHS FROM THE MAILING DATE OF
`THIS COMMUNICATION.
`Extensions of time may be available under the provisions of 37 CFR1. 136( a).
`after SIX () MONTHS from the mailing date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHS from the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, even if timely filed, may reduce any
`earned patent term adjustment. See 37 CFR 1 .704(b).
`
`In no event, however, may a reply be timely filed
`
`Status
`
`1)IZI Responsive to communication(s) filed on 06/29/2016.
`El A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/were filed on
`
`2b)|ZI This action is non-final.
`2a)|:l This action is FINAL.
`3)I:I An election was made by the applicant in response to a restriction requirement set forth during the interview on
`
`; the restriction requirement and election have been incorporated into this action.
`
`4)|:| Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`
`closed in accordance with the practice under Exparte Quay/e, 1935 CD. 11, 453 O.G. 213.
`
`Disposition of Claims*
`
`5)IZI Claim(s) 1-17is/are pending in the application.
`5a) Of the above claim(s)
`is/are withdrawn from consideration.
`
`is/are allowed.
`6)I:I Claim(s)
`7)|Z| Claim(s)_1-17is/are rejected.
`8)|:I Claim(s)_ is/are objected to.
`
`
`are subject to restriction and/or election requirement.
`9)I:I Claim((s)
`* If any claims have been determined allowable, you may be eligible to benefit from the Patent Prosecution Highway program at a
`
`participating intellectual property office for the corresponding application. For more information, please see
`hit
`:/'I’vaIW.usnI‘.0. ovI’ atentS/init events/
`
`
`
`iindex.‘s or send an inquiry to PPI-iieedback{®usgtc.00v.
`
`Application Papers
`
`10)I:l The specification is objected to by the Examiner.
`11)I:l The drawing(s) filed on
`is/are: a)I:I accepted or b)I:I objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`
`12)I:| Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d) or (f).
`Certified copies:
`
`a)I:l All
`
`b)|:l Some” c)I:l None of the:
`
`1.I:I Certified copies of the priority documents have been received.
`2.|:l Certified copies of the priority documents have been received in Application No.
`3.|:| Copies of the certified copies of the priority documents have been received in this National Stage
`
`application from the International Bureau (PCT Rule 17.2(a)).
`** See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`
`
`3) D Interview Summary (PTO-413)
`1) E Notice of References Cited (PTO-892)
`Paper No(s)/Mai| Date.
`.
`.
`4) I:I Other'
`2) I] InformatIon DIsclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`Paper No(s)/Mai| Date
`US. Patent and Trademark Office
`PTOL—326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mai| Date 20160808
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AIA or AIA Status
`
`The present application, filed on or after March 16, 2013, is being examined under the first
`
`inventor to file provisions of the AIA.
`
`Continued Examination Under 37 CFR 1.1 14
`
`A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR
`
`1.17(e), was filed in this application after final rejection. Since this application is eligible for continued
`
`examination under 37 CFR 1.114, and the fee set forth in 37 CFR1.17(e) has been timely paid, the
`
`finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's
`
`submission filed on 06/29/2016 has been entered.
`
`Response to Arguments
`
`Applicant’s arguments with respect to claims 1,16,17 have been considered but are moot
`
`because the arguments do not apply to any of the references being used in the current rejection.
`
`Claim Rejections - 35 USC § 103
`
`In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and
`
`103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for
`
`the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale
`
`supporting the rejection, would be the same under either status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections
`
`set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed
`invention is not identically disclosed as set forth in section 102 of this title, if the
`differences between the claimed invention and the prior art are such that the claimed
`invention as a whole would have been obvious before the effective filing date of the
`claimed invention to a person having ordinary skill in the art to which the claimed
`invention pertains. Patentability shall not be negated by the manner in which the
`invention was made.
`
`This application currently names joint inventors. In considering patentability of the claims the
`
`examiner presumes that the subject matter of the various claims was commonly owned as of the effective
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 3
`
`filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the
`
`obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not
`
`commonly owned as of the effective filing date of the later invention in order for the examiner to consider
`
`the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later
`
`invention.
`
`1.
`
`Claims 1-2,4-8,10-13,16-17 are rejected under 35 U.S.C. 103 as being unpatentable over Baard
`
`(U.S Pub 2013/0201136 A1) in view of Honda et al (U.S Pub 2011/0175845 A1), and Tan et al (U.S
`
`Publication No.2012/0306772 A1 ).
`
`Regarding Claim 1, Baard teaches an electronic device (portable electronic device 1, Figs.1-3)
`
`comprising:
`
`a display section (a proximity-sensing user interface 2, Fig.1. Paragraph [0021], the proximity-
`
`sensing user interface comprises a capacitive touch sensor panel overlaid on a display panel);
`
`a depression detecting section (force sensor 19, Fig.2) that detects whether a first
`
`transparent member overlapped with the display section is depressed by an indicator (paragraphs
`
`[0022,0037], the force sensor senses a force applied onto the touch sensor panel 12);
`
`a coordinate detecting section that continuously detects a distance between the indicator
`
`and a touch panel layer overlapped with the display section; and coordinates along a surface of
`
`the display section corresponding to a position of the indicator regardless of whether the first
`
`transparent member is detected to be depressed by the indicator (paragraph [0006], the proximity-
`
`sensing user interface is configured to capture position information and distance information between a
`
`user's skin and a surface of the proximity-sensing user interface. Paragraphs [0020, 0059, 0062], a
`
`proximity-sensing user interface provides position information indicative of a position of the user interface
`
`contact by the user's skin or above which the user's finger is hovering); and
`
`a state determiner that determines, when the detected distance is longer than a first
`
`distance and shorter than a second distance (Figs.4-10 illustrate situations in which a finger (i.e.,
`
`gloved finger or non-gloved finger) is operating the touch sensor such as performing a touch action or a
`
`hover action. A distance between a user’s skin and the touch sensor panel is from 0 (i.e., when a non-
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 4
`
`gloved finger contacts with the touch sensor panel) to an upper threshold distance (e.g., upper threshold
`
`34 or 37) when a gloved finger or a non-gloved finger hovers over the touch sensor panel. More
`
`specifically, Figs.7-8, the user is operating the touch sensor panel in a first input mode (e.g., gloved
`
`finger). An input action is a touch action when a distance between the user's skin is from 33 to 32 (Fig.7).
`
`An input action is a hover action when a distance between the user's skin is from 32 to 34 (Fig.8).
`
`Similarly, it is determined that a non-gloved finger is operating the touch sensor (i.e., in a second input
`
`mode) in response to detecting that the user's skin contacts with the touch sensor panel (Fig.9). An input
`
`action is a hover action when the user does not wear a glove and a distance between the user's skin and
`
`the touch sensor panel being from 36 to 37 (Fig.10)), that:
`
`(a) a touch input is indicated when the touch sensor panel is detected to be
`
`depressed (paragraph 0106], an output signal of a force sensor may be utilized to discriminate m
`
`Whom hover actions in the first input mode (e.g., gloved finger) or the second input mode (e.g., a
`
`non-gloved finger). Accordingly, a skilled person in the art would have appreciated that if it is determined
`
`that a force is applied to the touch sensor panel; the input action is a touch action. Reversely, if it is
`
`determined that a force is not applied to the touch sensor panel, the input action is a hover action), and
`
`(c) a hover input is indicated if the touch sensor panel is not detected to be
`
`depressed (as described above, if it is determined that a force is not applied onto the touch sensor panel,
`
`the input action would be a hover action); and
`
`a processor (a central processing unit 6, Fig.2) that performs:
`
`when the state determiner determines the touch input is indicated, a processing
`
`associated with the touch input at the detected coordinates, and when the state determiner
`
`determines the hover input is indicated, a processing associated with the hover input at the
`
`detected coordinates, the processing associated with the hover input being different from the
`
`processing associated with the touch input (Paragraph [0066], in both input modes, touch actions and
`
`hover actions are used to control the portable electronic device. The controller may perform different
`
`functions depending on whether an input action at a given position is identified to be a touch action or a
`
`hover action).
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 5
`
`Baard does not teach a first transparent member overlapped with the display section.
`
`Honda teaches a first transparent member overlapped with the display section (Fig.1 , a
`
`display cover 5 is disposed while being overlapped with a display 7. The display cover 5 should be
`
`transparent so that a user can see image displayed on the display 7).
`
`At the time of invention was filed, it would have been obvious to one of ordinary skill in the art to
`
`modify the touch screen of Baard to include the display cover of Honda placed on top of the touch sensor
`
`panel. Accordingly, a force applied onto the display cover would be detected by the force sensor so as to
`
`discriminate a touch action from a hover action. Such a modification is a result of combining prior art
`
`elements according to known methods to yield predictable results. More specifically, the portable
`
`electronic device of Baard as modified by Honda is known to yield a predictable result of providing a
`
`protection for the touch display screen. Thus, a person of ordinary skill would have appreciated including
`
`in the portable electronic device of Baard the ability to include the display cover of Honda since the
`
`claimed invention is merely a combination of old elements, and in the combination each element merely
`
`would have performed the same function as it did separately, and one of ordinary skill in the art would
`
`have recognized that the results of the combination were predictable.
`
`Baard and Honda do not teaches that the coordinate detecting section detects two dimensional
`
`coordinates along a surface of the display section corresponding to a changing position of the indicator ifl
`
`a vertical or horizontal direction regardless of whether the first transparent member is detected to be
`
`depressed by the indicator, determining that (b) the touch input is indicated, regardless of whether
`
`the first transparent member is detected to be depressed, for at least a predetermined time period
`
`after the touch input has been determined to be indicated in response to detecting that the first
`
`transparent member is depressed.
`
`Tan teaches the coordinate detecting section detects two dimensional coordinates along a
`
`surface of the display section corresponding to a changing position of the indicator in a vertical or
`
`horizontal direction regardless of whether the first transparent member is detected to be depressed
`
`by the indicator, determining that (b) the touch input is indicated, regardless of whether the first
`
`transparent member is detected to be depressed, for at least a predetermined time period after the
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 6
`
`touch input has been determined to be indicated in response to detecting that the first transparent
`
`member is depressed (paragraph [0105], Tan discloses a method of determining a physical contact with
`
`a touch screen not being interrupted. For example, while a user may be swiping across the touch screen,
`
`he temporarily lifts his finger from the touch screen. The method may determine that the lift is inadvertent
`
`based on the lift from the touch screen lasting for less than a threshold amount of time. The time
`
`threshold may be 0.1 seconds. Therefore, Tan further teaches continuously detecting two dimensional
`
`coordinates corresponding to a changing position of a finger regardless of whether the touch screen is
`
`physically touched by the finger).
`
`At the time of invention was filed, it would have been obvious to one of ordinary skill in the art to
`
`modify the system of Baard, Honda, and Tan to include the method of Tan of continuously detecting a
`
`changing position of a finger regardless of whether a touch screen is physically touched by the finger.
`
`Accordingly, the touch input is indicated for at least a predetermined time period after the touch input has
`
`been determined to be indicated. The suggestion/motivation would have been in order to avoid
`
`inadvertent inputs (e.g., inadvertent lifting finger from the touch screen).
`
`Regarding Claim 2; Baard, Honda, and Tan teach the electronic device of Claim 1 as described above.
`
`Baard, Honda, and Tan further teach the state determiner determines that the hover input is
`
`indicated when the detected distance becomes longer than the second distance before the
`
`predetermined time period passes after the touch input has been determined to be indicated in
`
`response to detecting that the first transparent member is depressed (referred to the analysis of
`
`Claim 1 above, the pause threshold period is applied to the touch event when the user’s skin is between
`
`the first distance and the second distance. Baard further discloses that an input action is a hover action
`
`when a distance between the user's skin is from 32 to 34 (Fig.8, par.[0074]). Thus, as long as the
`
`distance is greater than the distance 32, the input action is judged as a hover input no matter how long
`
`the finger is released after a pressure is applied. Thus, the hover input can be detected before the pause
`
`threshold period ends).
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 7
`
`Regarding Claim 4; Baard, Honda, and Tan teach the electronic device of Claim 1 as described above.
`
`Baard further teaches the first distance is 0 (Fig.9, a non-gloved finger contacts with the touch sensor
`
`panel. Thus, the distance between the user’s skin and the touch sensor panel is 0).
`
`Regarding Claim 5; Baard, Honda, and Tan teach the electronic device of Claim 1 as described above.
`
`Honda further teaches a housing (casing 1, Fig.1), wherein at least part of the first transparent
`
`member is exposed from the housing (as shown in Fig.1).
`
`Regarding Claim 6; Baard, Honda, and Tan teach the electronic device of Claim 1 as described above.
`
`Honda further teaches the first transparent member (display cover 5, Fig.1) and the touch panel layer
`
`(touch panel 2, Fig.1) are integrated into one piece (as shown in Fig.1).
`
`Regarding Claim 7; Baard, Honda, and Tan teach the electronic device of Claim 1 as described above.
`
`Honda further teaches the display section has a rectangle shape (Fig.2, display panel 7 is a
`
`rectangle); and the depression detecting section is disposed along at least one side of the
`
`rectangle shape (Fig.2, paragraph [0095], the pressure-sensitive sensor 3 includes a first electrode 10
`
`that is formed annularly and a second electrode 11 that is also formed annularly. Thus, the pressure-
`
`sensitive sensor is disposed four sides of the rectangle).
`
`Regarding Claim 8; Baard, Honda, and Tan teach the electronic device of Claim 7 as described above.
`
`Honda further teaches the depression detecting section is disposed along at least one of short
`
`sides of the rectangle sh_arfi (Fig.2, paragraph [0095], the pressure-sensitive sensor 3 includes a first
`
`electrode 10 that is formed annularly and a second electrode 11 that is also formed annularly. Thus, the
`
`pressure-sensitive sensor is disposed at least along one of short sides of the rectangle).
`
`Regarding Claim 10; Baard, Honda, and Tan teach the electronic device of Claim 1 as described above.
`
`Honda further teaches the depression detecting section is disposed while at least part of the
`
`depression detecting section is overlapped with the touch panel layer (For example, Fig.24, the
`
`pressure-sensitive layer comprises electrodes 10 and 11 is overlapped with the touch panel 2).
`
`Regarding Claim 11; Baard, Honda, and Tan teach the electronic device of Claim 1 as described above.
`
`Honda further teaches the depression detecting section is disposed on at least the transparent
`
`member (Fig.1, the pressure-sensitive sensor 3 is disposed on the display cover 5).
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 8
`
`Regarding Claim 12; Baard, Honda, and Tan teach the electronic device of Claim 1 as described above.
`
`Honda further teaches the depression detecting section is disposed on at least the touch panel
`
`layer (Figs.24,27; the pressure-sensitive sensor 3 comprising a first electrode 10 and a second electrode
`
`11 is disposed on the touch panel 2).
`
`Regarding Claim 13; Baard, Honda, and Tan teach the electronic device of Claim 1 as described above.
`
`Honda further teaches the depression detecting section is disposed on at least the display section
`
`(Fig.20, the pressure-sensitive sensor 703 is disposed on the display panel 7).
`
`Regarding Claim 16, Baard teaches an input processing method useable for an electronic device (a
`
`processing method for a portable electronic device 1, Figs.1-4) that includes a display section (display
`
`11, Figs.2—3), a processor (a central processing unit 6, Fig.2) and a touch panel layer overlapping the
`
`display section (a touch sensor panel 12 overlapping the display 12, Figs.2—3), the input processing
`
`method comprising:
`
`detecting whether the touch sensor panel is depressed by an indicator (paragraphs
`
`[0022,0037], the force sensor senses a force applied onto the touch sensor panel 12);
`
`continuously detecting a distance between the indicator and the touch panel layer, and
`
`coordinates along a surface of the display section corresponding to a position of the indicator
`
`regardless of whether the first transparent member is detected to be depressed by the indicator
`
`(paragraph [0006], the proximity-sensing user interface is configured to capture position information and
`
`distance information between a user's skin and a surface of the proximity-sensing user interface.
`
`Paragraphs [0020, 0059, 0062], a proximity-sensing user interface provides position information indicative
`
`of a position of the user interface contact by the user's skin or above which the user's finger is hovering);
`
`and
`
`determining, when the detected distance is longer than a first distance and shorter than a
`
`second distance (Figs.4-10 illustrate situations in which a finger (i.e., gloved finger or non-gloved finger)
`
`is operating the touch sensor such as performing a touch action or a hover action. A distance between a
`
`user’s skin and the touch sensor panel is from 0 (i.e., when a non-gloved finger contacts with the touch
`
`sensor panel) to an upper threshold distance (e.g., upper threshold 34 or 37) when a gloved finger or a
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 9
`
`non-gloved finger hovers over the touch sensor panel. More specifically, Figs.7-8, the user is operating
`
`the touch sensor panel in a first input mode (e.g., gloved finger). An input action is a touch action when a
`
`distance between the user's skin is from 0 to 32 (Fig.7). An input action is a hover action when a distance
`
`between the user's skin is from 32 to 34 (Fig.8). Similarly, it is determined that a non-gloved finger is
`
`operating the touch sensor (i.e., in a second input mode) in response to detecting that the user's skin
`
`contacts with the touch sensor panel (Fig.9). An input action is a hover action when the user does not
`
`wear a glove and a distance between the user's skin and the touch sensor panel being from 36 to 37
`
`(Fig.10)), m:
`
`(a) a touch input is indicated when the first transparent member is detected to be
`
`depressed (paragraph 0106], an output signal of a force sensor may be utilized to discriminate
`
`touch actions from hover actions in the first input mode (e.g., gloved finger) or the second input
`
`mode (e.g., a non-gloved finger). Accordingly, a skilled person in the art would have appreciated
`
`that if it is determined that a force is applied to the touch sensor panel; the input action is a touch
`
`action. Reversely, if it is determined that a force is not applied to the touch sensor panel, the input
`
`action is a hover action),fl
`
`(c) a hover input is indicated if the first transparent member is not detected to be
`
`depressed (as described above, if it is determined that a force is not applied onto the touch
`
`sensor panel, the input action would be a hover action);
`
`performing, by the processor:
`
`when the state determiner determines the touch input is indicated, a processing
`
`associated with the touch input at the detected coordinates, and when the hover input is
`
`indicated, a processing associated with the hover input at the detected coordinates, the
`
`processing associated with the hover input being different from the processing associated with
`
`the touch input (Paragraph [0066], in both input modes, touch actions and hover actions are used to
`
`control the portable electronic device. The controller may perform different functions depending on
`
`whether an input action at a given position is identified to be a touch action or a hover action).
`
`Baard does not teach a first transparent member overlapped with the display section.
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 10
`
`Honda teaches a first transparent member overlapped with the display section (Fig.1 , a
`
`display cover 5 is disposed while being overlapped with a display 7. The display cover 5 should be
`
`transparent so that a user can see image displayed on the display 7).
`
`At the time of invention was filed, it would have been obvious to one of ordinary skill in the art to
`
`modify the touch screen of Baard to include the display cover of Honda placed on top of the touch sensor
`
`panel. Accordingly, a force applied onto the display cover would be detected by the force sensor so as to
`
`discriminate a touch action from a hover action. Such a modification is a result of combining prior art
`
`elements according to known methods to yield predictable results. More specifically, the portable
`
`electronic device of Baard as modified by Honda is known to yield a predictable result of providing a
`
`protection for the touch display screen. Thus, a person of ordinary skill would have appreciated including
`
`in the portable electronic device of Baard the ability to include the display cover of Honda since the
`
`claimed invention is merely a combination of old elements, and in the combination each element merely
`
`would have performed the same function as it did separately, and one of ordinary skill in the art would
`
`have recognized that the results of the combination were predictable.
`
`Baard and Honda do not teaches that the coordinate detecting section detects two dimensional
`
`coordinates along a surface of the display section corresponding to a changing position of the indicator in
`
`a vertical or horizontal direction regardless of whether the first transparent member is detected to be
`
`depressed by the indicator, determining that (b) the touch input is indicated, regardless of whether
`
`the first transparent member is detected to be depressed, for at least a predetermined time period
`
`after the touch input has been determined to be indicated in response to detecting that the first
`
`transparent member is depressed.
`
`Tan teaches the coordinate detecting section detects two dimensional coordinates along a
`
`surface of the display section corresponding to a changing position of the indicator in a vertical or
`
`horizontal direction regardless of whether the first transparent member is detected to be depressed
`
`by the indicator, determining that (b) the touch input is indicated, regardless of whether the first
`
`transparent member is detected to be depressed, for at least a predetermined time period after the
`
`touch input has been determined to be indicated in response to detecting that the first transparent
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 11
`
`member is depressed (paragraph [0105], Tan discloses a method of determining a physical contact with
`
`a touch screen not being interrupted. For example, while a user may be swiping across the touch screen,
`
`he temporarily lifts his finger from the touch screen. The method may determine that the lift is inadvertent
`
`based on the lift from the touch screen lasting for less than a threshold amount of time. The time
`
`threshold may be 0.1 seconds. Therefore, Tan further teaches continuously detecting two dimensional
`
`coordinates corresponding to a changing position of a finger regardless of whether the touch screen is
`
`physically touched by the finger).
`
`At the time of invention was filed, it would have been obvious to one of ordinary skill in the art to
`
`modify the system of Baard, Honda, and Tan to include the method of Tan of continuously detecting a
`
`changing position of a finger regardless of whether a touch screen is physically touched by the finger.
`
`Accordingly, the touch input is indicated for at least a predetermined time period after the touch input has
`
`been determined to be indicated. The suggestion/motivation would have been in order to avoid
`
`inadvertent inputs (e.g., inadvertent lifting finger from the touch screen).
`
`Regarding Claim 17, Baard teaches an input processing program for causing a computer to
`
`execute a processing for an electronic device (paragraph [0057], an electronic device comprising a
`
`memory storing instruction code for a central processing unit 6) that includes a display section (display
`
`11, Figs.2—3), and a touch panel laver overlapping the display section (Fig.3, a touch sensor panel 12
`
`overlaps the display 11), the input processing program causing the computer to execute the
`
`processing comprising: continuously detecting whether the touch panel is depressed by an
`
`indicator (paragraphs [0022,0037], the force sensor senses a force applied onto the touch sensor panel
`
`12); detecting a distance between the indicator and the touch panel layer; and coordinates along a
`
`surface of the display section corresponding to a position of the indicator regardless of whether
`
`the first transparent member is detected to be depressed by the indicator (paragraph [0006], the
`
`proximity-sensing user interface is configured to capture position information and distance information
`
`between a user's skin and a surface of the proximity-sensing user interface. Paragraphs [0020, 0059,
`
`0062], a proximity-sensing user interface provides position information indicative of a position of the user
`
`interface contact by the user's skin or above which the user's finger is hovering); determining, when the
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 12
`
`detected distance is longer than a first distance and shorter than a second distance (Figs.4-10
`
`illustrate situations in which a finger (i.e., gloved finger or non-gloved finger) is operating the touch sensor
`
`such as performing a touch action or a hover action. A distance between a user’s skin and the touch
`
`sensor panel is from 0 (Le, when a non-gloved finger contacts with the touch sensor panel) to an upper
`
`threshold distance (e.g., upper threshold 34 or 37) when a gloved finger or a non-gloved finger hovers
`
`over the touch sensor panel. More specifically, Figs.7-8, the user is operating the touch sensor panel in a
`
`first input mode (e.g., gloved finger). An input action is a touch action when a distance between the user's
`
`skin is from 0 to 32 (Fig.7). An input action is a hover action when a distance between the user's skin is
`
`from 32 to 34 (Fig.8). Similarly, it is determined that a non-gloved finger is operating the touch sensor
`
`(i.e., in a second input mode) in response to detecting that the user's skin contacts with the touch sensor
`
`panel (Fig.9). An input action is a hover action when the user does not wear a glove and a distance
`
`between the user's skin and the touch sensor panel being from 36 to 37 (Fig.10)), that:
`
`(a) a touch input is indicated when the first transparent member is detected to be
`
`depressed (paragraph 0106], an output signal of a force sensor may be utilized to discriminate m
`
`Mfrom hover actions in the first input mode (e.g., gloved finger) or the second input mode (e.g., a
`
`non-gloved finger). Accordingly, a skilled person in the art would have appreciated that if it is determined
`
`that a force is applied to the touch sensor panel; the input action is a touch action. Reversely, if it is
`
`determined that a force is not applied to the touch sensor panel, the input action is a hover action), and
`
`(c) a hover input is indicated if the first transparent member is not detected to be
`
`depressed (as described above, if it is determined that a force is not applied onto the touch sensor panel,
`
`the input action would be a hover action); and
`
`performing:
`
`when the state determiner determines the touch input is indicated, a processing
`
`associated with the touch input at the detected coordinates, and when the hover input is
`
`indicated, a processing associated with the hover input at the detected coordinates, the
`
`processing associated with the hover input being different from the processing associated with
`
`the touch input (Paragraph [0066], in both input modes, touch actions and hover actions are used to
`
`
`
`Application/Control Number: 14/169,874
`
`Art Unit: 2622
`
`Page 13
`
`cont