throbber
www.uspto.gov
`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and TrademarkOffice
`Address; COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`16/814,666
`
`03/10/2020
`
`Hiroki KASUGAI
`
`AOYA.16PUSO1
`
`3955
`
`MARKD. SARALINO (PAN)
`RENNER, OTTO, BOISSELLE & SKLAR, LLP
`1621 EUCLID AVENUE
`ISTH FLOOR
`
`CLEVELAND, OH 44115
`
`NAZRUL, SHAHBAZ
`
`2697
`
`06/24/2021
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`
`ipdocket @rennerotto.com
`
`PTOL-90A (Rev. 04/07)
`
`

`

`
`
`Disposition of Claims*
`1-12 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) ___ is/are withdrawn from consideration.
`C} Claim(s)
`is/are allowed.
`Claim(s) 1-12 is/are rejected.
`S)
`) © Claim(s)____is/are objected to.
`Cj) Claim(s
`are subjectto restriction and/or election requirement
`)
`S)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) )
`
`Application Papers
`10)(] The specification is objected to by the Examiner.
`11) The drawing(s) filed on 3/10/2020 is/are: a)(¥) accepted or b)() objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)¥) All
`1.4) Certified copies of the priority documents have been received.
`2.2) Certified copies of the priority documents have been received in Application No.
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`* See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3) (J Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`4)
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20210619
`
`Application No.
`Applicant(s)
`16/814,666
`KASUGAI, Hiroki
`
`Office Action Summary Art Unit|AIA (FITF) StatusExaminer
`SHAHBAZ NAZRUL
`2697
`Yes
`
`
`
`-- The MAILING DATEofthis communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensions of time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133}.
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 2/8/2021.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)L) This action is FINAL. 2b)¥)This action is non-final.
`3)02 An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4\0) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AlA or AIA Status
`
`The present application, filed on or after March 16, 2013, is being examined under the
`
`first inventor to file provisions of the AIA.
`
`Claim Rejections - 35 USC § 102
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103)is incorrect, any
`
`correction of the statutory basis for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be
`
`the same under either status.
`
`The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form
`
`the basis for the rejections under this section made in this Office action:
`
`A person shall be entitled to a patent unless —
`
`(a)(1) the claimed invention was patented, described in a printed publication, or in public
`use, on sale or otherwise available to the public before the effective filing date of the
`claimed invention.
`
`(a)(2) the claimed invention was described in a patent issued under section 151, or in an
`application for patent published or deemed published under section 122(b), in which the patent or
`application, as the case may be, namesanother inventor and waseffectively filed before the
`effective filing date of the claimed invention.
`
`Claims 1, 5-7 are rejected under 35 U.S.C. 102(a)(1) and/or 102(a)(2)as being
`
`anticipated by Hirose etal. (WO 2013/146893; hereinafter Hirose. The referenceis
`
`part of IDS submitted by applicant. Examiner uses the main reference and
`
`translation provided by Applicant for citation).
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 3
`
`Regarding claim 1, Hirose discloses an imaging apparatus (fig. 1) comprising:
`
`an imager configured to capture a subject image to generate image data(fig.1,
`
`units 11-12);
`
`an audio input device configured to receive audio data indicating sound during
`
`shooting using the imager (fig.1, unit 15);
`
`a detector (CPU 14) configured to detect a subject and a type of the subject
`
`based on the image data generated by the imager (fig.2, step S1: the scene
`
`determination processing unit 14b of the CPU 14 performs a scene determination
`
`process of determining a photographic scene at predeterminedintervals (for example,
`
`several frames) based on the through image, and then proceeds to step S2. As the
`
`shooting scene determinedin this scene judgment processing, for example, a shooting
`
`scene "Portrait" where the subject is a person, a shooting scene "Child" where the
`
`subject is a child, and a shooting scene "Landscape" where the subject is a landscape
`
`are is there);
`
`an audio processor (CPU 14, fig. 1) configured to process the audio data
`
`received by the audio input device based on the type of subject detected by the detector
`
`(Fig.2, step S3: switches the frequency characteristic to be changedin the frequency
`
`characteristic changing unit 14d according to the scene determination result in step S1,
`
`and proceeds Io step S4. In the present embodiment, the frequency characteristics of
`
`the sound are predetermined for each photographic scene so as to obtain sound
`
`suitable for the photographic scene (subject). When the shooting sceneis "child", the
`
`frequency characteristic changing unit 14 d changes the frequency characteristic of the
`
`voice so as to emphasize the voice of the child. For example, leave only the frequency
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 4
`
`band of 300 Hz above and below the frequency of 1 kHz, which is the frequency of
`
`voice of a general child, and cut off other frequency bands. When the photographic
`
`scene is “portrait”, the frequency characteristic changing unit 14 d changes the
`
`frequency characteristic of the sound so as to emphasize the voice of the person. For
`
`example, we leave only 0. 15 to 13 kHz, which is a frequency band including voices of
`
`general people in general, and cut off other frequency bands. When the shooting scene
`
`is "landscape", the frequency characteristic changing section 14 d changes the
`
`frequency characteristic of the sound so as to cut the wind noise and the high frequency
`
`sound); and
`
`an operation member (operation member 16, fig. 1) configured to set a target
`
`type amonga plurality of types based on a user operation on the imaging apparatus, the
`
`target type indicating a type to be processed by the audio processor,the plurality of
`
`types includingafirst type and a second typedifferent from the first type (see fig.2, step
`
`$1),
`
`wherein the audio processor is configured to process the audio data to
`
`emphasize or suppress specific sound corresponding to the target type in audio
`
`data received when a subject of the target type is detected in the image data
`
`(see fig.2, step S3).
`
`Regarding claim 5, Hirose discloses the imaging apparatus according to claim 1,
`
`wherein each ofthe first and second types is a type regarding any one of a person, an
`
`animal other than a person, and an object having a background sound (“it is determined
`
`whether or not the photographic scene is "child", "portrait", and "landscape", but the
`
`photographic scene may be "pet" May be added” — 40043).
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 5
`
`Regarding claim 6, Hirose discioses the imaging apparatus according to claim 1,
`
`wherein the audio processor is configured to gradually increase an amplification factor
`
`from a timing when the detector detects the subject of the target type, the amplification
`
`factor emphasizing the specific sound corresponding to the target type (the frequency
`
`characteristic change unit 14d sequentially changes the frequency characteristic of the
`
`sound based on the determination result — (0034. The frequency characteristic
`
`changing unit 14 d from the current frequency characteristic to the frequency
`
`characteristic according to the shooting scene determined in step S6 It may be changed
`
`gradually. Thereby, it is possible to prevent the frequency characteristic of tile sound of
`
`the moving picture from suddenly cilanging between the frames and giving the viewera
`
`feeling of strangenessat the time of reproduction — 40040).
`
`Regarding claim 7, Hirose discloses the imaging apparatus according to claim 6,
`
`wherein the audio processor is configured to gradually decrease the amplification factor,
`
`when the subjectof the target type is no longer detected after the subject of the target
`
`type is detected by the detector (the frequency characteristic change unit 14d
`
`sequentially changes the frequency characteristic of the sound based on the
`
`determination result — 40034. The frequency characteristic changing unit 14 d from the
`
`current frequency characteristic to the frequency characteristic according to the
`
`shooting scene determined in step S6 It may be changed gradually. Thereby,it is
`
`possible to prevent the frequency characteristic of tile sound of the moving picture from
`
`suddenly cilanging between the frames and giving the viewer a feeling of strangeness
`
`at the time of reproduction — 40040).
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 6
`
`Claim Rejections - 35 USC § 103
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103)is incorrect, any
`
`correction of the statutory basis for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be
`
`the same under either status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness
`
`rejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed
`invention is not identically disclosed as set forth in section 102 of thistitle, if the differences
`between the claimed invention andthe prior art are such that the claimed invention as a whole
`would have been obvious before the effectivefiling date of the claimed invention to a person
`having ordinary skill in the art to which the claimed invention pertains. Patentability shall not
`be negated by the manner in which the invention was made.
`
`Claim(s) 2, 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over
`
`Hirose in view of Arrasvuori et al. (2014/0369506. Referenceis part of IDS).
`
`Regarding claim 2, Hirose discloses the imaging apparatus according to claim 1,
`
`except, further comprising a display configured to display an image indicated by the
`
`image data, and target type information indicating the target type.
`
`However, Arrasvuori discloses in figs. 4-8, and the corresponding parts in the
`
`specification, that display 410 configured to display an image indicated by the image
`
`data (e.g. 530, 540 etc.), and target type information indicating the target type (Hohn
`
`doe, a dog etc.).
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 7
`
`Therefore, it would have been obvious to one of ordinary skill in the art before the
`
`effective filing date of the claimed invention (AIA) to modify the invention of Hirose to
`
`include the teaching of Arrasvuori, to obtain, apparatus further comprising a display
`
`configured to display an image indicated by the image data, and target type information
`
`indicating the target type, because, combining prior art elements according to known
`
`method ready for improvement to yield predictable results is obvious. Furthermore, such
`
`combination would enhancethe overall configurability and versatility of the system, by
`
`providing the information to the user about current focused target type(s).
`
`Regarding claim 3, Hirose in view of Arrasvuori discloses the imaging apparatus
`
`according to claim 2, wherein the display is configured to further display information
`
`indicating a detection result of the subject by the detector (e.g. John Doe and A dog, as
`
`shownin figs. 6-8), and the operation member is configured to set the target type in
`
`accordance with a user operation of specifying a subject to be focused in the imaging
`
`apparatus based on the information displayed in the display (The user interface
`
`controller 320 may be configured to receive an indication of a user action associated
`
`with an item representing a sound source of the composite audio signal. The sound
`
`source subjected to a user action may bereferred to as a selected sound source. The
`
`user interface controller 320 may be further configured to determine an indication of a
`
`user selected modification of the audio image on basis of the received indication of the
`
`user action associated with the item representing the selected sound source of the
`
`composite audio signal. The user selection may involve the user using a mouse, a
`
`touchpad or a corresponding arrangementtogether with a respective action button to
`
`makethe selection. Additionally or alternatively, the user selection may involve the user
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 8
`
`using a finger, a stylus or a corresponding pointing device to perform an action on a
`
`touchscreen to make the selection — 40079-0080).
`
`Claim(s)4, 8, 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over
`
`Hirose in view of Lee et al. (US 9,332,211; hereinafter Lee).
`
`Regarding claim 4, Hirose discloses the imaging apparatus according to claim 1,
`
`except, wherein the imaging apparatus has operation modes including a specific
`
`operation mode presetone of the plurality of types, and the operation member is
`
`configured to set the target type according to a user operation of selecting the specific
`
`operation mode among the operation modes of the imaging apparatus.
`
`However, Lee discloses, various capture modes and sound capture modes. In a camera
`
`mode the audio zooming method is implemented. The capture modes include e.g. basic
`
`mode, landscape mode, portrait mode, street mode etc. Also fig. 5 shows that sound
`
`capture mode is a sub modeselected and configured by a user in capture mode, that
`
`includes screen zoom link mode, a gun mode, a sound zoom variation mode, a gun
`
`location change mode, a stereo mode, hearing aid mode andthe likes thereof (fig. 5,
`
`Col. 12 line 60- Col. 13, line 6).
`
`Therefore, it would have been obvious to one of ordinary skill in the art before the
`
`effectivefiling date of the claimed invention (AIA) to modify the invention of Hirose to
`
`include the teaching of Lee of user settable capture mode and sound capture mode, to
`
`obtain, wherein the imaging apparatus has operation modes including a specific
`
`operation mode presetone of the plurality of types, and the operation member is
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 9
`
`configured to set the target type according to a user operation of selecting the specific
`
`operation mode among the operation modes of the imaging apparatus, because,
`
`combining prior art elements according to known method ready for improvement to yield
`
`predictable results is obvious. Furthermore, such combination would enhance the
`
`configurability and versatility of the overall system, by giving the user to operate the
`
`camera in additional mode based on audio and video capture and synchronization.
`
`Regarding claim 8, Hirose discloses the imaging apparatus according to claim 1,
`
`except, further comprising a sound collector configured to collect sound, wherein the
`
`audio input device is configured to receive audio data indicating sound collected by the
`
`sound collector.
`
`However, Lee discloses w.r.t. fig. 7A and 7B, when a general capture mode or
`
`landscape mode is selected, the sound capture angle is set to 180 degrees to capture
`
`sounds in a broad range so as to contain surrounding sounds to the maximum. At this
`
`time, the sound capture range 50 is set to the maximum, and whena portrait mode is
`
`selected, the controller 180 configures the sound capture angle to be small so as to
`
`capture only a specific user's voice, and displays the sound capture range 50 ina
`
`narrow range on the automatically recognized face (figs. 7A, 7B, Col. 13, line 60-Col.
`
`14, line 12).
`
`Therefore, it would have been obvious to one of ordinary skill in the art before the
`
`effective filing date of the claimed invention (AIA) to modify the invention of Hirose to
`
`include the teaching of setting capturing mode based on varying sound capture angle,
`
`to obtain, apparatus further comprising a sound collector configured to collect sound,
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 10
`
`wherein the audio input device is configured to receive audio data indicating sound
`
`collected by the sound collector, because, combining prior art elements according to
`
`known method ready for improvement to yield predictable results is obvious.
`
`Furthermore, such combination would enhancethe configurability and versatility of the
`
`overall system by allowing a user select his or her desired capture mode according to
`
`the environmental situation and the kind of a captured subject to control the sound
`
`capture angle, thereby automatically controlling the sound capture range (Lee, Col. 14,
`
`lines 8-12).
`
`Regarding claim 9, Hirose in view of Lee discloses the imaging apparatus according to
`
`claim 8, further comprising a beam former configured to change a range in which the
`
`sound collector collects sound in accordance with a detection result of the detector (figs.
`
`7A, 7B, Col. 13, line 60-Col. 14, line 12).
`
`Claim 10-12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hirose
`
`in view of Lee and Arrasvuori.
`
`Regarding claim 10, Hirose discloses an imaging apparatus (fig.1) comprising:
`
`an irnager configured to capture a subject image to generate image data(fig.1,
`
`units.11-12);
`
`an audio input device configured to receive audio data indicating sound during
`
`shooting using the imager (fig.1, unit15);
`
`a detector configured to detect a subject and a type of the subject based on the
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 11
`
`image data generated by the imager (fig.2, step S1: the scene determination processing
`
`unit 14 b of the CPU 14 performs a scene determination process of determining a
`
`photographic scene at predeterminedintervals (for example, several frames) based on
`
`the through image, and then proceeds to step S 2. As the shooting scene determinedin
`
`this scene judgment processing, for example, a shooting scene "Portrait" where the
`
`subject is a person, a shooting scene "Child" where the subjectis a child, and a
`
`shooting scene "Landscape" where the subject is a landscape are is there);
`
`a display configured to display an image indicated by the image data (0016:
`
`displayed on an LCD monitor (not shown));
`
`an operation member configured to select a focus subject in the image from
`
`subjects detected by the detector (0018: AF (auto focus) setting ... the CPU 14
`
`switches the photographing setting so as to focus on the child's face recognized by the
`
`face recognition processing);
`
`an audio processor configured to process the audio data received by the audio
`
`input device based on a type of selected subject by the operation member (Fig. 2, step
`
`$3: switches the frequency characteristic to be changed in the frequency characteristic
`
`changing unit 14 d according to the scene determination result in step S1, and proceeds
`
`to step S4. In the present embodiment, the frequency characteristics of the sound are
`
`predetermined for each photographic scene so as to obtain sound suitable for the
`
`photographic scene (subject). When the shooting scene is "child", the frequency
`
`characteristic changing unit 14 d changes the frequency characteristic of the voice so as
`
`to emphasize the voice of the child. For example, leave only the frequency band of 300
`
`Hz above and below the frequency of 1 kHz, which is the frequencyof voice of a
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 12
`
`general child, and cut off other frequency bands. When the photographic sceneis
`
`"portrait", the frequency characteristic changing unit 14 d changes the frequency
`
`characteristic of the sound so as to emphasize the voice of the person. For example, we
`
`leave only 0. 15 to 13 kHz, whichis a frequency band including voices of general people
`
`in general, and cut off other frequency bands. When the shooting sceneis "landscape",
`
`the frequency characteristic changing section 14 d changes the frequency characteristic
`
`of the soundso as to cut the wind noise and the high frequency sound).
`
`The subject-matter of claim 10 therefore differs from this known imaging apparatus in
`
`that: (1) select a focus subject .
`
`.. based on a user operation on the imaging apparatus:
`
`and (2) a processor configured to cause the display to display target type information
`
`indicating the type of the focus subject as a target type to be processed by the audio
`
`processor.
`
`There is no apparent synergy between features (1) and (2). The subject-matter of claim
`
`10, therefore, consists merely in the juxtaposition or association of both known and
`
`common measures functioning in their normal way and not producing any non-obvious
`
`working interrelationship. The combined features do not mutually support each other in
`
`their effects to such an extent that any unexpected or surprisingly advantageous result
`
`is achieved.
`
`Lee discloses feature (1) in Col. 1, lines 43-48, as a user inputted AF demand selected
`
`on the subject to be focused.
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 13
`
`Arrasvuori discloses teature (2) in figures 4-8 and corresponding parts of the
`
`description. E.g. Arrasvuori discloses in figs. 4-8, and the corresponding parts in the
`
`specification, that display 410 configured to display an image indicated by the image
`
`data (e.g. 530, 540 etc.), and target type information indicating the target type (Hohn
`
`doe, a dog etc.).
`
`Therefore, it would have been obvious to one of ordinary skill in the art before the
`
`effective filing date of the claimed invention (AIA) to modify the invention of Hirose to
`
`include the teachings of Lee and Arrasvuori, to obtain, select a focus subject .
`
`.. based
`
`on a user operation on the imaging apparatus: and a processor configured to cause the
`
`display to display target type information indicating the type of the focus subject as a
`
`target type to be processed by the audio processor, because, combining prior art
`
`elements according to known method ready for improvement to yield predictable results
`
`is obvious. Furthermore, such combination would enhancethe overall configurability
`
`and versatility of the system, by providing the information to the user about current
`
`focused target type(s). Combination would also allow user to apply automatic focusing
`
`feature on a desired subjectof interest.
`
`Regarding claim 11, Hirose in view of Lee and Arrasvuori discloses the imaging
`
`apparatus according to claim 10, wherein the processor is configured to further cause
`
`the display to display emphasis level information indicating a level at which the audio
`
`processor emphasizes or suppresses specific sound of the selected subject (Hirose:
`
`Fig. 2, step S3. In step S3, the frequency cllaracteristic change unit 14d of tlle CPU
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 14
`
`14 switches the frequency characteristic of the sound to be recorded in accordance with
`
`the determined voice (cry) of the pet (animal). For example, when the type of pet
`
`(animal) is a dog, the frequency band of the soundto be recorded is switched to 500 to
`
`-1000 Hz so that the dog's bark can be emphasized. Whenthe type of pet (animal) is a
`
`cat, switch the frequency band of the sound to be recorded to 750 to 1500 Hz sottlat
`
`the bark of the cat can be emphasized. That is, the frequencycriaracteristic of trle
`
`sound to be recorded is set in accordance with the frequency characteristic of the voice
`
`(cry) of a pet (animal) — cited from translation provided, page 8, 43).
`
`Regarding claim 12, Hirose in view of Lee and Arrasvuori discloses the imaging
`
`apparatus according to claim 10, wherein the processor is configured to, in response to
`
`change of the focus subject with a type of the focus subject after the change being
`
`different from the target type before the change, update the target type information to
`
`indicate the type after the change asthe target type, and causethe display to display
`
`
`the updated. target type information (Hirose: even when tile subject changes in the
`
`middle of the moving image, it is possible to make the sound of the moving image
`
`suitabie for the subject — §0033-0038).
`
`Conclusion
`
`The prior and/or pertinent art(s) made of record and not relied upon is considered
`
`pertinent to applicant's disclosure, are - Ozawa (20050140810), Inagaki (5,999,214),
`
`Matsumotoet al. (10909384), Kelly et al. (10,778,900), Kim et al. (10,051,364) — who
`
`disclose different AV systems ofinterest.
`
`

`

`Application/Control Number: 16/814,666
`Art Unit: 2697
`
`Page 15
`
`Any inquiry concerning this communication or earlier communications from the
`
`examiner should be directed to SHAHBAZ NAZRUL whose telephone number is
`
`(571)270-1467. The examiner can normally be reached on M-Th: 9.30 am-3 pm, 6.30
`
`pm-9 pm, F: 9.30 am-1.30 pm, 4 pm-8 pm.
`
`Examiner interviews are available via telephone, in-person, and video
`
`conferencing using a USPTO supplied web-based collaboration tool. To schedule an
`
`interview, applicant is encouraged to use the USPTO Automated Interview Request
`
`(AIR) at http:/Avww.uspto.gov/interviewpractice.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner's
`
`supervisor, Lin Ye can be reached on 571-272-7372. The fax phone number for the
`
`organization where this application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of an application may be obtained from the
`
`Patent Application Information Retrieval (PAIR) system. Status information for
`
`published applications may be obtained from either Private PAIR or Public PAIR.
`
`Status information for unpublished applications is available through Private PAIR only.
`
`For more information about the PAIR system, see https://ppair-
`
`my.uspto.gov/pair/PrivatePair. Should you have questions on accessto the Private
`
`PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197(toll-free).
`
`If you would like assistance from a USPTO Customer Service Representative or access
`
`to the automated information system, call 800-786-9199 (IN USA OR CANADA)or 571-
`
`272-1000.
`
`/SHAHBAZ NAZRUL/
`Primary Examiner, Art Unit 2697
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket