throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`
`
`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and Trademark Office
`Address: COMMISSIONER FOR PATENTS
`PO. Box 1450
`Alexandria, Virginia 2231371450
`www.uspto.gov
`
`15/182,488
`
`06/14/2016
`
`Hiroyuki Matsumoto
`
`731156534
`
`4201
`
`Seed IP Law Group LLP/Panason1e
`701 Fifth Avenue, Suite 5400
`Seattle, WA 98104
`
`KARWAN' SIHAR A
`
`ART UNIT
`
`2422
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`07/29/2019
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above—indicated "Notification Date" to the
`
`following e—mail address(es):
`US PTOeACtion @ SeedIP .Com
`
`pairlinkdktg @ seedip .eom
`
`PTOL-90A (Rev. 04/07)
`
`

`

`0/7709 A0170” Summary
`
`Application No.
`15/182,488
`Examiner
`SIHAR A KARWAN
`
`Applicant(s)
`Matsumoto et al.
`Art Unit
`AIA (FITF) Status
`2422
`Yes
`
`- The MAILING DA TE of this communication appears on the cover sheet wit/7 the correspondence address -
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLY IS SET TO EXPIRE g MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensions of time may be available under the provisions of 37 CFR 1.136(a). In no event, however, may a reply be timely filed after SIX (6) MONTHS from the mailing
`date of this communication.
`|f NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHS from the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133).
`Any reply received by the Office later than three months after the mailing date of this communication, even if timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1). Responsive to communication(s) filed on 7/1/2019.
`[:1 A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/were filed on
`
`2a)D This action is FINAL.
`
`2b)
`
`This action is non-final.
`
`3)[:] An election was made by the applicant in response to a restriction requirement set forth during the interview on
`; the restriction requirement and election have been incorporated into this action.
`
`4)[:] Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Expat/7e Quay/e, 1935 CD. 11, 453 O.G. 213.
`
`Disposition of Claims*
`5)
`Claim(s)
`
`1—1 18 is/are pending in the application.
`
`5a) Of the above claim(s) fl is/are withdrawn from consideration.
`
`E] Claim(s) _ is/are allowed.
`
`Claim(s) 1—6,14—13 is/are rejected.
`
`E] Claim(s) _ is/are objected to.
`
`) ) ) )
`
`6 7
`
`8
`
`
`
`are subject to restriction and/or election requirement
`[:1 Claim(s)
`9
`* If any claims have been determined allowable, you may be eligible to benefit from the Patent Prosecution Highway program at a
`
`participating intellectual property office for the corresponding application. For more information, please see
`
`http://www.uspto.gov/patents/init events/pph/index.'sp or send an inquiry to PPeredback@uspto.gov.
`
`Application Papers
`10):] The specification is objected to by the Examiner.
`
`11):] The drawing(s) filed on
`
`is/are: a)C] accepted or b)Ej objected to by the Examiner.
`
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12). Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d) or (f).
`Certified copies:
`
`a). All
`
`b)C] Some**
`
`c)C] None of the:
`
`1.. Certified copies of the priority documents have been received.
`
`2C] Certified copies of the priority documents have been received in Application No.
`
`3.[:] Copies of the certified copies of the priority documents have been received in this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`
`** See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date_
`U.S. Patent and Trademark Office
`
`3) C] Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`4) CI Other-
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mai| Date 20190716
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 2
`
`DETAILED ACTION
`
`This is a non—final office action for the application filed on 7/1/2019.
`
`Claims 1—6, 13—18 are pending. Claims 1—6, 13—18 are rejected. Claims 7—12 are canceled.
`
`Amendments to the claims have been recorded.
`
`Response to Arguments
`
`Applicant’s arguments filed on 5/16//2019 have been fully considered but they are not
`
`persuasive.
`
`Regarding Applicant’s Arguments
`
`Regarding Double Patent rejections. The claims presented are under prosecution and as such
`
`the Double Patenting rejection will be held in abidance until such time as allowable subject
`
`matter has been found.
`
`Applicant arguments to the 102 are based on the newly provided amendments which are
`
`addressed with the rejections provided.
`
`Nonstatutory Double Patenting
`
`The nonstatutory double patenting rejection is based on a judicially created doctrine
`
`grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or
`
`improper timewise extension of the ”right to exclude” granted by a patent and to prevent
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 3
`
`possible harassment by multiple assignees. A nonstatutory obviousness-type double
`
`patenting rejection is appropriate where the conflicting claims are not identical, but at least
`
`one examined application claim is not patentably distinct from the reference claim(s) because
`
`the examined application claim is either anticipated by, or would have been obvious over, the
`
`reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re
`
`Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ
`
`645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel,
`
`422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644
`
`(CCPA 1969).
`
`A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may
`
`be used to overcome an actual or provisional rejection based on a nonstatutory double
`
`patenting ground provided the conflicting application or patent either is shown to be
`
`commonly owned with this application, or claims an invention made as a result of activities
`
`undertaken within the scope of a joint research agreement.
`
`Effective January 1, 1994, a registered attorney or agent of record may sign a terminal
`
`disclaimer. A terminal disclaimer signed by the assignee must fully comply with 37 CFR
`
`3.73(b).
`
`Claims 1—6; 13—18 are rejected on the ground of nonstatutory obviousness—type double
`
`patenting as being unpatentable over claim of copending Application No. listed below.
`
`Although the conflicting claims are not identical, they are not patentably distinct from each
`
`other.
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 4
`
`This is an obviousness—type double patenting rejection because the conflicting claims
`
`have been patented.
`
`Double Patenting Rejections will not be revisited and be held in abeyance until
`
`allowable subject matter is to be found.
`
`1. A sound processing system comprising: at
`
`l. A monitoring system that monitors a
`
`least one camera that captures
`
`sound, comprising: a camera that images an
`
`a video image; a sound collector that
`
`imaging area; a microphone array that
`
`includes a plurality of microphones and
`
`collects a voice in the imaging area; a display
`
`collects sound by using the plurality of
`
`that displays image data of the imaging area
`
`microphones; a processor that
`
`imaged by the camera; and a signal processor
`
`displays the video image captured by the
`
`that derives a sound parameter specifying a
`
`camera on a display, and outputs, from
`
`level of a sound in the imaging area for each
`
`information in step wise according to the
`
`a speaker, sound collected by the sound
`
`predetermined unit of pixel configuring the
`
`collector; and an input device that
`
`image data of the imaging area using voice
`
`receives designation of at least one
`
`data collected by the microphone array,
`
`designated location in the video image
`
`wherein the signal processor causes sound
`
`displayed on the display, wherein the
`
`source image information in which the sound
`
`camera and the sound collector are
`
`parameter is converted into different visual
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 5
`
`disposed on an indoor ceiling, when a first
`
`comparison between the derived sound
`
`designated location is received
`
`parameter and a plurality of threshold values
`
`while the video image in a first coordinate
`
`relating to the level of the sound, to be
`
`system is being displayed on the
`
`displayed on the display in a superimposed
`
`area.
`
`display, the processor converts a
`
`manner for each predetermined unit of pixel
`
`predetermined region of the video image
`
`configuring the image data of the imaging
`
`including the first designated location, into
`
`an image in a second coordinate
`
`system, and displays the converted image in
`
`the second coordinate system on the
`
`display, and the processor outputs, from the
`
`speaker, first emphasized sound in
`
`which sound originated in the predetermined
`
`region is emphasized, when a second
`
`designated location is received while the
`
`converted image in the second
`
`coordinate system is being displayed on the
`
`display, the processor outputs,
`
`from the speaker, second emphasized sound
`
`in which sound in a direction
`
`directed toward a position corresponding to
`
`the second designated location from
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`the sound collector is emphasized, wherein,
`
`when the at least one designated
`
`Page 6
`
`location comprises a plurality of different
`
`designated locations in the video
`
`image displayed on the display, the processor
`
`displays different identification
`
`shapes around the respective designated
`
`locations in the video image.
`
`
`
`l. A monitoring system comprising: a camera
`
`1. A monitoring system that monitors a
`
`which images an imaging area; a
`
`sound, comprising: a camera that images an
`
`microphone array which collects audio ofthe
`
`imaging area; a microphone array that
`
`imaging area; a monitor which
`
`collects a voice in the imaging area; a display
`
`displays a captured image of the imaging
`
`that displays image data of the imaging area
`
`area which is captured by the camera;
`
`imaged by the camera; and a signal processor
`
`a processor; and a memory including
`
`that derives a sound parameter specifying a
`
`instructions that, when executed by the
`
`level of a sound in the imaging area for each
`
`processor, cause the processor to perform
`
`predetermined unit of pixel configuring the
`
`operations including: setting a
`
`image data ofthe imaging area using voice
`
`masking area to be excluded from detection
`
`data collected by the microphone array,
`
`of a pilotless flying object which
`
`wherein the signal processor causes sound
`
`source image information in which the sound
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 7
`
`appears in the captured image of the imaging
`
`parameter is converted into different visual
`
`area, based on the audio collected
`
`information in step wise according to the
`
`by the microphone array; detecting the
`
`comparison between the derived sound
`
`pilotless flying object based on the
`
`parameter and a plurality of threshold values
`
`audio collected by the microphone array and
`
`relating to the level of the sound, to be
`
`the masking area set by the masking
`
`displayed on the display in a superimposed
`
`area setter; and superimposing a sound
`
`manner for each predetermined unit of pixel
`
`source visual image, which indicates
`
`configuring the image data of the imaging
`
`
`
`the volume of a sound at a sound source
`
`area.
`
`position, at the sound source position
`
`of the pilotless flying object in the captured
`
`image and displays the result on
`
`the monitor in a case where the pilotless
`
`flying object is detected in an area
`
`other than the masking area.
`
`1. An object detection system comprising: a
`
`l. A monitoring system that monitors a
`
`first camera; a microphone array;
`
`sound, comprising: a camera that images an
`
`a display; a processor; and a memory
`
`imaging area; a microphone array that
`
`including instructions that, when
`
`collects a voice in the imaging area; a display
`
`executed by the processor, cause the
`
`that displays image data of the imaging area
`
`processor to perform operations including:
`
`imaged by the camera; and a signal processor
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 8
`
`imaging an imaging area by the first camera,
`
`that derives a sound parameter specifying a
`
`displaying, on a display, a
`
`level of a sound in the imaging area for each
`
`captured image of the imaging area imaged
`
`predetermined unit of pixel configuring the
`
`by the first camera, acquiring audio
`
`image data ofthe imaging area using voice
`
`of the imaging area by the microphone array,
`
`data collected by the microphone array,
`
`detecting an audio source of an
`
`wherein the signal processor causes sound
`
`object which appears in the imaging area
`
`source image information in which the sound
`
`from data of the audio acquired by the
`
`parameter is converted into different visual
`
`microphone array, calculating position
`
`information in step wise according to the
`
`information ofthe audio source ofthe object
`
`comparison between the derived sound
`
`on the captured image of the imaging area,
`
`parameter and a plurality of threshold values
`
`converting the position information of the
`
`relating to the level of the sound, to be
`
`object into first visual identification
`
`displayed on the display in a superimposed
`
`information ofthe object in the captured
`
`manner for each predetermined unit of pixel
`
`image of the imaging area, and
`
`configuring the image data of the imaging
`
`
`
`superimposing the first visual identification
`
`area.
`
`information ofthe object on the captured
`
`image of the imaging area.
`
`Copending 20150350621 claim 1
`
`Instant claim 1 and 4
`
`1. A monitoring system that monitors a
`
`sound, comprising: a camera that images an
`
`imaging area; a microphone array that
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 9
`
`collects a voice in the imaging area; a display
`
`that displays image data of the imaging area
`
`imaged by the camera; and a signal processor
`
`that derives a sound parameter specifying a
`
`level of a sound in the imaging area for each
`
`predetermined unit of pixel configuring the
`
`image data ofthe imaging area using voice
`
`data collected by the microphone array,
`
`wherein the signal processor causes sound
`
`area.
`
`source image information in which the sound
`
`parameter is converted into different visual
`
`information in step wise according to the
`
`comparison between the derived sound
`
`parameter and a plurality of threshold values
`
`relating to the level of the sound, to be
`
`displayed on the display in a superimposed
`
`manner for each predetermined unit of pixel
`
`configuring the image data of the imaging
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 10
`
`Claim Rejections - 35 USC § 102
`
`1.
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C.102 and 103 (or as subject to pre—AIA 35 U.S.C. 102 and 103) is incorrect, any correction
`
`of the statutory basis for the rejection will not be considered a new ground of rejection if the
`
`prior art relied upon, and the rationale supporting the rejection, would be the same under
`
`either status.
`
`2.
`
`The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form
`
`the basis for the rejections under this section made in this Office action:
`
`A person shall be entitled to a patent unless —
`
`(a)(2) the claimed invention was described in a patent issued under section 151, or in an application
`for patent published or deemed published under section 122(b), in which the patent or application, as
`the case may be, names another inventor and was effectively filed before the effective filing date of
`the claimed invention.
`
`3.
`
`Claims 1—6 are rejected under 35 U.S.C. 102(a)(2) as being taught by Wu US
`
`20120093339.
`
`Regarding claim 1, Wu teaches a monitoring system, comprising: a camera ,which, in
`
`operation, images an monitoring area; (Wu para 3-7 camera and array of microphones locating
`
`and monitoring multiple noise sources) a microphone array which, in operation, collects sound
`
`in the monitoring area; (Wu para 7 an array of microphones, locating and monitoring multiple
`
`noises) a display , which, in operation, displays image data ofthe monitoring area imaged by
`
`the camera; and (Wu para 3, then overlay the high sound pressure spots on the image of a
`
`test object captured by the camera to indicate the locations from which sounds are emitted.
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 11
`
`Para 17, Results may be displayed on a display) a signal processor derives a sound parameter
`
`specifying a level of a sound in the monitoring area for each predetermined unit of pixels
`
`configuring the image data of the monitoring area using sound data collected by the
`
`microphone array, generates sound source image information in which the sound parameter is
`
`converted into different visual information in step wise according to a comparison between the
`
`sound parameter and a plurality ofthreshold values relating to the level of the sound, and
`
`displays the sound source image information on the display using plural indicators and in a
`
`superimposed manner for each predetermined unit of pixels configuring the image data ofthe
`
`monitoring area,(Wu para 3, camera and an array of 30-60 microphones to measure the
`
`sound pressure, and then overlay the high sound pressure spots [plural indicators]on the
`
`image of a test object captured by the camera to indicate the locations from which sounds
`
`are emitted. Para 50, Once the locations of sound sources are identified, the resultant sound
`
`field in 3D space can be visualized by superimposing contributions from all the individual
`
`sources. Also Table 1 shows different sound parameters for visual step wise based on
`
`threshold values relating to the level of sound to be displayed. Para 11; After locating sound
`
`sources, the sound pressures generated by these sources are calculated and the resultant
`
`sound pressure field in 3D space including the source surfaces is visualized. This 3D
`
`soundscaping produces direct and easy to understand pictures of sound pressure distribution
`
`in 3D space and how they change with time.) Examiner notes that although Wu does not
`
`mention ”pixel” such as ”unit of pixel configuring the image data of the imaging area”. Unit of
`
`pixel is a location on the display where the sound is located. ”spots on the image” is also
`
`”unit of pixels” Also See Fig. 2.) wherein the plural indicators includes a first indicator that is
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 12
`
`applied to sound parameters that are within a first range defined by a first threshold value of
`
`the plurality ofthreshold values, and a second indicator that is applied to sound parameters
`
`that are within a second range defined by a second threshold value ofthe plurality ofthreshold
`
`values, (Wu Abstract, visualizing the resultant overall sound pressure distribution in 3D space
`
`[3D space is within a first second and third range threshold values]) Also Wu Para 10-11
`
`Display of results In 3D space On measurement surface only Display of source (x, y, z)
`
`coordinates Color map locations Spatial resolution Very high One wavelength of sound wave
`
`Discernable source level Up to 20 dB <5 dB. and the signal processor repeats the display
`
`operation from a detection of start event to a detection of a stop event.(Wu para 7, and
`
`displaying the resultant sound field in 3D space in real time). Examiner notes that the
`
`detection of start event to a detection of stop event is only displayed. The signal processor is
`
`not triggered to display based on a detection of a start event to a detection of a stop event as
`
`the claims are presented.
`
`Regarding claim 2, Wu teaches all of the limitations of claim 1 and further teaches, wherein
`
`the sound parameter is a sound pressure. (Wu para 3, and then overlay the high sound
`
`pressure spots on the image of a test object)
`
`Regarding claim 3, Wu teaches all of the limitations of claim 1 and further teaches, further
`
`comprising: a sensor, which, in operation, detects the start event. (Wu para 3, camera.)
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 13
`
`Regarding claim 4, Wu teaches all of the limitations of claim 1 and further teaches, further
`
`comprising: a sound output , which in operation, reproduces the sound data collected by the
`
`microphone array, wherein, in a case where the sound source image information displayed on
`
`the display is designated, the signal processor forms directivity to a direction from the
`
`microphone array toward a position ofthe image data of the monitoring area corresponding to
`
`the designated sound source image information, and causes the sound reproduce to output the
`
`sound data. (Wu para 7, and displaying the resultant sound field in 3D space in real time)
`
`Regarding claim 5, Wu teaches all of the limitations of claim 1 and further teaches, wherein
`
`the camera and the microphone array are coaxially disposed. Examiner give not patentable
`
`weight to ”wherein the camera and the microphone array are coaxially disposed.” Particularly
`
`when coaxial cable is specifically designed to propagate audio and video signals. As such
`
`coaxially disposed is all well-understood, routine, and conventional activities previously
`
`known to the industry.
`
`Claim 6 is rejected using the same rejections as made to claim 1.
`
`7—12. Canceled.
`
`Claim Rejections - 35 USC § 103
`
`1.
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre—AIA 35 U.S.C. 102 and 103) is incorrect, any correction
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 14
`
`of the statutory basis for the rejection will not be considered a new ground of rejection if the
`
`prior art relied upon, and the rationale supporting the rejection, would be the same under
`
`either status.
`
`2.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness
`
`rejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is
`not identically disclosed as set forth in section 102, if the differences between the claimed invention
`and the prior art are such that the claimed invention as a whole would have been obvious before the
`effective filing date of the claimed invention to a person having ordinary skill in the art to which the
`claimed invention pertains. Patentability shall not be negated by the manner in which the invention
`was made.
`
`3.
`
`Claim 13 is rejected under 35 U.S.C.103 as being unpatentable over Wu as applied to
`
`claim above, and further in view of Lee US 20150095818.
`
`13. Wu teaches all of the limitations of claim 3 but does not teach wherein the sensor detects
`
`the start event in response at least one of a brightness greater than a predetermined value, a
`
`temperature greater than a predetermined tempter, and a human being detected by the
`
`sensor. However examiner maintains that it was well known in the art at the time the
`
`invention was filed as taught by Lee. Regarding teach wherein the sensor detects the start
`
`event in response at least one of a brightness greater than a predetermined value, a
`
`temperature greater than a predetermined tempter, and a human being detected by the
`
`sensor, Lee teaches Lee Para 102, event-based vision sensor is used to detect the change in
`
`brightness,) Examiner notes that ”event-based” is based on or triggered by an event; all
`
`events have a start and end. Therefore it would have been obvious to one of ordinary skill in
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 15
`
`the art to combine the teachings of Wu in view of Lee such that the claimed invention as a
`
`whole would have been obvious before the effective filing date of the claimed invention for
`
`the purpose of monitoring events.
`
`Regarding claim 15, Wu teaches all of the limitations of claim 1 and further teaches, in
`
`operation, receives a designation of one ofthe predetermined unit of pixels in which the sound
`
`source image information is superimposed, and the signal processor outputs, in response to the
`
`designation being received, sound data collected by the microphone array.(Wu Fig.2 sound
`
`imaged is superimposed on the unit of pixels of the display in a 3D soundscape from the
`
`microphone array 1-4 for fig.1) [different colors for different spatial resolution ]
`
`Regarding claim 16, Wu teaches all of the limitations of claim 1 and further teaches wherein
`
`the first indicator includes a first color different from a second color included by the second
`
`indicator.(Wu para 10-11 Display of source (x, y, z) coordinates Color map locations Spatial
`
`resolution)
`
`Regarding claim 17, Wu teaches all of the limitations of claim 6 and further teaches wherein
`
`receiving a designation of one of the predetermined unit of pixels in which the sound source
`
`information is superimposed, and outputting, in response to the designation being received,
`
`sound data collected by the microphone array. (Wu Fig.2 sound imaged is superimposed on
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 16
`
`the unit of pixels of the display in a 3D soundscape from the microphone array 1-4 for fig.1)
`
`[different colors for different spatial resolution ]
`
`Regarding claim 18, Wu teaches all of the limitations of claim 6 and further teaches wherein
`
`the first indicator includes a first color different from a second color included by the second
`
`indicator. (Wu para 10-11 Display of source (x, y, z) coordinates Color map locations Spatial
`
`resolution)
`
`1.
`
`Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Wu as applied to
`
`claim above, and further in view of Natsumoto US 20170132474.
`
`Regarding claim 14, Wu teaches all of the limitations of claim 1 but does not teach wherein
`
`the predetermined unit of pixels is a pixel block including a plurality of pixels, and a sound
`
`perimeter is an average value of sound pressure values corresponding to the plurality of pixels
`
`of the pixel block. (Natsumoto Para 80, the average value of the sound pressure values in pixel
`
`block units.) However examiner maintains that it was well known in the art at the time the
`
`invention was filed as taught by Natsumoto. Regarding wherein the predetermined unit of
`
`pixels is a pixel block including a plurality of pixels, and a sound perimeter is an average value of
`
`sound pressure values corresponding to the plurality of pixels of the pixel block, Natsumoto
`
`teaches Natsumoto Para 80, the average value of the sound pressure values in pixel block
`
`units. Therefore it would have been obvious to one of ordinary skill in the art to combine the
`
`teachings of Wu in view of Natsumoto such that the claimed invention as a whole would have
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 17
`
`been obvious before the effective filing date of the claimed invention for the purpose of
`
`monitoring events.
`
`Pertinent Art
`
`Applicant is requested to review the art of Sakai US 20120162259 also as pertinent art,
`
`particularly Para 10 ”the display data is generated to display level information of sounds that
`
`are output from a sound source using the size of a predetermined shape such as a circle. In
`
`such a case, it becomes possible to determine a sense of perspective. For example, it is
`
`possible to determine that the sound source is approaching from a gradually enlarging circle.
`
`Further, for example, the display data is generated to display the frequency information of
`
`sounds that are output from a sound source with a color that is applied to the predetermined
`
`shape such as the circle. In such a case, it is possible to find a specific sound source based on
`
`the color.” As it relates to claims 1, 6, 14-18.
`
`Conclusion
`
`Any inquiry concerning this communication or earlier communications from the
`
`examiner should be directed to SIHAR A KARWAN whose telephone number is (571)272—2747.
`
`The examiner can normally be reached M—F; 11—7.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner’s
`
`supervisor, Jefferey Harold can be reached on 571—272—7519. The fax phone number for the
`
`organization where this application or proceeding is assigned is 571—273—8300.
`
`

`

`Application/Control Number: 15/182,488
`Art Unit: 2422
`
`Page 18
`
`Information regarding the status of an application may be obtained from the Patent
`
`Application Information Retrieval (PAIR) system. Status information for published applications
`
`may be obtained from either Private PAIR or Public PAIR. Status information for unpublished
`
`applications is available through Private PAIR only. For more information about the PAIR
`
`system, see http://pair—direct.uspto.gov. Should you have questions on access to the Private
`
`PAIR system, contact the Electronic Business Center (EBC) at 866—217—9197(toll—free). If you
`
`would like assistance from a USPTO Customer Service Representative or access to the
`
`automated information system, call 800—786—9199 (IN USA OR CANADA) or 571—272—1000.
`
`/SIHAR A KARWAN/
`
`Examiner, Art Unit 2422
`
`/JEFFEREY F HAROLD/
`
`Supervisory Patent Examiner, Art Unit 2422
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket