`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and TrademarkOffice
`Address; COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`16/664,084
`
`10/25/2019
`
`Takahiro NISHI
`
`2019-1804A
`
`7418
`
`Up
`Lind&
`Wenderoth,
`Wenderoth, Lind & Ponack, L.L.P.
`1025 Connecticut Avenue, NW
`Suite 500
`Washington, DC 20036
`
`KALAPODAS, DRAMOS
`
`2487
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`07/21/2020
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`eoa@ wenderoth.com
`kmiller@wenderoth.com
`
`PTOL-90A (Rev. 04/07)
`
`
`
`
`
`Disposition of Claims*
`1-24 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) ___ is/are withdrawn from consideration.
`CC) Claim(s)
`is/are allowed.
`Claim(s) 1-24 is/are rejected.
`S)
`) O Claim(s)___is/are objected to.
`C) Claim(s
`are subjectto restriction and/or election requirement
`)
`S)
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) )
`
`Application Papers
`10)C The specification is objected to by the Examiner.
`11) The drawing(s) filed on 10/25/2019 is/are: a)(¥) accepted or b){(. objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12)(1) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d)or (f).
`Certified copies:
`cc) None ofthe:
`b)LJ Some**
`a)L) All
`1.2 Certified copies of the priority documents have been received.
`2.2 Certified copies of the priority documents have been received in Application No.
`3.4) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`* See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`2)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3) (J Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`4)
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20200715
`
`Application No.
`Applicant(s)
`16/664,084
`NISHI etal.
`
`Office Action Summary Art Unit|AIA (FITF) StatusExaminer
`DRAMOS KALAPODAS
`2487
`Yes
`
`
`
`-- The MAILING DATEofthis communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensions of time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133}.
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 10/25/2019.
`LC} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`
`2a)(J This action is FINAL. 2b))This action is non-final.
`3) An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4\(Z Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AlA or AIA Status
`
`1.
`
`The present application, filed on or after March 16, 2013, is being examined
`
`under the first inventor to file provisions of the AIA.
`
`Information Disclosure Statement
`
`2.
`
`The information disclosure statement (IDS) was submitted on 10/25/2019. The
`
`submissionis in compliance with the provisions of 37 CFR 1.97. Accordingly, the
`
`information disclosure statement is being considered by the examiner.
`
`Claim Rejections - 35 USC § 112
`
`The following is a quotation of 35 U.S.C. 112(b):
`(b) CONCLUSION.—The specification shall conclude with one or more claims particularly
`pointing out and distinctly claiming the subject matter which the inventor or a joint inventor
`regards as the invention.
`
`The following is a quotation of 35 U.S.C. 112 (pre-AlA), second paragraph:
`The specification shall conclude with one or moreclaims particularly pointing out and distinctly
`claiming the subject matter which the applicant regards as his invention.
`
`3.
`
`Claims 7 and 18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AlA),
`
`second paragraph, as being indefinite for failing to particularly point out and distinctly
`
`claim the subject matter which the inventor or a joint inventor, or for pre-AlA the
`
`applicant regards asthe invention.
`
`The term "different" in claims 7 and 18 is a relative term which renders the claim
`
`indefinite. The term “wherein the reference image is included in a processedpicture
`
`different from a picture that includes the input image, .... " is not defined by the
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 3
`
`claim, the specification does not provide a standard for ascertaining the requisite
`
`degree, and one ofordinary skill in the art would not be reasonably apprised of the
`
`scope of the invention. An analysis to the specification finds a similar recited matter
`
`which does notidentify any parametric indicia by which the reference image could be
`
`“different” from the picture comprising the input image, as similarly recited at
`
`Pars.[0071], [0087], [0338], [0347].
`
`(i)
`
`In this regard, Examiner is unable to establish a metric of differentiating
`
`the reference image from a processed picture for being different than the picture
`
`including the encoded image. However, it may be presumedthat the different reference
`
`image could be part of an inter-prediction mode relying on an inter-frame predicted and
`
`reconstructed referencepicture not containing the original “/” frame used as the input
`
`image. In this regard the inter-prediction mode relying on a different reference picture is
`
`disclosed in Teradaat least in Par.[0118]-[0120] by selecting the inter-prediction mode.
`
`(ii)
`
`Another interpretation is suggestively taught by Terada at Par.[01 78]
`
`where a referencepixel is connected at a second NN layer directly or to another
`
`succeeding layer, representing a different prediction than the picture including the input
`
`image by which a second predicted image is generated.
`
`(iii)|The first and second intra-predicted images are generated differently by
`
`selecting at the intra-prediction switch “intra_pred_type”, for fixed intra prediction in
`
`110b position or for using the NN intra-prediction generator at position 110a in Fig.1 as
`
`processed by the syntax code at Fig.22.
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 4
`
`3-1.
`
`The dependent claims 8-10 and 19-22 respectively, are also rejected for their
`
`dependencythe claims 7 and 18.
`
`Clarification is required.
`
`Claim Rejections - 35 USC § 102
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103)is incorrect, any
`
`correction of the statutory basis for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be
`
`the same under either status.
`
`The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that
`
`form the basis for the rejections under this section madein this Office action:
`
`A person shall be entitled to a patent unless —
`
`(a)(1) the claimed invention was patented, described in a printed publication, or in public
`use, on Sale or otherwise available to the public before the effectivefiling date of the
`claimed invention.
`
`(a)(2) the claimed invention was described in a patent issued undersection 151, or in an
`application for patent published or deemed published under section 122(b), in which the
`patent or application, as the case may be, names another inventor and waseffectively
`filed before the effectivefiling date of the claimed invention.
`
`4.
`
`Claims 1, 3-12 and 14-24 are rejected under 35 U.S.C. 102(a)(2) as being
`
`anticipated by Kengo Teradaet al., (hereinafter Terada) (US 2018/0184123).
`
`No common inventor or assignee has been identified in regard to this art of
`
`reference.
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Examiner’s Note.
`
`Page 5
`
`In conformance to the MPEP 2152.02(b) provisions for anticipative rejections, Examiner
`
`contends thatall the prior art referenced to Terada is mapped from paragraphs
`
`identified from a single Embodiment 1. However, future references to Summaryof
`
`Embodiments 1 and 6, having a common description in the prior art specification
`
`Par.[0362]-[(0514] may be addressed and used, exchangeable between Embodiments 1
`
`and 6 in a subsequent Office Action as warranted by amendment.
`
`Re Claim 1. Terada discloses, an encoder, comprising (an enceder 163 in
`
`Frig.35 or encoder 10 Fig.60A4):
`
`processing circullry (an enceding processor 100 in Fig.d}: and
`
`memory, wherein using the memory, the processing circutry Grame memory
`
`ti2 in Fig.t and step 5165 in Fig.6 Par.[O728}):
`
`generates a predicted image of an ingut image thal is a current image to be
`
`encoded {generating an intra/inter predicted image, steps Sizt, S122 Fig.3 by the
`
`intra/inter prediction generator Par.[O119] of the current Wage to be encoded at
`
`aoparatus 100 in Fig.2 where the predicted image is generated by the NN in intra
`
`mode per Fig.4, inter made in Fig.5}, based on generated data output trom a
`
`generator network (and based on the data ouiput from the mode network
`
`generators for intra/inter prediction at Fig. 4 and 5 respectively, the encoding
`
`being based on a neural network, NN @.¢., as generated by a function 4 oulpulting
`
`the prediction mode, per Fig.? Par. [00136] according to a parameter determiner
`
`109 setting diferent prediction and coding NN modes, Par.f0112) in response to a
`
`reierence image being inpul to the generaior neiwork, the generator network being 4
`
`neural network (ihe image prediction being based on the output data generaied by
`
`the neural network, NN generator in response to a reference image extracted at
`
`S181, Fig.8, and inputted to the NN al Fig.13, Par{0134] and on the pregram code
`
`at Fig.i9 Par [0152] to generate the predicted pixel e.g., in “nan_intra’ mode and
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 6
`
`Pig.t4 Par {O145] and code at Fig.20 for generating the prediction image in
`
`“an_inter’ mode Par[O1S3i and/or [0143)};
`
`calculates a prediction error by subtracting the predicted image from the input
`
`image (@ prediction error is computed by subtracting the predicted image fram the
`
`input current image, Par jQi04],1GS) (01201) (0124), T0136), (0158), Fig.7); and
`
`generales an encoded image (generating the encoded image af 105, Fig.t,
`
`Par (21361, based on prediction differences Fig.7, Par [O104)) byal least
`
`transforming the prediction error (enceder 190 performing generates the image
`
`stream af 105 by applying frequency transform at 103, step S126, on the
`
`prediction error Le., being computed at the diference block S125, at subtractor
`
`102 wherein generating the prediction error is generated by subtracting the
`
`prediction block from the current block, Par [O104] [O120h}.
`
`Re Claim 3. Terada discloses, the encoder according tc claim 1, wherein the
`
`relerence image is a processed image included in a picture, the picture including the
`
`inputimage, and in generating the predicted image, the processing circuliry generates a
`
`first intra-predicied image as ine predicted image, based on the generated data (the
`
`neighboring pixels represent! ine processed reference image included in the
`
`picture that includes the input image by which the first intra-predicted data is
`
`generated, Fig.? and 14 per block 116 in Fig.1 Par[O1OG8)-(0109] as indicated by
`
`ihe intra-prediction switch for “Fixed intra- Predictor’ Le., requilar intra-mode at
`
`block 110b in Fig.4 Par.{0121}10123)).
`
`Re Claim 4. Terada and discloses, the encoder accarding to claim 3, wherein the
`
`processing circuliry further:
`
`generates a second intra-predicted image of the input image by intra prediction
`
`based on the reference image {generating the second intra-predicted image by
`
`switching from 1106 to the 11Ga NN prediction mode, Par.[O124h):
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 7
`
`selects an image fram among the first intra-predicted image and the second
`
`intra-predicted image (selecting one of the plurality of indicated intra-predicted
`
`input images Fig.22, ¢.q., a second predicted image block, by apolying the
`
`instruction code for fixed “intrapredtype” based on the neighboring references
`
`at 1166, or indicating which one of the NN intra-prediction is used, Par.j0154]
`
`according to the accuracy level determined at Par.[0124] as depicted at switching
`
`block 110a in Fig.t and Par (0724): and
`
`when the processing circultry selects the second intra-predicted imaqde in
`
`selecting the image, calculates the preciction error oy subtracting the second intra-
`
`predicted image from the input image in calculating the prediction error (the selected
`
`second intra-predicted block ai 110 in Fig.i, 1108, Le., the second inira-predicted
`
`image block, encodes the prediction error according to the NN, Par.[Q1203).
`
`Re Claim 5. Terada and discloses, the encoder according to claim 3. wherein in
`
`generaling the precicied unage: ihe processing circuilry generates, as the first intra-
`
`oredicied image, the generated data output fram the generator network in response to
`
`the reference image being input to ine generator network (precessing Le., generating
`
`ihe first selected intra-predicted block Le., the first intra-predicted image block,
`
`encodes the prediction error, in regard to the selected reference image inputted
`
`to the NN generator, Par {G120] Fig.33}.
`
`Fe Claim 6. Terada and discloses, the encoder according to clair 3, wherein in
`
`generating the predicted image, the processing circuitry: obtains, as an inira orediction
`
`parameter, the generaled dala cuipul fram the ceneratar network in response io the
`
`reference image being input to the generator network and generates the first inira-
`
`precicied image by intra prediction based on the reference image and the intra
`
`orediciion parameter (see Fig.7, obtaining i.e., delermining the inira-prediction
`
`parameter, based on which generating the first inira-predicied image, Par [0114]
`
`and Par (0730) [0131] and Fig.13 Par. [O734).
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 8
`
`Fe Claim 7. Terada and discloses, the encoder according to claim 1, wherein the
`
`reference image is included in a processed picture different from a picture that includes
`
`the inout made, and in generating the oredicted image, the processing circultry
`
`generates a first inter-predictecl image as the predicted image, based on the yenerated
`
`data ¢switching to inter-prediction mode at block 111 in Fig.1 and relying ona
`
`different reference picture is disclosed in Terada at least in Par.[0118]-[0120] by
`
`which generating a first predicted image in inter-prediction mode based on the
`
`reference image Par.[O137}-46139)).
`
`(This claim is rejected under 35 U.5.C.112(b) and requires clarification prior
`
`ip assessing a proper examination).
`
`Re Claim &. Terada and discloses, the encoder according to claim 7, wherein the
`
`processing circuitry further:
`
`generates a second inter-nredicted image of the input image by inter predictan
`
`based on the reference image (switching between “fixed infer-prediction” mode
`
`Titb in Fig.1, and generating a second predicted image in inter-prediction mode
`
`based on the reference image per Par.[O137]-[0138] according to conditions set at
`
`Par foi7sh:
`
`selects an image from among the first inter-predicted image and the second
`
`inter-predicted image (selecting farm lirst and second inter-prediction modes 1116
`
`and Tita, or using the second layer to which a reference pixel is connected to 4
`
`node ig used as a second-inter-predicted pixel, Par{O17Si); and
`
`wher the processing circuitry selects the second inter-predicied image in
`
`selecting the image, calculates the prediction errar by subtracting the second inter-
`
`precicied image frorn the input image in calculating the prediction error (upon selecting
`
`ihe second inler-predicted image, perform the caiculation of the prediction error
`
`as previously established at Par [0170 (However, this ciaim is also relected
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 9
`
`under 35 U.S.C.112(5) for depending from claim 7 and requires clarification prior
`
`iG assessing a proper examination).
`
`Re Claim 9. Terada and discloses, the encoder according to claim 7,
`
`wherein in generating the predicted image, the pracessing circullry generates, as
`
`ihe first inter-precdicied wmacge, the generated data outout from ihe generalor nelwark In
`
`response to the reference image being ingul fo the generator network (his limitation
`
`follows the specifics of inter-prediction mode selected al ihe NN position Tita in
`
`Pig.? block 111 and is disclosed at Par 70119], [0120] and executed per NN in
`
`rig.i4 ParfOi43i, 01457) (However, this claim is also rejected under 35
`
`U.5.C.112{6) for depending fram claim 7? and requires clarification prior to
`
`assessing @ proper examination).
`
`Re Claim 10. Terada and discloses, the encoder according to claim 7, wherein in
`
`generaling the preciciecd unage, ihe processing circutry:
`
`obtains, as an inter prediction parameter (per syniax of parameter in NN inter-
`
`prediction code at Fig.i6 and 26}, the generated dala outout from the generator
`
`network in response to ihe reference image being input to the generator network
`
`ioblaining the inter-prediction parameter, the data generated at the oulput of the
`
`NN according to the inputted image, Fig. 14, Par{O143h: and
`
`generates the first inter-predicted image by inter prediction based on the
`
`reference image and perthe inter prediction parameter {generating the first inter-
`
`predicted image based on the reference and the NN infer-prediction parameter,
`
`Par fOis7i-(01 39] and Fig.8} (However, this claim is alse relected under 35
`
`U.S.C.1120b) for depending from claim 7 and requires clarification prior to
`
`assessing 2 preper examination).
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 10
`
`Re Claim 11. Terada and discloses, the encoder according to clarn 1, wherein
`
`the generatar network is @ Rlerarchical network that includes an Inout layer, a thaden
`
`layer, ancl an outout layer (the generator NN is a hierarchical type, per Fig. 13, 74}.
`
`Re Claim 12. This claim represents the decoding part of the prediction loop rade
`
`part of the encoder represented in claim 1, hence performing a simular prediction
`
`process in the decoding of the reconstructed picture by flowing the same limiting siens
`
`and in sirlar order thus iis reiected on the same evidentiary premise, mutalis
`
`mutandis.
`
`Re Claim 14. This claim reoresents the decoding part of the prediction oop made
`
`part of the encoder represented in a limilation of claim 3, hence performing a simular
`
`prediction process in ine decoding of the reconstructed picture by flowing the same
`
`limiting steps and in similar orderthus it is relected on the same evidentiary premise,
`
`miuatis mutandis.
`
`Re Claim 15. This claim represents the decoding part of the predictian loop made
`
`part of the encoder represented in claim 4, hence perfarming 2 similar prediction
`
`grocess in the decoding of the reconstructed picture by flowing the same limiting steps
`
`and in similar order thus ft is rejected on the same evidentiary oremise, mutatis
`
`mutandis.
`
`Re Claim 16. This claim represents the decoding part of the prediction loop made
`
`part of the encoder represented in claim 5, hence performing a simular prediction
`
`process in the decoding of the reconstructed picture by flowing the same limiting siens
`
`and in similar order thus itis reiected on the same evidentiary premise, mutatis
`
`mutandis.
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 11
`
`Re Claim 17. This claim represents the decoding part of the prediction loop made
`
`part af the encoder represented in claim 6, hence perfarming 4 similar prediction
`
`process in the decoding of the reconstructed picture by flowing the same limiting steps
`
`and in similar order thus ft is reelected on the same evidentiary orermnise, mutatis
`
`mutandis.
`
`Re Claim 16. This claim reoresents the decoding part of the prediction oop made
`
`part of the encoder represented in claim 7, hence performing a similar oreciiction
`
`process in the decoding of the reconstructed picture by flowing the same limiting steps
`
`and in similar orderthus ff is relecteci on the same evidentiary premise, mutatis
`
`mutandis.
`
`Re Claim 19. This claim represents the decoding part of the prediction ioop made
`
`part of the encoder represented in claim 8, hence performing a similar prediction
`
`orocess in the decoding of the reconstructed picture by flowing ihe same limiting steps
`
`and in similar order thus ft is rejected on the same evidentiary oremise, mutatis
`
`MiSs,
`
`Re Claim 20. This claim represents the decoding part of the prediction loop made
`
`part of the encoder represented in claim 9, hence performing a simular prediction
`
`process in the decoding of the reconstructed picture by flowing the same limiting siens
`
`and in sirlar order thus iis reiected on the same evidentiary premise, mutalis
`
`mutandis.
`
`Re Claim 21. This claim reoresents the decoding part of the prediction oop made
`
`part of the encoder represented in claim 10, hence performing a similar prediction
`
`process in the decoding of the reconstructed picture by flowing the same limiting steps
`
`and in similar order thus if is reiectect on the same evidentiary premise, mutatis
`
`mutandis.
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 12
`
`Re Claim 22. This claim represents the decoding part of the prediction loop made
`
`part af the encoder represented in claim 11, hence performing a similar prediction
`
`process in the decoding of the reconstructed picture by flowing the same limiting steps
`
`and in similar order thus ft is reelected on the same evidentiary orermnise, mutatis
`
`mutandis.
`
`Re Claim 23. This claim represenis the encoding method being implemented at
`
`each limiting step by the encoding apparatus of 1, performing a similar process hence it
`
`is rejected on the same evidentiary premise, mutatis mutandis.
`
`Re Claim 24. This claim represents the decoding method being implemented at
`
`each limiting step by the decoding apparatus of 12, hence performing 4 similar
`
`prediction process in the decoding of the reconstructed picture by folowing the same
`
`limiting steps and in similar orcler thus it is rejected on the same evidentiary premise,
`
`mutatis mutandis,
`
`Claim Rejections - 35 USC § 103
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103)is incorrect, any
`
`correction of the statutory basis for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be
`
`the same under either status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all
`
`obviousnessrejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention
`is not identically disclosed assetforth in section 102, if the differences between the claimed
`invention and the prior art are such that the claimed invention as a whole would have been
`obvious before the effective filing date of the claimed invention to a person having ordinaryskill in
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 13
`
`the art to which the claimed invention pertains. Patentability shall not be negated by the manner
`in which the invention was made.
`
`The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148
`
`USPQ 459 (1966), that are applied for establishing a background for determining
`
`obviousness under 35 U.S.C. 103 are summarized asfollows:
`
`1. Determining the scope and contents of the prior art.
`
`2. Ascertaining the differences betweenthe prior art and the claims at issue.
`
`3. Resolving the level of ordinary skill in the pertinent art.
`
`4. Considering objective evidence present in the application indicating obviousnessor
`nonobviousness.
`
`5.
`
`Claims 2 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over
`
`Terada in view of Oren Rippeletal., (hereinafter Rippel) (US 2018/0174052)in lieu of
`
`Prov. App. 62/434,600, 62/434,602, 62/434,603 and 62/458,749.
`
`Re Claim 2. Terada discloses, the encoder according to claim 1, but he does not
`
`expressly teach about providing a feedback to the NN generator from a discriminator
`
`nebwark, wherein the processing clreuitry further:
`
`Rippel teaches about the, feeds back, fo the generator network, a probability that
`
`the predicted image matches the input image by inputting the input image and the
`
`predicied image to a discriminator network,
`
`the discriminator network being a neural network anc constituting a generative
`
`aciversarial network (GAN) with the generaior network (a generative adversarial
`
`network, Title, Abstract); and
`
`updates the generator network and the discriminator network fo reduce
`
`difference between the input image and the predicted image and increase accuracy of
`
`discriminating between the inoul mage and the predicted image (updating the
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 14
`
`discriminator and the encoder, to minimize the loss, Par. [0OG14] Le., to reduce the
`
`differance between the original image and the reconstructed conient, as earlier
`
`taught at Par. [00190] per discriminators 704 and the feedback output 734 in Fig.?
`
`Par. [O07] in order to increase the prediction accuracy by reducing the errer
`
`corresponding with the encoder loss, ParJ0075)-(0077)).
`
`in consideration to ihe prediciian process using neural networks identified in
`
`Terada referencing to reducing the prediction error (Par JO136)) in order to improve the
`
`coding efficiency (Par O158)}} by switching the encoding modes for the best image
`
`quality (Par [0445] (0446) Fig 22} and respectively reducing the image distortion, while
`
`not teaching about an image matching probability through application of a feedback at a
`
`discriminator, the ordinary skilled in the art would have had the incentive before the
`
`elective filing date of the application, to search for similar NN prediction modes seeking
`
`to reduce the difference between the current image and the predicted value as identified
`
`in Rippel disclosing a NN coding method and reducing the prediction error by using a
`
`discriminator error feedback (Par.[0076],[0077]) thus deeming the combination
`
`predictable, hence obviating the claim.
`
`The rationale fo combine finds support in the Graham factual inquines necessary
`
`io substantiate the above combination, in view of the instant fact case under
`
`consideration and in accordance with explaining the conclusion of obviousness in view
`
`of the provisions stioulated in MPEP 2143: Basic Requirements of a Prima Facie Case
`
`of Obviousness. I.
`
`EXEMPLARY RATIONALES(A),(D) and (G),that may support a
`
`conclusion of obviousness above evicenced,
`
`including:
`
`(A) Combining prior art elements according to knawn methads to yield
`&
`predictable results improving the prediction error processing by applying
`feedback mechanism to the NN prediction method as disclosed in Rippel
`(Par.[0076],[0077]) by combining with Terada suggestions for error reduction
`(Par (0445) 10446) Fig 22):
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 15
`
`{(D) Applying @ known technique to a known device (method, or product}
`&
`ready for improvement to yield predictable results (where the method in Rippel is
`known and adapted in the art per the preliminary provisional teachings}:
`
`{(G} Some teaching, suggestion, ar motivation in the orior art that would
`&
`have jed ome of ordinary skil fo modify the prior art reference or ta combine pricrart
`reference teachings fo arrive al the claimed invention Gwhere the suggestion to
`combine relies on the common interest to reduce the prediction error identified in
`both aris}.
`
`See precedence im: “The Federal Circuit recognized Agrizap as “a textbook case of when
`the asserted claims involve a cornbination of familiar elements according to known methads
`that does no more than yield predictable results.” fo. Agrizap exemplifies a strang case of
`obviousness based on simple substitution that was not overcame by the objective evidence af
`nonobviousness offered. if also demonstrates that analogous art is not limited to the field of
`applicant’s endeavor, in that one of the references that usec an animal body as a resistive
`switch ta complete a circuit for the generation of an electric charge was not in the field of pest
`conmtrab”?
`
`Re Claim 13. This claim represents the decoding prediction loop being intrinsic
`
`part of the encoder represented in 4 limitation of claim 2, hence performing a similar
`
`prediction process in the decoding of the recanstructed picture by flowing ihe same
`
`limiting steps and in similar order thus it is reflected on the sarne evidentiary premise,
`
`Mulals mutandis,
`
`Conclusion
`
`6.
`
`The prior art made of record and not relied upon, is considered pertinent to applicant's
`
`disclosure. See PTO-892 form. Applicant is required under 37 C.F.R. 1.111(c) to consider
`
`these references when responding to this action.
`
`Any inquiry concerning this communication or earlier communications from the
`
`examiner should be directed to DRAMOS KALAPODASwhosetelephone number is
`
`(571)272-4622. The examiner can normally be reached on Monday-Friday 8am-5pm.
`
`
`
`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 16
`
`Examiner interviews are available via telephone, in-person, and video
`
`conferencing using a USPTO supplied web-basedcollaboration tool. To schedule an
`
`interview, applicant is encouraged to use the USPTO Automated Interview Request
`
`(AIR) at http:/Awww.uspto.gov/interviewpractice.
`
`If attempts to reach the examiner by telephone are unsuccessful, the examiner's
`
`supervisor, David Czekaj can be reached on 571-272-7327. The fax phone number for
`
`the organization where this application or proceeding is assigned is 571-273-8300.
`
`Information regarding the status of an application may be obtained from the
`
`Patent Application Information Retrieval (PAIR) system. Status information for
`
`published applications may be obtained from either Private PAIR or Public PAIR.
`
`Status information for unpublished applications is available through Private PAIR only.
`
`For more information about the PAIR system, see http://pair-direct.uspto.gov. Should
`
`you have questions on accessto the Private PAIR system, contact the Electronic
`
`Business Center (EBC) at 866-217-9197(toll-free). If you would like assistance from a
`
`USPTO Customer Service Representative or access to the automatedinformation
`
`system, call 800-786-9199 (IN USA OR CANADA)or 571-272-1000.
`
`DRAMOS . KALAPODAS
`
`Primary Examiner
`
`Art Unit 2487
`
`/DRAMOS KALAPODAS/
`
`