throbber
www.uspto.gov
`
`UNITED STATES DEPARTMENT OF COMMERCE
`United States Patent and TrademarkOffice
`Address; COMMISSIONER FOR PATENTS
`P.O. Box 1450
`Alexandria, Virginia 22313-1450
`
`16/664,084
`
`10/25/2019
`
`Takahiro NISHI
`
`2019-1804A
`
`7418
`
`CP
`Lind&
`Wenderoth,
`Wenderoth, Lind & Ponack, L.L.P.
`1025 Connecticut Avenue, NW
`Suite 500
`Washington, DC 20036
`
`KALAPODAS, DRAMOS
`
`2487
`
`PAPER NUMBER
`
`NOTIFICATION DATE
`
`DELIVERY MODE
`
`03/03/2021
`
`ELECTRONIC
`
`Please find below and/or attached an Office communication concerning this application or proceeding.
`
`The time period for reply, if any, is set in the attached communication.
`
`Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the
`following e-mail address(es):
`eoa@ wenderoth.com
`kmiller@wenderoth.com
`
`PTOL-90A (Rev. 04/07)
`
`

`

`
`
`Disposition of Claims*
`1,3-12 and 14-24 is/are pending in the application.
`)
`Claim(s)
`5a) Of the above claim(s) ___ is/are withdrawn from consideration.
`C] Claim(s)__ is/are allowed.
`Claim(s) 1,3-12 and 14-24 is/are rejected.
`(1 Claim(s)__is/are objectedto.
`C} Claim(s)
`are subjectto restriction and/or election requirement
`* If any claims have been determined allowable, you maybeeligible to benefit from the Patent Prosecution Highway program at a
`participating intellectual property office for the corresponding application. For more information, please see
`http://www.uspto.gov/patents/init_events/pph/index.jsp or send an inquiry to PPHfeedback@uspto.gov.
`
`) ) ) )
`
`Application Papers
`10)() The specification is objected to by the Examiner.
`11) The drawing(s) filed on 10/25/2019 is/are: a)[¥) accepted or b)() objected to by the Examiner.
`Applicant may not request that any objection to the drawing(s) be held in abeyance. See 37 CFR 1.85(a).
`Replacement drawing sheet(s) including the correction is required if the drawing(s) is objected to. See 37 CFR 1.121 (d).
`
`Priority under 35 U.S.C. § 119
`12)Z) Acknowledgment is made of a claim for foreign priority under 35 U.S.C. § 119(a)-(d) or (f).
`Certified copies:
`_—_c)L) None ofthe:
`b)L) Some**
`a)X) All
`1.2 Certified copies of the priority documents have been received.
`2.2) Certified copies of the priority documents have been received in Application No.
`3.2.) Copies of the certified copies of the priority documents have been receivedin this National Stage
`application from the International Bureau (PCT Rule 17.2(a)).
`* See the attached detailed Office action for a list of the certified copies not received.
`
`Attachment(s)
`
`1)
`
`Notice of References Cited (PTO-892)
`
`2) (J Information Disclosure Statement(s) (PTO/SB/08a and/or PTO/SB/08b)
`Paper No(s)/Mail Date
`U.S. Patent and Trademark Office
`
`3) (J Interview Summary (PTO-413)
`Paper No(s)/Mail Date
`(Qj Other:
`
`4)
`
`PTOL-326 (Rev. 11-13)
`
`Office Action Summary
`
`Part of Paper No./Mail Date 20210226
`
`Application No.
`Applicant(s)
`16/664,084
`NISHI etal.
`
`Office Action Summary Art Unit|AIA (FITF) StatusExaminer
`DRAMOS KALAPODAS
`2487
`Yes
`
`
`
`-- The MAILING DATEofthis communication appears on the cover sheet with the correspondence address --
`Period for Reply
`
`A SHORTENED STATUTORY PERIOD FOR REPLYIS SET TO EXPIRE 3 MONTHS FROM THE MAILING
`DATE OF THIS COMMUNICATION.
`Extensions of time may be available underthe provisions of 37 CFR 1.136(a). In no event, however, may a reply betimely filed after SIX (6) MONTHSfrom the mailing
`date of this communication.
`If NO period for reply is specified above, the maximum statutory period will apply and will expire SIX (6) MONTHSfrom the mailing date of this communication.
`-
`- Failure to reply within the set or extended period for reply will, by statute, cause the application to become ABANDONED (35 U.S.C. § 133}.
`Any reply received by the Office later than three months after the mailing date of this communication, evenif timely filed, may reduce any earned patent term
`adjustment. See 37 CFR 1.704(b).
`
`Status
`
`1) Responsive to communication(s) filed on 12/18/2020.
`C} A declaration(s)/affidavit(s) under 37 CFR 1.130(b) was/werefiled on
`2a)¥) This action is FINAL.
`2b) (J This action is non-final.
`3)02 An election was madeby the applicant in responseto a restriction requirement set forth during the interview
`on
`; the restriction requirement and election have been incorporated into this action.
`4\0) Since this application is in condition for allowance except for formal matters, prosecution as to the merits is
`closed in accordance with the practice under Exparte Quayle, 1935 C.D. 11, 453 O.G. 213.
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 2
`
`DETAILED ACTION
`
`Notice of Pre-AlA or AIA Status
`
`1.
`
`The present application, filed on or after March 16, 2013, is being examined
`
`under the first inventor to file provisions of the AIA.
`
`Claim Status
`
`2.
`
`Claims 1, 3-12 and 14-24 are currently pending
`
`Claims 2 and 13 have been cancelled.
`
`The rejection to claims 7 and 18 under 35 U.S.C. 112(b) is withdrawn upon
`
`amendment
`
`Responseto Arguments
`
`3.
`
`Applicant’s arguments with respect to the rejection(s) of claims 1, 3-12 and 14-24
`
`have been fully considered and are unpersuasive.
`
`3-1. Applicants’ Argument
`
`i
`
`Applicants allege thal “Hippel discloses technology of improving the accuracy
`
`achieved by an encoder and a discriminator by a GAN for image encoding technology.
`
`Although, Fa. 3 of Rippel Hustrates that Prediction 318 is output fram Discriminator
`
`34, Applicant notes that Prediction 318 indicates whether the input to Discriminator
`
`304 is an original image or a reconstructed image (See [Q006]}. Accordingly, His
`
`respectiuily submitted that Prediction 218 is nel a predicted image.”
`
`Ep.
`
`Therefore, Applicant notes that Discriminator 304 of Rippel discriminates
`
`whether the input is an original image or a reconstructed image, and as such, if
`
`is respectiully submitted that Rippel falis to teach that Discriminator 304 outputs 3
`
`probability that a predicted image, which is nether an original image nor a
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 3
`
`reconstructed image, matches an original image to feed back the probability to
`
`the generator network.
`
`Applicant respectfully submits that any combination of Terada and Rinpe!
`
`would, at best, teach encoding an original image using the image encading
`
`apparatus of Terada, generating a reconstructed image by further decoding the
`
`encoded ariginal image, and adjusting neural networks of predictors (.¢., the NN
`
`intra predictor and the NN inter predictor of Terada} of the image encoding
`
`apparatus, based on a result of discriminating whether the input to the discrirninator
`
`is an original image or a reconstructed image as taught by Rippel.
`
`i}
`
`However, itis respectiully submitted that any combination of Terada and
`
`Hippel faiis io teach thal a predicted image is generated using a generator network,
`
`and that a probability that the predicted image maiches an original image (e.q., the
`
`inpul image required by claim 1} is fed back ic the generator nebwork using a
`
`discriminator network in order te update the generator network.
`
`Accordingly, any combination of Terada and Hippel necessarily fails to teach “feeds
`
`back, io the generator network, a probability that the predicted image matches
`
`the inpul unage Oy inputting the inopul image and the predicted image to a
`
`discriminator network, the discriminator network being a neural network and
`
`constituting a generative adversarial network (GAN) with the generator network" and
`
`“updaies the generaior nebvark and the discriminator network lo reduce difference
`
`between the inoul image and the predicted image and increase accuracy af
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 4
`
`discriminating between the inoul image and the predicted image,” as required by the
`
`above-noted features of claim 1.
`
`3-2.
`
`Examiner’s Rebuttal
`
`To point 3-1(i)
`
`To this point, Examiner contends that the arguments raised by Applicants to the
`
`mapping of claim 2, as being regarded by the Office to have referenced the paragraphs
`
`(e.g., Par.[0006]) or the figures (¢.g., Fig.3) or the discriminator 304 and its output 318
`
`herein allegedly introduced, are directed in error to the respective references captured
`
`from Rippel by which advancing the rationale that the Office would have improperly
`
`interpret Rippel, in considering the output (Id. 318) of the discriminator (Id. 304)
`
`represent a predicted image in a different embodiment seen asirrelevant by the Office.
`
`The argued paragraphs and figures are not part of the First Action on merit
`
`hence the responseto the Office Action of 07/21/2020, and at this point the argumentis
`
`considered mootfor failing to address the subject matter referenced in the 35 U.S.C.
`
`103 rejection.
`
`Please refer to MPEP 707.07 Unpersuasive Argument: Applicant Obtains
`Result Not Contemplated by Prior Art
`In responseto applicant's argumentthat [the recognized advantages over
`paragraphsnot used or referenced by the Office in rejection as stated at point 3-
`1(i) and repeatedat point 3-1(ii)], the fact that applicant has recognized another
`advantage which would flow naturally from following the suggestion of the prior art
`cannot be the basis for patentability when the differences would otherwise be obvious.
`See Ex parte Obiaya, 227 USPQ 58, 60 (Bd. Pat. App. & Inter. 1985).
`
`To point 3-1(ii)
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 5
`
`in rebuttal, ibis remarked that in rejecting claim 2 herein addressed by
`
`amenament, Examiner did not reference/mapped the claimed matter analysis to the
`
`discriminator element 304, nor misinterpreted the probability error output by unit 304
`
`as being a predicted image as alleged herein, hence the presumptive logic
`
`introduced by Applicants in argumentciting; “any combination of Terada and Rippel
`
`necessarily fails to teach.....”, may not be considered relevant from its referencing a
`
`different embodimentof the art in responseto the Office action rejection to claim 2.
`
`See the evidence mappedat the respective claim.
`
`Thus, - Applicant's arguments fail to comply with 37 CFR 1.111(b) because they
`
`amount to a general allegation that the claims define a patentable invention without
`
`specifically pointing out how the combination art distinguishes from the claim.
`
`Furthermore, the Applicant's specific analysis of the art to Rippel, being directed
`
`to different portions identified from the respective description, have not been considered
`
`in the Office Action (O.A.) rejection. See rebuttal at points 3-1(i) and (ii), addressing the
`
`combined references of Terada and Rippel, while the argument fails to provide an
`
`evidenciary based responseto the rejection albeit the consel legal argument provided to
`
`the 35 U.S.C. 103 rejection.
`
`Please refer to; MPEP 700, (707.07(f) Answer All Material Traversed
`Unpersuasive Argument: No Teaching, Suggestion, or Motivation To Combine
`In responseto applicant's argumentthat there is no teaching, suggestion, or motivation
`to combine the references, the examiner recognizes that obviousness may be
`established by combining or modifying the teachings of the prior art to produce the
`claimed invention where there is some teaching, suggestion, or motivation to do so
`foundeither in the references themselves or in the knowledge generally available to one
`of ordinary skill in the art. See /n re Fine, 837 F.2d 1071, 5 USPQ2d 1596(Fed. Cir.
`1988), In re Jones, 958 F.2d 347, 21 USPQ2d 1941 (Fed. Cir. 1992), and KSR
`International Co. v. Teleflex, Inc., 550 U.S. 398, 82 USPQ2d 1385 (2007). In this
`case, [In support to the combinedarts to Terada and Rippel, it is found that based
`on suggestionsin Rippel the art to Terada teaches the matter claimed about
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 6
`
`applying the input image and the predicted image to the NN prediction parameter
`determiner 109 in Fig.1 and outputs the parameter to theintra/inter predictor of
`the encoder 300 in Fig.1, and Fig. 35 which is being feed-backedto the intra/inter
`NN predictors 110 and 111 respectively, Par.[0243]-[0246]].
`
`To point 3-1(iii)
`
`Examiner reiterates the evidentiary probe mappedat claim 2 presently
`
`embeddedinto claim 1 by amendment, and remarks that the claim rajection has been
`
`defended solely based on counsel argument corresponding to the claim language.
`
`The argument is directed to the provisions in MPEP 2145stipulating; “However,
`
`arguments of counsel cannottake the place of factually supported objective evidence.
`
`See, e.g., In re Huang, 100 F.3d 135, 139-40, 40 USPQ2d 1685, 1689 (Fed. Cir.
`
`1996); In re De Blauwe, 736 F.2d 699, 705, 222 USPQ 191, 196 (Fed. Cir. 1984).”
`
`Conclusion to Finality decision
`
`In lieu of the above rebuttals to the Applicant’s arguments referring to other
`
`portions submitting a new portion of the reference to argue that the combined arts to
`
`Terada and Rippel would in fact teach away asrelated to the refernce used by Office,
`
`Examiner constructs the finality of the rejection based on supplementing the original
`
`rejection based on responding to the argument provided.
`
`Applicant's representative is encouraged to contact the Exmainer with matter
`
`deemedto advance the prosecution.
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 7
`
`Claim Rejections - 35 USC § 103
`
`In the event the determination of the status of the application as subject to AIA 35
`
`U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103)is incorrect, any
`
`correction of the statutory basis for the rejection will not be considered a new ground of
`
`rejection if the prior art relied upon, and the rationale supporting the rejection, would be
`
`the same under either status.
`
`The following is a quotation of 35 U.S.C. 103 which forms the basis for all
`
`obviousnessrejections set forth in this Office action:
`
`A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention
`is not identically disclosed as set forth in section 102, if the differences between the claimed
`invention and the prior art are such that the claimed invention as a whole would have been
`obvious before the effective filing date of the claimed invention to a person having ordinaryskill in
`the art to which the claimed invention pertains. Patentability shall not be negated by the manner
`in which the invention was made.
`
`The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148
`
`USPQ 459 (1966), that are applied for establishing a background for determining
`
`obviousness under 35 U.S.C. 103 are summarized asfollows:
`
`1. Determining the scope and contents of the prior art.
`2. Ascertaining the differences between the prior art and the claims at issue.
`3. Resolving the level of ordinary skill in the pertinent art.
`4. Considering objective evidence presentin the application indicating
`obviousness or nonobviousness.
`
`4.
`
`Claims 1, 3-12 and 14-24 are rejected under 35 U.S.C. 103 as being
`
`unpatentable over Kengo Teradaetal., (hereinafter Terada) (US 2018/0184123)in
`
`view of Oren Rippeletal., (hereinafter Rippel) (US 2018/0174052) in lieu of Prov.
`
`App. 62/434,600, 62/434,602, 62/434,603 and 62/458,749.
`
`Re Claim 1. (Currently Amended) Terada discloses, an encoder, comprising (an
`
`encoder 105 in Fig.35 or encoder 10 Fig.69A):
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 8
`
`orocessing circully fan enceding processor 100 in Fig.i}; and
`
`memory, wherein using ihe memory, the processing circultry Grame memory
`
`ti2 in Fig.t and step 5165 in Fig.6 Par.[O728}):
`
`generates a predicted image of an input image that is a current image to be
`
`encoded (generating an intra/inter predicted image, steps S121, Si22 Fig.3 by the
`
`intra/inter prediction generator Par(0119] of the current image to be encoded at
`
`aoparatus 100 in Fig.2 where the predicted image is generated by the NN in intra
`
`mode per Fig.4, of inter made in Fig.5}, based on generated data outoul from a
`
`generaior nelwork (and being based on the data outpul from the mode network
`
`generators for intra/inter prediction al Fig. 4 and 5 respectively, the encoding
`
`being based on a neural network, NN ¢.g., as generated by a function # outputting
`
`the prediction mode, per Fig.7 Par [00136] according to a parameter determiner
`
`109 setting different prediction and coding NN modes, Par [O1127) in response ta a
`
`reference image being input to the generator network, the generator nefwork being a
`
`neural network (the image prediction being based on the cutpul data generated by
`
`the neural network, NN generator in response to 4 reference image extracted at
`
`Sis, Fig.6, and inputted to the NN at Fig.13, Par.{Gi34] and on the program code
`
`ai Fig.19 Par.[O1S2] to generate the predicted pixel ¢.g., in “nan_inira’ mode and
`
`Fig.?4 Par{0145] and cade at Fig.20 for generating the prediction image in
`
`“aninfer’ mode being neural networks, Par [0153] and/or [O143h:
`
`calculates a prediction error by subtracting the predicted image from the input
`
`image (a prediction error is computed by subiraciing the predicied image from the
`
`inpul current image, Par {O104], (0105) [0120]124), (O1S6), (O1Se8i, Fig.7): andl
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 9
`
`generaies an encoded image by at least transiorming the prediction error (and
`
`further generating the encoded image at 105, Fig.i1, Par.[0136], based on
`
`prediction differences Fig.7, Par [0104] at encoder 100 generating the image
`
`stream af 105 by applying frequency transform at 103, step S726, on the
`
`prediction error Le., being computed at the diference block S125, at subtractor
`
`102 wherein generating the prediction error is generated by subtracting the
`
`prediction block from the current block, Par {O1G4] [0120
`
`feeds back, to the generator network, a probability that the predicted image
`
`maiches the indut image by inputting the input image and the predicted imaue fo a
`
`discriminator network Geeding back to the intra/inter NN generator the feedback
`
`parameters from NN by matching the inpul image to the predicted image Fig.7
`
`START
`
`
`
`
`INPUT CURRENTBLOCK AND NEIGHBORINGPIXELS
`TO|
`NEURAL NETWORK, AND DETERMINEPARAMETERFOR|
`“ CALCULATINGPE
`‘Bt
`NEIGHBOR
`
`
`
`“END ___
`A
`
`
`Par (2124), [O1S0-[01 32] or [0736], or based on highest correlation in case of inter
`
`NN orediction, Par O139]-(0141p.
`
`wodaies the generator network and the discriminator network to reduce
`
`difierence between ihe input image and the predicted image and increase accuracy of
`
`discriminating bebveen the input image and the predicted image (updating the
`
`generator network “NWN intra/inter Predictor’ 110 and f11 and the discriminator
`
`network 1694 with data from the input image and data fram the predicted image
`
`i.e., the prediction block, in Fig.35 to increase the accuracy of the predicted image
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 10
`
`by reducing the difference error, or discriminating between the input anc
`
`predicted image io be small per step S172 in fig.7 herein reproduced with
`
`highlights far brevity
`
`FIG. 36
`
`r3
`
`DIFFERENCE
`“ean
`BLOCK PAR
`;,
`SUBTRAGCTOR o
`Dna
`
`Le HOSA
`
`pi BITRTREAM
`
`and Par [0243)-[0246)).
`
`Within the same neural network prediction and coding apparatus and method
`
`based on Neural Network error feedback being determined belweeri thie input and the
`
`oredicied image described in Terada, the art io Rippel expressly teaches about using a
`
`generative adversarial network (GAN} with the generator network feeding back, to the
`
`generator network a probability that the predicted image matches the input image by
`
`inputting the input image and the predicted wage to a discriminator nebwark,
`foeds back, oredicted wmace to the generator network, 4 orobabiliy that the
`
`
`
`
`
`matches the inout image by inputting the input image and the predicted image to a
`
`discriminator network
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 11
`
`the discriminator network being a neural network and constituting a qenerative
`
`
`
`adversarialnetwork(GAN)withthegeneratornetwork (@ generative adversarial
`
`network, Title, Abstract): and
`
`Wodates the generator network and the discriminator network to reduce
`
`difisrence between ihe input image and the predicted image and increase accuracy of
`
`predicted unace (updating the
`ciscrimmating bebveen the in Ui image and the
`
`
`
`
`discriminaior and the encoder, io minimize the loss, Par fO014] Le., fo reduce the
`
`difference between the criginal image and the reconstructed content, as earlier
`
`taught at Par.[O010] per discriminators 704 and the feedback output 734 in Fig.7
`tt Fa
`AMutoencederA t
`ctOS
`
` Saenannnannvngnenannanannene
`
`
`ies =’ Me PSR
`
`
`\ Aukoanooder
`TOR
`
`ft
`f
`
`‘ i
`
`
`
`vue
`renee
`
`
`
`. Par[0073] in order
`“EURAY
`to increase the prediction accuracy by reducing the error corresponding with the
`
`encoder loss, Par (0075-0077).
`
`in carsideration to ihe prediction process using neural networks identified in
`
`Terada being apmlied to reduce the prediction error (Par iGi3ep in order to improve the
`
`coding efficiency (Par [0158)) by switching the encoding modes for the best image
`
`quality (Par [0445] (0446) Fig. 22} and respectively reducing the image distortion,
`
`indicating the presence of the error feedback cetermined af the NN prediction 109A,
`
`and feedback fo the NN inter/intra predictors 110 and 111, e.g., being viewed as the NN
`
`orediclan discriminator, by which the ardinary skilled in the art would have had the
`
`incentive before the effective filing date of the application, io search for similar NN
`
`prediction modes seeking to reduce the difference between the current image and the
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 12
`
`uredicied value as identified in Rippel disclosing a NN coding method and reducing the
`
`prediction error by using a discriminator error feedback (Par.[0076],[0077]) thus
`
`deeming the combination predictable, hence obviating the claim.
`
`The rationale fo combine finds support in the Graham factual inquines necessary
`
`io substantiate the above combination, in view of the instant fact case under
`
`consideration and in accordance with explaining the conclusion of obviousness in view
`
`of the provisions stioulated in MPEP 2143: Basic Requirements of a Prima Facie Case
`
`of Obviousness. I.
`
`EXEMPLARY RATIONALES(A),(D) and (G),that may support a
`
`conclusion of obviousness above evicenced,
`
`including :
`
`(A) Combining prior art elements according to knawn methads to yield
`&
`predictable results improving the prediction error processing by applying
`feedback mechanism to the NN prediction method as disclosed in Rippel
`(Par.[0076],[0077]) by combining with Terada suggestions for error reduction
`{Par (0445) 10446] Fig 22};
`
`{(D) Applying @ known technique to a known device (method, or product}
`&
`reacy for improvernent to yield predictable results Gwhere the method in Rippel is
`known and adapted in the art per the preliminary provisional teachings}:
`
`(G) Some teaching, suqgestion, or motivation in the orior art that would

`have led one of arcinary skill fo modify the prior art reference or ia combine prior art
`reference teachings ta arrive at the claimed invention (where the suggestion to
`combine relies on the common interest to reduce the prediction error identified in
`both aris}.
`
`See precedence im “The Federal Circuit recognized Agrizap as “a textbook case of when
`the asserted claims involve a cornbination of familiar elements according to known methads
`that does no more than yield predictable results.” id. Agrizap exemplifies a strang case of
`obviousness based on simple substitution that was not overcome by the objective evidence of
`nonobviousness offered. It also demonstrates that analogous art is not limited to the field of
`applicant’s endeavor, in that one of the references that usec an animal body as a resistive
`switch ta complete a circuit for the generation of an electric charge was not in the field of pest
`
`control”
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 13
`
`2, (Cancelled)
`
`Re Claim S. (Original Terada anc Alppel disclose, the encocler according to
`
`claim 1. wherein
`
`Terada teaches about, the reference image is a processed image included in a
`
`picture, the picture including the input image, anc
`
`in generating the predicied image, ihe processing circultry generates a first inira-
`
`precicied image as the predicted image, based an ihe generated data (ihe
`
`neighboring pixels represent ihe processed reference image included in the
`
`picture that includes the input image by which the first intra-predicted data is
`
`generated, Fig.7 and 14 per block 170 in Fig.1 Par.f0108)-[0709)] ¢.¢., as indicated
`
`by ihe intra-prediction switch for “Fixed intra- Predictor” L.e., regular intra-mode
`
`at block 116b in Fig.t Par [O121)-[01 23).
`
`Re Claim 4. (Original Terada and Rippel disclose, the encoder according to
`
`claim 3, wherein the processing circultry further:
`
`Terada ieaches about the processor, generales a second intra-precicted image
`
`of the input image by intra prediction based on the reference image (the processor
`
`generates the second intra-predicied image by switching from 110b te the 11Ga,
`
`NA prediction mode, Par [fO124)):
`
`selects an image from among the first inira-precicted image and the second
`
`inira- predicied image (selecting one of the plurality of indicated intra-predictecd
`
`inpul inages Fig.22, ¢.g., a second predicted image block, by applying the
`
`instruction code fer fixed “intra_pred_type” based on the neighboring references
`
`ai 1106, or indicating which one of the NN intra-prediction is used, Par jO154]
`
`according to the accuracy level determined at Par.[0i24] as depicied at switching
`
`block 110a in Fig.t and Par.[ot24)}: and
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 14
`
`when the processing circuitry selects the second imtra-predicted image in
`
`selecting the image, calculates the prediction errar by sublracting the second intra-
`
`predicied image from the input image in calculating the prediction error (he selected
`
`second intra-predicied block a1 110 in Fig.i, 11a, Le., the second intra-predicted
`
`image block, encodes the prediction error according to the NN, Par.{0120}).
`
`Re Claim 5. (Orginal Terada and Riopel disclose, the encoder according to
`
`clairn 3, wherein
`
`Terada teaches wherein, in generating the precictec image, the processing
`
`circuilry generates, as the first intra~ predicted image, the generated data cutout from
`
`ihe generaior nelwark in response to the reference image being inpul to the generaior
`
`network (processing Le., generating the first selected inira-predicted block Le., the
`
`first intre-predicted image block, encedes the prediction error, in regard te the
`
`selecied reference image inputted to the NN generator, Par [0120] Fig.13}.
`
`Re Claim 6. (Original) Terada and Rippel disclose, the encoder according ta
`
`claim 3, wherein
`
`Terada teaches wherein, in generating the predicted image, the processing
`
`clroultry: oblains, as an intra prediction parameter, the generated data outout from the
`
`generator network in response to the reference image being input to the generator
`
`network; and generates the first intra-predicied image by intra prediction based on the
`
`reference image and the intra prediction parameter (see Fig.7, obtaining Le.,
`
`determining ine intra-prediction parameter, based on which generating the first
`
`inira-predicied imaqe, Far.[O1i4i and Par fi30Liidi] and Fig.13 Par [O134p.
`
`Re Glairn 7. (Currently Amended} Terada and Rippel disclose, the encoder
`
`according fo claim 1, wherein
`
`Terada teaches abaul, the reference image is included in a processed picture
`
`thal is not etterertivem a picture that includes the input image, the processed picture
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 15
`
`being 2 picture on which the encoder has already performed processing related io
`
`encoding. ard
`
`in generating the precicied image, the processing circullry generates a first inter-
`
`predicied image as the predicted image, based on the generated data Gn an inter-
`
`prediction mode at block 111 relies on a reference picture applying the encoding
`
`process at units 103-104 then inte the prediction loop at 107-106 as reference
`
`picture appled to the NN inter-predictor tita, in Fig.1 and Par [O116)-[(0120) by
`
`which generating a first predicted image in inter-prediction mode based on the
`
`generated reference image data, Par (O137]-(0739]}.
`
`Re Claim 8. (Original Terada and Rippel disclose, the encoder accarding ta
`
`claim 7, wherein the processing circuttry further:
`
`Terada teaches about the processor, generates a second inter-precicted image
`
`of the input image by inter prediction based an the reference image (generating a
`
`second prediction by switching between “fixed inter-precdiction” mode P11 in
`
`Fig.?, where generating a second predicted image in inter-prediction mode based
`
`on the reference image per Par [0137]-(6139] according to condHions set al
`
`Par (O17sh:
`
`selects an image fram among the first inter-predicied image and the second
`
`infter- predicted image (selecting form first and second inter-prediction modes titb
`
`and 1i1a, or using the second layer to which a reference pixel is connected to a
`
`node is used as a second-inter-predicted pixel, Par [O178). and
`
`when the processing circultry selecis the second inter-predicted image in
`
`selecting the image, calculates the prediction error oy subtracting the second inter-
`
`predicted image from the input image in calculating the prediction error {upon selecting
`
`the second inter-predicted image, perform the calculation of the prediction errar
`
`as previously established at Par.[0170) (However, this claim is aise rejected
`
`under 35 U.S.C.112(3) for depending from claim 7 and requires clarification prior
`
`io assessing a proper examination).
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 16
`
`Re Claim 9. (Orginal Terada anc Rippel disclose, the encocler according to
`
`clair 7, wherein
`
`Terada icaches about, in generating the predicied image, ihe processing
`
`circuilry generates, as the first inter-~ predicted image, the generated data cutout from
`
`ihe generaior nelwark in response to the reference image being inpul to the generaior
`
`network (he infer-prediction mode selected al the NN position Tita in Fig.d block
`
`ti and is disclosed at Par fO11S], [0120] and executed per NN generator block in
`
`Fig.i4 Par (6143), (0145).
`
`Re Claim 10. (Orginal Terada and Rippel disclose, the encocer according to
`
`claim 7, wherein in generating the predicted image, the nrocessing circuitry:
`
`Terada teaches about, oblains, as an inter prediction parameter (per syntax of
`
`parameter in NN inter-prediction code at Fig.1S and 20), the generated data output
`
`from the gerieraior network in response to the reference image being input fo the
`
`generator network (obtaining the inter-prediciion parameter, the data generated at
`
`the ouiput of the NN according to the inputted image, Fig.14, Par [0943)}: and
`
`generates the first inter-predicted image by inter prediction based on the
`
`reierence image and ihe infer prediction’ parameter (generating the first inter-
`
`predicted image based on the reference and the NN inter-prediction parameter,
`
`Par (O137E(0139i and Fig.8}.
`
`Re Claim 11. (Orginal Terada and Rippel disclose, the encoder according to
`
`claim 1, wherein
`
`Terada teaches about, the generator network is a hierarchical network that
`
`includes an inpul layer, a hidden layer, and an outpul layer (ihe NN generator is of a
`
`hierarchical type, per Fig.d3, 14}.
`
`

`

`Application/Control Number: 16/664,084
`Art Unit: 2487
`
`Page 17
`
`Re Claim 12. (Currently Amended) This claim represents the decacting pari of the
`
`prediction loop made part of the encoder represented in claim 1, hence perfarming a
`
`similar prediction orocess in the decoding of the reconstructed picture by flowing the
`
`same Hmiting steps and in similar orderthus it is relected on the same evidentiary
`
`premise, mutatis mutandis.
`
`13. (Cancelled)
`
`Re Claim 14. (Original) This claim represents the decoding part of the prediction
`
`loop made part of the encoder represented in a imitation of claim 3, hence performing a
`
`similar prediction process in the decoding of the reconstructed picture by flowing the
`
`same limiting steps and in similar order thus it is relected on the same evidentiary
`
`premise, mutatis mutandis.
`
`Re Claim 15. (Original) This claim represenis the decoding part of the prediction
`
`loop mace part of the encoder represented in clairn 4, hence performing a similar
`
`brediclion process in the decoding of the reconstructed picture by Towing ihe same
`
`limiting stens and in similar orcer thus it is rejected on the same evidentiary premise,
`
`mutatis mutandis.
`
`Re Claim 16. (Original This c

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket