throbber
DEVICE FOR OVERALL MACHINE TOOL MONITORING
`
`Field of the invention
`
`The present
`
`invention relates to a device for overall
`
`machine tool monitoring and, more particularly,
`
`to a device
`
`for monitoring, prior to and during machining operation, an
`
`anomaly existence in the machine
`
`tool,
`
`and further
`
`for
`
`detecting a fault in the machine tool.
`
`Background of the Invention
`
`Conventionally,
`
`there has been known that a technique
`
`for detecting vibrations generated while a machine tool
`
`is
`
`machining,
`
`so
`
`that monitoring
`
`chatter vibrations
`
`and
`
`unbalance of a grinding stone and the like while the machine
`
`tool
`
`is machining has been considered.
`
`In order to detect
`
`the vibrations,
`
`an acceleration or an accustic emission is
`
`monitored (see, e.g., Japanese Patent Laid-open Application
`
`10
`
`15
`
`20
`
`No.H8—261818).
`
`Patent Reference discloses a technique for determining
`
`whether
`
`the chatter vibrations, unbalance of
`
`a grinding
`
`stone,
`
`or
`
`the
`
`like exist or not
`
`through monitoring a
`
`frequency spectrum. However,
`
`it is impossible for a person
`
`25
`
`to monitor the frequency spectrum all the time. Therefore,
`
`it is not practical to be actually used in the machine tool.
`
`

`

`Automation of
`
`the determination is required for actual use
`
`in the machine tool, and a neural network or fuzzy logic may
`
`be used in the determination.
`
`The neural network requires learning various states to
`
`determine
`
`various
`
`situations,
`
`but
`
`collecting
`
`training
`
`samples with respect to the situations which rarely occur is
`
`difficult. Therefore,
`
`the neural network has a problem that
`
`it takes long time to learn. Further,
`
`the fuzzy logic has a
`
`problem that it requires time to set a membership function.
`
`In
`
`order
`
`to
`
`solve
`
`such
`
`problems,
`
`it
`
`could
`
`be
`
`considered that
`
`the neural network learns normal states of
`
`the machine tool, and then determines states except for the
`
`normal states to abnormal.
`
`However,
`
`the machine tool has
`
`totally different normal states depending on whether it is
`
`prior to performing machining operation or it is performing
`
`machining operation.
`
`Moreover,
`
`an anomaly can be also
`
`caused by a fault in the machine tool as well as an abnormal
`
`state of tool attachment or of contact between the tool and
`
`a workpiece.
`
`Therefore,
`
`classification is
`
`required to
`
`distinguish these states.
`
`If states except for the normal
`
`states are treated being oversimplified as an abnormal state,
`
`the classification is impossible.
`
`Summary of the Invention
`
`In view of the above,
`
`the present invention provides a
`
`10
`
`15
`
`20
`
`25
`
`

`

`device for overall machine tool monitoring which is capable
`
`of distinguishing anomalies
`
`between occurring prior
`
`to
`
`machining operation and during machining operation and,
`
`moreover, capable of detecting a fault in the machine tool,
`
`even though neural networks learn only normal states of
`
`the
`
`machine tool.
`
`In this configuration,
`
`the device includes the first
`
`neural network for classifying the prior racing operation
`
`into a normal state and an abnormal state so that whether an
`
`10
`
`attachment
`
`state of
`
`a
`
`tool
`
`is
`
`normal
`
`or not
`
`can be
`
`determined.
`
`That
`
`is, unbalance in the attachment state of
`
`the
`
`tool or
`
`a
`
`fault
`
`in the
`
`tool
`
`can be detected by
`
`determining the anomaly in the tool.
`
`Further,
`
`the device
`
`includes
`
`the
`
`second neural network for classifying the
`
`15
`
`operation during the machining operation into a normal state
`
`and an abnormal state so that an anomaly in a contact state
`
`of
`
`the tool
`
`to the workpiece can be detected by the second
`
`neural network.
`
`In other words,
`
`it is possible to detect
`
`anomalies
`
`such
`
`as
`
`self—induced vibrations
`
`or
`
`chatter
`
`20
`
`vibrations generated depending on
`
`the
`
`relative position
`
`between the workpiece and the tool.
`
`Further,
`
`since the
`
`deviation history is obtained from the first and the second
`
`neural networks,
`
`tendency toward deteriorating performance
`
`of
`
`the machine
`
`tool or
`
`the
`
`tool
`
`can be obtained and,
`
`25
`
`moreover, it is possible to determine a fault in the machine
`
`tool or a tool breakdown when the deviation deviates from
`
`

`

`the tendency toward deteriorating performance.
`
`As afore mentioned,
`
`it
`
`can. become
`
`independent of
`
`a
`
`person to detect an anomaly existing prior to and during the
`
`machining operation, and a fault
`
`in the machine tool, while
`
`the neural networks learning only normal categories are used,
`
`so that
`
`learning' becomes easier.
`
`Therefore,
`
`taking time
`
`until an actual operation can be reduced and results with
`
`respect
`
`to anomalies
`
`requiring to be classified can be
`
`obtained, corresponding to respective classification.
`
`Further, since a plurality of neural networks are used
`
`to classify a plurality of anomalies while a common signal
`
`input unit
`
`is used,
`
`the signal
`
`input unit does not need to
`
`be provided to every kind of
`
`the anomalies and a simpler
`
`configuration to implement the device can be possible.
`
`In this configuration,
`
`since the vibrations from the
`
`machine tool are used to monitor whether an anomaly exist or
`
`not,
`
`even. previous machine tools only' need the vibration
`
`sensor being attached thereto.
`
`In this configuration, a fault in the tool as well as
`
`tilt in an attachment position of
`
`the tool can be detected
`
`by using frequency components
`
`of
`
`the
`
`target
`
`signal
`
`as
`
`information on
`
`a
`
`state prior
`
`to machining operation.
`
`Further,
`
`since the frequency components of
`
`the envelop of
`
`the
`
`target
`
`signal
`
`are used as
`
`information during the
`
`machining operation,
`
`noise
`
`components
`
`such as
`
`accustic
`
`emissions generated during the machining operation are
`
`10
`
`15
`
`20
`
`25
`
`

`

`removed. As a result, a position relation between the tool
`
`and the workpiece can be easily obtained.
`
`Since
`
`the
`
`competitive
`
`learning neural networks are
`
`used in this embodiment,
`
`simple configuration is possible
`
`and, moreover,
`
`learning can
`
`be
`
`simply carried out
`
`by
`
`colleting the
`
`training samples with
`
`respect
`
`to
`
`every
`
`category and assigning the training samples
`
`to respective
`
`categories.
`
`10
`
`Brief Description of the Drawings
`
`The objects and features of the present invention will
`
`become
`
`apparent
`
`from the
`
`following
`
`description
`
`of
`
`embodiments
`
`given
`
`in conjunction with
`
`the
`
`accompanying
`
`15
`
`drawings,
`
`in which:
`
`Fig.
`
`1
`
`is a. block diagram of an embodiment of
`
`the
`
`present invention; and
`
`Fig.
`
`2
`
`illustrates a
`
`schematic configuration of
`
`a
`
`neural network used in the embodiment in Fig. 1.
`
`20
`
`Detailed Description of the Embodiments
`
`Embodiments
`
`of
`
`the present
`
`invention will
`
`now be
`
`described with reference to the accompanying drawings which
`
`25
`
`form a part hereof.
`
`A machine tool exemplified in an embodiment described
`
`

`

`below has a tool rotatably driven by a driving unit. There
`
`are various kinds of machine tools for machining such as
`
`cutting or polishing in the machine
`
`tool.
`
`Any driving
`
`source using a motor can serve as the driving unit,
`
`and a
`
`proper power
`
`transmission unit such as a gearbox or a belt
`
`can be provided between the driving source and the tool.
`
`Hereinafter, a spindle with a housing is exemplified as the
`
`driving unit.
`
`As
`
`shown in Fig.
`
`l, a device for overall machine tool
`
`monitoring described in the present embodiment uses, e.g.,
`
`unsupervised competitive learning neural networks 1a and 1b
`
`(hereinafter,
`
`simply referred to as neural networks if not
`
`otherwise necessary for
`
`some purpose).
`
`Supervised back
`
`propagation type neural networks can be also used as neural
`
`networks, but
`
`the unsupervised competitive learning neural
`
`networks are more appropriate for
`
`this purpose since the
`
`unsupervised
`
`competitive
`
`learning
`
`neural
`
`networks
`
`have
`
`simpler configuration than the supervised back propagation
`
`type, and training of the unsupervised competitive learning
`
`neural network can be made only once by using training
`
`samples of every category, or can be enhanced further by
`
`performing additional training.
`
`As shown in Fig. 2, each of the neural networks la and
`
`1b has
`
`two layers,
`
`i.e.,
`
`an input
`
`layer 11 and an output
`
`layer 12, and is configured such that every neuron N2 of the
`
`output
`
`layer 12 is connected to all neurons N1 of the input
`
`10
`
`15
`
`20
`
`25
`
`

`

`layer 11.
`
`In the embodiment,
`
`the neural networks la and 1b
`
`may be executed by an application program running at
`
`a
`
`sequential processing type computer, but a dedicated neuro—
`
`computer may be used.
`
`Each of the neural networks 1a and 1b has two modes of
`
`operations,
`
`i.e.,
`
`a
`
`training mode
`
`and a checking mode.
`
`After
`
`learning through proper
`
`training samples
`
`in the
`
`training mode,
`
`an amount of characteristics
`
`(check data)
`
`formed as a plurality of parameters generated from an actual
`
`10
`
`target signal is classified into a category in the checking
`
`mode.
`
`A coupling degree (weight coefficients) of the neurons
`
`N1 of
`
`the input
`
`layer 11 with the neurons N2 of
`
`the output
`
`layer
`
`12
`
`is variable.
`
`In the training mode,
`
`the neural
`
`15
`
`networks 1a and 1b are trained through inputting training
`
`sample to the neural networks la and 1b so that respective
`
`weight coefficients of the neurons N1 of the input
`
`layer 11
`
`with the neurons N2 of the output
`
`layer 12 are decided.
`
`In
`
`other words,
`
`every neuron. N2 of
`
`the output
`
`layer
`
`12
`
`is
`
`20
`
`assigned. with a weight vector having weight coefficients
`
`associated with all the neurons N1 of the input
`
`layer 11 as
`
`elements of the weight vector. Therefore,
`
`the weight vector
`
`has same number of elements as the number of neurons N1 in
`
`the input
`
`layer 11,
`
`and the number‘ of parameters of
`
`the
`
`25
`
`amount of characteristics inputted to the input
`
`layer 11 is
`
`equal
`
`to the number of the elements of the weight vector.
`
`

`

`Meanwhile,
`
`in the checking mode, when check data whose
`
`category needs to be decided is given to the input
`
`layer 11
`
`of
`
`the neural networks
`
`1a
`
`and 1b,
`
`a neuron having the
`
`shortest Euclidean distance between the its weight vector
`
`and the check data,
`
`is excited among the neurons N2 of
`
`the
`
`output
`
`layer 12.
`
`If categories are assigned to the neurons
`
`N2 of
`
`the output
`
`layer 12 in the training mode,
`
`a category
`
`of the check data can be recognized through a category of a
`
`location of the excited neuron N2.
`
`The neurons N2 of
`
`the output
`
`layer 12 are associated
`
`with
`
`zones
`
`of
`
`respective
`
`two-dimensional
`
`cluster
`
`determination units 4a and 4b having 6 *
`
`6 zones for example
`
`in one—to-one correspondence.
`
`Therefore,
`
`if categories of
`
`the training samples are associated with the zones of
`
`the
`
`cluster
`
`determination
`
`units
`
`4a
`
`and
`
`4b,
`
`a
`
`category
`
`corresponding to a neuron N2 excited by check data can be
`
`recognized through the cluster determination units 4a and 4b.
`
`.Thus,
`
`the cluster determination units 4a and 4b can function
`
`as an output unit for outputting a classified result. Here,
`
`the cluster determination units 4a and 4b may be visualized
`
`by using a map.
`
`When associating categories with each of
`
`the zones of
`
`the cluster determination units 4a and 4b (actually each of
`
`the neurons N2 of
`
`the output
`
`layer 12),
`
`trained neural
`
`networks
`
`la and 1b are operated in the reverse direction
`
`from the output layers 12 to the input layers 11 to estimate
`
`10
`
`15
`
`20
`
`25
`
`

`

`data assigned to the input
`
`layers 11 for every neuron N2 of
`
`the output
`
`layers 12.
`
`A category' of
`
`a
`
`training sample
`
`having the shortest Euclidean distance with respect
`
`to the
`
`estimated data is used as a category' of a corresponding
`
`neuron N2 in the output layer 12.
`
`In other word, a category of a training sample having
`
`the shortest Euclidean. distance with respect
`
`to a 'weight
`
`vector of
`
`a neuron N2
`
`is used for
`
`a category of
`
`the
`
`corresponding neuron N2 of the output layer 12. As a result,
`
`the categories of
`
`the training samples are reflected to the
`
`categories of the neurons N2 of the output layer 12.
`
`A large number of
`
`training samples
`
`(for example,
`
`150
`
`samples)
`
`are employed to each of
`
`the categories so that
`
`categories having similar attributes
`
`are
`
`arranged close
`
`together in the cluster determination units 4a and 4b.
`
`In
`
`other words,
`
`the neurons N2, excited in response to training
`
`samples belonging to a like category among the neurons N2 of
`
`the output
`
`layer 12,
`
`form a cluster formed of a group of
`
`neurons
`
`N2
`
`residing
`
`close
`
`together
`
`in
`
`the
`
`cluster
`
`10
`
`15
`
`20
`
`determination units 4a and 4b.
`
`Cluster determination units 4a and 4b are originally
`
`the one in which clusters are formed in association with
`
`categories after training, but
`
`in this embodiment even the
`
`one before training is also called a cluster determination
`
`25
`
`unit 4a or 4b so that both of
`
`them are not distinguished.
`
`The training samples given to the neural networks 1a and lb
`
`

`

`operating in the training mode are stored in respective
`
`training sample storages 5a and 5b, and retrieved therefrom
`
`to be used in the respective neural networks 1a and 1b when
`
`necessary.
`
`Information
`
`to
`
`be
`
`detected by
`
`using
`
`the
`
`neural
`
`networks 1a and 1b is whether an anomaly exists in racing
`
`operation befOre the machine tool X machines a workpiece or
`
`not, whether an anomaly exists in an operation during the
`
`machine tool X is machining a workpiece or not, and whether
`
`10
`
`the machine tool X is out of work or not.
`
`Therefore,
`
`in
`
`order
`
`to classify anomalies before machining and during
`
`machining into categories,
`
`two neural networks la and lb are
`
`provided.
`
`for being' used. prior
`
`to Inachining operation and
`
`during machining operation respectively.
`
`The neural network
`
`15
`
`1a for being used prior to the machining operation learns
`
`only a normal state by using the training samples of
`
`a
`
`normal state prior to the machining operation.
`
`The neural
`
`network lb for being used during machining operation learns
`
`only a normal state by using the training samples of
`
`a
`
`20
`
`normal state during the machining operation.
`
`Both of
`
`the neural networks la and 1b classify input
`
`data into categories according to ‘whether
`
`the input data
`
`belong
`
`in
`
`normal
`
`categories
`
`or
`
`not.
`
`The
`
`cluster
`
`determination units
`
`4a
`
`and 4b correspond to the neural
`
`25
`
`networks
`
`1a
`
`and
`
`1b
`
`respectively,
`
`and
`
`the
`
`cluster
`
`determination unit 4a produces an output concerning whether
`
`‘10—
`
`

`

`an anomaly exists prior to the machining operation, while
`
`the
`
`cluster determination unit
`
`4b
`
`produces
`
`an
`
`output
`
`concerning whether an anomaly exists during the machining
`
`operation.
`
`A history determination unit 4c as well as the cluster
`
`determining units 4a and 4b is provided at a determination
`
`unit 4.
`
`The history determination unit 4c computes, with
`
`respect
`
`to each of
`
`the neural networks
`
`1a
`
`and
`
`1b,
`
`a
`
`deviation which
`
`is
`
`equivalent
`
`to an Euclidean distance
`
`between
`
`the
`
`input
`
`data
`
`and
`
`the weight
`
`coefficients
`
`associated with the neurons N2 of
`
`the output
`
`layer 12
`
`in
`
`each of the neural networks 1a and lb, and stores history of
`
`the computed deviation.
`
`The history determination unit 4c
`
`determines an. anomaly existence (mostly,
`
`a
`
`fault)
`
`in the
`
`machine tool X if the deviation is greater than a preset
`
`threshold. Outputs of
`
`the cluster determination units 4a
`
`and 4b
`
`and the history determination unit
`
`4c
`
`come out
`
`through the output unit 6.
`
`The method for computing the
`
`deviation will be described later.
`
`Electric signals representing vibrations generated by
`
`the machine tool X are used as target signals and amounts of
`
`characteristics to be assigned to the neural networks 1a and
`
`lb are extracted from the target signals by the respective
`
`10
`
`15
`
`20
`
`characteristics
`extracting units
`3a
`and
`3b.
`In this
`embodiment,
`a vibration sensor
`2 employing an acceleration
`
`25
`
`pick—up is used to output
`
`the electric signals representing
`
`-11-
`
`

`

`vibrations generated from the machine tool X. The output of
`
`the vibration sensor 2a is inputted to the signal input unit
`
`2
`
`and
`
`the
`
`target
`
`signal
`
`from which
`
`the
`
`amount
`
`of
`
`characteristics will be extracted is segmented by the signal
`
`input unit 2.
`
`A microphone or an accustic emission sensor
`
`may' be used as
`
`a sensor
`
`for detecting ‘Vibrations of
`
`the
`
`machine tool X.
`
`A tool of
`
`the machine
`
`tool
`
`X exemplified in this
`
`embodiment is rotatably driven by a driving unit so that an
`
`output of the vibration sensor 2a is periodic. An extracted
`
`amount of characteristics varies depending on a position, on
`
`a time axis, of
`
`the output of
`
`the vibration sensor 2a from
`
`which the amount of characteristics is extracted. Therefore,
`
`prior to the extraction of amounts of characteristics,
`
`the
`
`signal
`
`input unit
`
`2
`
`is required to regulate the positions
`
`where amounts of characteristics are extracted from outputs
`
`of the vibration sensor 2a.
`
`In the present embodiment,
`
`the positions where amounts
`
`of
`
`characteristics
`
`are
`
`extracted
`
`are
`
`regulated
`
`by
`
`segmentation performed by the signal
`
`input unit
`
`2 and the
`
`segmentation will be described later.
`
`Therefore,
`
`the
`
`signal
`
`input unit
`
`2
`
`performs
`
`the
`
`segmentation of
`
`the
`
`target
`
`signal produced through the
`
`vibration sensor 2a on the time axis, e.g.,
`
`by using a
`
`timing
`
`signal
`
`(trigger
`
`signal)
`
`synchronous with
`
`the
`
`operation
`
`of
`
`the machine
`
`tool
`
`X
`
`or
`
`by
`
`using wave
`
`_12-
`
`10
`
`15
`
`20
`
`25
`
`

`

`characteristics of
`
`the target signal
`
`(for example, a start
`
`point and an end point of an envelop of the target signal).
`
`The
`
`signal
`
`input unit
`
`2 has
`
`an A/D converter
`
`for
`
`converting
`
`the
`
`electric
`
`signals
`
`produced
`
`through
`
`the
`
`vibration sensor 2a into digital signals and a buffer for
`
`temporarily storing the digital signals.
`
`The segmentation
`
`is performed on the signals stored in the buffer. Further,
`
`limitation of a frequency bandwidth or the like is performed
`
`in
`
`order
`
`to
`
`reduce
`
`noises when
`
`necessary.
`
`In
`
`the
`
`segmentation of
`
`the target signal, only a single segmented
`
`signal need not be outputted from one period of
`
`the target
`
`signal, but a plurality of segmented signals may be made per
`
`every proper unit time.
`
`The segmented target signals by the signal
`
`input unit
`
`2 are inputted to the characteristics extracting units 3a
`
`and
`
`3b
`
`provided
`
`at
`
`the
`
`neural
`
`networks
`
`la
`
`and
`
`lb
`
`respectively. The characteristics extracting units 3a and 3b
`
`extract one set of
`
`amount of characteristics including' a
`
`plurality of parameters
`
`from one
`
`segmented signal.
`
`The
`
`amounts
`
`of characteristics
`
`can
`
`be
`
`adaptively extracted
`
`according to characteristics considered in the target signal.
`
`In the present
`
`embodiment,
`
`the characteristics extracting
`
`unit 3a for extracting the amount of characteristics from
`
`vibrations prior to machining operation extracts frequency
`
`components of the whole frequency bandwidth detected through
`
`the vibration sensor 2a (power at every frequency bandwidth)
`
`-13-
`
`10
`
`15
`
`2O
`
`25
`
`

`

`as the amount of characteristics, while the characteristics
`
`extracting
`
`unit
`
`3b
`
`for
`
`extracting
`
`the
`
`amount
`
`of
`
`characteristics from vibrations during machining operation
`
`extracts
`
`frequency
`
`components
`
`from an
`
`envelop
`
`of
`
`the
`
`electric signal detected through the vibration sensor 2a.
`
`The characteristics extracting units 3a and 3b may use
`
`FFT
`
`(Fast Fourier Transform)
`
`in order
`
`to extract
`
`the
`
`frequency
`
`components.
`
`Further,
`
`the
`
`characteristics
`
`extracting unit 3b performs equalization for extracting the
`
`envelop
`
`before
`
`extracting
`
`the
`
`frequency
`
`components.
`
`Frequency
`
`components
`
`to
`
`be
`
`used
`
`in
`
`the
`
`amount
`
`of
`
`characteristics are properly decided depending on the type
`
`of the machine tool to be employed.
`
`The
`
`amounts
`
`of
`
`characteristics
`
`obtained
`
`from the
`
`characteristics extracting units 3a and 3b are stored in the
`
`respective training sample storages 5a and 5b when training
`
`samples are collected prior to the training mode.
`
`In the
`
`checking mode,
`
`the amounts of characteristics are provided
`
`to the neural networks 1a and 1b whenever
`
`the amounts of
`
`characteristics
`
`are
`
`extracted, wherein
`
`the
`
`amounts
`
`of
`
`characteristics are served as
`
`check data and the neural
`
`networks 1a and lb classifies the check data into categories.
`
`The data stored in the training sample storages 5a and
`
`5b may be called a data set.
`
`It is clearly from described
`
`above that the training sample storage 5a corresponding the
`
`neural network la stores the data set obtained. when the
`
`10
`
`15
`
`20
`
`25
`
`-14-
`
`

`

`machine
`
`tool
`
`X
`
`is
`
`racing normally before machining
`
`a
`
`workpiece,
`
`while
`
`the
`
`training
`
`sample
`
`storage
`
`5b
`
`corresponding the neural network 1b stores the data set
`
`obtained when
`
`the machine
`
`tool
`
`X
`
`is operating normally
`
`during machining the workpiece.
`
`The number of data forming
`
`the data set can be arbitrarily decided within a range of a
`
`capacity of each of
`
`the training sample storages 5a and 5b.
`
`However, it is preferable that about 150 of data are used to
`
`train
`
`each
`
`of
`
`the
`
`neural
`
`networks
`
`1a
`
`and
`
`1b
`
`as
`
`10
`
`aforementioned.
`
`Since only the set of data. belonging to the normal
`
`categories is stored in the training data storages 5a and 5b,
`
`the neural networks 1a and 1b learn only a normal state if
`
`the neural networks 1a and 1b are trained by using the data
`set stored in the training sample storages 5a and 5b at the
`
`15
`
`training mode.
`
`In other word,
`
`since only the
`
`normal
`
`categories are associated with the zones of
`
`the cluster
`
`determination units 4a and 4b,
`
`the aforementioned operating
`
`in
`
`the
`
`reverse
`
`direction
`
`after
`
`learning
`
`to
`
`setting
`
`20
`
`categories can be omitted.
`
`If
`
`the neural networks
`
`1a
`
`and
`
`lb are
`
`trained as
`
`aforementioned, every neuron N2
`
`in the output
`
`layer 12 is
`
`assigned with a weight vector having the weight coefficients
`
`associated with all the neurons N1 of
`
`the input
`
`layer 11 as
`
`25
`
`elements of the weight vector. Therefore, a training sample
`
`belonging to a category is assigned to the neural network 1a
`
`_15_
`
`

`

`or 1b in the checking mode, a neuron N2 associated with the
`
`category is excited.
`
`However,
`
`since the training samples
`
`have difference with each other
`
`even
`
`though
`
`they are
`
`included in the same category, it is not the only one neuron
`
`N2 but a plural
`
`forming a cluster that excited by training
`
`samples (a data set)
`
`included in a single category.
`
`When the check data extracted from the characteristics
`
`extracting units 3a and 3b are assigned to the respective
`
`neural networks la and lb after the neural networks 1a and
`
`lb complete
`
`learning in the
`
`training mode, whether
`
`the
`
`machine tool X is abnormal or not can be determined.
`
`It is
`
`preferable that a switching 'unit
`
`is provided. between the
`
`signal input unit 2 and the characteristics extracting units
`
`3a and 3b to select signal paths for assigning the check
`
`data obtained prior to the machining operation to the neural
`
`network la, and assigning the check data obtained during the
`
`machining operation to the neural network lb.
`
`The switching
`
`unit may be configured by an analog switch and the like and
`
`synchronized with the operation of
`
`the machining tool X to
`
`select
`
`the signal paths according to the operation state,
`
`i.e., before
`
`the machining operation of
`
`a workpiece or
`
`during it.
`
`By
`
`the
`
`operation
`
`aforementioned,
`
`the
`
`cluster
`
`determination unit 4a can detect an anomaly such as
`
`tool
`
`unbalance or loss prior to the machining operation. Further,
`
`the cluster determination unit 4b can detects an anomaly in
`
`10
`
`15
`
`20
`
`25
`
`-16-
`
`

`

`a contact state between the tool and a workpiece during the
`
`machining operation. When the cluster determination unit 4a
`
`or 4b judges the anomaly,
`
`it is preferable that
`
`the output
`
`unit 6 drives a proper notifying unit to let a user know the
`
`anomaly.
`
`As for notifying the anomaly, blinking a lamp or
`
`generating alarm sounds may be preferable.
`
`In the present embodiment,
`
`the history determination
`
`unit 4c is also provided at
`
`the determination unit 4. The
`
`history determination unit
`
`4c
`
`stores
`
`the deviation with
`
`respect
`
`to each of the neural networks 1a and 1b so that it
`
`judges the anomaly in the machine tool X when the deviation
`
`with respect
`
`to one of
`
`the neural networks 1a and 1b is
`
`greater than the preset
`
`threshold. Mostly,
`
`the anomaly in
`
`the machine tool X means a fault in the machine tool X.
`
`The
`
`amount of data stored in the history determination unit 4c
`
`is preferably set by a time unit, e.g., per a day or per a
`
`week,
`
`but
`
`it may
`
`be determined by
`
`a
`
`specific number
`
`(e.g.,10000) of the check data.
`
`Deviation is a normalized value of a magnitude of
`
`the
`
`difference vector between the
`
`amount of characteristics
`
`(characteristics vector)
`
`as the check data and the weight
`
`coefficients
`
`(weight vector) corresponding to each of
`
`the
`
`neurons N2 of the output layers 12 in the neural networks 1a
`
`and lb. The deviation Y is defined as:
`
`Y=([x]/X—[Wwin}/Wwin)T([x]/x—[Wwin]/Wwin),
`
`where [X]
`
`is the characteristics vector;
`
`[Wwin]
`
`is the
`
`-17-
`
`10
`
`15
`
`20
`
`25
`
`

`

`weight vector of neuron N2 corresponding to a category ([a]
`
`represents that
`
`“a”
`
`is a vector);
`
`T represents transpose;
`
`and X and Wwin which are not bracketed represent norms of
`
`the respective vectors.
`
`The normalization is carried out by
`
`elements of the vector are divided by the respective norms.
`
`By
`
`employing
`
`the
`
`configuration
`
`of
`
`the
`
`present
`
`invention as aforementioned, based on the output of
`
`the
`
`vibrations sensor 2a, an anomaly in the attachment state of
`
`the tool
`
`(tool tilting or attachment miss) or an anomaly in
`
`the tool at
`
`the machine tool X is monitored prior to the
`
`machining operation, while the contact state of
`
`the tool
`
`to
`
`the workpiece at
`
`the machine tool X is monitored.
`
`Futher,
`
`an anomaly such as a fault in the machine tool X can be also
`
`monitored based on the history of the deviation.
`
`Though the output of the vibration sensor 2a serves as
`
`the target signal
`
`in the embodiment aforementioned, a load
`
`current of a motor can be used as the target signal if the
`
`driving source of
`
`the machine tool X is a motor and if the
`
`motor is servo—controlled, an output of an Incoder provided
`
`to the motor may be used as the target signal.
`
`While the invention has been shown and described with
`
`respect
`
`to the embodiments,
`
`it will be understood by those
`
`skilled in the art
`
`that various changes and modifications
`
`may
`
`be made without departing from the
`
`scope
`
`of
`
`the
`
`invention as defined in the following claims.
`
`10
`
`15
`
`20
`
`25
`
`-18—
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket