[go: up one dir, main page]

Kamimura, 1990 - Google Patents

Application of temporal supervised learning algorithm to generation of natural language

Kamimura, 1990

Document ID
299267809870949874
Author
Kamimura R
Publication year
Publication venue
1990 IJCNN International Joint Conference on Neural Networks

External Links

Snippet

An attempt is made to generate natural language by using a recurrent neural network with the temporal supervised learning algorithm (TSLA), developed by RJ Williams and D. Zipser (1989). As TSLA uses explicit representation of consecutive events, it can deal with time …
Continue reading at ieeexplore.ieee.org (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/02Computer systems based on biological models using neural network models
    • G06N3/08Learning methods
    • G06N3/082Learning methods modifying the architecture, e.g. adding or deleting nodes or connections, pruning
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/02Computer systems based on biological models using neural network models
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • G06N3/0635Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means using analogue means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/02Computer systems based on biological models using neural network models
    • G06N3/04Architectures, e.g. interconnection topology
    • G06N3/0472Architectures, e.g. interconnection topology using probabilistic elements, e.g. p-rams, stochastic processors
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/02Computer systems based on biological models using neural network models
    • G06N3/04Architectures, e.g. interconnection topology
    • G06N3/0454Architectures, e.g. interconnection topology using a combination of multiple neural nets

Similar Documents

Publication Publication Date Title
Shi et al. A new approach of neuro-fuzzy learning algorithm for tuning fuzzy rules
Gallant Perceptron-based learning algorithms
Shi et al. Some considerations on conventional neuro-fuzzy learning algorithms by gradient descent method
US5050095A (en) Neural network auto-associative memory with two rules for varying the weights
Salerno Using the particle swarm optimization technique to train a recurrent neural model
Gopalsamy et al. Convergence under dynamical thresholds with delays
US5446828A (en) Nonlinear neural network oscillator
Kamimura Application of temporal supervised learning algorithm to generation of natural language
Simard et al. Analysis of recurrent backpropagation
Simard et al. Fixed point analysis for recurrent networks
Kamimura Experimental analysis of performance of temporal supervised learning algorithm, applied to a long and complex sequence
Lawrence et al. Can recurrent neural networks learn natural language grammars?
Hattori et al. Quick learning for multidirectional associative memories
Kak State generators and complex neural memories
Zhenjiang et al. An extended BAM neural network model
Gu et al. NLoPT: N-gram Enhanced Low-Rank Task Adaptive Pre-training for Efficient Language Model Adaption
Young et al. Improvements and extensions to the constructive algorithm carve
Bebis et al. BACK-PROPAGATIONleCREASING RATE OF CONVERGENCE BY PREDICTABLE PATTERN LOADING
Owens et al. A multi-output-layer perceptron
Sun et al. A novel design for a gated recurrent network with attentional memories
Cybulski et al. Determining word lexical categories with a multi-layer binary perceptron
Podolak Feedforward neural network's sensitivity to input data representation
Nakagawa A study of chaos neural network with a periodic activation function
Morita et al. Context-dependent sequential recall by a trajectory attractor network with selective desensitization
Van den Broeck Entropy and Learning