(Return to ICNN97 Homepage) (Return to ICNN'97 Agenda)
ICNN'97 FINAL PAPER LIST
LM: LEARNING & MEMORY
Oral Sessions: LM1, LM2, LM3, LM4, LM5 Poster Session: LMP2
Session Chairs:
LM1 Mon (AM) Jianchiang Mao, IBM Almaden Research Center
LM2 Mon (PM) Irwin Sandberg, University of Texas Austin
LM3 Tues (AM) Francesco Palmieri, Univ. Degli Studi Federico II di Napoli
LM4 Tues (PM) Joachim Utans, London Business School
LM5 Wed (AM) Anthony Zaknich, Univ. of Western Australia
SESSION LM1: Monday, June 9, 1997; 10:50 - 12:30 (Return to Top)
ICNN97 Learning & Memory Session: LM1A Paper Number: 169 Oral
Uniform approximation and gamma networks
Irwin W. Sandberg and Lilian Xu
Keywords: gamma network Uniform approximation nonlinear input-output maps
_____
ICNN97 Learning & Memory Session: LM1B Paper Number: 419 Oral
Asymptotical analysis of modular neural network
Lin-Cheng Wang, Nasser M. Nasrabadi and Sandor Der
Keywords: modular neural network data representation asymptotical performance
_____
ICNN97 Learning & Memory Session: LM1C Paper Number: 422 Oral
Recognition algorithm using evolutionary learning on the random neural networks
Jose Aguilar and Adriana Colmenares
Keywords: evolutionary learning random neural networks pattern recognition
_____
ICNN97 Learning & Memory Session: LM1D Paper Number: 141 Oral
Dynamics of distance between patterns for higher order random neural networks
Hiromi Miyajima and Shuji Yatsuki
Keywords: hugher order random neural network dynamical properties dynamics of distance
_____
ICNN97 Learning & Memory Session: LM1E Paper Number: 414 Oral
Principal components via cascades of block-layers
Francesco Palmieri and Michele Corvino
Keywords: principal components cascades of block-layers constarained connectivity
_____
SESSION LM2: Monday, June 9, 1997; 13:50 - 15:50 PM (Return to Top)
ICNN97 Learning & Memory Session: LM2A Paper Number: 383 Oral
Bayesian geometric theory of learning algorithms
Huaiyu Zhu
Keywords: bayesian geometric theory learning algorithms objective evaluation
_____
ICNN97 Learning & Memory Session: LM2B Paper Number: 251 Oral
Note on effective number of parameters in nonlinear learning systems
Jianchang Mao and Anil Jain
Keywords: nonlinear learning system effective number of parameters feedforward neural network
_____
ICNN97 Learning & Memory Session: LM2C Paper Number: 413 Oral
New bounds for correct generalization
D. Mattera and F. Palmieri
Keywords: correct generalization number of training examples neural network architecture
_____
ICNN97 Learning & Memory Session: LM2D Paper Number: 48 Oral
D-Entropy minimization
Ryotaro Kamimura
Keywords: D-entropy information maximization Renyi entropy Shannon entropy
_____
ICNN97 Learning & Memory Session: LM2E Paper Number: 438 Oral
Projection pursuit and the solvability condition applied to constructive learning
Fernando J. Von Zuben and Marcio L. de Andrade Netto
Keywords: projection pursuit solvability condition constructive learning
_____
ICNN97 Learning & Memory Session: LM2F Paper Number: 569 Oral
The generalization capabilities of ARTMAP
Gregory L. Heileman, Michael Georgiopoulos, Michael J. Healy and Stephen J. Verzi
Keywords: generalization capability ARTMAP number of training examples
_____
SESSION LM3: Tuesday, June 10, 1997; 10:00 - 12:00 (Return to Top)
ICNN97 Learning & Memory Session: LM3A Paper Number: 167 Oral
Automatic learning rate optimization by higher-order derivatives
Xiao-Hu Yu and Li-Qun Xu
Keywords: learning rate optimization higher-order derivatives backpropagation learning
_____
ICNN97 Learning & Memory Session: LM3B Paper Number: 406 Oral
Improved back propagation training algorithm using conic section functions
Tulay Yildirim and John S. Marsland
Keywords: back propagation conic section functions Radial basis funtion
_____
ICNN97 Learning & Memory Session: LM3C Paper Number: 96 Oral
Comparing parameterless learning rate adaptation methods
M. Moreira and E. Fiesler
Keywords: parematerless learning rate backpropagation adaptive learning rate
_____
ICNN97 Learning & Memory Session: LM3D Paper Number: 339 Oral
Optimal stopped training via algebraic on-line estimation of the expected test-set error
Joachim Utans
Keywords: optimal stopped training algebraic on-line estimation expected test-set error
_____
ICNN97 Learning & Memory Session: LM3E Paper Number: 97 Oral
Weight evolution algorithm with dynamic offset range
S. C. Ng, S. H. Leung and A. Luk
Keywords: multi-layered neural network back propagation weight evolution
_____
ICNN97 Learning & Memory Session: LM3F Paper Number: 400 Oral
Incorporating state space constraints into a neural network
Daryl H. Graf
Keywords: state space constraints continuous neural network manifold learning
_____
SESSION LM4: Tuesday, June 10, 1997; 14:40 - 16:00 PM (Return to Top)
ICNN97 Learning & Memory Session: LM4A Paper Number: 551 Oral
A structural learning algorithm for multi-layered neural networks
Manabu Kotani, Akihiro Kajiki and Kenzo Akazawa
Keywords: structural learning multi-layered neural network pruning algorithm
_____
ICNN97 Learning & Memory Session: LM4B Paper Number: 618 Oral
An improved expand-and-truncate learning
A. Yamamoto and T. Saito
Keywords: expand-and-truncate learning binary-to-binary mapping number of hidden units
_____
ICNN97 Learning & Memory Session: LM4C Paper Number: 336 Oral
a vector quantisation reduction method for the probabilistic neural network
Anthony Zaknich
Keywords: vector quantisation probabilistic neural network regression
_____
SESSION LM5: Wednesday, June 11, 1997; 10:00 - 12:00 AM (Return to Top)
ICNN97 Learning & Memory Session: LM5A Paper Number: 636 Oral
Robust adaptive identification of dynamic systems by neural networks
James Ting-Ho Lo
Keywords: dynamic systems adaptive identification on-line weight adjustment
_____
ICNN97 Learning & Memory Session: LM5B Paper Number: 521 Oral
Fibre bundles and receptive neural fields
S. Puechmorel
Keywords: fiber bundles receptive neural fields membership function
_____
ICNN97 Learning & Memory Session: LM5C Paper Number: 105 Oral
Fast binary cellular neural networks
Iztok Fajfar
Keywords: cellular neural network analog transient parameter optimization
_____
ICNN97 Learning & Memory Session: LM5D Paper Number: 316 Oral
Associative memory of weakly connected oscillators
Frank Hoppensteadt and Eugene Izhikevich
Keywords: multiple andronov-hopf bifurcations weakly connected neural networks Cohen-Grossberg convergence limit cycle attractors
_____
ICNN97 Learning & Memory Session: LM5E Paper Number: 372 Oral
Input-to-state (ISS) analysis for dynamic neural networks
Edgar N. Sanchez and Jose P. Perez
Keywords: neural network dynamics intelligent control stability ISS analysis
_____
SESSION LMP2: Wednesday, June 11, 1997; 16:00 - 18:20 PM (Return to Top)
ICNN97 Learning & Memory Session: LMP2 Paper Number: 110 Poster
Simultaneous information maximization and minimization
Ryotaro Kamimura
Keywords: information maximization information minimization internal representation
_____
ICNN97 Learning & Memory Session: LMP2 Paper Number: 535 Poster
Hybrid learning algorithm with low input-to-output mapping sensitivity for iterated time-series prediction
So-Young Jeong, Minho Lee and Soo-Young Lee
Keywords: hybrid learning input-to-output mapping sensitivity iterated time-series prediction
_____
ICNN97 Learning & Memory Session: LMP2 Paper Number: 23 Poster
On viewing the transform performed by a hidden layer in a feedforward ANN as a complex mobius mapping
Adriana Dumitras and Vasile Lazarescu
Keywords: hidden layer feedforward network mobius mapping Hinton diagram
_____
ICNN97 Learning & Memory Session: LMP2 Paper Number: 21 Poster
An upper bound on the node complexity of depth-2 multilayer perceptrons
Masahiko Arai
Keywords: node complexity multilayer perceptron hidden units
_____
ICNN97 Learning & Memory Session: LMP2 Paper Number: 548 Poster
Improved sufficient convergence condition for the discrete-time cellular neural networks
Sungjun Park and Soo-Ik Chae
Keywords: convergence condition discrete-time cellular neural network positive semi-definite constraint
_____
ICNN97 Learning & Memory Session: LMP2 Paper Number: 455 Poster
Beyond weights adaptation: a new neuron model with trainable activation function and its supervised learning
Youshou Wu, Mingsheng Zhao and Xiaoqing Ding
Keywords: weights adaptation trainable activation functions supervised learning