An edition of Neuro-fuzzy and soft computing (1996)

Neuro-Fuzzy and Soft Computing

A Computational Approach to Learning and Machine Intelligence

  • 11 Want to read
  • 1 Currently reading

My Reading Lists:

Create a new list

  • 11 Want to read
  • 1 Currently reading

Buy this book

Last edited by MARC Bot
August 4, 2024 | History
An edition of Neuro-fuzzy and soft computing (1996)

Neuro-Fuzzy and Soft Computing

A Computational Approach to Learning and Machine Intelligence

  • 11 Want to read
  • 1 Currently reading

xxvi, 614 p. : 24 cm

Publish Date
Publisher
Prentice Hall
Language
English
Pages
614

Buy this book

Previews available in: English

Edition Availability
Cover of: Neuro-fuzzy and soft computing
Cover of: Neuro-Fuzzy and Soft Computing
Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence
September 26, 1996, Prentice Hall
Paperback in English - US Ed edition
Cover of: Neuro-Fuzzy and Soft Computing
Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence
1996, Pearson Education, Limited
in English
Cover of: Neuro-Fuzzy and Soft Computing
Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence
September 26, 1996, Prentice Hall
in English

Add another edition?

Book Details


Table of Contents

Foreword
Page xv
Preface
Page xix
1. Audience
Page xix
2. Organization
Page xx
3. Features
Page xxii
4. Obtaining the Example Programs
Page xxiii
5. Acknowledgments
Page xxiv
6. How to Contact Us
Page xxvi
1. Introduction to Neuro-Fuzzy and Soft Computing
Page 1
1.1. Introduction
Page 1
1.2. Soft Computing Constituents and Conventional Artificial Intelligence
Page 1
1.2.1. From Conventional AI to Computational Intelligence
Page 3
1.2.2. Neural Networks
Page 6
1.2.3. Fuzzy Set Theory
Page 6
1.2.4. Evolutionary Computation
Page 7
1.3. Neuro-Fuzzy and Soft Computing Characteristics
Page 7
I. Fuzzy Set Theory
Page 11
2. Fuzzy Sets
Page 13
2.1. Introduction
Page 13
2.2. Basic Definitions and Terminology
Page 14
2.3. Set-Theoretic Operations
Page 21
2.4. Mf Formulation and Parameterization
Page 24
2.4.1. Mfs of One Dimension
Page 24
2.4.2. Mfs of Two Dimensions
Page 30
2.4.3. Derivatives of Parameterized Mfs
Page 34
2.5. More on Fuzzy Union, Intersection, and Complement*
Page 35
2.5.1. Fuzzy Complement*
Page 35
2.5.2. Fuzzy Intersection and Union*
Page 36
2.5.3. Parameterized T-norm and T-conorm*
Page 40
2.6. Summary
Page 42
Exercises
Page 42
3. Fuzzy Rules and Fuzzy Reasoning
Page 47
3.1. Introduction
Page 47
3.2. Extension Principle and Fuzzy Relations
Page 47
3.2.1. Extension Principle
Page 47
3.2.2. Fuzzy Relations
Page 50
3.3. Fuzzy If-Then Rules
Page 54
3.3.1. Linguistic Variables
Page 54
3.3.2. Fuzzy If-Then Rules
Page 59
3.4. Fuzzy Reasoning
Page 62
3.4.1. Compositional Rule of Inference
Page 63
3.4.2. Fuzzy Reasoning
Page 64
3.5. Summary
Page 70
Exercises
Page 70
4. Fuzzy Inference Systems
Page 73
4.1. Introduction
Page 73
4.2. Mamdani Fuzzy Models
Page 74
4.2.1. Other Variants
Page 79
4.3. Sugeno Fuzzy Models
Page 81
4.4. Tsukamoto Fuzzy Models
Page 84
4.5. Other Considerations
Page 85
4.5.1. Input Space Partitioning
Page 86
4.5.2. Fuzzy Modeling
Page 87
4.6. Summary
Page 89
Exercises
Page 90
II. Regression and Optimization
Page 93
5. Least-Squares Methods for System Identification
Page 95
5.1. System Identification: An Introduction
Page 95
5.2. Basics of Matrix Manipulation and Calculus
Page 97
5.3. Least-Squares Estimator
Page 104
5.4. Geometric Interpretation of LSE
Page 110
5.5. Recursive Least-Squares Estimator
Page 113
5.6. Recursive LSE for Time-Varying Systems*
Page 116
5.7. Statistical Properties and the Maximum Likelihood Estimator*
Page 118
5.8. LSE for Nonlinear Models
Page 122
5.9. Summary
Page 125
Exercises
Page 126
6. Derivative-Based Optimization
Page 129
6.1. Introduction
Page 129
6.2. Descent Methods
Page 129
6.2.1. Gradient-Based Methods
Page 130
6.3. The Method of Steepest Descent
Page 133
6.4. Newton's Methods
Page 134
6.4.1. Classical Newton's Method
Page 134
6.4.2. Modified Newton's Methods*
Page 136
6.4.3. Quasi-Newton Methods*
Page 139
6.5. Step Size Determination
Page 141
6.5.1. Initial Bracketing
Page 141
6.5.2. Line Searches
Page 142
6.5.3. Termination Rules*
Page 146
6.6. Conjugate Gradient Methods*
Page 148
6.6.1. Conjugate Directions*
Page 148
6.6.2. From Orthogonality to Conjugacy*
Page 149
6.6.3. Conjugate Gradient Algorithms*
Page 152
6.7. Analysis of Quadratic Case
Page 154
6.7.1. Descent Methods with Line Minimization
Page 156
6.7.2. Steepest Descent Method without Line Minimization
Page 156
6.8. Nonlinear Least-Squares Problems
Page 160
6.8.1. Gauss-Newton Method
Page 161
6.8.2. Levenberg-Marquardt Concepts
Page 163
6.9. Incorporation of Stochastic Mechanisms
Page 166
6.10. Summary
Page 168
Exercises
Page 168
7. Derivative-Free Optimization
Page 173
7.1. Introduction
Page 173
7.2. Genetic Algorithms
Page 175
7.3. Simulated Annealing
Page 181
7.4. Random Search
Page 186
7.5. Downhill Simplex Search
Page 189
7.6. Summary
Page 193
Exercises
Page 194
III. Neural Networks
Page 197
8. Adaptive Networks
Page 199
8.1. Introduction
Page 199
8.2. Architecture
Page 200
8.3. Backpropagation for Feedforward Networks
Page 205
8.4. Extended Backpropagation for Recurrent Networks
Page 210
8.4.1. Synchronously Operated Networks: BPTT and RTRL
Page 212
8.4.2. Continuously Operated Networks: Mason's Gain Formula*
Page 215
8.5. Hybrid Learning Rule: Combining Steepest Descent and LSE
Page 219
8.5.1. Off-Line Learning (Batch Learning)
Page 220
8.5.2. On-Line Learning (Pattern-By-Pattern Learning)
Page 222
8.5.3. Different Ways of Combining Steepest Descent and LSE
Page 222
8.6. Summary
Page 223
Exercises
Page 223
9. Supervised Learning Neural Networks
Page 226
9.1. Introduction
Page 226
9.2. Perceptrons
Page 227
9.2.1. Architecture and Learning Rule
Page 227
9.2.2. Exclusive-OR Problem
Page 229
9.3. Adaline
Page 230
9.4. Backpropagation Multilayer Perceptrons
Page 233
9.4.1. Backpropagation Learning Rule
Page 234
9.4.2. Methods of Speeding Up MLP Training
Page 236
9.4.3. MLP's Approximation Power
Page 238
9.5. Radial Basis Function Networks
Page 238
9.5.1. Architectures and Learning Methods
Page 238
9.5.2. Functional Equivalence to FIS
Page 241
9.5.3. Interpolation and Approximation RBFNs
Page 242
9.5.4. Examples
Page 244
9.6. Modular Networks
Page 246
9.7. Summary
Page 250
Exercises
Page 251
10. Learning from Reinforcement
Page 258
10.1. Introduction
Page 258
10.2. Failure Is the Surest Path to Success
Page 259
10.2.1. Jackpot Journey
Page 259
10.2.2. Credit Assignment Problem
Page 262
10.2.3. Evaluation Functions
Page 263
10.3. Temporal Difference Learning
Page 264
10.3.1. TD Formulation
Page 265
10.3.2. Expected Jackpot
Page 266
10.3.3. Predicting Cumulative Outcomes
Page 268
10.4. The Art of Dynamic Programming
Page 270
10.4.1. Formulation of Classical Dynamic Programming
Page 270
10.4.2. Incremental Dynamic Programming
Page 272
10.5. Adaptive Heuristic Critic
Page 273
10.5.1. Neuron-like Critic
Page 273
10.5.2. An Adaptive Neural Critic Algorithm
Page 274
10.5.3. Exploration and Action Selection
Page 277
10.6. Q-learning
Page 278
10.6.1. Basic Concept
Page 278
10.6.2. Implementation
Page 279
10.7. A Cost Path Problem
Page 281
10.7.1. Expected Cost Path Problem by TD Methods
Page 282
10.7.2. Finding an Optimal Path in a Deterministic Minimum Cost Path Problem
Page 283
10.7.3. State Representations for Generalization
Page 287
10.8. World Modeling*
Page 288
10.8.1. Model-free and Model-based Learning*
Page 288
10.8.2. Distal Teacher*
Page 289
10.8.3. Learning Speed*
Page 289
10.9. Other Network Configurations*
Page 290
10.9.1. Divide-and-Conquer Methodology*
Page 290
10.9.2. Recurrent Networks*
Page 291
10.10. Reinforcement Learning by Evolutionary Computation*
Page 292
10.10.1. Bucket Brigade*
Page 292
10.10.2. Genetic Reinforcers*
Page 292
10.10.3. Immune Modeling*
Page 293
10.11. Summary
Page 293
Exercises
Page 294
11. Unsupervised Learning and Other Neural Networks
Page 301
11.1. Introduction
Page 301
11.2. Competitive Learning Networks
Page 302
11.3. Kohonen Self-Organizing Networks
Page 305
11.4. Learning Vector Quantization
Page 308
11.5. Hebbian Learning
Page 310
11.6. Principal Component Networks
Page 312
11.6.1. Principal Component Analysis
Page 312
11.6.2. Oja's Modified Hebbian Rule
Page 316
11.7. The Hopfield Network
Page 316
11.7.1. Content-Addressable Nature
Page 317
11.7.2. Binary Hopfield Networks
Page 318
11.7.3. Continuous-Valued Hopfield Networks
Page 321
11.7.4. Traveling Salesperson Problem
Page 324
11.7.5. The Boltzmann Machine
Page 326
11.8. Summary
Page 327
Exercises
Page 328
IV. Neuro-Fuzzy Modeling
Page 333
12. ANFIS: Adaptive Neuro-Fuzzy Inference Systems
Page 335
12.1. Introduction
Page 335
12.2. ANFIS Architecture
Page 336
12.3. Hybrid Learning Algorithm
Page 340
12.4. Learning Methods that Cross-fertilize ANFIS and RBFN
Page 341
12.5. ANFIS as a Universal Approximator*
Page 342
12.6. Simulation Examples
Page 345
12.6.1. Practical Considerations
Page 345
12.6.2. Example 1: Modeling a Two-Input Sinc Function
Page 346
12.6.3. Example 2: Modeling a Three-Input Nonlinear Function
Page 348
12.6.4. Example 3: On-Line Identification in Control Systems
Page 351
12.6.5. Example 4: Predicting Chaotic Time Series
Page 353
12.7. Extensions and Advanced Topics
Page 360
Exercises
Page 363
13. Coactive Neuro-Fuzzy Modeling: Towards Generalized ANFIS
Page 369
13.1. Introduction
Page 369
13.2. Framework
Page 370
13.2.1. Toward Multiple Inputs/Outputs Systems
Page 370
13.2.2. Architectural Comparisons
Page 370
13.3. Neuron Functions for Adaptive Networks
Page 372
13.3.1. Fuzzy Membership Functions versus Receptive Field Units
Page 373
13.3.2. Nonlinear Rule
Page 376
13.3.3. Modified Sigmoidal and Truncation Filter Functions
Page 380
13.4. Neuro-Fuzzy Spectrum
Page 382
13.5. Analysis of Adaptive Learning Capability
Page 385
13.5.1. Convergence Based on the Steepest Descent Method Alone
Page 385
13.5.2. Interpretability Spectrum
Page 386
13.5.3. Evolution of Antecedents (Mfs)
Page 387
13.5.4. Evolution of Consequents (Rules)
Page 389
13.5.5. Evolving Partitions
Page 390
13.6. Summary
Page 393
Exercises
Page 395
V. Advanced Neuro-Fuzzy Modeling
Page 401
14. Classification and Regression Trees
Page 403
14.1. Introduction
Page 403
14.2. Decision Trees
Page 404
14.3. CART Algorithm for Tree Induction
Page 406
14.3.1. Tree Growing
Page 407
14.3.2. Tree Pruning
Page 413
14.4. Using CART for Structure Identification in ANFIS
Page 416
14.5. Summary
Page 421
Exercises
Page 421
15. Data Clustering Algorithms
Page 423
15.1. Introduction
Page 423
15.2. K-Means Clustering
Page 424
15.3. Fuzzy C-Means Clustering
Page 425
15.4. Mountain Clustering Method
Page 427
15.5. Subtractive Clustering
Page 431
15.6. Summary
Page 432
Exercises
Page 432
16. Rulebase Structure Identification
Page 434
16.1. Introduction
Page 434
16.2. Input Selection
Page 435
16.3. Input Space Partitioning
Page 436
16.4. Rulebase Organization
Page 441
16.5. Focus Set-Based Rule Combination
Page 446
16.6. Summary
Page 447
Exercises
Page 448
VI. Neuro-Fuzzy Control
Page 451
17. Neuro-Fuzzy Control I
Page 453
17.1. Introduction
Page 453
17.2. Feedback Control Systems and Neuro-Fuzzy Control: An Overview
Page 454
17.2.1. Feedback Control Systems
Page 454
17.2.2. Neuro-Fuzzy Control
Page 458
17.3. Expert Control: Mimicking an Expert
Page 458
17.4. Inverse Learning
Page 460
17.4.1. Fundamentals
Page 460
17.4.2. Case Studies
Page 463
17.5. Specialized Learning
Page 465
17.6. Backpropagation Through Time and Real-Time Recurrent Learning
Page 469
17.6.1. Fundamentals
Page 469
17.6.2. Case Studies: The Inverted Pendulum System
Page 470
17.7. Summary
Page 476
Exercises
Page 477
18. Neuro-Fuzzy Control II
Page 480
18.1. Introduction
Page 480
18.2. Reinforcement Learning Control
Page 480
18.2.1. Control Environment
Page 480
18.2.2. Neuro-Fuzzy Reinforcement Controllers
Page 481
18.3. Gradient-Free Optimization
Page 483
18.3.1. GAs: Coding and Genetic Operators
Page 484
18.3.2. GAs: Formulating Objective Functions
Page 488
18.4. Gain Scheduling
Page 489
18.4.1. Fundamentals
Page 489
18.4.2. Case Studies
Page 491
18.5. Feedback Linearization and Sliding Control
Page 493
18.6. Summary
Page 496
Exercises
Page 497
VII. Advanced Applications
Page 501
19. ANFIS Applications
Page 503
19.1. Introduction
Page 503
19.2. Printed Character Recognition
Page 503
19.3. Inverse Kinematics Problems
Page 507
19.4. Automobile MPG Prediction
Page 510
19.5. Nonlinear System Identification
Page 514
19.6. Channel Equalization
Page 516
19.7. Adaptive Noise Cancellation
Page 523
20. Fuzzy-Filtered Neural Networks
Page 535
20.1. Introduction
Page 535
20.2. Fuzzy-Filtered Neural Networks
Page 536
20.3. Application 1: Plasma Spectrum Analysis
Page 538
20.3.1. Multilayer Perceptron Approach
Page 538
20.3.2. Fuzzy-Filtered Neural Network Approach
Page 539
20.4. Application 2: Hand-Written Numeral Recognition
Page 540
20.4.1. One Dimensional Fuzzy Filters
Page 541
20.4.2. Two Dimensional Fuzzy Filters
Page 542
20.5. Genetic Algorithm-Based Fuzzy Filters
Page 543
20.5.1. A General Model
Page 543
20.5.2. Variations and Discussion
Page 545
20.6. Summary
Page 549
21. Fuzzy Sets and Genetic Algorithms in Game Playing
Page 551
21.1. Introduction
Page 551
21.2. Variants of Genetic Algorithms
Page 551
21.3. Using Genetic Algorithms in Game Playing
Page 553
21.4. Simulation Results of the Basic Model
Page 556
21.5. Using Fuzzily Characterized Features
Page 559
21.6. Using Polyploid GA in Game Playing
Page 560
21.7. Summary
Page 564
22. Soft Computing for Color Recipe Prediction
Page 568
22.1. Introduction
Page 568
22.2. Color Recipe Prediction
Page 569
22.3. Single MLP Approaches
Page 569
22.4. CANFIS Modeling for Color Recipe Prediction
Page 571
22.4.1. Fuzzy Partitionings
Page 572
22.4.2. CANFIS Architectures
Page 573
22.4.3. Knowledge-Embedded Structures
Page 576
22.4.4. CANFIS Simulation
Page 576
22.5. Color Paint Manufacturing Intelligence
Page 577
22.5.1. Manufacturing Intelligence Architecture
Page 579
22.5.2. Knowledge Base
Page 580
22.5.3. Multi-elites Generator
Page 581
22.5.4. Fuzzy Population Generator
Page 581
22.5.5. Fitness Function
Page 582
22.5.6. Genetic Strategies
Page 584
22.6. Experimental Evaluation
Page 587
22.7. Discussion
Page 589
22.8. Concluding Remarks and Future Directions
Page 591
A. Hints to Selected Exercises
Page 595
B. List of Internet Resources
Page 598
C. List of MATLAB Programs
Page 601
D. List of Acronyms
Page 604
Index
Page 607

Classifications

Library of Congress
QA76.9.S63 J36 1997, QA76.9.S63J36 1997

Edition Identifiers

Open Library
OL7340190M
Internet Archive
neurofuzzysoftco0000jang
ISBN 10
0132610663
ISBN 13
9780132610667
LCCN
96029050
OCLC/WorldCat
35029665
LibraryThing
561565
Goodreads
256400

Work Identifiers

Work ID
OL3285960W

Community Reviews (0)

No community reviews have been submitted for this work.

Lists

History

Download catalog record: RDF / JSON
August 4, 2024 Edited by MARC Bot import existing book
October 23, 2023 Edited by Scott365Bot import existing book
August 23, 2020 Edited by ImportBot import existing book
July 7, 2019 Edited by MARC Bot import existing book
December 10, 2009 Created by WorkBot add works page