A description of competing fusion systems
Information Fusion
A Replica Replacement Algorithm Based on Value-Cost Prediction
Computer Supported Cooperative Work in Design IV
A VO-based two-swtage replica replacement algorithm
NPC'07 Proceedings of the 2007 IFIP international conference on Network and parallel computing
The effect of intermittent signal on the performance of code tracking loop in GNSS receivers
Journal of Electrical and Computer Engineering
Efficient liver segmentation based on the spine
ADVIS'04 Proceedings of the Third international conference on Advances in Information Systems
Improved fully automatic liver segmentation using histogram tail threshold algorithms
ICCS'05 Proceedings of the 5th international conference on Computational Science - Volume Part III
Automatic hepatic tumor segmentation using composite hypotheses
ICIAR'05 Proceedings of the Second international conference on Image Analysis and Recognition
Automatic hepatic tumor segmentation using statistical optimal threshold
ICCS'05 Proceedings of the 5th international conference on Computational Science - Volume Part I
Automatic boundary tumor segmentation of a liver
ICCSA'05 Proceedings of the 2005 international conference on Computational Science and Its Applications - Volume Part IV
Hi-index | 0.00 |
From the Publisher:An understanding of random processes is crucial to many engineering fields聳including communication theory, computer vision, and digital signal processing in electrical and computer engineering, and vibrational theory and stress analysis in mechanical engineering. The filtering, estimation, and detection of random processes in noisy environments are critical tasks necessary in the analysis and design of new communications systems and useful signal processing algorithms. Random Processes: Filtering, Estimation, and Detection clearly explains the basics of probability and random processes and details modern detection and estimation theory to accomplish these tasks. In this book, Lonnie Ludeman, an award-winning authority in digital signal processing, joins the fundamentals of random processes with the standard techniques of linear and nonlinear systems analysis and hypothesis testing to give signal estimation techniques, specify optimum estimation procedures, provide optimum decision rules for classification purposes, and describe performance evaluation definitions and procedures for the resulting methods. The text covers four main, interrelated topics: Probability and characterizations of random variables and random processes Linear and nonlinear systems with random excitations Optimum estimation theory including both the Wiener and Kalman Filters Detection theory for both discrete and continuous time measurements Lucid, thorough, and well-stocked with numerous examples and practice problems that emphasize the concepts discussed, Random Processes: Filtering, Estimation, and Detection is an understandable and useful text ideal as both aself-study guide for professionals in the field and as a core text for graduate students. About the Author:LONNIE C. LUDEMAN,received his doctorate from Arizona State University and is Professor Emeritus of electrical and computer engineering at New Mexico State University. In 1993, he was a Fulbright Scholar at the Aristotle University in Thessaloniki, Greece. He is the author of Fundamentals of Digital Signal Processing, which won Choice magazine聮s award for Outstanding Engineering Book of the Year.