libri scuola books Fumetti ebook dvd top ten sconti 0 Carrello


Torna Indietro

vaseghi saeed v. - advanced signal processing and digital noise reduction
Zoom

Advanced Signal Processing and Digital Noise Reduction




Disponibilità: Normalmente disponibile in 10 giorni


PREZZO
65,98 €
NICEPRICE
62,68 €
SCONTO
5%



Questo prodotto usufruisce delle SPEDIZIONI GRATIS
selezionando l'opzione Corriere Veloce in fase di ordine.


Pagabile anche con Carta della cultura giovani e del merito, 18App Bonus Cultura e Carta del Docente


Facebook Twitter Aggiungi commento


Spese Gratis

Dettagli

Genere:Libro
Lingua: Tedesco
Pubblicazione: 05/2012
Edizione: Softcover reprint of the original 1st ed. 1996





Sommario

1 Introduction.- 1.1 Signals and Information.- 1.2 Signal Processing Methods.- 1.2.1 Non-parametric Signal Processing.- 1.2.2 Model-based Signal Processing.- 1.2.3 Bayesian Statistical Signal Processing.- 1.2.4 Neural Networks.- 1.3 Applications of Digital Signal Processing.- 1.3.1 Adaptive Noise Cancellation and Noise Reduction.- 1.3.2 Blind Channel Equalisation.- 1.3.3 Signal Classification and Pattern Recognition.- 1.3.4 Linear Prediction Modelling of Speech.- 1.3.5 Digital Coding of Audio Signals.- 1.3.6 Detection of Signals in Noise.- 1.3.7 Directional Reception of Waves: Beamforming.- 1.4 Sampling and Analog to Digital Conversion.- 1.4.1 Time-Domain Sampling and Reconstruction of Analog Signals.- 1.4.2 Quantisation.- 2 Stochastic Processes.- 2.1 Random Signals and Stochastic Processes.- 2.1.1 Stochastic Processes.- 2.1.2 The Space or Ensemble of a Random Process.- 2.2 Probabilistic Models of a Random Process.- 2.3 Stationary and Nonstationary Random Processes.- 2.3.1 Strict Sense Stationary Processes.- 2.3.2 Wide Sense Stationary Processes.- 2.3.3 Nonstationary Processes.- 2.4 Expected Values of a Stochastic Process.- 2.4.1 The Mean Value.- 2.4.2 Autocorrelation.- 2.4.3 Autocovariance.- 2.4.4 Power Spectral Density.- 2.4.5 Joint Statistical Averages of Two Random Processes.- 2.4.6 Cross Correlation and Cross Covariance.- 2.4.7 Cross Power Spectral Density and Coherence.- 2.4.8 Ergodic Processes and Time-averaged Statistics.- 2.4.9 Mean-ergodic Processes.- 2.4.10 Correlation-ergodic Processes.- 2.5 Some Useful Classes of Random Processes.- 2.5.1 Gaussian (Normal) Process.- 2.5.2 Multi-variate Gaussian Process.- 2.5.3 Mixture Gaussian Process.- 2.5.4 A Binary-state Gaussian Process.- 2.5.5 Poisson Process.- 2.5.6 Shot Noise.- 2.5.7 Poisson-Gaussian Model for Clutters and Impulsive Noise.- 2.5.8 Markov Processes.- 2.6 Transformation of a Random Process.- 2.6.1 Monotonic Transformation of Random Signals.- 2.6.2 Many-to-one Mapping of Random Signals.- Summary.- 3 Bayesian Estimation and Classification.- 3.1 Estimation Theory: Basic Definitions.- 3.1.1 Predictive and Statistical Models in Estimation.- 3.1.2 Parameter Space.- 3.1.3 Parameter Estimation and Signal Restoration.- 3.1.4 Performance Measures.- 3.1.5 Prior, and Posterior Spaces and Distributions.- 3.2 Bayesian Estimation.- 3.2.1 Maximum a Posterior Estimation.- 3.2.2 Maximum Likelihood Estimation.- 3.2.3 Minimum Mean Squared Error Estimation.- 3.2.4 Minimum Mean Absolute Value of Error Estimation.- 3.2.5 Equivalence of MAP, ML, MMSE and MAVE.- 3.2.6 Influence of the Prior on Estimation Bias and Variance.- 3.2.7 The Relative Importance of the Prior and the Observation.- 3.3 Estimate-Maximise (EM) Method.- 3.3.1 Convergence of the EM algorithm.- 3.4 Cramer-Rao Bound on the Minimum Estimator Variance.- 3.4.1 Cramer-Rao Bound for Random Parameters.- 3.4.2 Cramer-Rao Bound for a Vector Parameter.- 3.5 Bayesian Classification.- 3.5.1 Classification of Discrete-valued Parameters.- 3.5.2 Maximum a Posterior Classification.- 3.5.3 Maximum Likelihood Classification.- 3.5.4 Minimum Mean Squared Error Classification.- 3.5.5 Bayesian Classification of Finite State Processes.- 3.5.6 Bayesian Estimation of the Most Likely State Sequence.- 3.6 Modelling the Space of a Random Signal.- 3.6.1 Vector Quantisation of a Random Process.- 3.6.2 Design of a Vector Quantiser: K-Means Algorithm.- 3.6.3 Design of a Mixture Gaussian Model.- 3.6.4 The EM Algorithm for Estimation of Mixture Gaussian Densities.- Summary.- 4 Hidden Markov Models.- 4.1 Statistical Models for Nonstationary Processes.- 4.2 Hidden Markov Models.- 4.2.1 A Physical Interpretation of Hidden Markov Models.- 4.2.2 Hidden Markov Model As a Bayesian Method.- 4.2.3 Parameters of a Hidden Markov Model.- 4.2.4 State Observation Models.- 4.2.5 State Transition Probabilities.- 4.2.6 State-Time Trellis Diagram.- 4.3 Training Hidden Markov Models.- 4.3.1 Forward-Backward Probability Computation.- 4.3.2 Baum-Welch Model Re-Estimation.- 4.3.3 Training Discrete Observation Density HMMs.- 4.3.4 HMMs with Continuous Observation PDFs.- 4.3.5 HMMs with Mixture Gaussian pdfs.- 4.4 Decoding of Signals Using Hidden Markov Models.- 4.4.1 Viterbi Decoding Algorithm.- 4.5 HMM-based Estimation of Signals in Noise.- 4.5.1 HMM-based Wiener Filters.- 4.5.2 Modelling Noise Characteristics.- Summary.- 5 Wiener Filters.- 5.1 Wiener Filters: Least Squared Error Estimation.- 5.2 Block-data Formulation of the Wiener Filter.- 5.3 Vector Space Interpretation of Wiener Filters.- 5.4 Analysis of the Least Mean Squared Error Signal.- 5.5 Formulation of Wiener Filter in Frequency Domain.- 5.6 Some Applications of Wiener Filters.- 5.6.1 Wiener filter for Additive Noise Reduction.- 5.6.2 Wiener Filter and Separability of Signal and Noise.- 5.6.3 Squared Root Wiener Filter.- 5.6.4 Wiener Channel Equaliser.- 5.6.5 Time-alignment of Signals.- 5.6.6 Implementation of Wiener Filters.- Summary.- 6 Kalman and Adaptive Least Squared Error Filters.- 6.1 State-space Kalman Filters.- 6.2 Sample Adaptive Filters.- 6.3 Recursive Least Squares (RLS) Adaptive Filters.- 6.4 The Steepest Descent Method.- 6.5 The LMS Adaptation Method.- Summary.- 7 Linear Prediction Models.- 7.1 Linear Prediction Coding.- 7.1.1 Least Mean Squared Error Predictor.- 7.1.2 The Inverse Filter: Spectral Whitening.- 7.1.3 The Prediction Error Signal.- 7.2 Forward, Backward and Lattice Predictors.- 7.2.1 Augmented Equations for Forward and Backward Predictors.- 7.2.2 Levinson-Durbin Recursive Solution.- 7.2.3 Lattice Predictors.- 7.2.4 Alternative Formulations of Least Squared Error Predictors.- 7.2.5 Model Order Selection.- 7.3 Short-term and Long-term Predictors.- 7.4 MAP Estimation of Predictor Coefficients.- 7.5 Signal Restoration Using Linear Prediction Models.- 7.5.1 Frequency Domain Signal Restoration.- Summary.- 8 Power Spectrum Estimation.- 8.1 Fourier Transform, Power Spectrum and Correlation.- 8.1.1 Fourier Transform.- 8.1.2 Discrete Fourier Transform (DFT).- 8.1.3 Frequency Resolution and Spectral Smoothing.- 8.1.4 Energy Spectral Density and Power Spectral Density.- 8.2 Non-parametric Power Spectrum Estimation.- 8.2.1 The Mean and Variance of Periodograms.- 8.2.2 Averaging Periodograms (Bartlett Method).- 8.2.3 Welch Method ¡Averaging Periodograms from Overlapped and Windowed Segments.- 8.2.4 Blackman-Tukey Method.- 8.2.5 Power Spectrum Estimation from Autocorrelation of Overlapped Segments.- 8.3 Model-based Power Spectrum Estimation.- 8.3.1 Maximum Entropy Spectral Estimation.- 8.3.2 Autoregressive Power Spectrum Estimation.- 8.3.3 Moving Average Power Spectral Estimation.- 8.3.4 Autoregressive Moving Average Power Spectral Estimation.- 8.4 High Resolution Spectral Estimation Based on Subspace Eigen Analysis.- 8.4.1 Pisarenko Harmonic Decomposition.- 8.4.2 Multiple Signal Classification (MUSIC) Spectral Estimation.- 8.4.3 Estimation of Signal Parameters via Rotational Invariance Techniques (ESPRIT).- Summary.- 9 Spectral Subtraction.- 9.1 Spectral Subtraction.- 9.1.1 Power Spectrum Subtraction.- 9.1.2 Magnitude Spectrum Subtraction.- 9.1.3 Spectral Subtraction Filter: Relation to Wiener Filters.- 9.2 Processing Distortions.- 9.2.1 Effect of Spectral Subtraction on Signal Distribution.- 9.2.2 Reducing the Noise Variance.- 9.2.3 Filtering Out the Processing Distortions.- 9.3 Non-linear Spectral Subtraction.- 9.4 Implementation of Spectral Subtraction.- 9.4.1 Application to Speech Restoration and Recognition.- Summary.- 10 Interpolation.- 10.1 Introduction.- 10.1.1 Interpolation of a Sampled Signal.- 10.1.2 Digital Interpolation by a Factor of I.- 10.1.3 Interpolation of a Sequence of Lost Samples.- 10.1.4 Factors that Affect Interpolation.- 10.2 Polynomial Interpolation.- 10.2.1 Lagrange Polynomial Interpolation.- 10.2.2 Newton Interpolation Polynomial.- 10.2.3 Hermite Interpolation Polynomials.- 10.2.4 Cubic Spline Interpolation.- 10.3 Statistical Interpolation.- 10.3.1 Maximum a Posterior Interpolation.- 10.3.2 L




I LIBRI CHE INTERESSANO A CHI HA I TUOI GUSTI

Musica elettronica e sound design. vol. 1: teoria e pratica con max 8
Musica elettronica e sound design. vol. 2: teoria e pratica con max 8
Real 3 - listening & speaking
English grammar in use with answers
English grammar in use with answers + e-book



I LIBRI ACQUISTATI DA CHI HA I TUOI GUSTI

Giallo in vacanza 2
Giallo in vacanza 5
Historia de una gaviota y del gato que le enseno' a volar. nivel a1
The canterbury tales . level b1/b2
Peter pan. level starter a1 - ga



Altre Informazioni

ISBN:

9783322927743

Condizione: Nuovo
Dimensioni: 244 x 170 mm Ø 715 gr
Formato: Brossura
Illustration Notes:XIII, 397 S. 42 Abb.
Pagine Arabe: 397
Pagine Romane: xiii


Dicono di noi





Per noi la tua privacy è importante


Il sito utilizza cookie ed altri strumenti di tracciamento che raccolgono informazioni dal dispositivo dell’utente. Oltre ai cookie tecnici ed analitici aggregati, strettamente necessari per il funzionamento di questo sito web, previo consenso dell’utente possono essere installati cookie di profilazione e marketing e cookie dei social media. Cliccando su “Accetto tutti i cookie” saranno attivate tutte le categorie di cookie. Per accettare solo deterninate categorie di cookie, cliccare invece su “Impostazioni cookie”. Chiudendo il banner o continuando a navigare saranno installati solo cookie tecnici. Per maggiori dettagli, consultare la Cookie Policy.

Impostazioni cookie
Rifiuta Tutti i cookie
Accetto tutti i cookie
X