Hi,
I'm working on an ultrasonic simulation and i would like to compare the result to actual data from our experiments. The general impression seems to be correct, but in detail, the frequency bandwidth of the used transducer plays an important role.
As far as i know, the manual addresses this issue only quickly and in this forum, i found these 2 topics:
http://www.k-wave.org/forum/topic/simple-3d-simulation-received-signal-question
http://www.k-wave.org/forum/topic/sensor-specifications
The second suggest to use gaussianfilter
which is applying a frequency filtering (as far as i see). What seems to be correct compared to the spectra of the transducer provided by the manufacturer, results in clearly incorrect signals. Have a look here:
http://homepages.physik.uni-muenchen.de/~Sebastian.Lehrack/gaussian_filter_spectra_3.5MHz.png
http://homepages.physik.uni-muenchen.de/~Sebastian.Lehrack/compared_signals_filtered.png
The gaussian filter results in a signal which differs from the data we measured. It also produces a signal appearing before the actual signal. In a second step, we applied Matlab's standard butterworth low and high pass filter and the result agrees much better with the measured data but yet not perfectly. However, it seems to me that the gaussianfilter
is to simple and to strict regarding higher frequencies. And although it is not indicated, a transducer can have higher resonances, at least theoretically.
Arising from this, what is the intended or recommended way to simulate the detector bandwidth?
Greetings,
Sebastian