Hi
I've setup a simple 3D k-wave simulation with a single cell point source and a full plane of receivers for an US tomography problem. Before adding extra complexity to the sim I would like to treat the medium as homogeneous with the density of water (c = 1500 ms-1), without dispersion (medium.alpha_mode = 'no_dispersion') and be able to detect the time of flight (TOF) from the source position to each point in the receive plane. However, when attempting to flatten this response for water using an ideal TOF found from the known distances and speed of sound I find that they do not match the detected TOF. How would I minimise the affect of absorption and simplify the simulation to prevent this?
TOF mismatch: https://i.imgur.com/UEHTUI4.png
All the best
Tom