Damping a Time Domain Signal to Mimic Signal Decay
Posted: Wed Mar 18, 2015 10:16 am
I'm currently simulating some 14N ESEEM data and I was wondering if there is a way to include a damping function in a fitting routine. In my script for simulating by hand, I can apply a damping function like this:
However afterwards when I want to let ESFit simulate fine-tune my handiwork this doesn't happen. It seems to cause problems when it calculates the R^2 value later on in my data when the signal has simply relaxed. I've tried Sys.T1T2 but it seems to have no effect.
Any ideas?
Code: Select all
Damping=1.8 %User-specified variable
%Generate Signal
[simt,simy]=saffron(Sys, Exp, Opt);
simy=abs(simy);
%Subtract Biexponential
[~,~,simfit]=exponfit(simt,simy,2);
simy=simy-simfit;
%Apply Damping function
simy=simy.*exp(-simt(:)/Damping)';
simy=simy/max(simy);
Any ideas?