Re: GLM fit or Cubic smoothing spline for categorical boundary data?? (Stuart Rosen )


Subject: Re: GLM fit or Cubic smoothing spline for categorical boundary data??
From:    Stuart Rosen  <s.rosen@xxxxxxxx>
Date:    Mon, 7 May 2012 11:31:38 +0100
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

<html> <head> <meta content="text/html; charset=ISO-8859-1" http-equiv="Content-Type"> </head> <body bgcolor="#FFFFFF" text="#000000"> You can't make sensible statistical inferences from a least-squares fit. A proper statistical approach (i.e., using maximum likelihood as in logistic regression) would enable you to to answer questions like how many parameters are necessary for an adequate description of the data. <br> <br> Yours - Stuart<br> <br> On 07/05/2012 11:23, Pragati Rao wrote: <blockquote cite="mid:10401_1336386215_4FA7A2A7_10401_136_1_CAKL8Na=Oi1XKhvr4-dfBjJi7fU-TGE4pYTnjpEJddNcxhSH08w@xxxxxxxx" type="cite"> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"> Hi everyone,<br> <br> Thank you for the helpful replies. Based on some of the suggestions I tried a two parameter logistic curve fit using lsqcurvefit(). The equation used was y(t)=1/(1+exp(-r(t-t0))). The results obtained for the same data is attached. I have a few more questions:<br> <br> 1. Will the 4 parameter fit be better? And should I use&nbsp; y(t)= k1/(1+exp(-r(t-t0)))+k2 ? <br> <br> 2. Trueutwein and Strasberger (1999) suggest that maximum likelihood is better for fitting psychometric function data. Has anyone found results from a maximum likelihood fit better than a least squares fit? <br> <br> Any opinions on this?<br> <br> Regards,<br> Pragati<br> </blockquote> </body> </html>


This message came from the mail archive
/var/www/postings/2012/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University