[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: software too...
Yet another software, not for auditory modeling but for performing
psychoacoustical experiments, is "psylab".
Detection, discrimination, and matching experiments are currently
supported and a number of paradigms such as N-AFC, n-up-m-down, weighted
up-down etc. are readily available. The user only needs to
implement/program the stimuli herself, in plain Matlab.
psylab can be found at www.hoertechnik-audiologie.de/psylab
Regards,
Martin
On 12.03.2012 21:35, Richard F. Lyon wrote:
> Speaking of auditory software...
>
> Various frameworks and toolboxes include cochlear models that I had
> something to do with, including in AIM-C, AIM-MAT, and Marsyas, as well
> as older ones like Slaney's Auditory Toolbox. But I'm not so happy with
> any of those models, and for the last year or so have been working with
> Tom Walters and others on a new open-source cochlear model project,
> evolved from those, but cleaner and more efficient, with
> mono/stereo/multichannel capability. We could use another volunteer or
> two to help finish it up.
>
> In 2010 we had a regional workshop about auditory tools and frameworks,
> and found there was a sort of "rift" between those who like to build and
> use frameworks, and those who prefer simpler library-level bits of code
> that they can incorporate into their systems. We'd like to support
> both. So our project will be an open-source library of simple code,
> with equivalent Matlab, C++, and Python versions, and we'll help anyone
> who wants to make wrappers to connect those into their favorite
> frameworks and toolboxes.
>
> The model we're doing is partially described in my recent (and only
> ever) JASA article:
> http://asadl.org/jasa/resource/1/jasman/v130/i6/p3893_s1 "Cascades of
> two-pole--two-zero asymmetric resonators are good models of peripheral
> auditory function" (I can provide a copy on request). But the details
> of how the model is turned into an efficient real-time sound analyzer
> aren't in there.
>
> If you'd like to help, either to finish up implementations and tests in
> Matlab, C++, or Python--or some other language of your choice--or to
> wrap it into one of the other systems, let me know. The code project
> sits as a subproject of http://code.google.com/p/aimc/ for now, but
> we'll probably separate it out soon.
>
> Dick
>
>
>
> At 9:45 AM +0000 3/9/12, Etienne Gaudrain wrote:
>> There's a repository of auditory models and other stuff here :
>>
>> http://soundsoftware.ac.uk/
>>
>> or more exactly there :
>>
>> http://code.soundsoftware.ac.uk/projects
>>
>> -Etienne
>>
>>
>>
>> On 09/03/2012 08:01, Lowel O'Mard wrote:
>>> There is also the the development system for auditory modelling:
>>> DSAM: http://dsam.org.uk
>>>
>>> It is a computer library allows contains many established auditory
>>> models to be used via a simple scripting interface. The library
>>> supports most sound file formats, threaded parallel processing and
>>> also contains many useful analysis and utility functions. There is a
>>> java interface which can be used on most versions of Matlab (which
>>> support Java) and a simple application which can be used on all
>>> platforms, executable installations existing for Windows and Linux.
>>>
>>> Sincere regards,
>>>
>>>
>>> ...Lowel.
>>>
>>> On 8 March 2012 21:17, Ray Goldsworthy <raygold@xxxxxxxx
>>> <mailto:raygold@xxxxxxxx>> wrote:
>>>
>>> It was also suggested that we bring together open-source toolboxes
>>> for hearing research. I think this is a great idea. Here are the
>>> ones I know of:
>>>
>>> APEX :
>>> https://gilbert.med.kuleuven.be/web/index.php/Public:Software/APEX
>>> MLP : http://www.psy.unipd.it/~grassi/mlp.html
>>> <http://www.psy.unipd.it/%7Egrassi/mlp.html>
>>> Percept : http://www.sens.com/percept/
>>> Psycon and miscellaneous tools: http://auditorypro.com/download
>>> PsySound3 : http://psysound.wikidot.com/
>>>
>>> please let me know if you have other general software utilities
>>> developed in your labs that you make available for research use
>>> and I will try to summarize....Ray
>>>
>>> -- Ray Goldsworthy
>>> Research Scientist
>>> Sensimetrics Corporation
>>>
>>
>>
>> --
>> Etienne Gaudrain, PhD
>> MRC Cognition and Brain Sciences Unit
>> 15 Chaucer Road
>> Cambridge, CB2 7EF
>> UK
>> Phone: +44 1223 355 294, ext. 645
>> Fax (unit): +44 1223 359 062