[AUDITORY] Two Fully-Funded PhD Opportunities in Auditory Sensory Augmentation (Craig Jin )


Subject: [AUDITORY] Two Fully-Funded PhD Opportunities in Auditory Sensory Augmentation
From:    Craig Jin  <000001bc67790e0f-dmarc-request@xxxxxxxx>
Date:    Thu, 14 Jul 2022 09:39:54 +0000

--_000_SYBPR01MB6253D36ED30E387D4F3C57A5BB889SYBPR01MB6253ausp_ Content-Type: text/plain; charset="Windows-1252" Content-Transfer-Encoding: quoted-printable Dear Auditory List, I have two fully-funded (stipend and tuition) PhD opportunities available a= t the University of Sydney in the area of auditory sensory augmentation. Please send me an email with cover letter, CV, and transcript to apply (craig.jin (at) sydney.edu.au) -----------------------Details Below------------------ The Computing and Audio Research Lab at the University of Sydney, Australia= has a fully-funded PhD position open in augmented reality for the visually= impaired, in partnership with ARIA LLC. Applicants with a strong background in Psychoacoustics, Computer Science, S= oftware Engineering or similar programs are encouraged to apply. Project ARIA, Augmented Reality in Audio, seeks to endow the visually impai= red with a richer sense of their surroundings using a wearable augmented re= ality device. Building on technologies from robotics, augmented reality, an= d spatialised audio display, ARIA will deliver next-generation auditory sen= sory augmentation with the potential to improve the quality of life for mil= lions of people affected by vision impairment worldwide. There are multiple PhD projects available in Project ARIA. This Sensory Aug= mentation-focused project will advance the auditory sensory augmentation te= chnologies required for the ARIA wearable device to perform reliably and ef= ficiently in a breadth of usage scenarios. We take a broad view of auditory sensory augmentation as comprised of three= parts: (1) sensors and machine artificial intelligence extract information= for a targeted objective; (2) this information is rendered via the auditor= y channel as sound; (3) we enable efferent feedback control via hand/wrist = or other sensors. Experiments are run using motion capture and the latest A= R/VR/XR equipment. The challenges are to convey navigation, social, or symb= olic information via the auditory channel in perceptually consistent and me= aningful ways. The project's aims include: - Establishing a psychoacoustic simulation environment to support rapid dev= elopment and evaluation of novel auditory sensory augmentation paradigms - Adapting computational imaging technologies to propose novel methods for = auditory sensory augmentation - Characterising and calibrating auditory sensory augmentation paradigms vi= a simulation, controlled lab experiments, and user trials - Developing auditory sensory augmentation algorithms to best make use of i= ncoming machine vision sensor information - Supporting development of the high-level sensory augmentation algorithms = and the full sensory augmentation pipeline ---Research Environment--- Embedded in the School of Electrical and Information Engineering, the Compu= ting and Audio Research Lab (CARLab) is focused on auditory sensory augment= ation, machine hearing, and morphoacoustics (morphoacoustics.org). We emplo= y audio signal processing and machine learning techniques to develop new co= ncepts and understandings that will provide new technologies for auditory s= ensory augmentation and machine hearing. CARLab offers specialised acoustic facilities (anechoic chambers, semi-anec= hoic chamber, loudspeaker arrays, linear and spherical microphone arrays, A= R/VR/XR platforms). You will have access to mechanical and electronics work= shops and a pool of technical staff to help realise your research ambitions= . You will also have the opportunity to work closely with Project ARIA=92s = engineers and make use of their extensive development and testing facilitie= s. The University of Sydney offers a rich academic setting in a world-class= city, and CARLab has strong ties to a network of nearby and international = academic and industrial collaborators. ---Offering--- A fully funded 3-year PhD scholarship covering tuition fees and a stipend c= overing living expenses, with extension to 3.5 years contingent on availabi= lity of funding. ---About You--- Successful candidates will have: - A bachelor=92s degree in a relevant discipline - Interest in developing novel auditory sensory augmentation systems and wo= rking with AR/VR/XR simulation environments - Excellent communication and interpersonal skills - Experience with one or more of psychoacoustics, computer science, softwar= e engineering, audio signal processing, sound design and engineering - Hands-on experience with Python, Matlab, C++, and one or more video game = simulation environments would be an asset CRAIG JIN | Head of the Computing and Audio Research Laboratory https://sydney.edu.au/engineering/people/craig.jin.php https://www.linkedin.com/in/craig-jin-2a04b115/ Member of The University of Sydney Nano Institute<https://protect-au.mimeca= st.com/s/J217Cvl0PoCkVrjWFXYxEU?domain=3Dsydney.edu.au> THE UNIVERSITY OF SYDNEY School of Electrical and Information Engineering, J03 The University of Sydney | NSW | 2006 T +61 2 9351 7208 | M +61 414 817 265 E craig.jin@xxxxxxxx<mailto:craig.jin@xxxxxxxx> | W sydney.edu.au= <https://protect-au.mimecast.com/s/yhXOCxnMRvtWNP4JhwJv-z?domain=3Dsydney.e= du.au> --_000_SYBPR01MB6253D36ED30E387D4F3C57A5BB889SYBPR01MB6253ausp_ Content-Type: text/html; charset="Windows-1252" Content-Transfer-Encoding: quoted-printable <html xmlns:o=3D"urn:schemas-microsoft-com:office:office" xmlns:w=3D"urn:sc= hemas-microsoft-com:office:word" xmlns:m=3D"http://schemas.microsoft.com/of= fice/2004/12/omml" xmlns=3D"http://www.w3.org/TR/REC-html40"> <head> <meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3DWindows-1= 252"> <meta name=3D"Generator" content=3D"Microsoft Word 15 (filtered medium)"> <style><!-- /* Font Definitions */ @xxxxxxxx {font-family:"Cambria Math"; panose-1:2 4 5 3 5 4 6 3 2 4;} @xxxxxxxx {font-family:Calibri; panose-1:2 15 5 2 2 2 4 3 2 4;} @xxxxxxxx {font-family:-webkit-standard; panose-1:2 11 6 4 2 2 2 2 2 4;} /* Style Definitions */ p.MsoNormal, li.MsoNormal, div.MsoNormal {margin:0cm; font-size:11.0pt; font-family:"Calibri",sans-serif; mso-fareast-language:EN-US;} span.EmailStyle17 {mso-style-type:personal-compose; font-family:"Calibri",sans-serif; color:windowtext;} .MsoChpDefault {mso-style-type:export-only; font-family:"Calibri",sans-serif; mso-fareast-language:EN-US;} @xxxxxxxx WordSection1 {size:612.0pt 792.0pt; margin:72.0pt 72.0pt 72.0pt 72.0pt;} div.WordSection1 {page:WordSection1;} --></style> </head> <body lang=3D"EN-AU" link=3D"#0563C1" vlink=3D"#954F72" style=3D"word-wrap:= break-word"> <div class=3D"WordSection1"> <p class=3D"MsoNormal">Dear Auditory List, <o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">I have two fully-funded (stipend and tuition) PhD op= portunities available at the University of Sydney in the area of auditory s= ensory augmentation. <o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">Please send me an email with cover letter, CV, and t= ranscript to apply<o:p></o:p></p> <p class=3D"MsoNormal">(craig.jin (at) sydney.edu.au) <o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal">-----------------------Details Below----------------= --<o:p></o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">The Computing and Audio Researc= h Lab at the University of Sydney, Australia has a fully-funded PhD positio= n open in augmented reality for the visually impaired, in partnership with = ARIA LLC.<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">Applicants with a strong backgr= ound in Psychoacoustics, Computer Science, Software Engineering or similar = programs are encouraged to apply.<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">Project ARIA, Augmented Reality= in Audio, seeks to endow the visually impaired with a richer sense of thei= r surroundings using a wearable augmented reality device. Building on techn= ologies from robotics, augmented reality, and spatialised audio display, ARIA will deliver next-generation auditory = sensory augmentation with the potential to improve the quality of life for = millions of people affected by vision impairment worldwide.<o:p></o:p></spa= n></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">There are multiple PhD projects= available in Project ARIA. This Sensory Augmentation-focused project will = advance the auditory sensory augmentation technologies required for the ARI= A wearable device to perform reliably and efficiently in a breadth of usage scenarios.<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">We take a broad view of auditor= y sensory augmentation as comprised of three parts: (1) sensors and machine= artificial intelligence extract information for a targeted objective; (2) = this information is rendered via the auditory channel as sound; (3) we enable efferent feedback control via han= d/wrist or other sensors. Experiments are run using motion capture and the = latest AR/VR/XR equipment. The challenges are to convey navigation, social,= or symbolic information via the auditory channel in perceptually consistent and meaningful ways. <o:p></o:= p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">The project's aims include:<o:p= ></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">- Establishing a psychoacoustic= simulation environment to support rapid development and evaluation of nove= l auditory sensory augmentation paradigms<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">- Adapting computational imagin= g technologies to propose novel methods for auditory sensory augmentation<o= :p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">- Characterising and calibratin= g auditory sensory augmentation paradigms via simulation, controlled lab ex= periments, and user trials<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">- Developing auditory sensory a= ugmentation algorithms to best make use of incoming machine vision sensor i= nformation<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">- Supporting development of the= high-level sensory augmentation algorithms and the full sensory augmentati= on pipeline<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">---Research Environment---<o:p>= </o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">Embedded in the School of Elect= rical and Information Engineering, the Computing and Audio Research Lab (CA= RLab) is focused on auditory sensory augmentation, machine hearing, and mor= phoacoustics (morphoacoustics.org). We employ audio signal processing and machine learning techniques to devel= op new concepts and understandings that will provide new technologies for a= uditory sensory augmentation and machine hearing. <o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">CARLab offers specialised acous= tic facilities (anechoic chambers, semi-anechoic chamber, loudspeaker array= s, linear and spherical microphone arrays, AR/VR/XR platforms). You will ha= ve access to mechanical and electronics workshops and a pool of technical staff to help realise your research ambi= tions. You will also have the opportunity to work closely with Project ARIA= =92s engineers and make use of their extensive development and testing faci= lities. The University of Sydney offers a rich academic setting in a world-class city, and CARLab has strong ties = to a network of nearby and international academic and industrial collaborat= ors. <o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">---Offering---<o:p></o:p></span= ></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">A fully funded 3-year PhD schol= arship covering tuition fees and a stipend covering living expenses, with e= xtension to 3.5 years contingent on availability of funding.<o:p></o:p></sp= an></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB"><o:p>&nbsp;</o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">---About You---<o:p></o:p></spa= n></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">Successful candidates will have= :<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">- A bachelor=92s degree in a re= levant discipline<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">- Interest in developing novel = auditory sensory augmentation systems and working with AR/VR/XR simulation = environments<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">- Excellent communication and i= nterpersonal skills<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">- Experience with one or more o= f psychoacoustics, computer science, software engineering, audio signal pro= cessing, sound design and engineering<o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-GB">- Hands-on experience with Pyth= on, Matlab, C++, and one or more video game simulation environments would b= e an asset<o:p></o:p></span></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <div> <p class=3D"MsoNormal"><span style=3D"font-family:-webkit-standard;color:bl= ack;mso-fareast-language:EN-GB">CRAIG JIN | Head of the&nbsp;</span><span s= tyle=3D"font-family:-webkit-standard;color:#070706;background:#FFEE94;mso-f= areast-language:EN-GB">Computing</span><span style=3D"font-family:-webkit-s= tandard;color:black;mso-fareast-language:EN-GB">&nbsp;</span><span style=3D= "font-family:-webkit-standard;color:#070706;background:#FFEE94;mso-fareast-= language:EN-GB">and</span><span style=3D"font-family:-webkit-standard;color= :black;mso-fareast-language:EN-GB">&nbsp;</span><span style=3D"font-family:= -webkit-standard;color:#070706;background:#FFEE94;mso-fareast-language:EN-G= B">Audio</span><span style=3D"font-family:-webkit-standard;color:black;mso-= fareast-language:EN-GB">&nbsp;</span><span style=3D"font-family:-webkit-sta= ndard;color:#070706;background:#FFEE94;mso-fareast-language:EN-GB">Research= </span><span style=3D"font-family:-webkit-standard;color:black;mso-fareast-= language:EN-GB">&nbsp;</span><span style=3D"font-family:-webkit-standard;co= lor:#070706;background:#FFEE94;mso-fareast-language:EN-GB">Laboratory</span= ><span style=3D"font-size:10.0pt;color:black;mso-fareast-language:EN-GB"><o= :p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-US" style=3D"font-size:12.0pt;color= :black;mso-fareast-language:EN-GB"><a href=3D"https://sydney.edu.au/enginee= ring/people/craig.jin.php" title=3D"https://sydney.edu.au/engineering/peopl= e/craig.jin.php"><span lang=3D"EN-GB" style=3D"font-size:11.0pt;font-family= :-webkit-standard;color:#0563C1">https://sydney.edu.au/engineering/people/c= raig.jin.php</span></a></span><span style=3D"font-size:10.0pt;color:black;m= so-fareast-language:EN-GB"><o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-US" style=3D"font-size:12.0pt;color= :black;mso-fareast-language:EN-GB"><a href=3D"https://www.linkedin.com/in/c= raig-jin-2a04b115/" title=3D"https://www.linkedin.com/in/craig-jin-2a04b115= /"><span lang=3D"EN-GB" style=3D"font-size:11.0pt;font-family:-webkit-stand= ard;color:#0563C1">https://www.linkedin.com/in/craig-jin-2a04b115/</span></= a></span><span style=3D"font-size:10.0pt;color:black;mso-fareast-language:E= N-GB"><o:p></o:p></span></p> <p class=3D"MsoNormal"><span lang=3D"EN-US" style=3D"font-size:10.0pt;color= :black;mso-fareast-language:EN-GB">&nbsp;</span><span style=3D"font-size:10= .0pt;color:black;mso-fareast-language:EN-GB"><o:p></o:p></span></p> <p class=3D"MsoNormal" style=3D"line-height:13.15pt"><span style=3D"font-fa= mily:-webkit-standard;color:#333333;mso-fareast-language:EN-GB">Member of&n= bsp;</span><span lang=3D"EN-US" style=3D"font-size:12.0pt;color:black;mso-f= areast-language:EN-GB"><a href=3D"https://protect-au.mimecast.com/s/J217Cvl= 0PoCkVrjWFXYxEU?domain=3Dsydney.edu.au" target=3D"_blank" title=3D"https://= protect-au.mimecast.com/s/J217Cvl0PoCkVrjWFXYxEU?domain=3Dsydney.edu.au"><s= pan lang=3D"EN-GB" style=3D"font-size:11.0pt;font-family:-webkit-standard;c= olor:#125687;border:none windowtext 1.0pt;padding:0cm">The University of Sydney Nano Institute</span></a></span><span style=3D"font-s= ize:10.0pt;color:black;mso-fareast-language:EN-GB"><o:p></o:p></span></p> <p class=3D"MsoNormal"><span style=3D"font-family:-webkit-standard;color:bl= ack;mso-fareast-language:EN-GB">THE UNIVERSITY OF SYDNEY<br> School of Electrical&nbsp;</span><span style=3D"font-family:-webkit-standar= d;color:#070706;background:#FFEE94;mso-fareast-language:EN-GB">and</span><s= pan style=3D"font-family:-webkit-standard;color:black;mso-fareast-language:= EN-GB">&nbsp;Information Engineering, J03<br> The University of Sydney | NSW | 2006<br> T +61 2 9351 7208 | M +61 414 817 265<br> E&nbsp;</span><span lang=3D"EN-US" style=3D"font-size:12.0pt;color:black;ms= o-fareast-language:EN-GB"><a href=3D"mailto:craig.jin@xxxxxxxx" title= =3D"mailto:craig.jin@xxxxxxxx"><span lang=3D"EN-GB" style=3D"font-size= :11.0pt;font-family:-webkit-standard;color:#0563C1">craig.jin@xxxxxxxx= </span></a></span><span style=3D"font-family:-webkit-standard;color:black;m= so-fareast-language:EN-GB">&nbsp;| W&nbsp;</span><span lang=3D"EN-US" style=3D"font-size:12.0pt;color:black;m= so-fareast-language:EN-GB"><a href=3D"https://protect-au.mimecast.com/s/yhX= OCxnMRvtWNP4JhwJv-z?domain=3Dsydney.edu.au" target=3D"_blank" title=3D"http= s://protect-au.mimecast.com/s/yhXOCxnMRvtWNP4JhwJv-z?domain=3Dsydney.edu.au= "><span lang=3D"EN-GB" style=3D"font-size:11.0pt;font-family:-webkit-standa= rd;color:#0563C1">sydney.edu.au</span></a></span><span style=3D"font-size:1= 0.0pt;color:black;mso-fareast-language:EN-GB"><o:p></o:p></span></p> </div> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> <p class=3D"MsoNormal"><o:p>&nbsp;</o:p></p> </div> </body> </html> --_000_SYBPR01MB6253D36ED30E387D4F3C57A5BB889SYBPR01MB6253ausp_--


This message came from the mail archive
src/postings/2022/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University