Subject: [AUDITORY] Invitation to Special Session on Wearable Technology for Sensing and Perception From: Martin Skoglund <martin.skoglund@xxxxxxxx> Date: Wed, 2 Feb 2022 08:16:02 +0000--_000_AS8P191MB1736A6DF95268567270118C592279AS8P191MB1736EURP_ Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable Dear AUDITORY, We are planning to organize a Special Session on "Wearable Technology for S= ensing and Perception" at the "25th International Conference on Information= Fusion" (FUSION 2022, https://www.fusion2022.se/<https://eur01.safelinks.p= rotection.outlook.com/?url=3Dhttps%3A%2F%2Fwww.fusion2022.se%2F&data=3D04%7= C01%7Cmnsk%40oticon.com%7C87af11175be6440dd35608d9df1f8d8a%7C9bf8c7a8e00849= a79e43ab76976c4bf8%7C0%7C0%7C637786148866528649%7CUnknown%7CTWFpbGZsb3d8eyJ= WIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdat= a=3DaO1iXpvs6K6U45JPUWQHTAaDr0kex5pUnZiTME0vB3U%3D&reserved=3D0>) in Link= =F6ping, Sweden, on July 4-7, 2022. If you like contribute, please send a preliminary title, author list and ab= stract of your paper as soon as you can (preferably before February 15, 202= 2) to Martin Skoglund (martin.skoglund@xxxxxxxx<mailto:martin.skoglund@xxxxxxxx= >). Further details about the Special Session are given below. Instructions for= the submission of a paper are available on the FUSION 2022 website (https:= //www.fusion2022.se/<https://eur01.safelinks.protection.outlook.com/?url=3D= https%3A%2F%2Fwww.fusion2022.se%2F&data=3D04%7C01%7Cmnsk%40oticon.com%7C87a= f11175be6440dd35608d9df1f8d8a%7C9bf8c7a8e00849a79e43ab76976c4bf8%7C0%7C0%7C= 637786148866528649%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2lu= MzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=3DaO1iXpvs6K6U45JPUWQHTAaDr= 0kex5pUnZiTME0vB3U%3D&reserved=3D0>). We will cover the conference registration fee for up to 5 papers (max 1/aut= hor) by a fair lottery among the papers accepted for this special session. Feel free to forward this invitation to colleagues who could be interested. Best regards, Martin Skoglund and Jesper Jensen %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% Tentative TITLE: Wearable technology for sensing and perception ORGANIZERS: Martin Skoglund, Link=F6ping University/Oticon A/S and Jesper Jensen, Aalbo= rg University/Oticon A/S. ABSTRACT: The quest for more ecologically valid research is motivated by the need for= bridging the gap between laboratory-controlled experiments and real-life s= cenarios. Recent advancement in sensor miniaturization and computational ef= ficiency opens for novel applications with wearable perception and assistiv= e systems. The use of multimodal sensor integration enables scene and motio= n-aware systems are particularly powerful when combined with modern estimat= ion techniques. These technologies are already necessary in AR applications= and could play an equally important role in future personalized and intent= ion-controlled assistive devices. In this special session the focus is on wearable/portable sensor technology= , such as microphone arrays, cameras, inertial sensors, EEG, eye-tracking, = pulse, and more. We are highly interested in research on applications, plat= forms, algorithms, theory, perception, assistive support, objective assessm= ent, and paradigms, that may be applicable in more ecologically valid scena= rios but can also reach beyond traditional laboratory and clinical testing. Topics of this special session may include, but are not limited to, the fol= lowing aspects. * Audio-visual applications * Augmented reality * Audio processing and perception * Attention/intention decoding * Beamforming and noise reduction * Cognitive load/effort * Ecologically valid experimental design and stimulus development * Learning and classification * Localization, mapping, and tracking IMPORTANT DATES: Papers due: March 1, 2022 (previous years the deadline has been extended ap= proximately 2 weeks) Notification of acceptance: May 1, 2022 Final paper submission: June 1, 2022 --_000_AS8P191MB1736A6DF95268567270118C592279AS8P191MB1736EURP_ Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable <html xmlns:v=3D"urn:schemas-microsoft-com:vml" xmlns:o=3D"urn:schemas-micr= osoft-com:office:office" xmlns:w=3D"urn:schemas-microsoft-com:office:word" = xmlns:m=3D"http://schemas.microsoft.com/office/2004/12/omml" xmlns=3D"http:= //www.w3.org/TR/REC-html40"> <head> <meta http-equiv=3D"Content-Type" content=3D"text/html; charset=3Diso-8859-= 1"> <meta name=3D"Generator" content=3D"Microsoft Word 15 (filtered medium)"> <style><!-- /* Font Definitions */ @xxxxxxxx {font-family:Wingdings; panose-1:5 0 0 0 0 0 0 0 0 0;} @xxxxxxxx {font-family:"Cambria Math"; panose-1:2 4 5 3 5 4 6 3 2 4;} @xxxxxxxx {font-family:Calibri; panose-1:2 15 5 2 2 2 4 3 2 4;} /* Style Definitions */ p.MsoNormal, li.MsoNormal, div.MsoNormal {margin-top:0cm; margin-right:0cm; margin-bottom:8.0pt; margin-left:0cm; line-height:105%; font-size:11.0pt; font-family:"Calibri",sans-serif;} a:link, span.MsoHyperlink {mso-style-priority:99; color:#0563C1; text-decoration:underline;} p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph {mso-style-priority:34; margin-top:0cm; margin-right:0cm; margin-bottom:8.0pt; margin-left:36.0pt; mso-add-space:auto; line-height:105%; font-size:11.0pt; font-family:"Calibri",sans-serif;} p.MsoListParagraphCxSpFirst, li.MsoListParagraphCxSpFirst, div.MsoListParag= raphCxSpFirst {mso-style-priority:34; mso-style-type:export-only; margin-top:0cm; margin-right:0cm; margin-bottom:0cm; margin-left:36.0pt; mso-add-space:auto; line-height:105%; font-size:11.0pt; font-family:"Calibri",sans-serif;} p.MsoListParagraphCxSpMiddle, li.MsoListParagraphCxSpMiddle, div.MsoListPar= agraphCxSpMiddle {mso-style-priority:34; mso-style-type:export-only; margin-top:0cm; margin-right:0cm; margin-bottom:0cm; margin-left:36.0pt; mso-add-space:auto; line-height:105%; font-size:11.0pt; font-family:"Calibri",sans-serif;} p.MsoListParagraphCxSpLast, li.MsoListParagraphCxSpLast, div.MsoListParagra= phCxSpLast {mso-style-priority:34; mso-style-type:export-only; margin-top:0cm; margin-right:0cm; margin-bottom:8.0pt; margin-left:36.0pt; mso-add-space:auto; line-height:105%; font-size:11.0pt; font-family:"Calibri",sans-serif;} span.E-postmall17 {mso-style-type:personal-compose; font-family:"Calibri",sans-serif; color:windowtext;} .MsoChpDefault {mso-style-type:export-only; font-family:"Calibri",sans-serif;} @xxxxxxxx WordSection1 {size:612.0pt 792.0pt; margin:70.85pt 70.85pt 70.85pt 70.85pt;} div.WordSection1 {page:WordSection1;} /* List Definitions */ @xxxxxxxx l0 {mso-list-id:540938873; mso-list-type:hybrid; mso-list-template-ids:-320962514 417374264 67698691 67698693 67698689 6769= 8691 67698693 67698689 67698691 67698693;} @xxxxxxxx l0:level1 {mso-level-start-at:0; mso-level-number-format:bullet; mso-level-text:-; mso-level-tab-stop:none; mso-level-number-position:left; text-indent:-18.0pt; font-family:"Calibri",sans-serif; mso-fareast-font-family:Calibri;} @xxxxxxxx l0:level2 {mso-level-number-format:bullet; mso-level-text:o; mso-level-tab-stop:none; mso-level-number-position:left; text-indent:-18.0pt; font-family:"Courier New";} @xxxxxxxx l0:level3 {mso-level-number-format:bullet; mso-level-text:\F0A7; mso-level-tab-stop:none; mso-level-number-position:left; text-indent:-18.0pt; font-family:Wingdings;} @xxxxxxxx l0:level4 {mso-level-number-format:bullet; mso-level-text:\F0B7; mso-level-tab-stop:none; mso-level-number-position:left; text-indent:-18.0pt; font-family:Symbol;} @xxxxxxxx l0:level5 {mso-level-number-format:bullet; mso-level-text:o; mso-level-tab-stop:none; mso-level-number-position:left; text-indent:-18.0pt; font-family:"Courier New";} @xxxxxxxx l0:level6 {mso-level-number-format:bullet; mso-level-text:\F0A7; mso-level-tab-stop:none; mso-level-number-position:left; text-indent:-18.0pt; font-family:Wingdings;} @xxxxxxxx l0:level7 {mso-level-number-format:bullet; mso-level-text:\F0B7; mso-level-tab-stop:none; mso-level-number-position:left; text-indent:-18.0pt; font-family:Symbol;} @xxxxxxxx l0:level8 {mso-level-number-format:bullet; mso-level-text:o; mso-level-tab-stop:none; mso-level-number-position:left; text-indent:-18.0pt; font-family:"Courier New";} @xxxxxxxx l0:level9 {mso-level-number-format:bullet; mso-level-text:\F0A7; mso-level-tab-stop:none; mso-level-number-position:left; text-indent:-18.0pt; font-family:Wingdings;} ol {margin-bottom:0cm;} ul {margin-bottom:0cm;} --></style><!--[if gte mso 9]><xml> <o:shapedefaults v:ext=3D"edit" spidmax=3D"1026" /> </xml><![endif]--><!--[if gte mso 9]><xml> <o:shapelayout v:ext=3D"edit"> <o:idmap v:ext=3D"edit" data=3D"1" /> </o:shapelayout></xml><![endif]--> </head> <body lang=3D"EN-US" link=3D"#0563C1" vlink=3D"#954F72" style=3D"word-wrap:= break-word"> <div class=3D"WordSection1"> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm;line-height:normal">Dear = AUDITORY,<o:p></o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm;line-height:normal"><o:p>= </o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm;line-height:normal">We ar= e planning to organize a Special Session on "Wearable Technology for S= ensing and Perception" at the "25th International Conference on I= nformation Fusion" (FUSION 2022, <span lang=3D"SV"><a href=3D"https://eur01.safelinks.protection.outlook.com= /?url=3Dhttps%3A%2F%2Fwww.fusion2022.se%2F&data=3D04%7C01%7Cmnsk%40otic= on.com%7C87af11175be6440dd35608d9df1f8d8a%7C9bf8c7a8e00849a79e43ab76976c4bf= 8%7C0%7C0%7C637786148866528649%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAi= LCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=3DaO1iXpvs6= K6U45JPUWQHTAaDr0kex5pUnZiTME0vB3U%3D&reserved=3D0"><span lang=3D"EN-US= ">https://www.fusion2022.se/</span></a></span>) in Link=F6ping, Sweden, on July 4-7, 2022.<br> <br> If you like contribute, please send a preliminary title, author list and ab= stract of your paper as soon as you can (preferably before February 15, 202= 2) to Martin Skoglund (<span lang=3D"SV"><a href=3D"mailto:martin.skoglund@xxxxxxxx= liu.se"><span lang=3D"EN-US">martin.skoglund@xxxxxxxx</span></a></span>).<br> <br> Further details about the Special Session are given below. Instructions for= the submission of a paper are available on the FUSION 2022 website (<span = lang=3D"SV"><a href=3D"https://eur01.safelinks.protection.outlook.com/?url= =3Dhttps%3A%2F%2Fwww.fusion2022.se%2F&data=3D04%7C01%7Cmnsk%40oticon.co= m%7C87af11175be6440dd35608d9df1f8d8a%7C9bf8c7a8e00849a79e43ab76976c4bf8%7C0= %7C0%7C637786148866528649%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQI= joiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=3DaO1iXpvs6K6U45= JPUWQHTAaDr0kex5pUnZiTME0vB3U%3D&reserved=3D0"><span lang=3D"EN-US">htt= ps://www.fusion2022.se/</span></a></span>).<o:p></o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm;line-height:normal"><o:p>= </o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:12.0pt;line-height:normal">We= will cover the conference registration fee for up to 5 papers (max 1/autho= r) by a fair lottery among the papers accepted for this special session. <o:p></o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:12.0pt;line-height:normal">Fe= el free to forward this invitation to colleagues who could be interested.<b= r> <br> Best regards,<br> Martin Skoglund and Jesper Jensen<o:p></o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm;line-height:normal">%%%%%= %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%<o:p></o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm;line-height:normal">Tenta= tive TITLE: Wearable technology for sensing and perception<br> <br> ORGANIZERS:<o:p></o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm;line-height:normal">Marti= n Skoglund, Link=F6ping University/Oticon A/S and Jesper Jensen, Aalborg Un= iversity/Oticon A/S. <br> <br> ABSTRACT: <b><o:p></o:p></b></p> <p class=3D"MsoNormal">The quest for more ecologically valid research is mo= tivated by the need for bridging the gap between laboratory-controlled expe= riments and real-life scenarios. Recent advancement in sensor miniaturizati= on and computational efficiency opens for novel applications with wearable perception and assistive systems. The= use of multimodal sensor integration enables scene and motion-aware system= s are particularly powerful when combined with modern estimation techniques= . These technologies are already necessary in AR applications and could play an equally important role in f= uture personalized and intention-controlled assistive devices.<o:p></o:p></= p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm">In this special session = the focus is on wearable/portable sensor technology, such as microphone arr= ays, cameras, inertial sensors, EEG, eye-tracking, pulse, and more. We are = highly interested in research on applications, platforms, algorithms, theory, perception, assistive support, objective as= sessment, and paradigms, that may be applicable in more ecologically valid = scenarios but can also reach beyond traditional laboratory and clinical tes= ting.<b><o:p></o:p></b></p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm"><o:p> </o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm">Topics of this special s= ession may include, but are not limited to, the following aspects.<o:p></o:= p></p> <ul style=3D"margin-top:0cm" type=3D"disc"> <li class=3D"MsoListParagraphCxSpFirst" style=3D"margin-bottom:0cm;margin-l= eft:0cm;mso-add-space:auto;line-height:105%;mso-list:l0 level1 lfo1"> <span lang=3D"EN">Audio-visual applications<o:p></o:p></span></li><li class= =3D"MsoListParagraphCxSpMiddle" style=3D"margin-bottom:0cm;margin-left:0cm;= mso-add-space:auto;line-height:105%;mso-list:l0 level1 lfo1"> <span lang=3D"EN">Augmented reality<o:p></o:p></span></li><li class=3D"MsoL= istParagraphCxSpMiddle" style=3D"margin-bottom:0cm;margin-left:0cm;mso-add-= space:auto;line-height:105%;mso-list:l0 level1 lfo1"> <span lang=3D"EN">Audio processing and perception<o:p></o:p></span></li><li= class=3D"MsoListParagraphCxSpMiddle" style=3D"margin-bottom:0cm;margin-lef= t:0cm;mso-add-space:auto;line-height:105%;mso-list:l0 level1 lfo1"> <span lang=3D"EN">Attention/intention decoding<o:p></o:p></span></li><li cl= ass=3D"MsoListParagraphCxSpMiddle" style=3D"margin-bottom:0cm;margin-left:0= cm;mso-add-space:auto;line-height:105%;mso-list:l0 level1 lfo1"> <span lang=3D"EN">Beamforming and noise reduction<o:p></o:p></span></li><li= class=3D"MsoListParagraphCxSpMiddle" style=3D"margin-bottom:0cm;margin-lef= t:0cm;mso-add-space:auto;line-height:105%;mso-list:l0 level1 lfo1"> <span lang=3D"EN">Cognitive load/effort<o:p></o:p></span></li><li class=3D"= MsoListParagraphCxSpMiddle" style=3D"margin-bottom:0cm;margin-left:0cm;mso-= add-space:auto;line-height:105%;mso-list:l0 level1 lfo1"> <span lang=3D"EN">Ecologically valid experimental design and stimulus devel= opment<o:p></o:p></span></li><li class=3D"MsoListParagraphCxSpMiddle" style= =3D"margin-bottom:0cm;margin-left:0cm;mso-add-space:auto;line-height:105%;m= so-list:l0 level1 lfo1"> <span lang=3D"EN">Learning and classification<o:p></o:p></span></li><li cla= ss=3D"MsoListParagraphCxSpLast" style=3D"margin-bottom:0cm;margin-left:0cm;= mso-add-space:auto;line-height:105%;mso-list:l0 level1 lfo1"> <span lang=3D"EN">Localization, mapping, and tracking<o:p></o:p></span></li= ></ul> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm;line-height:normal"><span= lang=3D"SV"><br> IMPORTANT DATES:</span><o:p></o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm;line-height:normal">Paper= s due: March 1, 2022 (previous years the deadline has been extended approxi= mately 2 weeks) <o:p></o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:0cm;line-height:normal">Notif= ication of acceptance: May 1, 2022 <o:p></o:p></p> <p class=3D"MsoNormal" style=3D"margin-bottom:12.0pt;line-height:normal">Fi= nal paper submission: June 1, 2022<o:p></o:p></p> <p class=3D"MsoNormal"><o:p> </o:p></p> </div> </body> </html> --_000_AS8P191MB1736A6DF95268567270118C592279AS8P191MB1736EURP_--