Subject: Re: [AUDITORY] Workshop announcement From: Matt Flax <flatmax@xxxxxxxx> Date: Thu, 7 Jun 2018 09:01:20 +1000 List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>Hi Volker, LSL looks great. I can't make the workshop, however would like to learn=20 about the LSL protocol. Do you have a technical paper defining it or=20 even an academic paper ? thanks Matt On 06/06/18 19:23, Volker Hohmann wrote: > Dear list, > > on behalf of M. Bleichner I attach the announcement of a Lab Streaming > Layer (LSL) workshop, which might be interesting for people who want to > integrate signals from different sensors in their lab environments. > > Kind regards, > > Volker > > > ******************************************* > > Dear colleagues, > > We are happy to announce the second international Lab Streaming Layer > (LSL) workshop, which will take place September 27 - 28, 2018, at the > Hanse Wissenschaftskolleg, in Delmenhorst, Germany. > > You can now register for the event: http://www.h-w-k.de/index.php?id=3D= 2266. > > The registration is open until July 30, 2018. The number of participant= s > is limited to 40 people. > The workshop participation fee is > 100,00 =A4 reduced rate for Postdocs, PhDs, and Students > 350,00 =A4 for Industrial Partners, and Other > and includes meals. Participants need to cover their own travel expense= s. > > LSL is an open-source project enabling the synchronized streaming of > time series data coming from different devices, such as EEG amplifiers, > audio, video, eye tracking, keyboards, etc. LSL features near real-time > access to data streams, time-synchronization, networking and centralize= d > collection. > > Key LSL developers and expert users have confirmed attendance. The > workshop will provide a general introduction to LSL. We will present ho= w > LSL can be used for a multitude of different experimental setups, using > a variety of experimental software, hardware and operating systems. In = a > hands-on session, participants can learn to stream data from their own > hardware or play with hardware provided by us and external partners. We > will name pitfalls and discuss how to test and ensure the best possible > timing accuracy when recording multimodal data. We will also provide a > best practice guide and present different use cases of LSL. We will als= o > use the workshop to discuss future software and hardware developments. > > For further enquiries, please contact: martin.bleichner@xxxxxxxx > > Best, > > Martin Bleichner > >