Skip to main content

Publications

Most recent publications:

[1]
O. Misgeld et al., "A case study of deep enculturation and sensorimotor synchronization to real music," in International Society for Music Information Retrieval Conference, ISMIR, 2021.
[2]
H. Lindetorp and K. Falkenberg, "Audio Parameter Mapping Made Explicit Using WebAudioXML," in Proceedings of the Sound and Music Computing Conference, 2021.
[3]
K. Falkenberg et al., "Auditory notification of customer actions in a virtual retail environment: Sound design, awareness and attention," in Proceedings of International Conference on Auditory Displays ICAD 2021, 2021.
[4]
G. F. Arfvidsson et al., "Design considerations for short alerts and notification sounds in a retail environment," in Proceedings of the Sound and Music Computing Conference, 2021.
[5]
E. Frid and K. Falkenberg, "Designing and reporting research on sound design and music for health: Methods and frameworks for impact," in Doing Research in Sound Design, : Routledge, 2021.
[6]
S. Holmqvist et al., "Evaluating pleasure, arousal and customer satisfaction from sound notifications," in Nordic Retail and Wholesale Conference, 2021.
[7]
K. Falkenberg et al., "Ljud- och musikbehandling," in En introduktion till medieteknik, Pernilla Josefsson & Mikael Wiberg Ed., : Studentlitteratur AB, 2021.
[9]
E. Frid et al., "On Designing Sounds to Reduce Shoplifting in Retail Environments," in Nordic Retail and Wholesale Conference, 2021.
[11]
H. Lindetorp and K. Falkenberg, "Putting Web Audio API to the test : Introducing WebAudioXML as a pedagogical platform," in Web Audio Conference 2021, 2021.
[12]
E. Frid and A. Ilsar, "Reimagining (Accessible) Digital Musical Instruments: A Survey on Electronic Music-Making Tools," in Proceedings of the International Conference on New Interfaces for Musical Expression (NIME) 2021, 2021.
[13]
R. Bresin et al., "Robust Non-Verbal Expression in Humanoid Robots: New Methods for Augmenting Expressive Movements with Sound," in Workshop on Sound in Human-Robot Interaction at HRI 2021, 2021.
[14]
H. Lindetorp and K. Falkenberg, "Sonification For Everyone Everywhere : Evaluating The WebAudioXML Sonification Toolkit For Browsers," in The 26th International Conference on Auditory Display (ICAD 2021), 2021.
[15]
S. Pauletto and R. Bresin, "Sonification Research and Emerging Topics," in Doing Research in Sound Design, Michael Filimowicz Ed., : Routledge, 2021.
[16]
L. Elblaus and G. Eckel, "Acoustic modelling as a strategy for composing site-specific music," in ACM International Conference Proceeding Series : Proceedings of the 15th International Conference on Audio Mostly, 2020, pp. 69-76.
[17]
R. Hiraga and K. Falkenberg, "Computer-based music training with hearing impairments : Lessons from an experiment," in Proceedings of the 17th Sound and Music Computing Conference, 2020, pp. 426-433.
[19]
O. Misgeld, A. Holzapfel and S. Ahlbäck, "Exploring beat connections in Swedish Folk music and dance," in Proceedings of the ICTM Study Group on Sound, Movement, and the Sciences Symposium (SoMoS), 2020.
[20]
A. B. Latupeirissa, C. Panariello and R. Bresin, "Exploring emotion perception in sonic HRI," in 17th Sound and Music Computing Conference, 2020, pp. 434-441.
[21]
E. Frid and H. Lindetorp, "Haptic Music : Exploring Whole-Body Vibrations and Tactile Sound for a Multisensory Music Installation," in Proceedings of the Sound and Music Computing Conference (SMC) 2020, 2020, pp. 68-75.
[22]
I. Torre, A. B. Latupeirissa and C. McGinn, "How context shapes the appropriateness of a robot’s voice," in 29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020, 2020, pp. 215-222.
[23]
R. Bresin et al., "Looking for the soundscape of the future : preliminary results applying the design fiction method," in Sound and Music Computing Conference 2020, 2020.
[24]
E. Frid, C. Gomes and Z. Jin, "Music Creation by Example," (Manuscript).
[25]
E. Frid, Z. Jin and C. Gomes, "Music Creation by Example," in Proceedings CHI '20: CHI Conference on Human Factors in Computing Systems, 2020, pp. 1-13.
[27]
C. Panariello, "Study in three phases : An Adaptive Sound Installation," Leonardo music journal, vol. 30, pp. 44-49, 2020.
[28]
A. B. Latupeirissa and R. Bresin, "Understanding non-verbal sound of humanoid robots in films," in Workshop on Mental Models of Robots at HRI 2020 in Cambridge, UK, Mar 23rd 2020, 2020.
[30]
L. Elblaus and G. Eckel, "Utruchirp : An impulse response measurement and auralisation tool developed for artistic practice," in ACM International Conference Proceeding Series, 2020, pp. 61-68.
[31]
H. Lindetorp and K. Falkenberg, "WebAudioXML : Proposing a new standard for structuring web audio," in Sound and Music Computing Conference, 2020, pp. 25-31.
[32]
H. Lindetorp and K. Falkenberg, "WebAudioXML : Proposing a new standard for structuring web audio," in Proceedings of the Sound and Music Computing Conferences, 2020, pp. 25-31.
[33]
V. Tsaknaki and L. Elblaus, "A wearable nebula material investigations of implicit interaction," in TEI 2019 - Proceedings of the 13th International Conference on Tangible, Embedded, and Embodied Interaction, 2019, pp. 625-633.
[34]
E. Frid, "Accessible Digital Musical Instruments : A Review of Musical Interfaces in Inclusive Music Practice," Multimodal Technologies and Interaction, vol. 3, no. 3, 2019.
[35]
O. Misgeld, A. Holzapfel and S. Ahlbäck, "Dancing Dots - Investigating the Link between Dancer and Musician in Swedish Folk Dance," in Sound & Music Computing Conference, 2019.
[36]
T. Gulz, A. Holzapfel and A. Friberg, "Developing a Method for Identifying Improvisation Strategies in Jazz Duos," in Proc. of the 14th International Symposium on CMMR, 2019, pp. 482-489.
[37]
E. Frid, "Diverse Sounds : Enabling Inclusive Sonic Interaction," Doctoral thesis Stockholm : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2020:2, 2019.
[38]
C. Panariello et al., "From vocal sketching to sound models by means of a sound-based musical transcription system," in Proceedings of the Sound and Music Computing Conferences, 2019, pp. 167-173.
[39]
E. Frid, L. Elblaus and R. Bresin, "Interactive sonification of a fluid dance movement : an exploratory study," Journal on Multimodal User Interfaces, vol. 13, no. 3, pp. 181-189, 2019.
[40]
J. Yang, T. Hermann and R. Bresin, "Introduction to the special issue on interactive sonification," Journal on Multimodal User Interfaces, vol. 13, no. 3, pp. 151-153, 2019.
[41]
X. Han and R. Bresin, "Performance of piano trills: effects of hands, fingers, notes and emotions," in Combined proceedings of the Nordic Sound and Music Computing Conference 2019 and the Interactive Sonification Workshop 2019, 2019, pp. 9-15.
[42]
A. B. Latupeirissa, E. Frid and R. Bresin, "Sonic characteristics of robots in films," in Proceedings of the 16th Sound and Music Computing Conference, 2019, pp. 1-6.
[43]
K. F. Hansen, M. Ljungdahl Eriksson and R. Atienza, "Sound design through large audience interaction," in 16th Sound and Music Computing Conference (SMC2019), 2019, pp. 119-126.
[44]
E. Frid et al., "Sound Forest - Evaluation of an Accessible Multisensory Music Installation," in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 2019, pp. 1-12.
[45]
K. F. Hansen et al., "Student involvement in sound and music computing research: Current practices at KTH and KMH," in Combined proceedings of the Nordic Sound and Music Computing Conference 2019 and the Interactive Sonification Workshop 2019, 2019, pp. 36-42.
[46]
E. Frid, "Accessible Digital Musical Instruments : A Survey of Inclusive Instruments Presented at the NIME, SMC and ICMC Conferences," in Proceedings of the International Computer Music Conference 2018 : Daegu, South Korea, 2018, pp. 53-59.
[47]
L. Elblaus, "Crafting Experience : Designing Digital Musical Instruments for Long-Term Use in Artistic Practice," Doctoral thesis : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2018.
[48]
E. Frid et al., "Haptic feedback combined with movement sonification using a friction sound improves task performance in a virtual throwing task," Journal on Multimodal User Interfaces, vol. 13, no. 4, pp. 279-290, 2018.
[49]
M. Ljungdahl Eriksson et al., "My Sound Space : An attentional shield for immersive redirection," in Audio Mostly 2018 : Sound in Immersion and Emotion, 2018.
[50]
S. Serafin et al., "NordicSMC : A nordic university hub on sound and music computing," in Proceedings of the 15th Sound and Music Computing Conference : Sonic Crossings, SMC 2018, 2018, pp. 124-128.
Full list in the KTH publications portal