Skip to main content

Publications

Most recent publications:

[1]
L. Elblaus and G. Eckel, "Acoustic modelling as a strategy for composing site-specific music," in ACM International Conference Proceeding Series : Proceedings of the 15th International Conference on Audio Mostly, 2020, pp. 69-76.
[2]
R. Hiraga and K. Falkenberg, "Computer-based music training with hearing impairments : Lessons from an experiment," in Proceedings of the 17th Sound and Music Computing Conference, 2020, pp. 426-433.
[4]
A. B. Latupeirissa, C. Panariello and R. Bresin, "Exploring emotion perception in sonic HRI," in 17th Sound and Music Computing Conference, 2020, pp. 434-441.
[5]
E. Frid and H. Lindetorp, "Haptic Music : Exploring Whole-Body Vibrations and Tactile Sound for a Multisensory Music Installation," in Proceedings of the Sound and Music Computing Conference (SMC) 2020, 2020, pp. 68-75.
[6]
I. Torre, A. B. Latupeirissa and C. McGinn, "How context shapes the appropriateness of a robot’s voice," in 29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020, 2020, pp. 215-222.
[7]
R. Bresin et al., "Looking for the soundscape of the future : preliminary results applying the design fiction method," in Sound and Music Computing Conference 2020, 2020.
[8]
E. Frid, C. Gomes and Z. Jin, "Music Creation by Example," (Manuscript).
[9]
E. Frid, Z. Jin and C. Gomes, "Music Creation by Example," in Proceedings of ACM Conference on Human Factors in Computing Systems (CHI), 2020, pp. 1-13.
[10]
M. Houel et al., "Perception of Emotions in Knocking Sounds : an Evaluation Study," in Sound and Music Computing Conference 2020, Torino, 24-26 June 2020, 2020.
[12]
C. Manolas, S. Pauletto and J. Jang, "Soundtrack Loudness as a Depth Cue in Stereoscopic 3D Media," Convergence. The International Journal of Research into New Media Technologies, 2020.
[13]
C. Panariello, "Study in three phases : An Adaptive Sound Installation," Leonardo music journal, vol. 30, pp. 44-49, 2020.
[14]
A. Barahona-Rios and S. Pauletto, "Synthesising Knocking Sound Effects Using Conditional WaveGAN," in SMC Sound and Music Computing Conference 2020, 2020.
[15]
A. B. Latupeirissa and R. Bresin, "Understanding non-verbal sound of humanoid robots in films," in Workshop on Mental Models of Robots at HRI 2020 in Cambridge, UK, Mar 23rd 2020, 2020.
[17]
L. Elblaus and G. Eckel, "Utruchirp : An impulse response measurement and auralisation tool developed for artistic practice," in ACM International Conference Proceeding Series, 2020, pp. 61-68.
[18]
H. Lindetorp and K. Falkenberg, "WebAudioXML : Proposing a new standard for structuring web audio," in Sound and Music Computing Conference, 2020, pp. 25-31.
[19]
V. Tsaknaki and L. Elblaus, "A wearable nebula material investigations of implicit interaction," in TEI 2019 - Proceedings of the 13th International Conference on Tangible, Embedded, and Embodied Interaction, 2019, pp. 625-633.
[20]
E. Frid, "Accessible Digital Musical Instruments : A Review of Musical Interfaces in Inclusive Music Practice," Multimodal Technologies and Interaction, vol. 3, no. 3, 2019.
[21]
A. Holzapfel and E. Benetos, "AUTOMATIC MUSIC TRANSCRIPTION AND ETHNOMUSICOLOGY : A USER STUDY," in International Society for Music Information Retrieval Conference, ISMIR, 2019.
[22]
A. Holzapfel and E. Benetos, "Automatic music transcription and ethnomusicology : A user study," in Proceedings of the 20th International Society for Music Information Retrieval Conference, ISMIR 2019, 2019, pp. 678-684.
[23]
O. Misgeld, A. Holzapfel and S. Ahlbäck, "Dancing Dots - Investigating the Link between Dancer and Musician in Swedish Folk Dance," in Sound & Music Computing Conference, 2019.
[24]
O. Misgeld, A. Holzapfel and S. Ahlbäck, "Dancing dots - Investigating the link between dancer and musician in Swedish folk dance.," in Proceedings of the Sound and Music Computing Conferences, 2019, pp. 519-524.
[25]
T. Gulz, A. Holzapfel and A. Friberg, "Developing a Method for Identifying Improvisation Strategies in Jazz Duos," in Proc. of the 14th International Symposium on CMMR, 2019, pp. 482-489.
[26]
E. Frid, "Diverse Sounds : Enabling Inclusive Sonic Interaction," Doctoral thesis Stockholm : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2020:2, 2019.
[27]
[28]
F. Keenan and S. Pauletto, "Evaluating a continuous sonic interaction : Comparing a performable acoustic and digital everyday sound," in Proceedings of the Sound and Music Computing Conferences, 2019, pp. 127-134.
[29]
C. Panariello et al., "From vocal sketching to sound models by means of a sound-based musical transcription system," in Proceedings of the Sound and Music Computing Conferences, 2019, pp. 167-173.
[30]
R. Idrovo and S. Pauletto, "Immersive Point-of-Audition : Alfonso Cuarón’s Three-Dimensional Sound Design Approach," Music, Sound, and the Moving Image, 2019.
[31]
E. Frid, L. Elblaus and R. Bresin, "Interactive sonification of a fluid dance movement : an exploratory study," Journal on Multimodal User Interfaces, vol. 13, no. 3, pp. 181-189, 2019.
[32]
J. Yang, T. Hermann and R. Bresin, "Introduction to the special issue on interactive sonification," Journal on Multimodal User Interfaces, vol. 13, no. 3, pp. 151-153, 2019.
[33]
S. Pauletto, "Invisible Seams : the Role of Foley and Voice Postproduction Recordings in the Design of Cinematic Performances," in Foundations in Sound Design for Linear Media : A Multidisciplinary Approach, Michael Filimowicz Ed., : Routledge, 2019.
[34]
A. Barahona and S. Pauletto, "Perceptual evaluation of modal synthesis for impact-based sounds," in Proceedings of the Sound and Music Computing Conferences, 2019, pp. 34-38.
[35]
X. Han and R. Bresin, "Performance of piano trills: effects of hands, fingers, notes and emotions," in Combined proceedings of the Nordic Sound and Music Computing Conference 2019 and the Interactive Sonification Workshop 2019, 2019, pp. 9-15.
[36]
L. Jap and A. Holzapfel, "Real-time Mapping of Periodic Dance Movements to Control Tempo in Electronic Dance Music," in 16th Sound & Music Computing Conference, Malaga, Spain, 28-31 May 2019, 2019, pp. 274-280.
[37]
A. B. Latupeirissa, E. Frid and R. Bresin, "Sonic characteristics of robots in films," in Proceedings of the 16th Sound and Music Computing Conference, 2019, pp. 1-6.
[38]
K. F. Hansen, M. Ljungdahl Eriksson and R. Atienza, "Sound design through large audience interaction," in 16th Sound and Music Computing Conference (SMC2019), 2019, pp. 119-126.
[39]
E. Frid et al., "Sound Forest - Evaluation of an Accessible Multisensory Music Installation," in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 2019, pp. 1-12.
[40]
K. F. Hansen et al., "Student involvement in sound and music computing research: Current practices at KTH and KMH," in Combined proceedings of the Nordic Sound and Music Computing Conference 2019 and the Interactive Sonification Workshop 2019, 2019, pp. 36-42.
[41]
K. F. Hansen et al., "Student involvement in sound and music research: Current practices at KTH and KMH," in Combined proceedings of theNordic Sound and Music Computing Conference 2019and the Interactive Sonification Workshop 2019, 2019, pp. 36-41.
[42]
Q. Lemaire and A. Holzapfel, "Temporal convolutional networks for speech and music detection in radio broadcast," in 20th International Society for Music Information Retrieval Conference, ISMIR 2019, 4-8 November 2019, 2019.
[43]
A. Holzapfel, "A case study of ethnography and computational analysis as complementary tools for analyzing dance tunes," in International Conference on Analytical Approaches to World Music (AAWM)2018, 2018.
[44]
E. Frid, "Accessible Digital Musical Instruments : A Survey of Inclusive Instruments Presented at the NIME, SMC and ICMC Conferences," in Proceedings of the International Computer Music Conference 2018 : Daegu, South Korea, 2018, pp. 53-59.
[45]
L. Elblaus, "Crafting Experience : Designing Digital Musical Instruments for Long-Term Use in Artistic Practice," Doctoral thesis : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2018.
[46]
A. Holzapfel, B. Sturm and M. Coeckelbergh, "Ethical Dimensions of Music Information Retrieval Technology," Transactions of the International Society for Music Information Retrieval, vol. 1, no. 1, pp. 44-55, 2018.
[47]
E. Frid et al., "Haptic feedback combined with movement sonification using a friction sound improves task performance in a virtual throwing task," Journal on Multimodal User Interfaces, vol. 13, no. 4, pp. 279-290, 2018.
[48]
M. Ljungdahl Eriksson et al., "My Sound Space : An attentional shield for immersive redirection," in Audio Mostly 2018 : Sound in Immersion and Emotion, 2018.
[49]
S. Serafin et al., "NordicSMC : A nordic university hub on sound and music computing," in Proceedings of the 15th Sound and Music Computing Conference : Sonic Crossings, SMC 2018, 2018, pp. 124-128.
[50]
L. Handberg et al., "Op 1254 : Music for neutrons, networks and solenoids using a restored organ in a nuclear reactor," in TEI 2018 - Proceedings of the 12th International Conference on Tangible, Embedded, and Embodied Interaction, 2018, pp. 537-541.
Full list in the KTH publications portal
Page responsible:Web editors at EECS
Belongs to: Media Technology and Interaction Design
Last changed: Jan 14, 2021