Till innehåll på sidan

Publications

Most recent publications:

[1]
R. Hiraga och K. Falkenberg, "Computer-based music training with hearing impairments : Lessons from an experiment," i Proceedings of the 17th Sound and Music Computing Conference, 2020, s. 426-433.
[3]
A. B. Latupeirissa, C. Panariello och R. Bresin, "Exploring emotion perception in sonic HRI," i 17th Sound and Music Computing Conference, 2020, s. 434-441.
[4]
E. Frid och H. Lindetorp, "Haptic Music : Exploring Whole-Body Vibrations and Tactile Sound for a Multisensory Music Installation," i Proceedings of the Sound and Music Computing Conference (SMC) 2020, 2020, s. 68-75.
[5]
I. Torre, A. B. Latupeirissa och C. McGinn, "How context shapes the appropriateness of a robot’s voice," i 29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020, 2020, s. 215-222.
[6]
R. Bresin et al., "Looking for the soundscape of the future : preliminary results applying the design fiction method," i Sound and Music Computing Conference 2020, 2020.
[7]
E. Frid, C. Gomes och Z. Jin, "Music Creation by Example," (Manuskript).
[8]
E. Frid, Z. Jin och C. Gomes, "Music Creation by Example," i Proceedings of ACM Conference on Human Factors in Computing Systems (CHI), 2020, s. 1-13.
[9]
M. Houel et al., "Perception of Emotions in Knocking Sounds : an Evaluation Study," i Sound and Music Computing Conference 2020, Torino, 24-26 June 2020, 2020.
[11]
C. Manolas, S. Pauletto och J. Jang, "Soundtrack Loudness as a Depth Cue in Stereoscopic 3D Media," Convergence. The International Journal of Research into New Media Technologies, 2020.
[12]
C. Panariello, "Study in three phases : An Adaptive Sound Installation," Leonardo music journal, vol. 30, s. 44-49, 2020.
[13]
A. Barahona-Rios och S. Pauletto, "Synthesising Knocking Sound Effects Using Conditional WaveGAN," i SMC Sound and Music Computing Conference 2020, 2020.
[14]
A. B. Latupeirissa och R. Bresin, "Understanding non-verbal sound of humanoid robots in films," i Workshop on Mental Models of Robots at HRI 2020 in Cambridge, UK, Mar 23rd 2020, 2020.
[16]
H. Lindetorp och K. Falkenberg, "WebAudioXML : Proposing a new standard for structuring web audio," i Sound and Music Computing Conference, 2020, s. 25-31.
[17]
V. Tsaknaki och L. Elblaus, "A wearable nebula material investigations of implicit interaction," i TEI 2019 - Proceedings of the 13th International Conference on Tangible, Embedded, and Embodied Interaction, 2019, s. 625-633.
[18]
E. Frid, "Accessible Digital Musical Instruments : A Review of Musical Interfaces in Inclusive Music Practice," Multimodal Technologies and Interaction, vol. 3, no. 3, 2019.
[19]
A. Holzapfel och E. Benetos, "AUTOMATIC MUSIC TRANSCRIPTION AND ETHNOMUSICOLOGY : A USER STUDY," i International Society for Music Information Retrieval Conference, ISMIR, 2019.
[20]
A. Holzapfel och E. Benetos, "Automatic music transcription and ethnomusicology : A user study," i Proceedings of the 20th International Society for Music Information Retrieval Conference, ISMIR 2019, 2019, s. 678-684.
[21]
O. Misgeld, A. Holzapfel och S. Ahlbäck, "Dancing Dots - Investigating the Link between Dancer and Musician in Swedish Folk Dance," i Sound & Music Computing Conference, 2019.
[22]
O. Misgeld, A. Holzapfel och S. Ahlbäck, "Dancing dots - Investigating the link between dancer and musician in Swedish folk dance.," i Proceedings of the Sound and Music Computing Conferences, 2019, s. 519-524.
[23]
T. Gulz, A. Holzapfel och A. Friberg, "Developing a Method for Identifying Improvisation Strategies in Jazz Duos," i Proc. of the 14th International Symposium on CMMR, 2019, s. 482-489.
[24]
E. Frid, "Diverse Sounds : Enabling Inclusive Sonic Interaction," Doktorsavhandling Stockholm : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2020:2, 2019.
[25]
[26]
F. Keenan och S. Pauletto, "Evaluating a continuous sonic interaction : Comparing a performable acoustic and digital everyday sound," i Proceedings of the Sound and Music Computing Conferences, 2019, s. 127-134.
[27]
C. Panariello et al., "From vocal sketching to sound models by means of a sound-based musical transcription system," i Proceedings of the Sound and Music Computing Conferences, 2019, s. 167-173.
[28]
R. Idrovo och S. Pauletto, "Immersive Point-of-Audition : Alfonso Cuarón’s Three-Dimensional Sound Design Approach," Music, Sound, and the Moving Image, 2019.
[29]
E. Frid, L. Elblaus och R. Bresin, "Interactive sonification of a fluid dance movement : an exploratory study," Journal on Multimodal User Interfaces, vol. 13, no. 3, s. 181-189, 2019.
[30]
J. Yang, T. Hermann och R. Bresin, "Introduction to the special issue on interactive sonification," Journal on Multimodal User Interfaces, vol. 13, no. 3, s. 151-153, 2019.
[31]
S. Pauletto, "Invisible Seams : the Role of Foley and Voice Postproduction Recordings in the Design of Cinematic Performances," i Foundations in Sound Design for Linear Media : A Multidisciplinary Approach, Michael Filimowicz red., : Routledge, 2019.
[32]
A. Barahona och S. Pauletto, "Perceptual evaluation of modal synthesis for impact-based sounds," i Proceedings of the Sound and Music Computing Conferences, 2019, s. 34-38.
[33]
X. Han och R. Bresin, "Performance of piano trills: effects of hands, fingers, notes and emotions," i Combined proceedings of the Nordic Sound and Music Computing Conference 2019 and the Interactive Sonification Workshop 2019, 2019, s. 9-15.
[34]
L. Jap och A. Holzapfel, "Real-time Mapping of Periodic Dance Movements to Control Tempo in Electronic Dance Music," i 16th Sound & Music Computing Conference, Malaga, Spain, 28-31 May 2019, 2019, s. 274-280.
[35]
A. B. Latupeirissa, E. Frid och R. Bresin, "Sonic characteristics of robots in films," i Proceedings of the 16th Sound and Music Computing Conference, 2019, s. 1-6.
[36]
K. F. Hansen, M. Ljungdahl Eriksson och R. Atienza, "Sound design through large audience interaction," i 16th Sound and Music Computing Conference (SMC2019), 2019, s. 119-126.
[37]
E. Frid et al., "Sound Forest - Evaluation of an Accessible Multisensory Music Installation," i Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 2019, s. 1-12.
[38]
K. F. Hansen et al., "Student involvement in sound and music computing research: Current practices at KTH and KMH," i Combined proceedings of the Nordic Sound and Music Computing Conference 2019 and the Interactive Sonification Workshop 2019, 2019, s. 36-42.
[39]
K. F. Hansen et al., "Student involvement in sound and music research: Current practices at KTH and KMH," i Combined proceedings of theNordic Sound and Music Computing Conference 2019and the Interactive Sonification Workshop 2019, 2019, s. 36-41.
[40]
Q. Lemaire och A. Holzapfel, "TEMPORAL CONVOLUTIONAL NETWORKS FOR SPEECH AND MUSIC DETECTION IN RADIO BROADCAST," i International Society for Music Information Retrieval Conference, ISMIR, 2019.
[41]
Q. Lemaire och A. Holzapfel, "Temporal convolutional networks for speech and music detection in radio broadcast," i Proceedings of the 20th International Society for Music Information Retrieval Conference, ISMIR 2019, 2019, s. 229-236.
[42]
A. Holzapfel, "A case study of ethnography and computational analysis as complementary tools for analyzing dance tunes," i International Conference on Analytical Approaches to World Music (AAWM)2018, 2018.
[43]
E. Frid, "Accessible Digital Musical Instruments : A Survey of Inclusive Instruments Presented at the NIME, SMC and ICMC Conferences," i Proceedings of the International Computer Music Conference 2018 : Daegu, South Korea, 2018, s. 53-59.
[44]
L. Elblaus, "Crafting Experience : Designing Digital Musical Instruments for Long-Term Use in Artistic Practice," Doktorsavhandling : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2018.
[45]
A. Holzapfel, B. Sturm och M. Coeckelbergh, "Ethical Dimensions of Music Information Retrieval Technology," Transactions of the International Society for Music Information Retrieval, vol. 1, no. 1, s. 44-55, 2018.
[46]
E. Frid et al., "Haptic feedback combined with movement sonification using a friction sound improves task performance in a virtual throwing task," Journal on Multimodal User Interfaces, vol. 13, no. 4, s. 279-290, 2018.
[47]
M. Ljungdahl Eriksson et al., "My Sound Space : An attentional shield for immersive redirection," i Audio Mostly 2018 : Sound in Immersion and Emotion, 2018.
[48]
S. Serafin et al., "NordicSMC : A nordic university hub on sound and music computing," i Proceedings of the 15th Sound and Music Computing Conference : Sonic Crossings, SMC 2018, 2018, s. 124-128.
[49]
L. Handberg et al., "Op 1254 : Music for neutrons, networks and solenoids using a restored organ in a nuclear reactor," i TEI 2018 - Proceedings of the 12th International Conference on Tangible, Embedded, and Embodied Interaction, 2018, s. 537-541.
[50]
R. A. Mangkuto et al., "Optimisation of daylight admission based on modifications of light shelf design parameters," Journal of Building Engineering, vol. 18, s. 195-209, 2018.
Fullständig lista i KTH:s publikationsportal