HCI Bibliography : Search Results skip to search form | skip to results |
Database updated: 2016-05-10 Searches since 2006-12-01: 32,284,126
director@hcibib.org
Hosted by ACM SIGCHI
The HCI Bibliogaphy was moved to a new server 2015-05-12 and again 2016-01-05, substantially degrading the environment for making updates.
There are no plans to add to the database.
Please send questions or comments to director@hcibib.org.
Query: Tanaka_A* Results: 40 Sorted by: Date  Comments?
Help Dates
Limit:   
<<First <Previous Permalink Next> Last>> Records: 1 to 25 of 40 Jump to: 2016 | 15 | 14 | 13 | 12 | 11 | 10 | 09 | 08 | 07 | 06 | 04 | 03 | 02 | 89 |
Haptic Wave: A Cross-Modal Interface for Visually Impaired Audio Producers Visual Impairment and Technology / Tanaka, Atau / Parkinson, Adam Proceedings of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.1 p.2150-2161
ACM Digital Library Link
Summary: We present the Haptic Wave, a device that allows cross-modal mapping of digital audio to the haptic domain, intended for use by audio producers/engineers with visual impairments. We describe a series of participatory design activities adapted to non-sighted users where the act of prototyping facilitates dialog. A series of workshops scoping user needs, and testing a technology mock up and lo-fidelity prototype fed into the design of a final high-spec prototype. The Haptic Wave was tested in the laboratory, then deployed in real world settings in recording studios and audio production facilities. The cross-modal mapping is kinesthetic and allows the direct manipulation of sound without the translation of an existing visual interface. The research gleans insight into working with users with visual impairments, and transforms perspective to think of them as experts in non-visual interfaces for all users.

Machine Learning of Personal Gesture Variation in Music Conducting Gesture Elicitation and Interaction / Sarasua, Alvaro / Caramiaux, Baptiste / Tanaka, Atau Proceedings of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.1 p.3428-3432
ACM Digital Library Link
Summary: This note presents a system that learns expressive and idiosyncratic gesture variations for gesture-based interaction. The system is used as an interaction technique in a music conducting scenario where gesture variations drive music articulation. A simple model based on Gaussian Mixture Modeling is used to allow the user to configure the system by providing variation examples. The system performance and the influence of user musical expertise is evaluated in a user study, which shows that the model is able to learn idiosyncratic variations that allow users to control articulation, with better performance for users with musical expertise.

Human-Centred Machine Learning Workshop Summaries / Gillies, Marco / Fiebrink, Rebecca / Tanaka, Atau / Garcia, Jérémie / Bevilacqua, Frédéric / Heloir, Alexis / Nunnari, Fabrizio / Mackay, Wendy / Amershi, Saleema / Lee, Bongshin / d'Alessandro, Nicolas / Tilmanne, Joëlle / Kulesza, Todd / Caramiaux, Baptiste Extended Abstracts of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.2 p.3558-3565
ACM Digital Library Link
Summary: Machine learning is one of the most important and successful techniques in contemporary computer science. It involves the statistical inference of models (such as classifiers) from data. It is often conceived in a very impersonal way, with algorithms working autonomously on passively collected data. However, this viewpoint hides considerable human work of tuning the algorithms, gathering the data, and even deciding what should be modeled in the first place. Examining machine learning from a human-centered perspective includes explicitly recognising this human work, as well as reframing machine learning workflows based on situated human working practices, and exploring the co-adaptation of humans and systems. A human-centered understanding of machine learning in human context can lead not only to more usable machine learning tools, but to new ways of framing learning computationally. This workshop will bring together researchers to discuss these issues and suggest future research questions aimed at creating a human-centered approach to machine learning.

The Haptic Wave: A Device for Feeling Sound Interactivity Demos / Parkinson, Adam / Tanaka, Atau Extended Abstracts of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.2 p.3750-3753
ACM Digital Library Link
Summary: We demonstrate the Haptic Wave, a device that allows audio engineers with visual impairments to "feel" the amplitude of sound, gaining salient information that sighted engineers get through visual waveforms. The demo will allow visitors, sighted or visually-impaired, to sweep backwards and forwards through audio recordings (snippets of pop songs and voice recordings), feeling sound amplitude through haptic feedback delivered by a motorized fader. The result of Participatory Design, Workshopping, and Research through Design methods, the Haptic Wave has been previously exhibited at the Research Through Design Conference (RTD), Visually Impaired Musicians' Lives conference, and has been trialed in real world settings in recording studios by users with visual impairments in the UK and USA. A detailed account of the research and design process of the Haptic Wave has been accepted as a full paper at CHI'16.

Form Follows Sound: Designing Interactions from Sonic Memories Speech & Auditory Interfaces / Caramiaux, Baptiste / Altavilla, Alessandro / Pobiner, Scott G. / Tanaka, Atau Proceedings of the ACM CHI'15 Conference on Human Factors in Computing Systems 2015-04-18 v.1 p.3943-3952
ACM Digital Library Link
Summary: Sonic interaction is the continuous relationship between user actions and sound, mediated by some technology. Because interaction with sound may be task oriented or experience-based it is important to understand the nature of action-sound relationships in order to design rich sonic interactions. We propose a participatory approach to sonic interaction design that first considers the affordances of sounds in order to imagine embodied interaction, and based on this, generates interaction models for interaction designers wishing to work with sound. We describe a series of workshops, called Form Follows Sound, where participants ideate imagined sonic interactions, and then realize working interactive sound prototypes. We introduce the Sonic Incident technique, as a way to recall memorable sound experiences. We identified three interaction models for sonic interaction design: conducting; manipulating; substituting. These three interaction models offer interaction designers and developers a framework on which they can build richer sonic interactions.

Understanding Gesture Expressivity through Muscle Sensing Special Issue on Physiological Computing for Human-Computer Interaction / Caramiaux, Baptiste / Donnarumma, Marco / Tanaka, Atau ACM Transactions on Computer-Human Interaction 2015-01 v.21 n.6 p.31
ACM Digital Library Link
Summary: Expressivity is a visceral capacity of the human body. To understand what makes a gesture expressive, we need to consider not only its spatial placement and orientation but also its dynamics and the mechanisms enacting them. We start by defining gesture and gesture expressivity, and then we present fundamental aspects of muscle activity and ways to capture information through electromyography and mechanomyography. We present pilot studies that inspect the ability of users to control spatial and temporal variations of 2D shapes and that use muscle sensing to assess expressive information in gesture execution beyond space and time. This leads us to the design of a study that explores the notion of gesture power in terms of control and sensing. Results give insights to interaction designers to go beyond simplistic gestural interaction, towards the design of interactions that draw on nuances of expressive gesture.

Adaptive Gesture Recognition with Variation Estimation for Interactive Systems Special Issue on Activity Recognition for Interaction / Caramiaux, Baptiste / Montecchio, Nicola / Tanaka, Atau / Bevilacqua, Frédéric ACM Transactions on Interactive Intelligent Systems 2015-01 v.4 n.4 p.18
ACM Digital Library Link
Summary: This article presents a gesture recognition/adaptation system for human -- computer interaction applications that goes beyond activity classification and that, as a complement to gesture labeling, characterizes the movement execution. We describe a template-based recognition method that simultaneously aligns the input gesture to the templates using a Sequential Monte Carlo inference technique. Contrary to standard template-based methods based on dynamic programming, such as Dynamic Time Warping, the algorithm has an adaptation process that tracks gesture variation in real time. The method continuously updates, during execution of the gesture, the estimated parameters and recognition results, which offers key advantages for continuous human -- machine interaction. The technique is evaluated in several different ways: Recognition and early recognition are evaluated on 2D onscreen pen gestures; adaptation is assessed on synthetic data; and both early recognition and adaptation are evaluated in a user study involving 3D free-space gestures. The method is robust to noise, and successfully adapts to parameter variation. Moreover, it performs recognition as well as or better than nonadapting offline template-based methods.

Heart rate monitoring through the surface of a drinkware Body signals / Chigira, Hiroshi / Ihara, Masayuki / Kobayashi, Minoru / Tanaka, Akimichi / Tanaka, Tomohiro Proceedings of the 2014 International Joint Conference on Pervasive and Ubiquitous Computing 2014-09-13 v.1 p.685-689
ACM Digital Library Link
Summary: There is a growing demand for daily heart rate (HR) monitoring in the fields of healthcare, fitness, activity recognition, and entertainment. Although various HR monitoring systems have been proposed, most of these employ a wearable device, which may be a burden and disturb one's daily living.
    To achieve the goal of pervasive HR monitoring in our daily living, we present the HR monitoring method through the surface of a drinkware. The proposed method employs the surface of a drinkware as a broad sensing region, by expanding the principal of a basic photo-based HR sensor. The sensing surface works even with a curved shape, and it can be applied on various types of drinkwares. This approach enables unobtrusive HR monitoring during the beverage consumption. As a prototype, we implemented the proposed method on an ordinary transparent tumbler, and evaluated its HR monitoring performance.

Why you follow: a classification scheme for Twitter follow links Posters and demos / Tanaka, Atsushi / Takemura, Hikaru / Tajima, Keishi Proceedings of the 2014 ACM Conference on Hypertext and Social Media 2014-09-01 p.324-326
ACM Digital Library Link
Summary: Twitter is used for various purposes, such as, information publishing/gathering, open discussions, and personal communications. As a result, there are various types of follow links. In this paper, we propose a scheme for classifying follow links according to the followers' intention. The scheme consists of three axes: user-orientation, content-orientation, and mutuality. The combination of these three axes can classify most major types of follow links. Our experimental results suggest that the type of a follow link does not solely depend on the type of the followee nor solely on the type of the follower. The results also suggest that the proposed three axes are highly independent of one another.

Posters NIME 2014: New Interfaces for Musical Expression 2014-06-30 p.26
sched.co/RIsCh5
A Gesture Detection with Guitar Pickup and Earphones
	+ Suh, Sangwon
	+ Lee, Jeong-seob
	+ Yeo, Woon Seung
A Max/MSP Approach for Incorporating Digital Music via Laptops in Live Performances of Music Bands
	+ Amo, Yehiel
	+ Zissu, Gil
	+ Eloul, Shaltiel
	+ Shlomi, Eran
	+ Schukin, Dima
	+ Kalifa, Almog
A Real Time Common Chord Progression Guide on the Smartphone for Jamming Pop Song on the Music Keyboard
	+ Lui, Simon
An Exploration of Peg Solitaire as a Compositional Tool
	+ Keatch, Kirsty
Auraglyph: Handwritten Computer Music Composition and Design
	+ Salazar, Spencer
	+ Wang, Ge
Body As Instrument: Performing with Gestural Interfaces
	+ Mainsbridge, Mary
	+ Beilharz, Kirsty
Circle Squared and Circle Keys -- Performing on and with an unstable live algorithm for the Disklavier
	+ Dahlstedt, Palle
Composing Embodied Sonic Play Experiences: Towards Acoustic Feedback Ecology
	+ van Troyer, Akito
Design & Evaluation of an Accessible Hybrid Violin Platform
	+ Overholt, Dan
	+ Gelineck, Steven
Dynamical Interactions with Electronic Instruments
	+ Mudd, Tom
	+ Dalton, Nick
	+ Holland, Simon
	+ Mulholland, Paul
eMersion | Sensor-controlled Electronic Music Modules & Digital Data Workstation
	+ Udell, Chet
	+ Sain, James Paul
FingerSynth: Wearable Transducers for Exploring the Environment and Playing Music Everywhere
	+ Dublon, Gershon
	+ Paradiso, Joseph A.
Hand and Finger Motion-Controlled Audio Mixing Interface
	+ Ratcliffe, Jarrod
How to Make Embedded Acoustic Instruments
	+ Berdahl, Edgar
Interactive Parallax Scrolling Score Interface for Composed Networked Improvisation
	+ Canning, Rob
Mobile Device Percussion Parade
	+ Snyder, Jeff
	+ Sarwate, Avneesh
	+ Chen, Carolyn
	+ Fishman, Noah
	+ Collins, Quinn
	+ Ergun, Cenk
	+ Mulshine, Michael
Musical Interface to Audiovisual Corpora of Arbitrary Instruments
	+ Neupert, Max
	+ Goßmann, Joachim
New Open-Source Interfaces for Group Based Participatory Performance of Live Electronic Music
	+ Barraclough, Timothy J
	+ Murphy, Jim
	+ Kapur, Ajay
Orphion: A gestural multi-touch instrument for the iPad
	+ Trump, Sebastian
	+ Bullock, Jamie
Pd-L2Ork Raspberry Pi Toolkit as a Comprehensive Arduino Alternative in K-12 and Production Scenarios
	+ Bukvic, Ivica
PiaF: A Tool for Augmented Piano Performance Using Gesture Variation Following
	+ Van Zandt-Escobar, Alejandro
	+ Caramiaux, Baptiste
	+ Tanaka, Atau
Pitch Canvas: Touchscreen Based Mobile Music Instrument
	+ Strylowski, Bradley
	+ Allison, Jesse
Reappropriating Museum Collections: Performing Geology Specimens and Meterology Data as New Instruments for Musical Expression
	+ Bowers, John
	+ Shaw, Tim
Rub Synth: A Study of Implementing Intentional Physical Difficulty Into Touch Screen Music Controllers
	+ Sarier, Ozan
Sound Analyser: A Plug-in for Real-Time Audio Analysis in Live Performances and Installations
	+ Stark, Adam
Tangle: a Flexible Framework for Performance with Advanced Robotic Musical Instruments
	+ Mathews, Paul
	+ Morris, Ness
	+ Murphy, Jim
	+ Kapur, Ajay
	+ Carnegie, Dale
The Politics of Laptop Ensembles
	+ Knotts, Shelly
	+ Collins, Nick

A Preliminary Study of Relation Induction between HTML Tag Set and User Experience Interacting with the Web / Nakano, Azusa / Tanaka, Asato / Akiyoshi, Masanori HCI International 2014: 16th International Conference on HCI, Part III: Applications and Services 2014-06-22 v.3 p.49-56
Keywords: Web interface; user experience; relation induction
Link to Digital Content at Springer
Summary: This paper addresses a preliminary study for relation identification between the HTML tag set and user experience. Today's Web technologies such as "HTML5" and "Ajax" enable content providers to design rich Web pages, sometimes complicated and not ease-of-use. On the other hand, "user experience" is getting more and more significant as everyone from young to elder people uses the Web. The design principle seems not to be established from "user experience" viewpoints, because it includes user practical activities. Therefore our approach is to collect user operations and user impressions as to the target Web pages, then induce relation between user impression and such collected data by mining technologies. This paper reports a preliminary experimental results towards such systematic analysis.

Muscular Interactions. Combining EMG and MMG sensing for musical practice Session 2: Multimodal / Donnarumma, Marco / Caramiaux, Baptiste / Tanaka, Atau NIME 2013: New Interfaces for Musical Expression 2013-05-27 p.4
Keywords: NIME, sensorimotor system, EMG, MMG, biosignal, multimodal, mapping
nime2013.kaist.ac.kr/program/papers/day1/paper2/90/90_Paper.pdf
Summary: We present the first combined use of the electromyogram (EMG) and mechanomyogram (MMG), two biosignals that result from muscular activity, for interactive music applications. We exploit differences between these two signals, as reported in the biomedical literature, to create bi-modal sonification and sound synthesis mappings that allow performers to distinguish the two components in a single complex arm gesture. We study non-expert players' ability to articulate the different modalities. Results show that purposely designed gestures and mapping techniques enable novices to rapidly learn to independently control the two biosignals.

Machine Learning of Musical Gestures Session 10: Gesture | Space / Caramiaux, Baptiste / Tanaka, Atau NIME 2013: New Interfaces for Musical Expression 2013-05-27 p.32
Keywords: Machine Learning, Data mining, Musical Expression, Musical Gestures, Analysis, Control, Gesture, Sound
nime2013.kaist.ac.kr/program/papers/day3/paper10/84/84_Paper.pdf
Summary: We present an overview of machine learning (ML) techniques and their application in interactive music and new digital instrument design. We first provide the non-specialist reader an introduction to two ML tasks, classification and regression, that are particularly relevant for gestural interaction. We then present a review of the literature in current NIME research that uses ML in musical gesture analysis and gestural sound control. We describe the ways in which machine learning is useful for creating expressive musical interaction, and in turn why live music performance presents a pertinent and challenging use case for machine learning.

Towards Gestural Sonic Affordances Posters (1) / Altavilla, Alessandro / Caramiaux, Baptiste / Tanaka, Atau NIME 2013: New Interfaces for Musical Expression 2013-05-27 p.51
Keywords: Gestural embodiment of sound, Affordances, Mapping
nime2013.kaist.ac.kr/program/papers/day1/poster1/145/145_Paper.pdf
Summary: We present a study that explores the affordance evoked by sound and sound-gesture mappings. In order to do this, we make use of a sensor system with minimal form factor in a user study that minimizes cultural association. The present study focuses on understanding how participants describe sounds and gestures produced while playing designed sonic interaction mappings. This approach seeks to move from object-centric affordance towards investigating embodied gestural sonic affordances.

Muscular Interactions. Combining EMG and MMG sensing for musical practice Demos (1) / Donnarumma, Marco / Caramiaux, Baptiste / Tanaka, Atau NIME 2013: New Interfaces for Musical Expression 2013-05-27 p.70
Beyond recognition: using gesture variation for continuous interaction alt.chi: Design Lessons / Caramiaux, Baptiste / Bevilacqua, Frederic / Tanaka, Atau Extended Abstracts of ACM CHI'13 Conference on Human Factors in Computing Systems 2013-04-27 v.2 p.2109-2118
ACM Digital Library Link
Summary: Gesture-based interaction is widespread in touch screen interfaces. The goal of this paper is to tap the richness of expressive variation in gesture to facilitate continuous interaction. We achieve this through novel techniques of adaptation and estimation of gesture characteristics. We describe two experiments. The first aims at understanding whether users can control certain gestural characteristics and if that control depends on gesture vocabulary. The second study uses a machine learning technique based on particle filtering to simultaneously recognize and measure variation in a gesture. With this technology, we create a gestural interface for a playful photo processing application. From these two studies, we show that 1) multiple characteristics can be varied independently in slower gestures (Study 1), and 2) users find gesture-only interaction less pragmatic but more stimulating than traditional menu-based systems (Study 2).

SIG NIME: music, technology, and human-computer interaction SIGs / Bevilacqua, Frederic / Fels, Sidney / Jensenius, Alexander R. / Lyons, Michael J. / Schnell, Norbert / Tanaka, Atau Extended Abstracts of ACM CHI'13 Conference on Human Factors in Computing Systems 2013-04-27 v.2 p.2529-2532
ACM Digital Library Link
Summary: This SIG intends to investigate the ongoing dialogue between music technology and the field of human-computer interaction. Our specific aims are to consider major findings of musical interface research over recent years and discuss how these might best be conveyed to CHI researchers interested but not yet active in this area, as well as to consider how to stimulate future collaborations between music technology and CHI research communities.

MubuFunkScatShare: gestural energy and shared interactive music Interactivity: exploration / Tanaka, Atau / Caramiaux, Baptiste / Schnell, Norbert Extended Abstracts of ACM CHI'13 Conference on Human Factors in Computing Systems 2013-04-27 v.2 p.2999-3002
ACM Digital Library Link
Summary: We present a ludic interactive music performance that allows live recorded sounds to be re-rendered through the users' movements. The interaction design made the control similar to a shaker where the motion energy drives the energy of the played music piece. The instrument has been designed for musicians as well as non-musicians and allows for multiple players. In the MubuFunkScatShare performance, one performer plays acoustical instruments into the system, subsequently rendering them by shaking a smartphone. He invites participation by volunteers from the audience, resulting in a fun musical piece that includes layers of funk guitar, scat singing, guitar solo, and beatboxing.

Designing musical interactions for mobile systems DIS workshops / Tahiroglu, Koray / Tanaka, Atau / Parkinson, Adam / Gibson, Steve Proceedings of DIS'12: Designing Interactive Systems 2012-06-11 p.807-808
ACM Digital Library Link
Summary: Mobile music making is an area where many of the specific design challenges and specific affordances of mobile technologies can be explored. Music applications have often been at the forefront of research in social-interactive aspects of emerging technologies. Music, as a social activity and time-based medium makes demands in terms of intuitive and responsive interactions that later find relevance in other application domains. This workshop will discuss the specific interaction design challenges for deploying engaging and creative musical activities on mobile devices. These devices are characterised by powerful, but not unlimited processing power, touchscreen, small form factor, networking capability, embedded tilt, microphone, camera sensors, and compact GUI. The workshop will allow interaction designers, musicians, and developers who may not already be involved in mobile music development to engage with and learn about this rapidly developing field, and the design research methods that are at the core of creative mobile music applications.

A Survey and Thematic Analysis Approach as Input to the Design of Mobile Music GUIs Posters / Tanaka, Atau / Parkinson, Adam / Settel, Zack / Tahiroglu, Koray NIME 2012: New Interfaces for Musical Expression 2012-05-21 p.240
Keywords: NIME, Mobile Music, Pure Data
www.eecs.umich.edu/nime2012/Proceedings/papers/240_Final_Manuscript.pdf
Summary: Mobile devices represent a growing research field within NIME, and a growing area for commercial music software. They present unique design challenges and opportunities, which are yet to be fully explored and exploited. In this paper, we propose using a survey method combined with qualitative analysis to investigate the way in which people use mobiles musically. We subsequently present as an area of future research our own PDplayer, which provides a completely self contained end application in the mobile device, potentially making the mobile a more viable and expressive tool for musicians.

Articulating lines of research in digital arts, HCI, and interaction (invited SIG) SIGs / Fantauzzacoffin, Jill / Candy, Linda / Chenzira, Ayoka / Edmonds, Ernest / England, David / Schiphorst, Thecla / Tanaka, Atau Extended Abstracts of ACM CHI'12 Conference on Human Factors in Computing Systems 2012-05-05 v.2 p.1177-1180
ACM Digital Library Link
Summary: The establishment of a Digital Arts Featured Community at CHI 2012 indicates the general acceptance of mutually beneficial synergies between digital arts and HCI. At this juncture, the Digital Arts Community has an opportunity to build upon this established community platform to begin articulating lines of research. This SIG initiates this essential step in establishing traditions of contribution.

Music one participates in Creativity and music / Tanaka, Atau Proceedings of the 2011 Conference on Creativity and Cognition 2011-11-03 p.105-106
ACM Digital Library Link
Summary: Digital music has undergone fundamental shifts -- it has gone real time, it has become interactive, it has become miniaturized, and completely democratized. I'll map out my personal trajectory in this time to look at broader evolutions in the field with sensors, networks, and mobility. These are not just technological changes, but changes that bring about shifts in musical approaches. Form factors change, analogue is reconciled with digital, and new directions in Open Source and DIY culture continue to challenge our assumptions on what it means to be an artist, composer, performer, participant, in these evolving musical/technological landscapes.

Beyond participation: empowerment, control and ownership in youth-led collaborative design Poster session / Gaye, Lalya / Tanaka, Atau Proceedings of the 2011 Conference on Creativity and Cognition 2011-11-03 p.335-336
ACM Digital Library Link
Summary: We describe a collaborative design project with a group of young people in which an interactive educational information pack for teenagers was implemented. Instead of just providing input to a design project, the young people initiated, controlled and partially implemented the project themselves, with the support of an interdisciplinary research team. Here we present this approach to participatory design research, describe the design process and show that initiative, control, and hands-on engagement in youth-led collaborative design, can bring to the young people a strong sense of ownership and empowerment.

The user in flux: bringing HCI and digital arts together to interrogate shifting roles in interactive media Workshops / Leong, Tuck W. / Gaye, Lalya / Tanaka, Atau / Taylor, Robyn / Wright, Peter C. Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011-05-07 v.2 p.45-48
ACM Digital Library Link
Summary: With the advent of interactive digital media, people are no longer simply 'users'. They actively shift between various roles: author, collaborator, and even performer. We coin the term "user in flux" to problematize static definitions of "the user" and highlight how people's roles and practices switch and evolve when engaged in such interactions. Drawing from participatory practices and seeking inspiration from interactive artists, this workshop explores the "user in flux" with an aim to establish directions and approaches that can revitalize the HCI community's understanding of the user and inform the design of technologies used for interacting with digital media, and promote a new research agenda.

Classical music for rock fans?: novel recommendations for expanding user interests KM track: information filtering and recommender systems (II) / Nakatsuji, Makoto / Fujiwara, Yasuhiro / Tanaka, Akimichi / Uchiyama, Toshio / Fujimura, Ko / Ishida, Toru Proceedings of the 2010 ACM Conference on Information and Knowledge Management 2010-10-26 p.949-958
ACM Digital Library Link
Summary: Most recommender algorithms produce types similar to those the active user has accessed before. This is because they measure user similarity only from the co-rating behaviors against items and compute recommendations by analyzing the items possessed by the users most similar to the active user. In this paper, we define item novelty as the smallest distance from the class the user accessed before to the class that includes target items over the taxonomy. Then, we try to accurately recommend highly novel items to the user. First, our method measures user similarity by employing items rated by users and a taxonomy of items. It can accurately identify many items that may suit the user. Second, it creates a graph whose nodes are users; weighted edges are set between users according to their similarity. It analyzes the user graph and extracts users that are related on the graph though the similarity between the active user and each of those users is not high. The users so extracted are likely to have highly novel items for the active user. An evaluation conducted on several datasets finds that our method accurately identifies items with higher novelty than previous methods.
<<First <Previous Permalink Next> Last>> Records: 1 to 25 of 40 Jump to: 2016 | 15 | 14 | 13 | 12 | 11 | 10 | 09 | 08 | 07 | 06 | 04 | 03 | 02 | 89 |