Zum Inhalt

Shitsukan—Understanding and Manipulating Material and Quality Perception

  • Open Access
  • 2026
  • Open Access
  • Buch

Über dieses Buch

Dieses frei zugängliche Buch präsentiert interdisziplinäre Forschungen zu "Shitsukan", einem einzigartigen japanischen Wort, das sich auf die qualitativen, unermesslichen Attribute bezieht, die Material und Qualität von Objekten und Szenen betreffen. Durch die Wahrnehmung der großen und reichen Vielfalt von Shitsukan können Menschen wichtige Informationen für ihr tägliches Leben gewinnen, den Reichtum der Welt wahrnehmen, den Wert der Dinge beurteilen und angemessene körperliche Handlungen festlegen. Das Verständnis, wie man Shitsukan erkennt und wie man Shitsukan manipuliert, sind daher grundlegende Forschungsthemen für uns Menschen. Dieses Buch beschreibt das Bemühen, die tiefen Shitsukan-Informationen in der realen Welt zu definieren und auszugraben, um unser Verständnis der Shitsukan-Wahrnehmung sowohl auf rechnerischer als auch auf experimenteller Ebene zu verbessern. Dieses Buch gliedert sich wie folgt: Teil I stellt vor, wie Menschen und Maschinen Shitsukan wahrnehmen. Teil II stellt verschiedene Studien zur neuronalen Verarbeitung von Shitsukan vor. Teil III befasst sich mit der Kontrolle von Shitsukan-Erfahrungen. Durch die Lektüre dieses Buches erhalten die Leser Einblicke, wie Menschen und Maschinen Shitsukan (Psychopysik und Computervision) wahrnehmen, wie das Gehirn Shitsukan (Neurowissenschaften) verarbeitet und wie man Shitsukan (virtuelle Realität, digitale Kunst und Industrietechnik) erschaffen und modulieren kann.

Inhaltsverzeichnis

  1. Chapter 1. What Is Shitsukan?

    • Open Access
    Shin’ya Nishida, Ko Nishino
    Abstract
    This brief introductory chapter provides the definition of Shitsukan, the historical background of our Shitsukan projects, and an outline of the following chapters.
    PDF-Version jetzt herunterladen
  2. Perceiving Shitsukan

    1. Frontmatter

    2. Chapter 2. Human Visual Perception of Shitsukan

      • Open Access
      Shin’ya Nishida, Masataka Sawayama
      Abstract
      Visual estimation of material properties, such as gloss, viscosity, and wetness, is one of the main tasks of human Shitsukan recognition. It is computationally challenging since the image formation involves complex optical interplays of material, geometry, and illumination, and because physical material properties are inherently complex and high-dimensional. It seems difficult for human vision to estimate material characteristics from a single image in a strict inverse-optics way. The hypothesis that human vision heuristically uses image-computable simple features for estimating material properties (e.g., the skewness of the luminance histogram for gloss estimation) was proposed but has been criticized for its ignorance of other relevant stimulus factors (e.g., 3D shape). The complexity of human material computation remains a controversial issue, and the recent success of artificial neural networks in explaining human vision has amplified expectations about the complexity of the underlying neural computation. Here, we argue that specifying relevant image features for material perception still is a useful strategy for an explicit understanding of the human visual computation of material, as long as the study also analyzes the underlying physical mechanism and explains why these features work and when they fail to predict human material perception. Furthermore, for a full understanding of the human material processing, we should investigate how the material affects the perception of the other attributes, such as shape and motion, in addition to how the material perception is affected by the other attributes.
      PDF-Version jetzt herunterladen
    3. Chapter 3. Computational Visual Modeling of Shitsukan

      • Open Access
      Ko Nishino
      Abstract
      Computational understanding of visual Shitsukan, i.e., understanding the nuanced looks and feels of objects and scenes from their images, is a challenging task that goes to the very heart of computer vision. This chapter reviews the foundations of appearance modeling, the sub-discipline in computer vision centered on the analytical and image-based representations of the radiometric properties of real-world surfaces, namely its geometry, reflectance, and illumination. We particularly focus on the modeling of light reflection at a surface point, the estimation of the radiometric ingredients that participate in it, and also the recognition of the traces of materials that quietly dictate such light interaction, all from images. We then introduce our recent work on each of these fundamental topics of computer vision which collectively advance the frontier of computational visual Shitsukan perception.
      PDF-Version jetzt herunterladen
    4. Chapter 4. Shitsukan at The Intersection of Vision and Language

      • Open Access
      Takayuki Okatani
      Abstract
      This chapter examines Shitsukan recognition as a key challenge for computer vision and multimodal AI. Unlike object recognition, Shitsukan involves subtle properties—such as glossiness, transparency, and softness—as well as holistic impressions like warmth or atmosphere. We review classical approaches that frame Shitsukan recognition as material classification, multi-label attribute prediction, or ranking-based estimation, noting their limitations in scope and robustness. Recent advances in web-scale image—text mining and contextual captioning broaden recognition beyond predefined categories. With the rise of generative AI, multimodal large language models (MLLMs) trained on massive image—text corpora now achieve remarkable image understanding, often capturing Shitsukan at near-human levels. Yet challenges remain, including generalization beyond training distributions and the absence of embodied experience. We argue that Shitsukan recognition highlights both the potential and the limits of current AI, offering a path toward deeper alignment with human perception.
      PDF-Version jetzt herunterladen
    5. Chapter 5. Shitsukan of Sound

      • Open Access
      Takuya Koumura
      Abstract
      This chapter explores the auditory perception of sound Shitsukan, which differs from the basic contents of sound and includes more abstract, often difficult-to-verbalize qualities, such as material properties, spatial characteristics, and environmental factors including reverberation and ambient noise. I begin by examining how the physical properties of sound sources shape perceived Shitsukan, focusing on studies related to material perception of impact sounds. I then explore various other aspects of Shitsukan, such as the effects of reverberation on material and emotion perception, the perception of silent objects and sound textures, and the experience of a sound source moving closely around the listener’s head. These discussions illustrate how sound interacts with physical environments, including the listener’s body, to create a perceptual experience. The chapter concludes by considering the potential to quantify more elusive Shitsukan experiences, such as particular emotions evoked by voices and the unique auditory qualities of traditional songs, suggesting that Shitsukan may be key to understanding both modern and traditional cultural expressions of sound.
      PDF-Version jetzt herunterladen
  3. Shitsukan in The Brain

    1. Frontmatter

    2. Chapter 6. Decoding Shitsukan in The Brain

      • Open Access
      Yukiyasu Kamitani, Misato Tanaka, Tomoyasu Horikawa
      Abstract
      Understanding Shitsukan, or realistic qualities of the world, in visual experiences requires more than simple models or language-based descriptions. This chapter introduces a brain decoding approach centered on mid-level visual features represented in both the brain and deep neural networks (DNNs). By translating brain activity into DNN features and using generative models to reconstruct detailed visual images, we achieve high-fidelity decoding of subjective experiences, including mental imagery and visual illusions. Our approach extends beyond individual subjects, demonstrating inter-individual and inter-site generalization of these brain-derived patterns. Despite these advances, challenges persist in ensuring accurate hierarchical correspondence between brain and DNN features and mitigating spurious reconstructions. We propose a new research direction, the Neuroverse, which aims to externalize not only sensory representations but also the brain’s internal model of the world. This framework opens new avenues for exploring how Shitsukan and other complex perceptual experiences are constructed and represented in the brain.
      PDF-Version jetzt herunterladen
    3. Chapter 7. The Neural Processing of Shitsukan Perception

      • Open Access
      Hidehiko Komatsu
      Abstract
      Material perception likely plays a significant role in the understanding of Shitsukan in the real world. Significant advancements have been made in recent years on the understanding of neural mechanisms related to material perception especially in the domain of vision. Many studies provide converging evidence that the ventral visual pathway, one of the cortical visual pathways, processes information related to material perception. Information on the natural texture that is tightly related to the categorization of materials is hierarchically processed in the middle visual areas along the ventral visual pathway, and multimodal impression of materials is formed in the higher ventral visual areas as a result of association between visual and tactile experiences of objects. Information on the surface reflection properties of objects which is another important aspect of material perception, is also represented in the ventral higher visual area. Neurons selectively responsive to object images of specific reflection properties are recorded in the macaque monkey, and it is found that these neurons precisely represent perceptual gloss parameters defined by human psychophysics experiments on gloss perception.
      PDF-Version jetzt herunterladen
    4. Chapter 8. Shitsukan Perception When Impaired

      • Open Access
      Kyoko Suzuki, Shigeki Nakauchi, Yuya Kinzuka
      Abstract
      Material perception can dramatically change as people age. It can also be altered with brain damage. Studying these changes give us important insights into the perceptual mechanism of materials. Material perception in a broad sense, or Shitsukan perception, includes surface quality and object appearance as well as the texture, which are related to preference or emotional reaction toward an object. Thus, Shitsukan integrates multiple sensory modalities and is essential for object recognition and emotional processing. This chapter examines the influence of neurological disorders and aging on human Shitsukan perception. Two investigative approaches are highlighted: (1) evaluating performance in tasks related to material perception and recognition (“surface Shitsukan”) and (2) analyzing neurophysiological responses to visual appearance changes (“deep Shitsukan”). Findings reveal that aging and brain damage impair Shitsukan perception across visual, tactile, and auditory modalities. In individuals with brain lesions, the extent and modality of perceptual impairment of Shitsukan are determined by lesion localization. Moreover, studies on pupil responses to glare illusions indicate that aging and brain damage diminishes physiological responses, reflecting a decline in Shitsukan perception. These results underscore that surface and deep Shitsukan perception are differently affected by aging and neurological disorders. Addressing these changes through environmental adaptations could significantly enhance the quality of life for older adults and individuals with brain dysfunctions.
      PDF-Version jetzt herunterladen
    5. Chapter 9. The Neural Value Transformation from Shitsukan

      • Open Access
      Takafumi Minamimoto, Koshi Murata
      Abstract
      The sensory system enables the recognition and categorization of objects, allowing us to interpret the surrounding environment and guide our actions. Beyond this initial stage of sensory processing lies another level, where sensory representations are evaluated based on their biological significance. At this level, sensory representations give rise to value-based judgments, emotional responses, and motivational regulation, which in turn lead to decision-making and behavioral responses. Understanding sensory stimuli on the basis of their biological significance requires different computational frameworks distinct from those used for object identification—a transformation from recognizing surface attributes to evaluating deeper, context-dependent, and biologically grounded meaning. This deeper level of processing reflects the essence of “deep Shitsukan.” Despite its importance for adaptive behavior, this neural transformation remains poorly understood. This chapter reviews the neural mechanisms underlying object perception in both vision and olfaction, focusing on how the brain recognizes and categorizes sensory inputs. We then move beyond this surface-level object perception to explore how the brain derives biological meaning—such as value judgments—from these sensory representations. Finally, we discuss open questions and future directions for understanding how the brain transforms perception to value-based decisions.
      PDF-Version jetzt herunterladen
  4. Creating and Modulating Shitsukan

    1. Frontmatter

    2. Chapter 10. The Haptics of Shitsukan

      • Open Access
      Hiroyuki Kajimoto, Masashi Konyo, Shogo Okamoto
      Abstract
      This chapter explores the role of touch in the perception of Shitsukan, referring to material qualities like texture, hardness, and temperature. Haptics plays a critical role in understanding Shitsukan through five key sensations: hardness–softness, friction, micro-level roughness, macro-level roughness, and thermal. In addition, vibrotactile perception is involved in all of these factors except heat. The chapter presents examples on how these haptic sensations can be measured and reproduced. The first example is the presentation of softness using both force and cutaneous feedback. Previous research showed that force feedback alone is insufficient to replicate soft objects, and combining it with cutaneous feedback significantly improves the realism of the sensation. Based on this finding, a method employing an electro-tactile display has been proposed to simulate variations in contact area associated with the deformation of soft objects under pressure. The second example addresses the perception of high-frequency vibrations and introduces a method for reproducing them with high fidelity based on perceptual characteristics. It also introduces a spatial vibration representation that uses multiple oscillators to express the direction and distance of external vibration sources to improve the sense of realism. The third example shows texture rendering through vibrations and friction on touch panels. Vibratory stimuli and friction control create sensations of roughness and texture, offering new possibilities for tactile interfaces. By accurately controlling skin deformation, these methods can simulate a wide range of textures, making haptic feedback increasingly useful in touch-based devices.
      PDF-Version jetzt herunterladen
    3. Chapter 11. Augmenting Reality with Shitsukan

      • Open Access
      Daisuke Iwai, Yuta Itoh
      Abstract
      This chapter explores the capabilities and challenges of in situ Shitsukan (material appearance) manipulation through augmented reality (AR) technologies, focusing on two main approaches: projection mapping (PM) and optical see-through AR (OST-AR) systems. Digital Shitsukan manipulation allows for modifying the perceived material properties of physical surfaces, and AR enables these modifications to be experienced in real-world scenarios. We discuss various applications, including makeup simulation, product design, and tactile feedback, as well as the technical challenges of geometric alignment, radiometric compensation, and visual consistency in AR-based Shitsukan manipulation. Additionally, we address emerging hybrid systems, such as beaming displays, that combine PM and OST-AR technologies for enhanced experiences. Finally, we outline future research directions, highlighting the potential of AR-based Shitsukan manipulation in various fields.
      PDF-Version jetzt herunterladen
    4. Chapter 12. Dynamic Projection of Shitsukan

      • Open Access
      Yoshihiro Watanabe
      Abstract
      Dynamic projection mapping (DPM) is a technology that overlays virtual imagery onto moving physical objects in real time. By creating dynamic virtual appearances that adapt to the ever-changing surroundings, DPM enables the direct manipulation of Shitsukan, namely the perceived material properties of objects. This chapter reviews the current state, capabilities, and potential of this dynamic projection of Shitsukan; examines the technological advances in sensing, tracking, and projection systems that underlie it; and explores its wide range of applications across diverse industries. Finally, the chapter discusses future trends and challenges, offering insights to guide ongoing research and development in this rapidly evolving field.
      PDF-Version jetzt herunterladen
    5. Chapter 13. Digital Fabrication of Shitsukan

      • Open Access
      Masashi Nakatani, Yasuaki Kakehi, Shinsaku Hiura
      Abstract
      Recent advancements in digital fabrication technologies have enabled not only the shaping of physical forms but also the design of material qualities, including tactile sensations and interactive functions. This chapter explores novel approaches to designing Shitsukan through digital fabrication from three perspectives: (1) controlling haptic feedback via internal structural design, (2) developing fully flexible deformation sensors using porous and conductive materials, and (3) realizing dynamic visual and tactile transformations through methods such as 4D printing and refractive index modulation. By treating material texture as a programmable and dynamic design element of real-world Shitsukan, this work points toward new directions for embodied interaction and material-centered interface design.
      PDF-Version jetzt herunterladen
    6. Chapter 14. Shitsukan in Industrial Engineering

      • Open Access
      Takahiko Horiuchi, Maki Sakamoto
      Abstract
      This chapter explores the integration of Shitsukan into industrial engineering, highlighting its significance in product development, user experience, and manufacturing optimization. We review our collaborative research with industries and offer practical insights and methods for implementing Shitsukan-based design. At first, three major industrial applications are presented: halftone image quality enhancement for the printing industry, perceptual modeling in leather design, and star field reproduction in visual entertainment. Each case exemplifies how perceptual qualities influence industrial value and consumer satisfaction. We then expand on practical implementations, such as the quantification of onomatopoeia and the development of the Kansei Material Platform. These initiatives bridge academia and industry, enabling the creation of emotionally resonant products through systematic Shitsukan modeling. This chapter underscores the emerging role of perceptual engineering in fostering innovation and enhancing the value of industrial design.
      PDF-Version jetzt herunterladen
    7. Chapter 15. Depicting Shitsukan

      • Open Access
      Shin Ishihara, Shota Nakamoto, Hiroyuki Uchiro, Manabu Miki, Yuta Asano, Yuichiro Taira, Imari Sato
      Abstract
      Paintings are one of the most important objects for understanding Shitsukan, as they reflect the process by which artists perceived and expressed the material and texture of objects. Human skin has long been an object of great interest for artists, who seek to convey Shitsukan, qualities such as softness, glossiness, and transparency, on a two-dimensional canvas. Léonard Foujita, an artist active at the École de Paris in the 1920s, is world-renowned for his original technique of smooth, transparent skin, known as ‘milky-white’ skin. Destructive analysis has facilitated the identification of several white pigments used by Foujita; however, how he effectively utilized these pigments to achieve his ‘milky-white’ skin texture expression has remained a secret. Our preliminary investigation revealed that each of the white pigments reported to have been used by Foujita exhibits a unique fluorescent emission. Based on this finding, we measured and analyzed the fluorescent emission of Foujita’s paintings under ultraviolet illumination using a hyperspectral camera. Through our analysis, we found that white pigments, which exhibit similar reflective properties under illumination without ultraviolet light, were intentionally arranged according to their fluorescent emission colors. Building on this analysis, we discuss how artists may have exploited the optical properties of pigments to represent the Shitsukan–the optical and perceptual qualities–of human skin.
      PDF-Version jetzt herunterladen
Titel
Shitsukan—Understanding and Manipulating Material and Quality Perception
Herausgegeben von
Shin'ya Nishida
Ko Nishino
Copyright-Jahr
2026
Verlag
Springer Nature Singapore
Electronic ISBN
978-981-9547-62-3
Print ISBN
978-981-9547-61-6
DOI
https://doi.org/10.1007/978-981-95-4762-3

Die PDF-Dateien dieses Buches wurden gemäß dem PDF/UA-1-Standard erstellt, um die Barrierefreiheit zu verbessern. Dazu gehören Bildschirmlesegeräte, beschriebene nicht-textuelle Inhalte (Bilder, Grafiken), Lesezeichen für eine einfache Navigation, tastaturfreundliche Links und Formulare sowie durchsuchbarer und auswählbarer Text. Wir sind uns der Bedeutung von Barrierefreiheit bewusst und freuen uns über Anfragen zur Barrierefreiheit unserer Produkte. Bei Fragen oder Bedarf an Barrierefreiheit kontaktieren Sie uns bitte unter accessibilitysupport@springernature.com.

    Bildnachweise
    AvePoint Deutschland GmbH/© AvePoint Deutschland GmbH, ams.solutions GmbH/© ams.solutions GmbH, Wildix/© Wildix, arvato Systems GmbH/© arvato Systems GmbH, Ninox Software GmbH/© Ninox Software GmbH, Nagarro GmbH/© Nagarro GmbH, GWS mbH/© GWS mbH, CELONIS Labs GmbH, USU GmbH/© USU GmbH, G Data CyberDefense/© G Data CyberDefense, Vendosoft/© Vendosoft, Kumavision/© Kumavision, Noriis Network AG/© Noriis Network AG, tts GmbH/© tts GmbH, Asseco Solutions AG/© Asseco Solutions AG, AFB Gemeinnützige GmbH/© AFB Gemeinnützige GmbH, Ferrari electronic AG/© Ferrari electronic AG, Doxee AT GmbH/© Doxee AT GmbH , Haufe Group SE/© Haufe Group SE, NTT Data/© NTT Data