details

Our research aims at performing objective studies of emotions induced by architectural spaces, by sensing and statistically analyzing some physiological signals of users while walking through pre-defined locations in spaces designed for an immersive Virtual Environment (VE) (as opposed to a real space), in a lab setting, and correlate such classification with subjective post-experimental questionnaires. For this, virtual buildings were designed considering the presence of architectural elements that could affect user’s physiological states and perception of e.g. fear of falling. We started by studying sensations close to fear of heights, claustrophobia, frustration and relief (Dias et al 2014a) we later focused on fear of falling (Dias et al 2014b). We are now working on the study of the impact of sound in space perception comparing the outcomes with Space Syntax theory. For this we are also performing statistical analysis of some physiological signals while people walk in real space.

Main results show that physiological measurement of user’s emotions can discriminate between spaces, providing designers with basic information on people’s emotional state when using the buildings they design.

Our group of research on this topic is composed by Post-doc, PhD and master students from Computer Science and Information Technologies, Architecture, Psychology and Ergonomics with a large background in virtual reality.

Audiovisual Space Perception – Marie Curie IRIS partners (MSFT, METU) and ISCTE-IUL are analyzing the effects of environmental variables, in particular sound, on user’s space perception, on user’s movement and on his/her physiological arousal and emotional response. Various experiments in both VR and real environments are proposed and use various sound/noise post-processing/rendering techniques to evaluate the relation between perception and use of architectural space, audio stimuli and sensed emotions. This experiment not only continues the work done by the group on VR, space perception, space syntax, physiological data, sound simulation and emotions but also introduces new areas of study like eye tracking and user tracking (GPS, manual tracking and virtual path recording).

PlayArch – Interactive Architecture Environment is a study part of a PhD thesis that aims at examine, by means of experimental methods and objective results analysis, which are the spaces that can be considered good, bad or even indifferent and which is the connection between the selected spaces and the social and individual culture of the tested individual. This evaluation study aims at reaching a more profound and defined knowledge of the space of Architecture and its relation with man. The simulation will be based in an immersive virtual reality prototype capable of natural and multimodal interaction, including speech, gesture and biometrical interfaces – Brain Computer-Interfaces and other biometric sensing technologies such as electroencelography (EEG), electrodermal activity (EDA), heart beat (HR) and electromyography (EMG).

Coordination

Researchers

Other Researchers

Sibila Marques (CIS-IUL), Miguel Carreiro, Ana Moural, Elisângela Vilar, João Freitas, Joana Cruz, Jorge D’Alpuim, Nelson Carvalho, Pedro Proença, Tiago Pedro, Sérgio Azevedo, Sofia Leite

Related Publications

Dias, Miguel Sales; Eloy, S.; Carreiro, M.; Proença, P.; Moural, A.; Silva Pedro, T.; Vilar, E.; d'Alpuim, J.
in .. Proceedings of CAADRIA 2014 conference

Partners

Middle East Technical University of Turkey

Funding

This work was partiallyfunded by Marie Curie IRIS (ref. 610986, FP7-PEOPLE-2013-IAPP) and OLA, Organisation Life Assistant” (AAL 2014-076).

Related Projects