http://tangible.media.mit.edu/

https://www.qmul.ac.uk/robotics/

https://cs.stanford.edu/groups/manips/

https://affect.media.mit.edu/

https://resenv.media.mit.edu/

http://www.zaha-hadid.com/

http://miroslaw-balka.com/en/

http://www.pilarcorrias.com/artists/philippe-parreno/

 

RESEARCH & PROJECT MANIFESTO

Research Stage Drawings & Sketches

ANTI-GRAVITY REALITY: INSIDE THE BLACK CUBE OR BRAIN AI is an interactive cross-reality installation that collides Virtual (VR), Physical, and Mental Realities Human emotions and feelings are being consumed by a living AI organism in order to create itself in both virtual and physical realms. The system measures your anxiety level and feeds itself with it through biofeedback sensors. Biosignals from electroencephalogram (EEG), galvanic skin response (GSR), and pulse sensor manipulate the environment. Virtual Reality acquires Physical qualities and tactile sensations. The experience is totally disorienting. Interacting with structures produced by their own brain, a participant unknowingly becomes a performer. Each experience will be unique and the space constructed at its culmination will be catalogued and transformed into an outside anti-gravitational pavilion. Recorded, it will travel to a gallery again as 360 VR installation.

 

The project merges art, science, and technology and stands for erasing labels, identities, disciplines, and geographical borders by advocating freedom of choice and movement. At no given moment, a participant is sure whether they are still in virtual or already in physical space. Or might be somewhere in space of their own imagination.

 

 

 

 

 

 

CROSS-REALITY ZONE: VIRTUAL - PHYSICAL - MENTAL SPACE

INTERACTIVE - IMMERSIVE - DISORIENTING INSTALLATION

AT THE 

Inside the Black Cube Digital

Inside the Black Square Physical

ANTI-GRAVITY REALITY: INISDE THE BLACK CUBE OR THE BRAIN AI

London, UK

Research Stage Video & CGI Simulation

https://www.londonfestivalofarchitecture.org/event/brain-ai-interactive-cross-reality-installation/

 

Many thanks

Paolo Maffini for software and hardware prototype development, neuroscientists Dr Sasha Ondobaka, Dr Nuri Gene-Cos, Dr Simeran Sharma, Dr Dimitris Fotis Sakellariou for valuable input on the brain, emotions, and EEG, Professor of Advanced Robotics in Queen Mary University (ARQ) Kaspar Althoefer for suppor, encouregement, consultation, and imnspiration,

Goldsmith University Computing Department for support and advice, specially Professor Mick Grierson, Nicholas Donald, Dr Marco Gillies, Peter Mackenzie, Dr sylvia Xueni Pan, PhD candidate of ARQ Luis Andres Puertolas Balint for fun in the studio, help, and making me dance with Kinect,  Adam Sutcliffe  (QMUL) for consultations and conversations, and invaluable introduction to Kaspar Althoefer,  Adam Laschinger for amazing sound recording, Imperial College Neiroscience Society for introducing me to EEG and the brain Cutting Session, and also millions of conversations, Dr Dario Brescianini from Flying Machine Arena at ETHZ for inventing Omnicopter,

MIT Media Lab for ArtScience Manifesto 

and Affective Computing,

Tangible Media,

Responsive Environments groups for huge inspiration in their research,

Olimex, Hobs 3D, Sword Fish Works and many more manufactures and suppliers who provided materials and various components throughout the project.

 

Special thank you to Gordon Beshaw.

 

Green Sceen Video of VR Experience

Combined Video of simultenious Virtual and Physical Experiences

Cross-Reality Zone

Analysis of a Participant's Experience

Physical and Virtual Spaces Constructed by Participants

EXPERIENCE

Experiences from the Inside VR Headset

PART I. FIRST ITERATION.

PERFORMATIVE PARTICIPATION

From the Catalogue of Constructed Spaces Anti-Gravitational Pavilion Will Be Built [Part II], which in Part III will become a gallery installation again as 360 VR [Part III]. Thus, experience is being transferred in-between virtual - physical, inside - outside. 

 

ROBOTIC PROTOTYPE

In it's first iteration, physical structure is constructed as a one-wall prototype [to be realised at the second iteration as a full cube]. It is a robotic machine providing a haptic feedback to VR. Physical robots are actuated by servo motors and determine a position of a partcipant using ultra-sound distance sensors. Physical robot matches virtual robot and unexpectedly a participant realizes that VR elemnts have physical qualities, while other elemnts in-between the main grid escape their touch and shrink back.

AGR is proud to be a part of LFA

PROTOTYPE DEVELOPMENT DURING LONDON FESTIVAL OF ARCHITECTURE

CONCEPT VIDEO & CGI

RESEARCH INSPIRATIONS

PROCESS

https://www.ucl.ac.uk/icn/