|/FinalResults /MeetingMinutes /OurVision /ProjectDescription /ProjectLog /ProjectPlan /UserGroup /WhatIsNew /WorkShops|
- Project Introduction
- Project, Objectives and Competencies
- Client Information
- List of Available Resources/Experts
… the cinema is truth 24 frames a second … -– Jean Luc Godard
Gods, I like Gods. I know exactly how they feel. -- Jack Prokosch in Contempt
Movies create an imaginative experience to stimulate our dreams, hopes, desires, fears, where increased presence and ‘to be there’ feeling captures the observers lust – the imaginary film becomes the truth and the dream is a reality. To feel that you are an inescapable part of this action, the happening, the confirmation of your movements that transcribe and interact with the film, the ‘in your face’ vision of surprise, cacophony, deafening silence – are movies becoming ambient, you’re the main character. Take us there. . .
Truly believable computer-generated character of morphing, liquid molten metal, T-1000 cyborg in Terminator 2: Judgment Day. . . .Quentin Tarantino's non-formulaic and inventive hit Pulp Fiction, three interwoven stories told in non-linear order. . . .Pixar’s Toy Story , the first totally-digital feature-length film. . . . . IMAX 3-D 40-minute movie Wings of Courage, viewed through high-tech goggles with liquid crystal lenses. . . . James Cameron's Titanic, . . . Final Fantasy: The Spirits Within,. . . . Spider-man, . . . .Roman Polanski's The Pianist, . . .. Crouching Tiger, Hidden Dragon, . . . . .Star Wars: Episode II - Attack of the Clones, . . .. The Lord of the Rings: The Fellowship of the Ring. . . . oh yes and Matrix, Minority Report, etc.
Designing action, designing a way to move into movies
In the vision of Ambient Intelligence (Aarts and Marzano,2003), the next age of people’s interactive media experience will not be on a computer or television, or in a headset, but in a whole physical space which is responsive to and activating people’s movements, gestures, speech and touch. Projectors, screens, cameras, sensors, sound makers, lights and colours, objects all actively connected with each other and with people going into it. How to activate this event and what to activate exactly?
Nowadays film is experienced through empathy. You experience, without actually participating. With Ambient Intelligent movies an interesting field is opened to explore the relation of movie and ‘watcher’ can change. In creating such movies it is important to create a vision on what kind of relation you want to arise between both. Does the movie involve you in it, or is it you that manipulates the movie. Are you part of the movie, an actor in it, or do you go along in it as an empathising experience parasite? The same goes for the space you are in, the objects in it the objects you take with you….. So the question then is: How do you move into movies?
This is all about designing action and about technology, objects, use of space, to support this move. It concerns designing a way to move into movies. Use of technology, use and design of space and objects, depends on how you vision this new kind of movie, and on what you vision to activate. You have to create an idea of the action, to know how people, technology, objects and space are involved in this action. How does it all move, move you and move with you. What and how and why to move into the movie.
Incorporation in home environment
In ambient intelligence, the ambience (the space or the environment) involves multiple connected yet independent devices with an interface that enables natural interactions and adapts to the users and their needs. This confronts us with a new challenge and an opportunity at the same time: Presenting interactive movies in such ambient intelligent spaces to provide the users with compelling entertainment experiences. Everyday objects, multiple displays, surrounding audio systems, ambient lighting facilities and yet many others to be designed can be connected to, or in other word, moved into the movies. It is indeed technically challenging:
- There is not yet a set of standard technologies that cover all the aspects of distributed media presentation and synchronization over multiple devices.
- The environments, especially the domestic environments, are different from each other in many possible ways: the space, the layout, the configuration and the connectivity are all different. We can not expect the living rooms to be standardized.
Merging the real and the virtual
Connecting the real physical environment to the virtual world created by a movie to serve people with compelling immersive experience has been implemented by many theme parks as tourist attractions, known as “motion simulation rides”. Good examples are Star Tours Spaceship in Disneyland Paris (2004), Merlin’s Magic Castle in SixFlags Holland (2004), and the Batman Adventure in Warner Bros. Movie World Germany (2004). These attractions combine the movies with lighting effects, surrounding sounds, vibrations and movements of physical objects and even the entire environment. The Batman Adventure also involves live shows to make it more believable and convincing. These motion simulation rides are so attractive to people, that we can not help thinking about moving such an experience to the domestic environments. There are some differences that we should keep in our mind: 1) The motion simulation rides and the theme park environments are made for each other, while the domestic environments are firstly for living and working. 2) Most of the motion simulation rides are not interactive, at least not interactive for the audience. 3) The motion simulation rides are made for a larger group of audience. 4). This simulation is a particular kind of movie experience; it is an attraction, while at home the experience interest might be different.
Distribution to the senses
Distribution is not a new idea for enhancing the entertainment experience. A good example is the multi channel audio. Over the years, mono sound in early days has been improved to multi channel sound in different levels: stereo (2.0), stereo with bass (2.1), five channel surround (5.0), 5.1 DTS surround (five channels with bass). Recently 6.1 DTS surround (six channels with bass) is appearing in the market. Distribution of the sound makes the experience more compelling and more natural because of the nature of the human multidirectional auditory system (Freeman and Lessiter, 2001) . The successful story of the multi-channel, distributed surround sound systems and the promising vision of ambient intelligence inspire to look for the next step. Will the distribution of visual content elements also enhance the entertainment experience? What about touch? What about a combination of multiple sensory modalities just like how the humans perceive and interact with their environments?
Project, Objectives and Competencies
This project is not going to answer all these research questions. Instead, we are going to approach the answers by doing it. We are going to create a way to move into ambient intelligent movies. We will be challenged with a new area of design: entertainment experience design. The entertainment experience refers to the experience that people voluntarily going through for their pleasure and release, that has a beginning and an end, and that affects people and the context as a result. Movies are good examples for creating entertainment experiences. While integrating the movies into the physical environments, many design issues have to be taken into account.
Tools and examples
Starting from a set of existing software and hardware tools and an example of distributed interactive movie (Feijs and Hu, 2004; Hu and Feijs, 2003a, 2003b), the project may also need to:
- Investigate the technologies of interactive media, such as SMIL (W3C. 2001) and MPEG-4 (Koenen, R. 1999);
- get familiar with software and hardware tools. For example, Java Media Framework (Sun Microsystems. 2004), a java implementation of SMIL(X-Smiles.org. 2004) and LEJOS (Solorzano, J. 2002);
- Explore movie experience and investigate how home can be a place and make space for a kind of new home cinema.
- Create a vision and derive requirements out of your earlier investigations/research.
- Create your concept for action based on earlier steps. Move to explore and create your action.
- Let the technique, interaction devices, the lights, the objects, the use of home environment evolve with and out of your action concept.
- Model and move to develop your design.
- User test and feedback
- Final presentation for experience. Think of:
- The presentation: Distributed over the data hungry net. – fast, slow, high, low capacity?
- The environment: DIY as a commodity value – how do the punters quick-fix it?
- The people: customization, personalization and adaptation in ”REAL TIME” (the holy grail)
- Make it, build it - Rapid prototyping (Bartneck and Hu. 2004);
The students who are interested in this project are expected to have enough experience with object oriented programming in Java, at least have finished the Java assignment and the Java/Lego project. The students will get opportunities to develop the other competencies in the normal way, but specifically:
- Integrating technology:
- Object oriented programming in Depth;
- Preliminary understanding of network protocols and network programming;
- Multimedia standards;
- Idea and Concepts
- In this process the design movement approach will support a red line for conceptual development. This approach focuses on the creation of action and the use of movement as pivotal in designing.
- Form and senses:
- Everyday objects for distributed multimedia presentation and interaction.
- User focus:
- Design for families and for living environments.
- Design and research process:
- Rapid prototyping.
- Red line of conceptual thinking
- Analysing complexity:
- Object orientation
- Design patterns
The final deliverables should include:
- The final presentation, including a working demo that shows a distributed interactive movie, targeting at home environments, including working prototyped technology and objects and other facilities that support this movie.
- Report and a web site introducing the project and the result;
- All of the above to be delivered on a CD, at least 1 day before the final presentation.
Maddy D. Janse joined Philips in 1987 as a specialist in User System Interaction. In this function she worked at the CFT, at the IPO (Institute for Perception Research, a joint venture between Philips Research and the Technical University of Eindhoven) and Philips Research. Her main research interests are in the area of human factors, human perception and behaviour, user interface design for easy access and interaction for consumer systems. She has been working on methodologies for subjective evaluation of advisory systems for video content in the context of the ACTS SMASH and STORit projects. Currently she has been the project manager for the IST ICE-CREAM and NexTV projects. Currently she is working in the Ozone and Amigo projects She is Managing Director of a two-year international postgraduate Master’s programme in User-System Interaction Design at the Technical University of Eindhoven. Before joining Philips she worked at the Unisys Company (then Sperry) in an Advanced Systems Group in Minneapolis (US) responsible for technology transfer between the company and the MCC organisation in the area of artificial intelligence, expert systems and advanced user interface technology. She has a PhD in Cognitive Psychology - human problem solving (University of Minnesota) and a graduate degree in Food Chemistry and Technology (University of Wageningen).
List of Available Resources/Experts
Aarts, Emile and Stefano Marzano, eds. 2003. The New Everyday View on Ambient Intelligence: Uitgeverij 010 Publishers. Learn what ambient intelligence is about. Especially pay attention to “Physical Markup Language”
Koenen, R. 1999. Mpeg-4: Multimedia for Our Time. IEEE Spectrum 36 (2): 26-33. Learn what is possible with uptodate multimedia standards. We are not going to actually use MPEG-4 though.
W3C. 2001. Synchronized Multimedia Integration Language (Smil 2.0). Available from http://www.w3.org/TR/smil20/.
Sun Microsystems. 2004. Java Media Framework Api (Jmf). Available from http://java.sun.com/products/java-media/jmf/index.jsp.
X-Smiles.org. 2004. X-Smiles. Available from http://www.x-smiles.org/. We will use it as a basic tool for SMIL presentations.
Bartneck, Christoph and Jun Hu. 2004. Rapid Prototyping for Interactive Robots. Paper read at the The 8th Conference on Intelligent Autonomous Systems (IAS-8), at Amsterdam, The Netherlands. Available from http://www.idemployee.id.tue.nl/j.hu/publications/RapidPrototyping_ias8/RapidPrototyping_ias8.html
Klooster, S., R. Appleby and K. Overbeeke, DESIGN (EDUCATION) MOVES, available from http://www.io.tudelft.nl/iepde04/Documents%20for%20Downloads/Final%20Papers/EG_Klooster.pdf
Solorzano, J. 2002. Lejos: Java Based Os for Lego Rcx. Available from http://www.lejos.org.
SixFlags Holland. 2004. Merlin's Magic Castle. Available from http://www.sixflags.nl/sixflagsholland/attractions/familykids/index_nl.cfm.
Warner Bros. Movie World Germany. 2004. Batman Adventure. Available from http://www.movieworld.de/movieworld/attractions/thrills/index_en.cfm.
Disneyland. 2004. Star Tours. Available from http://en.wikipedia.org/wiki/Star_Tours.
Feijs, Loe M. G. and Jun Hu. 2004. Component-Wise Mapping of Media-Needs to a Distributed Presentation Environment. Paper read at the The 28th Annual International Computer Software and Applications Conference (COMPSAC 2004), at Hong Kong.
Freeman, J. and J. Lessiter. 2001. Hear There & Everywhere: The Effects of Multi-Channel Audio on Presence. Paper read at the ICAD 2001 - The Seventh International Conference on Audio Display, at Espoo, Finland.
Hu, Jun and Loe M. G. Feijs. 2003a. n Adaptive Architecture for Presenting Interactive Media Onto Distributed Interfaces. Paper read at the The 21st IASTED International Conference on Applied Informatics (AI 2003), at Innsbruck, Austria.
Hu, Jun and Loe M. G. Feijs. 2003b. n Agent-Based Architecture for Distributed Interfaces and Timed Media in a Storytelling Application. Paper read at the The 2nd International Joint Conference on Autonomous Agents and Multiagent Systems, at Melbourne, Australia.