Carnegie Mellon School of Design
Design for Environments
Projects bridging physical and digital environments, including experiential and speculative design alongside exploratory research through design prototypes, from CMU's Environments Track 3rd year undergraduates.
Thanks to: Dan Lockton
Participating Students
Amber Lee

Hello I'm Amber, a third-year environment and communication design major at Carnegie Mellon University. I like projects that make people think or feel differently and merit more than their measurable results.

Sound of the Sky — generative music installation

Initial research For this project, I was interested in the sky and and its invisible trace of sound. Although the sky is generally a visual experience, I wanted to experiment with ways it might become audial as well. To begin my image to sound research, I looked into a color to sound device, an article on different types of background noise, and Yuri Suzuki's Face the Music project.

Concept An installation that showcases generative music derived from images of the sky.

See images and read more https://amberlee.me/sound-of-the-sky


Carnival — digital platform for streaming, browsing and sharing.

Concept Art and music are frequently found hand-in-hand at festivals to create rich experiences for attendees. My project explores the benefits of at-home streaming, browsing, and sharing to create a contemporary digital experience in the form of a website.(edited)

Action Users have the opportunity to explore galleries, videos, live streams, and music from makers (professionals) featured on the site. In addition, every user becomes a curator with their own display. Displays feature users' saved contents so that friends or people with similar interests may browse for discovery.(edited)

Addressed gap Existing digital galleries, streams, etc. flush out many different methods of display (i.e. 2D and 3D), but are mostly for browsing purposes. My project explores the ways in which users can become further involved in said online experiences by contributing, connecting and sharing what they see.

Images and demo video See attached and embedded below!

  • Thumbnails
  • Slideshow
Amber Lee

Hello I'm Amber, a third-year environment and communication design major at Carnegie Mellon University. I like projects that make people think or feel differently and merit more than their measurable results.

Sound of the Sky — generative music installation

Initial research For this project, I was interested in the sky and and its invisible trace of sound. Although the sky is generally a visual experience, I wanted to experiment with ways it might become audial as well. To begin my image to sound research, I looked into a color to sound device, an article on different types of background noise, and Yuri Suzuki's Face the Music project.

Concept An installation that showcases generative music derived from images of the sky.

See images and read more https://amberlee.me/sound-of-the-sky


Carnival — digital platform for streaming, browsing and sharing.

Concept Art and music are frequently found hand-in-hand at festivals to create rich experiences for attendees. My project explores the benefits of at-home streaming, browsing, and sharing to create a contemporary digital experience in the form of a website.(edited)

Action Users have the opportunity to explore galleries, videos, live streams, and music from makers (professionals) featured on the site. In addition, every user becomes a curator with their own display. Displays feature users' saved contents so that friends or people with similar interests may browse for discovery.(edited)

Addressed gap Existing digital galleries, streams, etc. flush out many different methods of display (i.e. 2D and 3D), but are mostly for browsing purposes. My project explores the ways in which users can become further involved in said online experiences by contributing, connecting and sharing what they see.

Images and demo video See attached and embedded below!

  • Thumbnails
  • Slideshow
Dan Lockton (course leader)

Design for Environments Studio IV is a course for juniors (third-year undergraduates) taking Carnegie Mellon School of Design's Environments Track. We explore design, behavior, and people's understanding, in physical, digital, and 'hybrid' environments. The course comprises practical projects focused on investigating, understanding, and materializing invisible and intangible qualitative phenomena, from intelligence to social relationships, through new forms of probe, prototype, speculative design and exhibit.

In spring 2020, our first set of projects focused on Autographic Visualizations. The term (by Dietmar Offenhuber) refers to things that create a record or trace of something that's happened to them—how they are used by people, or how something in the surrounding environment affects them (they are forms of indexical visualization, or qualitative interface). Sometimes the trace is accidental or incidental, or solely a side-effect of a property of the materials. But it can be designed intentionally, and perhaps can reveal patterns which would otherwise be invisible.

We took this prompt as a starting point, with an intro to data materialization from Marion Lean, and created a wide range of interactive projects which build on and expand the idea, addressing everything from physicalizing a person's heartbeat through ferrofluid, to a VR physicalization of race and privilege. We have sonification of sunsets, new kinds of educational building blocks, ways to experience laughter, 3D printing as a real-time visualization of a human-machine conversation, synchronized breathing, a physicalization of the thread of a conversation, and design fiction extrapolating from your digital traces.

In the second half of the semester, we were forced by circumstances to pivot to an online form of distributed studio—which offered challenges but also some interesting new ways of responding to the environments of our everyday lives: the interior world, the endless Zoom, and the connections across space and time zones. These Virtual Environments Studio projects are a diverse mix of ideas ranging from new ways of experiencing music and art online, to remote therapy environments, some addressing aspects of the current COVID-19 situation directly, and others looking to a world beyond or above our contemporary distress.

In working on the projects, and 'showing' them internally, we explored combinations of Discord, Figma, and other tools to see whether we could partially replicate the feeling of 'stopping by someone's desk' in a physical studio—I don't think we're quite there yet, but this group of students have the sensitivity to the design of physical and digital environments, and the talent, to be at the forefront of evolving design practice in the new environments of the lives ahead of us.


  • Thumbnails
  • Slideshow
Dan Lockton (course leader)

Design for Environments Studio IV is a course for juniors (third-year undergraduates) taking Carnegie Mellon School of Design's Environments Track. We explore design, behavior, and people's understanding, in physical, digital, and 'hybrid' environments. The course comprises practical projects focused on investigating, understanding, and materializing invisible and intangible qualitative phenomena, from intelligence to social relationships, through new forms of probe, prototype, speculative design and exhibit.

In spring 2020, our first set of projects focused on Autographic Visualizations. The term (by Dietmar Offenhuber) refers to things that create a record or trace of something that's happened to them—how they are used by people, or how something in the surrounding environment affects them (they are forms of indexical visualization, or qualitative interface). Sometimes the trace is accidental or incidental, or solely a side-effect of a property of the materials. But it can be designed intentionally, and perhaps can reveal patterns which would otherwise be invisible.

We took this prompt as a starting point, with an intro to data materialization from Marion Lean, and created a wide range of interactive projects which build on and expand the idea, addressing everything from physicalizing a person's heartbeat through ferrofluid, to a VR physicalization of race and privilege. We have sonification of sunsets, new kinds of educational building blocks, ways to experience laughter, 3D printing as a real-time visualization of a human-machine conversation, synchronized breathing, a physicalization of the thread of a conversation, and design fiction extrapolating from your digital traces.

In the second half of the semester, we were forced by circumstances to pivot to an online form of distributed studio—which offered challenges but also some interesting new ways of responding to the environments of our everyday lives: the interior world, the endless Zoom, and the connections across space and time zones. These Virtual Environments Studio projects are a diverse mix of ideas ranging from new ways of experiencing music and art online, to remote therapy environments, some addressing aspects of the current COVID-19 situation directly, and others looking to a world beyond or above our contemporary distress.

In working on the projects, and 'showing' them internally, we explored combinations of Discord, Figma, and other tools to see whether we could partially replicate the feeling of 'stopping by someone's desk' in a physical studio—I don't think we're quite there yet, but this group of students have the sensitivity to the design of physical and digital environments, and the talent, to be at the forefront of evolving design practice in the new environments of the lives ahead of us.


  • Thumbnails
  • Slideshow
Danny Cho

OURS: Reaching beyond one's boundary

Image 1-4

90%. It's the failure rate for non-profit fundraisers on GoFundMe, the biggest crowding funding platform for such purposes. I looked into this issue as well as trying a fundraiser myself. This project is about how to frame the idea of sharing and associating oneself with a cause differently in the donating environment.

Realtime Interactive Visuals

Image 5-6

Whenever it shows up in Iron Man, I can't help but marvel at Jarvis, the hologram AI reacting to gestures of Tony Stark. I wanted to also explore creating visuals that react to my gestures in realtime. I used the TouchDesigner software and little After Effects to create the visuals.

On the Other Side: A short VR Film

Embedded Video

I also wanted to explore creating a surreal experience in a space filled with lights and intriguing shapes, playing with audience's emotion. This is an experience for VR, creating using Unity, After Effects, and Mandelbulb 3D.

  • Thumbnails
  • Slideshow
Vapor or Sweat [link: vimeo.com/41088535]
I connected the webcam to TouchDesigner and selected pixels that were of high value. From this silhouette, I created a particle emitter responsive to the webcam data in realtime. On the footage made from this visualization, I added audio-reactive effect to add more entertainment.
Animal Crossing Vibe [link: vimeo.com/404878901]
I created a landscape reactive to the accelerometer data on my iPhone. By tilting the device, you can morph the landscape back and forth.
Danny Cho

OURS: Reaching beyond one's boundary

Image 1-4

90%. It's the failure rate for non-profit fundraisers on GoFundMe, the biggest crowding funding platform for such purposes. I looked into this issue as well as trying a fundraiser myself. This project is about how to frame the idea of sharing and associating oneself with a cause differently in the donating environment.

Realtime Interactive Visuals

Image 5-6

Whenever it shows up in Iron Man, I can't help but marvel at Jarvis, the hologram AI reacting to gestures of Tony Stark. I wanted to also explore creating visuals that react to my gestures in realtime. I used the TouchDesigner software and little After Effects to create the visuals.

On the Other Side: A short VR Film

Embedded Video

I also wanted to explore creating a surreal experience in a space filled with lights and intriguing shapes, playing with audience's emotion. This is an experience for VR, creating using Unity, After Effects, and Mandelbulb 3D.

  • Thumbnails
  • Slideshow
Davis Dunaway

My name is Davis Dunaway. I'm a student at CMU studying Environment Design and Game Design. Below are the two projects I did for my Environments Studio this semester. The first is a project about human conversation. I generated unique 3D prints as two participants took a Turing Test. Participants could watch how the print changed based off their answers in real time and got to take the artifact with them as soon as the experience ended. The second project is a design for a single piece face shield. Due to the increased demand of PPE, its important that face shield designs be cost, time, and material effective. 3D printing solutions can take up to 5 hours just to produce a single shield. I worked to design a shield that could be produced in under 5 minutes using a single piece of Mylar and some clever folding techniques used in origami. I am currently working to get the shield into mass production. Thanks for taking the time to look at these projects and I will be happy to answer any questions you might have about them.

  • Thumbnails
  • Slideshow
Examples of G-Code Prints (Project 1)
A mixture of exploratory and responsive prints.
Early Testing (Project 1)
Working to connect the interactive test to a custom build G-Code generator to create responsive prints.
Experience in action (Project 1)
2 participants watch the the 3D printer react in real time to there answers on the Turing test, generating a one of a kind artifact.
Face Shield Testing (Project 2)
Testing the first production ready design. The design would still be changed slightly but this version is what was used to pitch the design to manufacturers.
Early Prototypes (Project 2)
A series of early prototypes of the face shield design spanning the first paper test to the first production-ready test.
First Production Batch (Project 2)
Over 100 folded shields from our first couple of days of production. Made using only 1 laser cutter and 2 sets of hands.
Davis Dunaway

My name is Davis Dunaway. I'm a student at CMU studying Environment Design and Game Design. Below are the two projects I did for my Environments Studio this semester. The first is a project about human conversation. I generated unique 3D prints as two participants took a Turing Test. Participants could watch how the print changed based off their answers in real time and got to take the artifact with them as soon as the experience ended. The second project is a design for a single piece face shield. Due to the increased demand of PPE, its important that face shield designs be cost, time, and material effective. 3D printing solutions can take up to 5 hours just to produce a single shield. I worked to design a shield that could be produced in under 5 minutes using a single piece of Mylar and some clever folding techniques used in origami. I am currently working to get the shield into mass production. Thanks for taking the time to look at these projects and I will be happy to answer any questions you might have about them.

  • Thumbnails
  • Slideshow
Davis Dunaway and Vicky Zhou

The Turing Test was developed by Alan Turing in 1950, and is used to analyze a machine's ability to exhibit intelligent behavior equivalent to, indistinguishable from, that of a human. The Loebner Prize is an annual competition in artificial intelligence to award computer programs by judges to be the most human-like.

Computers are advancing and progressing, but some argue that human's ability to communicate is also regressing. We wanted to create an experience to visualize and reflect on the human-ness of conversations today, so we created The Most Human Human. The process is as follows: there are two human participants: one judge and one competitor, as well as one bot. The human competitor and the bot are competing to prove their "human-ness", and the judge infers which response between the two is the human to the best of their ability. As the test progresses, a 3D print is generated in real-time.

The end product is a conversational piece, and tangible artifact of the experience. At the end of the experience, we hope to leave the users a couple of takeaway questions: Will this change the way you approach conversations in the future? How do you think human conversation will change in the future? Where does computer conversation fit within this?


  • Thumbnails
  • Slideshow
Davis Dunaway and Vicky Zhou

The Turing Test was developed by Alan Turing in 1950, and is used to analyze a machine's ability to exhibit intelligent behavior equivalent to, indistinguishable from, that of a human. The Loebner Prize is an annual competition in artificial intelligence to award computer programs by judges to be the most human-like.

Computers are advancing and progressing, but some argue that human's ability to communicate is also regressing. We wanted to create an experience to visualize and reflect on the human-ness of conversations today, so we created The Most Human Human. The process is as follows: there are two human participants: one judge and one competitor, as well as one bot. The human competitor and the bot are competing to prove their "human-ness", and the judge infers which response between the two is the human to the best of their ability. As the test progresses, a 3D print is generated in real-time.

The end product is a conversational piece, and tangible artifact of the experience. At the end of the experience, we hope to leave the users a couple of takeaway questions: Will this change the way you approach conversations in the future? How do you think human conversation will change in the future? Where does computer conversation fit within this?


  • Thumbnails
  • Slideshow
Gautham Sajith

About Gautham

I'm a student in CMU's Master of Human-Computer Interaction program. Before this program, I worked professionally as a software engineer for several years in the Bay Area. I consider myself a design technologist. My passions include design systems, generative art, and ethical technology.

About Stage

With Stage, I'm building a concert livestreaming application for the post-COVID era. As many musicians move to virtual concerts under quarantine restrictions, Stage pushes the boundaries of what a digital concert livestreaming platform can do. Check out the teaser video!

Stage incorporates direct artist-fan interactions, a virtual currency to reward the most loyal fans, in-room AR experiences, and more.

When building Stage, my main focus was on nailing the visual branding. I was meticulous about building the tone and voice of the app, from the moodboard to the "brand words" to the app name to the wordmark. There are some peeks at the behind-the-scenes process in the project images.

Hope you enjoy my work on this project! I would love any comments and feedback during the showcase, or through Twitter. My portfolio is still in-process. :)


  • Thumbnails
  • Slideshow
Stage Concept
A short description of the Stage concept, and the context it was conceived within.
Stage Screens
An exploratory set of features afforded through the Stage livestreaming platform.
Stage Stylescape
Stage's visual branding and moodboard.
Stage Visual Explorations
A high-level look at the process behind Stage's branding and visuals.
Gautham Sajith

About Gautham

I'm a student in CMU's Master of Human-Computer Interaction program. Before this program, I worked professionally as a software engineer for several years in the Bay Area. I consider myself a design technologist. My passions include design systems, generative art, and ethical technology.

About Stage

With Stage, I'm building a concert livestreaming application for the post-COVID era. As many musicians move to virtual concerts under quarantine restrictions, Stage pushes the boundaries of what a digital concert livestreaming platform can do. Check out the teaser video!

Stage incorporates direct artist-fan interactions, a virtual currency to reward the most loyal fans, in-room AR experiences, and more.

When building Stage, my main focus was on nailing the visual branding. I was meticulous about building the tone and voice of the app, from the moodboard to the "brand words" to the app name to the wordmark. There are some peeks at the behind-the-scenes process in the project images.

Hope you enjoy my work on this project! I would love any comments and feedback during the showcase, or through Twitter. My portfolio is still in-process. :)


  • Thumbnails
  • Slideshow
Meijie Hu

This is a dystopian design fiction inspired by the new implementation of health QR code in China. As a tool for public health monitoring and potentially for future mass surveillance, it pushes me to think about the consequences of the massive data system. While such surveillance measures can provide people more security and might come from goodwill, we give in our responsibility of self-discipline and let others, whether it is the authority or technology, to dictate our health practices and even daily life. Especially with the rapidly evolving sensing technology, how much autonomy is there left for us besides pressing the buttons of "yes" and "no"?

The first part of the future fiction presents the omnipotent IoT system that monitors your health condition, hosted on the phone. In 2025, your smart home can analyze different sensor data inputs and generates health indexes, which would be sent to other platforms to not only monitor your environment but also provide necessary compensations if needed. It is dedicated to creating a supportive, non-biased smart home system that provides you the health care you need.

The second part of the fiction describes the story of Josh, a first-day user of the health monitoring system, about how he gets into his first day of self-quarantine with the help of the machine:)

  • Thumbnails
  • Slideshow
Future Artifact
object system of the health monitoring IoT
Future Artifact 2
how the system collects sensor data
Future Artifact 3
health index and compensations
Future Fiction
Josh purchasing the health monitoring system
Future Fiction 2
Josh getting locked in his house
What happens next?
check my website for the rest of the story!
Meijie Hu

This is a dystopian design fiction inspired by the new implementation of health QR code in China. As a tool for public health monitoring and potentially for future mass surveillance, it pushes me to think about the consequences of the massive data system. While such surveillance measures can provide people more security and might come from goodwill, we give in our responsibility of self-discipline and let others, whether it is the authority or technology, to dictate our health practices and even daily life. Especially with the rapidly evolving sensing technology, how much autonomy is there left for us besides pressing the buttons of "yes" and "no"?

The first part of the future fiction presents the omnipotent IoT system that monitors your health condition, hosted on the phone. In 2025, your smart home can analyze different sensor data inputs and generates health indexes, which would be sent to other platforms to not only monitor your environment but also provide necessary compensations if needed. It is dedicated to creating a supportive, non-biased smart home system that provides you the health care you need.

The second part of the fiction describes the story of Josh, a first-day user of the health monitoring system, about how he gets into his first day of self-quarantine with the help of the machine:)

  • Thumbnails
  • Slideshow
Meijie Hu

This breathing pillow is a physicalization of breathing. It is inspired by the moment when you lie on your loved one's chest and feel the rise and fall of their chest. To simulate a similar intimacy in relationship, we built a pair of breathing pillows where you can lie on one pillow and feel the breathing from the person who lies on the other pillow through the rise and fall of the pillow.

We used linear actuators to push the air in and out of the pillows and stretch sensors to detect the expansion and contraction of the chests.

  • Thumbnails
  • Slideshow
Initial Visualization of Breathing
Schlieren effect of airflow when a person breathe in front of candle
Breathing Pillow Mechanism
Breathing Pillow System Mechanism
Machine Components
Machine Components
Working Machine
when people breathe, the linear actuator push/pull the air from the pillow to cause it inflate/deflate
Final Outcome
A participant sleeping on our breathing pillow
Meijie Hu

This breathing pillow is a physicalization of breathing. It is inspired by the moment when you lie on your loved one's chest and feel the rise and fall of their chest. To simulate a similar intimacy in relationship, we built a pair of breathing pillows where you can lie on one pillow and feel the breathing from the person who lies on the other pillow through the rise and fall of the pillow.

We used linear actuators to push the air in and out of the pillows and stretch sensors to detect the expansion and contraction of the chests.

  • Thumbnails
  • Slideshow
View Students