In many households, tablets are used as a babysitting device to keep children preoccupied when parents are busy. This isn’t a bad thing, but because tablets are mobile, this leads to children being on iPads in the car, the dinner table, and elsewhere.
This project explores what interactions would be like if the child/family tablet is transformed into a coffee table. I designed 3 scenarios to help children better spend their time and help parents be better involved in their kids’ digital lives.
Plan Play Sessions:
Kids can play forever, and every day is a struggle to get them to stop. Planning helps them be more responsible and also builds up a habit for the future. Kids plan what they want to do and for how long. When the time comes, they have to stop. No negotiation and no tantrum.
Play Together:
Sometimes, kids just want to play with their parents. Play Together takes away the friction — parents can just jump in. By sitting next to their child on the couch, they too can play whatever their child is playing, creating opportunities to connect.
Catch Up:
When kids use electronics, they have a whole life out there that parents don’t know about. Parents want to be involved. They want to talk to their kids about what they’re experiencing and feeling. Catching up should be effortless. As parents sit on the couch, they can get passive updates on what their child has been doing.
As interactions get more complex, we’ve away from physical interfaces (control panels with knobs and buttons) to software interfaces. And that’s a great thing: software UI are not constrained by physical space and can adapt to be whatever is needed.
However, we also lost something. Physical interfaces are approachable, tactile, and have immediate physical feedback. With Augmented Reality, where the digital is layered onto the real world, we can leverage both physicality of objects and the versatility of software.
Most current AR designs are simply taking 2D touch interfaces and projecting them onto 3D environments. Our case study explores how we can forgo the traditional 2D UI model for a more tangible way of interacting with music.
Team:
Derek Burkhardsmeier (MDes)
Derrick Ho (MHCI+D)
Matt Imus (MDes)
Most dogs are under stimulated, meaning that they’re often bored or sleeping. They also have natural skills they want to develop: using their ears, eyes, and noses to hunt for food. Collaborating with Microsoft, we designed AI-powered devices that keep dogs stimulated throughout the day and help develop their skills by having them hunt for food.
Sherlock and Watson devices use lights, sounds, and smells, to create a game for dogs to play. Dogs become little detectives and chase down these stimulants, in the process, exploring the home and finding patterns to earn food.
With cooking, there are so many things going on at once: there’s asparagus in the oven, there’s chicken on the pan, and oh my god the pot is boiling over.
Multitasking can be intimidating for beginners. Umami is a watch-like wearable that walks them through the chaos, step by step. By making recipes glanceable and providing contextual timers for specific tasks, users can focus on the task at hand and keep an eye on what’s coming up next.
Umami isn’t scared of greasy fingers. The case comes right off and can be easily cleaned.