MIT IAP2018 Cooking Helper for Alzheimer’s

Team Members:

Zoe Sherina Lemon

The Redesign:

Alzheimer’s disease affects 1 in 10 people over the age of 65, according to the Alzheimer’s Association (https://www.alz.org/facts/), making it an extremely prevalent issue for the elderly. This disease causes debilitating effects on patients’ memory and cognitive, exacting a huge toll on independence and autonomy. At the severest stages of Alzheimer’s, the person must depend almost entirely on others for daily living (https://www.caring.com/articles/stages-of-alzheimers).

One common difficulty as Alzheimer’s develops is with cooking (https://www.alz.org/nca/in_my_community_22019.asp#Living_Alone), because it becomes challenging to remember the steps of a recipe from start to finish – even a step like boiling water can be difficult. While a person in the later stages of Alzheimer’s will require assistance/someone to cook for them, in the earlier stages when a person is facing initial symptoms, losing the ability to cook (often something they’ve done for themselves their entire lives) can be frustrating and upsetting. For my project, I set out to design a voice command tool that can help people struggling with the onset of Alzheimer’s to retain as much of their autonomy as they can – by guiding them through the steps of the recipe of their choice. At this stage, this design will be in the form of an application compatible with Google Home or a similar digital assistant device; in the future, a physical cookware set could be built alongside the app, with visual cues to help the user know which utensils to use throughout. While human assistance with cooking becomes necessary at some point for Alzheimer’s victims, having a tool to provide autonomy for patients in the earlier stages of the disease can be a valuable emotional support mechanism to ease the frustrations of requiring outside help with daily tasks. This is especially true in a time when we do not yet have treatments for Alzheimer’s disease.

_______________________________________________________

How human-centered and human-factors concepts guided your team:

The first concept I had to consider as I began my project was target population, and how my design would fit our assigned audience. Our assigned population to redesign for was older adults, so I began to think about the problems that usually begin affecting us as we age. The most vivid example that came to mind was Alzheimer’s dementia, since my grandmother struggled with it in her final years, so I decided to work with that; specifically, a person in earlier stages. I remember how debilitating the disease became for her as it progressed to more severe stages, so I have a grasp on what makes the idea ‘stick’ (a concept we discussed in the first lecture) – the lack of autonomy that results, and how my design can help ease that.

The other things I considered as I brainstormed were the aspect of ‘training’ from Lecture 1, and “Exploring the rhetorical situation” from Lecture 3. My design’s purpose is to make life easier for a population that is trying to continue daily tasks they’ve done before but now have difficulty with – so I wanted to avoid frustration due to unfamiliar technology. Everything needed to be as intuitive as possible. I did consider that a human assistant could help with some of the startup tasks, like turning on the device and getting it ready for the user. But I needed the actual process of using the device to consist of familiar actions, like speaking to another person. Otherwise, the difficulty of using the device would outweigh the motivation to use it. Therefore, I decided to have my design be in the form of an application for a smart speaker like Google Home or Amazon Echo. This way, the act of using the tool could resemble having a conversation with someone who is with the user in the kitchen to help with anything they need. Hal’s lecture on the 18th also helped me very much with thinking about ways my project should be adjusted to fit my target population’s needs better, since initially I had plans of making a visual app (e.g. for iPad) along with other physical props; Hal’s discussion of the “usability evaluation” helped me rethink what my audience needed.

Dr. Sawyer’s discussion in Lecture 4 of Augmented Reality made me view my design through a different lens that I hadn’t considered before. Rather than some outside tool that was unconnected from the user, I could think of my app as AR itself, since its purpose was precisely to enrich the individual’s environment with additional, audible cues that would guide them through a process. Ashley Nunes’ guest lecture on the importance of thought was the final discussion of the class and emphasized that I needed to really consider each step of using the app, along with key features that needed to be included – particularly, including ways for the user to request feedback, such as asking the virtual assistant to repeat itself, or request the next step in the recipe.