HomeE
HomeE is an interface designed for smart home displays that intuitively show the users' current utilities usage through engaging metrics, the cost. It drives sustainable actions through budget setting and recommendations all while functioning as an interactive home interface.
Year
2021
My Role
User experience designer/ User Interface designer
My responsibilities
I worked as a user interface designer and user experience designer.
-
Collaborated with the engineer, project managers, and UX researcher to create a prototype for homeE.
-
Work with teams to research and communicate user pain points to understand user motivations, triggers, desires, goals, and competitor analysis.
-
Create and design user flows, wireframes, user interfaces, and interactive prototypes, etc.
-
Lead and design user Interface.
-
Develop user experience in line with project goals, and test with users.
-
Create illustrations for prototypes.
-
Assist project managers in planning project milestones and writing final documents.
Design Components




Design System
Colour Palette

Base colours are vibrant and high saturation colours. They are mostly found in the background of the interface and are also applied to the colour matching of key components. The key colour is Turquoise, Hex code is #02A6A7. We encourage the consistent use of Turquoise and greens throughout our visual communications to contribute to the cohesive and harmonious look of the identity across all our designs.
Secondary or accent colours have also been identified that compliment and highlight the primary colours. Secondary colours are low vibrancy. They are mostly used for small component backgrounds, and some infrequently used component colours.
Components

Typography


We only use one typography family to keep the consistency: Futura PT. It is sans-serif font, which brings a modern feel to users who use HomeE on a touch screen. Most of the headlines use both extra bold and bold weight, and the font size of the headlines is between 48-96 points. For the content font, we used Futura PT Book, in two font sizes, 48 points and 64 points respectively.
Problem Statement
How might we increase millennials' adoption of sustainable home practices in North America?
Background
Homes are consuming large amounts of energy.
-
In the US, residential energy use accounts for about 20% of greenhouse gas emissions
-
Residential buildings use 33.3% of the total electrical energy and 24% of the total natural gas in Canada in 2017.
Smart home devices could help users to monitor energy consumption and uncover energy and money saving opportunities.
Millennials are major users of smart home devices
-
In the US, 47% of millennials use smart home products
-
In Canada, 28.4% of smart home users are 25-34 years old in 2020.
Current management apps are complicated and overload the user with information.
Target Audience
Millennial (25-30s) home buyers/owners
User Needs
-
Wants to save money through saving energy in their daily lives
-
Wants to keep track of their utility consumption and which area is consuming the most
-
Having trouble staying engaged in saving energy and money (not enough time, too much effort)
-
Unable to understand Co2 measurement units and relate what carbon emission means to themselves without contextualization
-
Not interested in detailed analysis
-
Wants to receive customized actionable suggestions based on their energy-using trends with reasons explained
Our Solution
A smart home display with an intuitive and action-driven data visualization with the following features:
-
Budget limit setting of energy use to help user stay engaged
-
Energy-saving tips to educate users on how to save money by saving energy
-
Progress bars and charts showing how much energy is used in terms of money and progress towards weekly target
-
Bar chart plot to show daily, weekly and monthly energy-using trends
-
Alerting system on energy overuse
Project Goal
By quantifying home electricity, gas and water usage in regard to the homeowner's targets, HomeE encourages users to save money and therefore more sustainable practices.
User research
Persona
Our user persona is created based on the primary research we did:

User Journey Map
A user journey map was created to illustrate the expectations, struggles and emotions of our persona, Sara, when she is using HomeE.

User Flow
The user flow for using the HomeE smart home system is detailed below:

Design Process
Our team’s design process can be broken down into four phases. In each phase, we experienced the processes of ideate, design, iterate and test. Our team reviewed all the comments we received and conducted a retrospective based on the results to help us make design decisions. This process allowed us to stay on the same page despite having different skill sets and responsibilities. Following this design process, our team was able to deliver a high-fidelity mock-up and a technical prototype as a proof of concept within 11 weeks. This design process allowed us to solve the problem and test our assumptions together and provided numerous opportunities for us to discuss and voice our ideas.
Phase 1
This phase is the shortest among all the phases, which lasted only for three days. Our team was in a Design Jam on September 24-26, 2021. Our team framed our problem and had our 1st iteration of the smart home system in this phase of design. By the end of the Design Jam, we received useful comments from the cohort and the faculty, suggesting us to consider privacy concerns and add onboarding to our prototype. These comments helped us determine our next steps.
Phase 2
The second phase of the process started after the Design Jam and finished on the Midterm Test Day. This phase was from September 27 – October 24. Our team focused on solidifying our understanding of the problem and our target audience through both desk and primary research as well as creating a more integrated prototype that we used in our next test. This included additional pages like the settings and a user interface redesign that incorporated feedback from the last phase. The mid-term test allowed us to conduct usability testing to gather critical feedback regarding the intuitiveness of the interface as well as the units of measurement that we were using.
Phase 3
The third phase started after the Midterm Demo Day and ended on the Community Test Day. This phase was from October 25-November 21. Our team changed measurement of users’ utility consumption to cost and developed a new cost graph that incorporates all the reference values clearly to make our design more human-centered. Besides, our team finished designing the onboarding process and modified the badges design. During the weekly demo, we realized that our word choice might cause confusion, so we also renamed our target setting function as budget limit setting.
Phase 4
The final phase of the design process started after the community test day and wrapped up the project with the final presentation. In this phase we add our final feature, the alert system, and developed a simpler design for our achievement system. We also finalized our Adobe XD prototype by organizing the assets library and continued developing the technical proof of concept. To wrap up this phase and the project we presented our concept, final product and process in the final presentation where we gathered comments and questions as our final form of feedback.
We designed the component using a card-based background for visual simplicity and ease of editing and uniformity. Depending on the size of the component, we designed two different parameters to manage it. The small card component height range is between 242 points and 829 points, width range is between 238 points and 1066 points. The big card component heigh range is between 454 points and 1395 points, and the width range is between 945 points and 2560 points.
User Test 1: Midterm Demo Day (Oct 21, 2021)
We conducted the first prototype user testing with MDM students and faculty. During the usability testing phase, six users tested our prototype.
Objective
The main objective of the testing was to determine the effectiveness of the implementation of the concept. During the testing process, we monitored the users to reveal points of friction and confusing experiences.
Methodology
Testing was announced to the student and faculty of the CDM through Discord. Participants who booked a time for testing received an invitation to test. The user testing was done through Zoom online platform, while the screen has been shared by participants. The session was recorded upon user’s permission.
Testing
During the usability testing session, each participant was asked to complete a series of tasks on the prototype. The participant’s interaction with the HomeE application prototype was monitored by the facilitator. Before starting the tasks, participants completed a pretest demographic and experience information questionnaire.
Here is the list of the tasks:
-
You want to check your electricity usage. Can you tell us how much electricity you consumed today and if you are over or under your target setting?
-
After looking at your usage you want to save some power, please turn off the lights in the main bedroom.
-
Let’s establish a new electricity usage target. Please set 23kwh as your electricity target.
-
Could you tell us what is your most recent gas achievement?
-
Looking at your water usage you wonder how you can save more. Please share a tip or recommendation found in the app on saving water.
Meanwhile the participants were also asked to rate the presumed difficulty level as well as the post-test difficulty level. The test conductor was also monitoring users' behavior while taking notes during the test session. After completing all the tasks and scenarios, the participant asked to complete the post-test questionnaire. This questionnaire utilized the standard SUS structure to evaluate effectiveness, efficiency, and satisfaction.
Results
Once the Usability testing was finished, results from the surveys from the user tests were categorized and analyzed.
The average SUS score of the participants is 65.
After analyzing all the gathered during the usability test, we realized that Task 1 had the longest task completion time on average, identifying the participants initial scan of the document. As the users continued using the prototype the reflected times decreased. Participants became more comfortable with the prototype and came across some of the info while looking for other tasks. Task 2 had the shortest time at an average, this might be because of its placement on the dashboard. SUS scores have a modest correlation with task performance; however, this could simply be because people’s subjective assessments may not be consistent with if they were successful using a system.
Following, our team also analyzed tester’s behavior observation and in session notes, and break out the fundamental takeaways as below:
-
All the users could complete the tasks easily and quickly.
-
Most of the users found the various functions in this system were well integrated.
-
Few users showed some confusion about the task’s intention.
-
Some of the users found utility units meaningless.
-
UI elements were not intuitive and could be improved.
-
Target and cost chart were not synchronized.
-
Users preferred to see their weekly usage instead of daily.
The pre and post task difficulty was not analyzed as it was not collected from all the participants, as they had a habit of starting the tasks, online, before the question could be asked. This part of the protocol was removed in the second test.
Action Items
To increase our system effectiveness, our team decided to take the following actions and revise some features in our next prototype
-
Keep the overall visual format and simplicity.
-
Represent energy consumption by money.
-
Incorporate clearer heading and description for the entire dashboard.
-
Develop the graphs to show past and average usage
User Test 2: Nov 18, 2021
We conducted the second prototype community user testing. During the usability testing phase, six users tested our prototype, the participants were chosen from MDM faculties and community.
Objective
The main objective of the testing was to determine the effectiveness of the implementation of the concept. During the testing process we were monitoring the users to reveal points of friction and confusing experiences. Methodology
Methodology
Testing was announced to the student and faculty of the CDM by email. Participants who booked a time for testing, received an Invitation to test. The user testing was done through zoom online platform, while the screen has been shared by participants. The session was recorded upon user’s permission.
Testing
During the usability testing session, each participant was asked to complete a series of tasks on the prototype. The participant’s interaction with the HomeE application prototype was monitored by the facilitator. Before starting the tasks, Participants completed a pretest demographic and experience information questionnaire.
Here is the list of the tasks:
-
You just purchased a Google Nest Hub and want to save money by reducing your utility consumption. You have placed your Google Nest Hub on the Kitchen counter and just installed the HomeE application. Please set it up, the number with autofill on click and get to the dashboard. You will have to drag the bedroom light to the control center.
-
You are leaving your home, but you see your bedroom light is on. Can you please turn off the bedroom light on the dashboard?
-
Using HomeE for six months, you check it with your morning coffee. Tell us if you are under, at or over your weekly budget for each of the utilities for this week and how are you doing compared to last week?
-
Seeing your budgets you want to set a new electricity budget limit. Please set it to $15/week.
-
You think you are doing well in the last six months, so you want to check what achievements you have got. Please tell me how many achievements you have.
During the test, the conductor was monitoring users' behavior while taking notes during the test session. After completing all the tasks and scenarios, the participant asked to complete the post-test questionnaire. This questionnaire utilized the standard SUS structure.
Results
Once the Usability testing was finished, results from the surveys from the user tests were categorized and analyzed.
The average SUS score of the participants is 66.6
After analyzing all the gathered during the usability test, we realized that Task 1 had the longest time in average, it was not surprising as it included the onboarding process, and the participants spent some time on the initial scan of the system. As the users continued using the prototype the reflected times decreased. Participants became more comfortable with the prototype and came across some of the info while looking for other tasks. Task 2 had the shortest time at an average, this might be because of its placement on the dashboard. Tasks 3, 4 and 5 have the same time, this could be attributed to their relativity proximity to each other. Task 3 is slightly longer; this could be because users were testing the sliding feature on the untouchable screens. SUS scores have a modest correlation with task performance; however, this could simply be because people's subjective assessments may not be consistent with if they were successful using a system.
Following, our team also analyzed tester’s behavior observation and in session notes and break out the fundamental takeaways as below:
-
All the users could complete the tasks easily and quickly.
-
Most of the users found the onboarding process helpful.
-
Some users had difficulty using target setting features.
-
Users found the interface design and overall visual format interesting.
-
Users found the trends and achievement features engaging.
-
Some users thought the achievement badges were not presenting the level of accomplishments clearly.
-
Users found the recommendation feature helpful.
Action Items
To increase our system effectiveness, our team decided to take the following actions and revise some features in our next prototype
-
Keep the overall visual format and simplicity.
-
Modify the achievement arts to make it more engaging and understandable for the users.
-
Revise the target setting feature.
-
Create a built-in alert system to increase user’s engagement.
Moving Forwards
As far as the next step, our team has identified several aspects that can help us to continue and develop this project forward.
Development Build
We would like to complete a development build that we could continue testing on and use for higher level pitches. To increase adoption of sustainability practices, we need collective action at every scale from individual to global. Private individual actions may not increase at a rate sufficient to address the problem in a timely manner. To increase the impact, we are considering presenting pitches to the higher levels such as a pitch to BCHydro.
Test on Touch Devices
We would like to test our interface on touch screen devices. The HomeE system is designed specifically for touch screen devices. At this point due to the virtual testing process and limited accessibility to the touchscreen devices for the users, most of our participants used desktop platform and mouse or keyboard to perform the tasks. We realized that it caused some confusion, and the user experience would be different when someone is using a touchable screen. Hence, to allow users to give more accurate feedback, we would like Test our interface on touch screen devices like an iPad or even the smart home hardware itself.
Production Build
Develop the production build for release, the priority is to follow the ER diagram to establish the schemas for the back-end database and connect the front-end interface with it to retrieve and display the actual data. Furthermore, a dedicated collaboration needs to be established with service providers (e.g., BC Hydro) to fetch the utility data with different granularities, where secure and reliable Open APIs shall be accessible to or created by the development team if missing in the first place. Thereafter, user authentication and home automation would be the next few things to embark on.
Business Plan
Develop a business plan that includes branding and marketing. We would like to create a marketing plan to provide detailed action items that can help us to connect with our potential users and encourage them to incorporate our system into their houses. In addition to that, we would like to put more effort into creating key brand elements, like logo, website, and brand style guidelines.