Designing Ethically, Delivering Better

Thoughts on Human-Centered and Ethical Design

Software development team in a meeting.

Octobot’s Design team, in which I am a UX Designer, excellently shares knowledge and tips amongst each other and with the company as a whole. A few weeks ago, I held a Learning Session based on the “Laws of UX” book, by Jon Yablonski. This session was open to everyone in Octobot interested in this topic. It was a great experience discussing how the UX practices recommended in the book are key for ethical design decisions. 

I’d like to dive a little deeper into designing solutions that truly put the user interests at the center of the process as well as examples of how we strive for applying ethical design in our projects at Octobot.

What is ethical design?

After reading “Laws of UX”, researching, and preparing the Learning Session, some of Yablonski’s recommendations caught my attention because they were less obvious than other practices. While we’re used to making some design decisions in a very intrinsic, even automated way, others are harder to make because we need more information and to put ourselves in other people’s shoes. 

For example, we all know most languages are read from left to right, top to bottom. It’s natural for us to design solutions with this in mind. However, it can be tougher to define the right path when thinking about the user’s overall experience in our system, even when it fails. How are we designing and responding when the user faces a problem our software can’t solve? I want to focus on a few of the book’s points which I believe have the highest impact on users’ product perception.

For understanding ethical design, I believe it’s important to get started by accepting that we, designers and devs, are usually quite tech-savvy. We are immersed in the tech world and its solutions, and  can easily understand and engage with them. But when we are thinking about developing a product, generally we are not the ones who will be using it, so we need to shift our focus to the user. People are the way they are, not the way we would like them to be, so we have to be sensitive to their problems, dreams, and needs.

“The first step into making ethical design decisions is to acknowledge how the human mind can be exploited. We must be intentional with the technology we create and consider how it is impacting people's lives.”

What Yablonski means here is that ethical decisions about the design have to start from the basis of being aware that the human mind can be exploited. We have to be intentional and consider the ways in which what we are creating affects people’s lives and relationships. We must consider how to facilitate and provide in an empathetic, simple, and transparent way.

Every decision in a software development project is made by the team, and the designers play a key role in this process. If we are going to use dark patterns, intermittent variable rewards, infinite loops and scrolls, or any other red flag in terms of ethical design, it’s 100% up to us.

Human-Centered Design in the practice

The UX laws I want to highlight relate to Human-Centered Design, an approach that states that we, as humans, have a blueprint for how we perceive and process the world around us. By having this concept in mind, instead of forcing users to adapt to a product or experience, we can use some key principles from psychology to design in a way that fits people’s expectations and habits. 

Heuristic Laws

Heuristics are mental shortcuts that reduce the cognitive load of making a decision. These laws are super important since they ensure a faster and simpler understanding for the user.

Hick’s Law

This mostly intuitive law states that “The time it takes to make a decision increases with the number and complexity of choices available”

The statement is based on the psychological concept of cognitive load, which determines that our speed to understand and process information is proportional to the amount of information given. In other words, the more options we have to process, the more time we will take to choose one and, even possibly, become more likely to abandon (or at least feel frustrated by) the task.

So, how do we keep this condition in mind in our work?

First, we must design with a critical eye to successfully remove all the information that – to be honest – doesn’t need to be displayed. More often than not, we find ourselves showing off how much our tool can do for you; all the things this product can help you with; all the tasks we’ve discovered you’ll need to accomplish. However, we must avoid “the remote control syndrome”. Our design needs to show what it can help you with right now, right at this step, and leave every part of information to be shown where it is strictly needed and, most importantly, expected.

In this desktop app designed for a health care service provider, many options were available.  First, we had to prioritize and remove every item that did not need to be shown on this screen: repeated information, nested categories, secondary flows, mixed types of user flows. As we redesigned the web’s IA (Information Architecture), the top bar became the main navigation tool, serving as a primary and strong filter for many of the previous design items. After doing so, the “online services” screen, only available for logged users, was reduced to the information strictly needed in this step.

Before:

System before applying Hick’s Law

After:

System after applying Hick’s Law

Miller’s Law

Following the cognitive law premise and working under another psychology concept known as chunking, Miller’s Law defines that “The average person can keep only 7 (± 2) items in their working memory”

Miller’s Law is very much related to Hick’s, as they both intend to minimize the user’s need to memorize information and, therefore, compromise decision making. Since working memory can only keep a limited amount of detail, we should design in a way that doesn’t exceed these limits. We can do this by both reducing the amount of options and also chunking similar pieces of data.  

If we take a look at this redesign, many decisions were made with  these considerations in mind. In this case, there was a lot of information which could not be removed, so the best solution to keep the user interested and focused was to chunk related issues and design several pages to complete the flow. The previous design was shown as a one-step, way too large, check box list. The design now groups questions into four steps, making the information more scannable, and encouraging the process with a goal-gradient effect.

Before:

System before applying Miller's law

After:

System that applied Miller's law

Principle Laws

Principles are laws that came from scientific behavioral research related to the human-computer relationship. This relationship is often taken for granted; yet, it implies a particular perception for the human brain which needs to be fully understood to design human friendly processes. 

Doherty Threshold’s Principle

Originating in the publishing of a research paper that set the requirements for computer response, this principle talks about the importance of human perception of time passing: “Productivity soars when a computer and its users interact at a pace (<400 ms) that ensures that neither has to wait on the other”.

The time a system takes to respond leads users to different conclusions about process reliability. Our brains naturally tolerate and expect a certain amount of waiting for each task, judged by our preconception of the task’s complexity; this means that it being too slow, or even too fast, has an impact on our perception of the product. However, as designers we don’t really have control over the system so, how do we manage this information and act as a bridge between user behavior and computer interactions? 

The principle in practice

We are continuously faced with designs that solve waiting times, probably without noticing. Every time we haven’t noticed reinforces the belief that “you can’t tell good design as much as you can tell bad design”. It’s easy to think of sites and apps being stuck for too long, but can you equally remember all the times this hasn’t happened?

As designers we should identify responses that are too slow as well as ones that are too fast. This will help with designing states and interactions that enhance the experience and don’t let our users down. Long waiting times can confuse them into believing the product stopped working, while too quick of a response could mean the product didn’t even receive the order. Successful examples can be found in products such as Google, Facebook, or Instagram, which disguise slow loading as cool loading animations and fast responses as an opportunity to share something valuable for the user. 

Our team pays close attention to these instances, trying to stay in communication with developers who keep us informed about system requirements. This modal is a common example of designing for the waiting period, when we know loading time will exceed the user’s expectation. We don’t want them to see an empty card and believe the app isn’t working, when it actually is loading a lot of complex information; instead, in many cases we design skeletons to help the app appear to be responding faster.

Cognitive Bias Effects

A cognitive bias is a subjective interpretation, in an attempt to simplify the processing of information. As the saying goes “we do not perceive the world as it is, but as we are”, meaning  that users’ beliefs and context shape the way they perceive products, reducing most of the mental processing to critical points and overall experience.

Von Restorff Effect

Although the Von Restorff effect seems quite obvious, it is interesting to understand all the psychological concepts it implies and how we can  commit to it from many perspectives: “When multiple similar objects are present, the one that differs from the rest is most likely to be remembered”.

This is  true in almost every context in life; however, when designing interfaces it has a potential role we need to explore and empower. Our mind is set to call attention to those items which look different from the rest, as a primitive instinct related to survival and potentially dangerous threats. Although this appears to be obvious and intuitive, we must be intentional about it when the user’s focus on an action or a message is especially important; we can use contrast techniques such as shape, size, color and position, and sometimes even all of them at the same time, to make sure our design is successfully user-centered. 

An important consideration is also the opposite case: if everything is contrasted and highlighted, nothing will particularly capture a user’s attention; furthermore, this rule leads us to analyze whether or not something needs to stand out. In other words, as simple as it may seem, don’t over design Von Restorff effects, be intentional. 

In this particular case, the highlighted value becomes priority when in alert mode. Under this condition, designers chose to strongly focus on this alert state using contrast in color, size, shape and position. This highlight effect was placed in 3 different points of the card to ensure the communication of the message: first, visually prioritizing the existing information by adding color contrast when exceeding the recommended values; and second, incorporating tooltip and icon, contrasting through size and shape design.

App designed by Octobot's UX team.

Peak-End Rule

Last but not least, I would like to talk about this rule which is – to me – really interesting when talking about the human mind and, even more, when talking about commitment to ethical design. The Peak-End rule has to do with managing perception of processes from a purely experiential point of view; it explains how users’ subjective interpretation, which is completely out of our control as designers, shapes the way they feel and behave through our products. 

“People judge an experience largely based on how they felt at its peak and at its end, rather than on the total sum or average of every moment of the experience”, which creates a challenge  for designers, since even before thinking about solving the problem, we have to successfully identify the peaks of the experience.

So, how do we address the extremes of their emotional journey? 

The ultimate tool to feel and understand what the user is going through is to map their experience journey. Don’t assume, don’t jump to conclusions, don’t force the outcome to match the products’ ideas: do deep research. Interview users, prepare good questions, and test. Our designs must meet people’s real worries and emotions, so our job here has to be committed to empathizing. By doing a journey map we will identify the high and low peaks of the experience and therefore we will be able to design in a way that supports the user entirely.

This product about international money transfers wasn’t as typical as many others of its type. In this case, the business was very focused in understanding and supporting the user in the process of likely living in a foreign country and needing to send money to family. This user is not  doing random and rutinary money transfers, they are probably thinking about helping someone who is urgently waiting for it on the other side; waiting times and access to information was crucial for this product’s success. 

After doing a lot of research and having a clear view of the journey mapping, we focused on a design which paid careful attention to the stages in which the user would be impatient for news and full of questions. They needed fast and easy access to the transfer status as well as clear possible calls to action, so the design involved color psychology, clear user flows, progress bars and thoughtful wording. 

Screen we designed.
Fintech app we designed.

Want to keep learning? Read “Laws of UX”

It’s good to explore this book beyond accumulating knowledge or finding another checklist of everything that a screen has to have when developing a product. In my opinion, the most valuable thing this material provides are key arguments to fight against the tendency of putting into the system everything we’d like to see in it. Instead, it teaches us to prioritize what the user truly needs and wants.

You can draw your own conclusions about the “Laws of UX” by reading the book here. Another  resource on these topics is Humane by Design

Also, I’d like to recommend this article my teammates and I created for sharing our favorite UX/UI resources. And this interview with our Team Leader about the value of UX in software development projects.

You can also learn more about my experience as a UX Designer in this podcast (for Spanish speakers only).

See related posts