Design Playground

Miscellaneous work from hackathons, student projects, graphic work.

Role

UX Designer
Co-Researcher

Team with

Janani Ravikumar
Ayu Larasati

Progress

Concept design, ideation to UI

Date

Sep 2019 - Nov 2019

client kickoff slide

Overview

Verge is an AI audit tool for banks and lenders that reviews banks’ algorithmic loan decisions to ensure fairness in lending decisions. As a cross-functional team composed of data scientists, UX researcher and UX designer, we completed Verge from ideation, research, high-fidelity design to algorithm training based on interview with banking staff and the dataset ???. When designing the interface of this human-in-the-loop AI tool, I aimed to address the challenges of explainability, building trust, transparency, and human-centeredness. Triumphing fairness and explainable AI products in supporting human decision makers, our project won the second place in the final project voting and peer evaluation.
My Role
UX Designer
Co-Researcher
Duration
3 months
Mar - May 2021
Teammates
Data Scientist
Yunyi Li, Nicholas Wolczynski
UX Researcher
Elena Gonzales Melinger
Tool
Paper & pen
Figma, Illustrator

Prompt

Bias in loan application decisions is a persistent issue. Equal Credit Opportunity Act (ECOA) and the Fair Housing Act (FHA), which aims to offer protections against lending discrimination for protected groups, states it illegal to offer less favorable loan terms to applicants on the basis of their race, nationality, gender, age, and marital status. Despite the laws, cases of discrimination in home loan and credit approval persist as stories covered. AI, perceived as a decision maker free from racial or gender biases in individual human beings, however, could also be liable to inherit biases hidden in historical data.

screen capture of the news with headline reading "is an algorithm less racist than a loan officer" by New York Timesscreen capture of a piece of news from CNBC with headline reading "a troubling tale of a black man trying to refinance his mortgage"
Stories from NYT and CNBC about racial discrimination in loan application. Akridge, a black man, who had all the necessary financial credentials like steady job, well-paid salary, high FICO credit score, was declined to refinance his mortgage.
According to the report from CNBC, "A majority (59%), of Black homebuyers are concerned about qualifying for a mortgage, while less than half (46%) of White buyers are, according to a recent survey by Zillow. Lenders deny mortgages for Black applicants at a rate 80% higher than that of White applicants, according to 2020 data from the Home Mortgage Disclosure Act." In 2019, the Apple Card was investigated after facing complaints of gender discrimination, which investigators ultimately found no wrong-doing. These research not only shows that bias in loan application decisions persevere, but also reminds financial agencies that they need to address the challenge of ensuring lending decisions free from gender and racial biases.

Solution

Verge will assist the decision making process for loan approvals and assist lenders to double check whether the prediction results of their AI tool used for approving loan applications meet fairness standards.

Key features include:
- A "second opinion" on the bank’s original algorithmic decision
- Makes cases visible where the bank’s model and our models that are optimized for different fairness metrics disagree
- Provide predictions from fairer algorithms in a transparent, trustworthy, and intuitive way

Discovery

Racial and gender biases exist in algorithmic lending

Race and ethnicity is one of the most concerning fairness areas in bank loan applications, and many earlier studies also provide evidence of racial biases in loan markets and housing markets. Barlett et al. found evidence of racial discrimination by studying more than 13 million mortgage applications and refinance applications in 2019. They found that the racial bias against black borrowers can come from both face-to-face lending and algorithmic lending (Bartlett et al., 2019).

Gender differences in bank loan access have been studied thoroughly too. Accessing bank credit is more difficult or requires higher costs for female entrepreneurs (Calcagnini et al. 2015, Fay, 1993, Ongena et al. 2015). Given the same level of income, female loan applicants encounter higher rejection rates and lower loan approval amounts.

Garbage in, garbage out

While many people might think that machine learning and AI-driven systems can bring more objectiveness to the decision making process in loan applications, it is unfortunately not true. AI-driven systems rely on algorithms and models for making predictions. Algorithms do what they are taught. The biased societal patterns hidden in the data are unintentionally learnt by the algorithms and reproduce biased predictions. Also, the demographics of applicants are used for machine to predict default risks. The algorithms could correlate gender or race with default unsubstantially.

Fair algorithms

"Unfair algorithms" are not designed to be evilly unfair by data scientists. Instead, these algorithms are usually selected based on their performance in prediction accuracy. Accuracy, however, could be a trade-off of ruling out irrelevant factors. For instance, a machine learned that for 10 raining days, in 6 days I wore a hat; if I wore a hat today, it naturally predicted that there's a large chance it's going to rain.
Ruling out the possibilities that race and gender are mistakenly taken by algorithms as causes of default, is what fair algorithms address (Equalized odds). Data scientists in our group decides to employ post-processing as our fairness algorithm.
To the detailed reasoning on data science reasoning, please refer to our class slides: Project brief slides

Competitor Analysis

The market needs new tools and methods to evaluate more traditionally overlooked loan applicants beyond credit score.

AI used in the fintech market for credit scoring and assessing lending. We studied ZestAI, Lenddo, and Upstart as our competitors. ZestAI aims to facilitate underwriting with machine learning by providing streamlined data and algorithm training with modeling explainability. Trying to provide more opportunities to biased groups without depending too much on credit score, Lenddo used users’ digital history to predict default while Upstart intends to give young people more opportunities by considering using test scores as a way of credit approval. As most of the products are B2B products, we are not able to view the actual interface and features of competitors. However, we learned about the significance of financial explanation and the market need of ensuring fairness in applicants beyond a single metric of credit score. While trying to find alternative ways to evaluate applicants, none of these tools mentioned fairness metrics. Verge will cover this gap in the market.

Survey

We surveyed 30 people on their experience, knowledge, and opinions on algorithmically judged loan applications.

Loan applicants were not sensitive to if their application was judged algorithmically versus by a human in person.

55% of survey applicants reported having been declined for a loan instantly, but this did not translate into high rates of dissatisfaction or mistrust in the decision. While most customers reported that they understood what factors went into their loan decision, most disagreed that there was a way to get a satisfactory explanation about their loan application results, with ⅔ of all respondents strongly agreeing or agreeing with the statement “Hypothetically, if my application were rejected, I would want to know why”.
Customers are not sensitive to if their application is judged algorithmically vs. by a human in person, with the majority of people saying that they don’t think it makes a difference or they don’t know.

People either felt their applications were judged fairly, or acknowledged that they don’t know if their applications were judged fairly.

73% agreed that if they were hypothetically rejected from a loan, they would want an explanation why.
50% felt there was not a clear way to get a satisfactory explanation for their loan results.

User interview

We interviewed professionals working in banks, both to understand what aspects of fairness and compliance bank employees are sensitive to and to understand their present feelings about automated loan decisions and AI. One interviewee was certain that AI was not used at his workplace to judge applications, and the other was unsure if the platform they use to record customer information had a component of AI or not. The bank employees we spoke with had generally positive impressions of AI and fairness, and saw incorporating algorithmically judged loan applications as an important step in scaling up their operations.

After collecting this stakeholder feedback, we concluded that there was potential for this tool, especially in the arena of explanations of loan application results identified both by customers and bank employees as important.

Design

Find Printers

Provide users with a more straightforward view with the available printers on campus through a two-tier map.

Enable users to find printers nearby quickly and understand the status of printers.

Poster redesign and iteration

This came from a class assignment about redesigning posters for presenting information. We were asked to find a poster online and redesign it to make sure the information is communicated efficiently and attractively. Out of interest for space exploration, I found an activity poster made by a student to share their intern experience in NASA. For the first draft, I attempted to visualize the information by using space elements, reorganizing information layout. Based on class feedback, on the final version I managed to connect the scattered points into a story. The internship experience is framed as a journey of exploration narrated by the student.

First draft

Second iteration

Illustrations for iSchool news

As a web desiger for School of Information of UT, I made illustrations for their project of librarianship & AI. Iterated based on design critiques from supervisors and colleagues.

illustration about human-ai interaction on librarianship

Sticker design for the first gen of iSchool undergraduates

Excited to design a series of stickers for the coming new undergraduates of iSchool. Use vibrant colors, Texas iconic symbols, and UT icons.

Invites for a virtual happy hour

Designed three layouts and two color systems for a joint virtual happy hour event.

Welcome postcards for new undergraduates

Vibrant color, Texas iconic symbols, UT icons and iSchool features to welcome new undergraduates

Redesign infographic for career choices

Redesign iSchool website infographics, to display the intersection of different tracks and diversity of career choices in iSchool.

Usability study of POP

Usability study of POP, a startup social app to help students make genuine friends at college. From Jan to May 2020, Our team used heuristic analysis, competitive studies, contextual interview, and usability testing help to dig usability problems and find out the market niche of product.

View Full Problem Analysis  🪄

Problem analysis

Key takeaways

Attachment: Heuristic Evaluation Report

back to top button icon