Brandeis Graduate Professional Studies

Sins of our past modeling our future: Diversity and bias in AI and data

March 8, 2020

Word cloud with words like "data," "diversity," "bias," and "AI."

By Deniz A. Johnson

With International Women’s Day approaching, I was recently interviewed regarding the gender gap in Fintech and Financial Services. This is a hot topic with a variety of efforts underway to address it. To name a few:

  • A recent California law (SB826) mandated diversity in the boardroom.
  • Goldman Sachs CEO David Solomon announced that the investment bank will no longer take a company public unless said company has at least one “diverse” board member.

These are just the most recent examples of current shifts in the industry.

Perhaps the most compelling reason to increase diversity is that it pays! A 2020 KPMG study concluded that “boards that include more women and directors with diverse backgrounds and experiences are more effective on a variety of measures, including financial performance, risk oversight and sustainability.”

While these are steps in the right direction, I believe that diversity in financial data sets is a much larger issue.

Without resolving the bias in AI and its data, we cannot make diversity in financial services a sustainable reality.

Every day, we generate data trails as part of our lives as we engage in financial transactions large and small; post on social media; or even just log into a website or app. This data is and will be available for building and refining our Machine Learning (ML) and other Artificial Intelligence (AI) technologies.

As these new technologies are adopted to guide business decisions, including creation of new investment products and services, the diversity challenges have a potential to create significant limitations:

  • When data sets represent only a small percentage of the actual population’s activities, preferences and needs
  • When past decisions contain identifiable or hidden prejudice/ bias
  • When past business decisions omit segments of the population

If we do not openly address these problems, we will carry narrow customer insights and potential biases to future products and services, thus missing the opportunities to add greater value to more clients. This could mean; a minority group that has traditionally avoided loan applications can be automatically rejected in the future since the data set is incomplete.

Let’s begin to address this problem by first using ML/AI to identify bias, bad data, and data gaps. Further, let’s leverage community and educational programs to increase workforce diversity and encourage firms to create inclusive work environments – both will make diversity a reality rather a goal – and enable broader thinking about client segments and their diverse needs and preferences.

Diversity and inclusion are not just feel-good concepts, but investments in the future. Both are necessities for creating better data sets for the new technologies that will help us build the financial solutions of the future and our industry’s success.

Taking a mindful and intentional look into identifying and solving bias in data as well as models is the key to making diverse organizations.


Deniz Johnson is a FinTech thought leader, advisor and executive in the Boston area. You can find her on LinkedIn.

Brandeis Graduate Professional Studies is committed to creating programs and courses that keep today’s professionals at the forefront of their industries. To learn more, visit www.brandeis.edu/gps.