WWCode Podcast #31 — Jinzhou Huang, Director of Data Science at The Home Depot

Women Who Code
5 min readJun 7, 2022

Written by WWCode HQ

Women Who Code Podcast — Episode 31 | SpotifyiTunesGoogleYouTubePodcast Page
Stephanie Rideout, Python Leadership Fellow at Women Who Code, interviews Jinzhou Huang, Director of Data Science at The Home Depot. They discuss work and diversity at The Home Depot, the social responsibilities of a data scientist, and Jinzhou gives advice on advancing and maintaining a successful career.

Share a little bit about your career history?

I started in Humanities, my Ph.D. in Sociology at the University of Minnesota. Sociology turns out to be very quantitative, so I decided to pursue a career in statistics and finished my Master’s degree in Biostats. Both disciplines helped me to recognize my passion, which is to use experimentation to discover the truth. I started at Target where I did a lot of machine learning but also tried to build their experimentation platform. Now I am at Home Depot continuing to use machine learning to drive experimentation.

Tell us about working at the Home Depot and the work that you do there.

I joined about 10 years ago. At that time we did not have a data science practice. Our first data science team picture has about five people, now we have over a hundred people. The growth has been tremendous. We focus on bringing machine learning to see how we can best help our customers. We also do a lot of data science in our supply chain.

What can you tell us about the company culture and The Home Depot’s commitment to diversity?

I am really happy to see a lot of initiatives at The Home Depot that focus on promoting diversity. We have monthly talks around diversity. A while ago, for example, there was some violence against Asians, especially women, in the Atlanta area. When that happened, we hosted multiple talks about respect at the workplace. I feel very supportive of those conversations. More importantly, Home Depot’s hiring represents diverse cultural backgrounds.

What are some advances in the field of data science you are passionate about?

The first one I’m super excited about is reinforcement learning. For example, how does your mop learn to figure out where the dirt is and not to run over your cat? That’s reinforcement learning applied in robotics. How does this reinforcement learning apply to a retail setting? That’s actually one question we are actively trying to figure out.

A family of algorithms called Multi-Armed Bandits, or contextual bandit, is where we use reinforcement learning techniques. At Home Depot, we have lots of different versions of algorithms. It could take forever for the right parameter or hyper-parameter tuning. Wouldn’t it be great if there was a bot that could select the optimal algorithm for you?

We also use what we call the “complete the room” algorithm. Out of the room scene, we can detect, for example, a sofa, a chair, and a side table. Then, we will apply a visually similar calculation to figure out the similar product that Home Depot has in their inventory. This is how AI is helping Home Depot’s customers to shop for their houses.

What are some ways that AI impacts our daily lives, and how can data scientists practice social responsibility?

Growing up a sociologist, it’s natural to take a social responsibility lens. How does AI directly impact human life? How do we minimize the bias that we potentially would carry on from our human reality into algorithms? How do we avoid replicating those biases in algorithms?

When you think about how AI impacts human lives, one example is automation replacing more manual work. Take our call center associates, for example, we recognize that when customers call they want to talk to a human. We recognize that our associates are our most critical and important resource. We want to make sure that we are using technology to make our associates’ jobs easier, not replace them.

I think it’s inevitable that some bias would leak into algorithms. We’ve seen, with companies who have AI doing their hiring, that it may be a sexist practice, rather than a fair practice. We have seen AI labeling people incorrectly because the training data is unfair. It becomes more like racial profiling rather than a very fair label. How do you prevent that?

I’m responsible for machine teaching at Home Depot. We spend a tremendous amount of effort doing machine teaching to assure the machine is not predicting something that is reflecting bias. I think machine teaching is probably only a very small measure to prevent that from happening. Ideally, all of us, as an industry, should come together and treat this problem more seriously.

What advice would you give female leaders and data scientists to level up in their careers?

It’s critical to manage your personal brand and make that part of your very conscientious effort as your career progresses. Identify a few trusted, either mentors or friends, with who you are very comfortable. Sincerely ask for feedback. It is not who you think you are. Expose yourself to how other people are seeing you. Perception is very important.

What is a pro tip you would like to share with our audience?

It is absolutely critical for you to not stop learning. Develop your own lifelong curriculum. Just like we want our workspace to be diverse, you need to make sure your information is diverse as well.

Is there anything else you want to tell us?

Community, like Women Who Code, is so important. We’re entering the third year of the pandemic. I think a lot of us are in this state of semi-isolation. I don’t think any of us fully understood the long-term psychological impact that this would have on any of us.

--

--

Women Who Code

We are a 501(c)(3) non-profit organization dedicated to inspiring women to excel in technology careers. https://www.womenwhocode.com/