Wayfair's computer vision technology gets your style

At Wayfair, the digital and physical worlds continue to converge thanks to computer vision and mixed reality tech, enabling customers to get a better feel for products before they buy.

Wayfair LLC is taking some of the guesswork out of the home decorating process. The Boston-based retailer is developing applications that can decipher the look of a room and make style-appropriate product recommendations, as well as help customers visualize exactly what a rug will look like in their hallway.

In this Q&A, John Kim, vice president and global head of algorithms and analytics at Wayfair, says efforts like these, which utilize computer vision technology and mixed reality, help the company bridge the gap between the digital and brick-and-mortar worlds. Kim, a self-professed business person who oversees a team of 400 employees and reports to the chief merchandising officer, described his organization as data-oriented and data-intensive.

"We do everything from making sure that data is accessible to the rest of the company through our business intelligence teams, which arm all other decision-makers with data they need to analyze, to inform, to monitor their progress, as well as to inform our models and our algorithms, and then, even further downstream, to make decisions on how to manage margin or revenue growth and the like," he said.

Let's talk about computer vision technology. What's the most exciting project your team is working on?

John Kim: One application of computer vision that I'm most excited about is our ability to decipher the style of a room and then offer up products that would complement and support the style of that room.

John Kim, vice president and global head of algorithms and analytics, WayfairJohn Kim

That would require an ability to not only understand the style of every single product, but also be able to decipher the style of a room, which is not just a sum of the individual product styles, but also takes into account the combined effect of each product.

What's the big computer vision technology challenge in delivering an app like that?

Kim: A lot of these AI- or machine learning-based models are highly dependent on the training sets. And so, in our case, what that requires is a ton of tagged images and room styles -- someone saying this room is rustic or this room is contemporary or midcentury or whatever it may be. Developing that training set is a huge barrier in being able to do this kind of work.

The other is the ability to determine the style of a product. An end table, for example, could be categorized as both modern and contemporary. And so learning how to create those distinctions or gradients is something that could be challenging.

Labeled data is a common pain point for computer vision algorithms. Where does the labeled data come from at Wayfair?

Kim: We have a combination of efforts going on. One is we do have actual people. We also have stylists because we're taking advantage of the fact that we have certain features on our site where we're getting user input and data. And because we have the largest collection of 3D images of our products being tagged, we have that library of tagged images, as well.

So much of the data also comes from visitors to the site. For instance, visitors who look at 10 different bar stools, you can start to see patterns -- that when those bar stools are not only viewed together by one visitor but by many visitors, we might label that those bar stools as having a similar style. And so, by leveraging the 11 million customers we have, [it] has been enormously powerful.

You've also been experimenting with mixed reality, a kind of hybrid of augmented and virtual reality. What's the benefit of using mixed reality to buy, say, a bed?

Kim: We have a shopping app that allows you to take furniture from our site and plop it into your room. Because it's mixed reality, you're able to work up close to the bed, look underneath it, walk around it, manipulate it and see it in the context of your room. And if you want to buy it, you can buy it. And if you want to swap it out for a different bed, you can easily do so.

The other app is, when you're trying to style a room, and you might have an empty room or an existing room, but if you want to try out some different concepts -- what would happen if I swapped out this rug for a different rug -- you can. And you can start to develop a roomscape of sorts that allows you to think about how to decorate your home.

Wayfair is a digital native with no brick-and-mortar locations -- at least not yet. How does that influence your decision to experiment with mixed reality or embark on a computer vision project?

Kim: Our lack of a brick-and-mortar presence has forced us to think creatively about how to bridge the divide between brick-and-mortar and digital.

You can imagine that, for many people who want to buy furniture online, one of the big challenges is that they can't touch or feel the furniture they're buying. We've had to come up with ways to surmount that barrier, such as sending customers fabric samples before they commit to buying and making it easy to return products.

In addition to that, we also want to play to our strengths. In the digital world, we can allow people to visualize these products in their homes in ways that are better than going to an actual store. You visit a Macy's or an Ikea or wherever it may be, but it's not your home. And it might look great in your store, but will it look great in your home? It's hard to picture that without some of the tools we're developing here.

Dig Deeper on AI technologies

Business Analytics
CIO
Data Management
ERP
Close