Gabi Moisa - Fotolia
Ben Clark, chief architect at Wayfair LLC, refers to himself as a technology generalist. Seven years ago, he started the data science program at the Boston-based e-commerce company -- a year before data science was labeled the sexiest profession of the 21st century. His brief now as Wayfair's chief architect: enable the 60-person data science team by making data and data science tools as accessible as possible.
That team is experimenting with so-called black box testing techniques in an effort to make artificial intelligence an integral part of operations. Black box testing is currently a foundational technique to advance self-learning algorithms and neural networks, but it can leave CIOs -- and enterprises -- in the dark on how to support such initiatives.
I asked Clark if he could share some advice on how CIOs can support AI endeavors, and he encouraged them to become familiar with black box testing techniques. Here's his response in its entirety.
As companies start to experiment with AI, what tips can you give CIOs on how they can support -- and maybe even challenge -- data scientists?
Ben Clark: With any new technology or any new technique or family of techniques, which is what we're really talking about here, we do a kind of test and learn, lean startup principles [approach], if you will, even though Wayfair is not a startup anymore. But wherever uncertainty is large, which is a big part of a lot of what we do, I think this set of principles applies. So, speed to market for testing your ideas is as important in this area as in any area.
The thing that makes AI techniques tricky is that, like in so many other areas of technology, the technology executives often come up from their own deep background with some knowledge of how things work. And if you yourself don't have a quant background, you can't really challenge the things that are being asked of you on merit.
So, how do you get around that? If you are taking a kind of test and learn approach, you have to become savvy with black box testing. You can say, 'What do you need in order to prove that this is valuable? How can you show that this is a thing that is worth creating more resources around it -- or more investment or more this or more that?'
And I think the data scientist will respond to that. They will say, 'Well, I need to run an experiment.' And then CIOs can challenge them to say how small of an experiment they can run that will move the needle on something or show some value in some way that everyone can understand -- even the ones who aren't inward with the techniques.
Let them participate in that process and prove to you that what they're doing is not a kind of ivory tower-like experiment that will take a long time to bear any kind of reasonable fruit.