The demand for digital tools is booming, fuelled by customers who have flocked to digital solutions amid global lockdowns and the acceleration of technology transformation in the era of COVID-19. Consequently, the number of online transactions and, ultimately, the volume of data available has exponentially grown.
McKinsey’s Global Survey of executives points to this trend. The study found companies’ digitization of customer and supply chain interactions have accelerated by three to four years, with the share of firms’ digital products within their portfolios accelerating by seven years amid the COVID-19 pandemic.
With all this digitization comes a lot of data—and with all that data comes opportunity. But that should come with a sense of responsibility, as well.
Companies can dive deeper into understanding their customers, predicting needs and wants to delight customers in new ways. We’ve entered a data-rich world, feeding decision-making models and advanced analytics tools with more data than ever before. But with all this data and the quest to be more data driven, we must not underestimate the heightened responsibility to ethically use both the technologies and the information that fuels them.
We’ve entered a data-rich world, feeding decision-making models and advanced analytics tools with more data than ever before
At the root of this thinking is trust. Trust, simply put, is about fairness, reliance, and transparency. Take financial institutions, for example. Banking is built on the hard-earned trust customers place in their bank to handle their money. When we think about customer data, it’s really about the trust customers place in organizations to safeguard their data and use it for their good. It’s about asking the question: “Would the customer be OK with you using the data this way?”
As we hit this inflection point where digital activity is at an all-time high, organizations need to take a hard look at the ethical considerations that stem from the broad collection analysis and use of data, especially when that data is informing artificial intelligence and advanced analytics tools. To preserve that customer trust, organizations must be thoughtful about using customers’ data transparently and for their good. Just because the data is available doesn’t mean it should be used.
The challenge, as you may have guessed, is that data ethics don’t just come off the shelf. Organizations are tasked with building the culture that supports ethical action. It’s not about the optics but about outcomes that can only truly be delivered through accountability and investment. When you take ownership through accountability, it starts to foster the spirit of the rules and the culture that underpins it.
As a start, some organizations have established guiding principles for ethical data and technologies, acting as goalposts for building the culture around it. Scotiabank, for example, has defined six guiding principles for its ethical use of data and artificial intelligence. Coupled with certification programs for employees and more formal governance practices, the Bank is prioritizing the ethical use of data and technology as more and more customers divert to digital channels, and the Bank becomes even more data driven.
Operationalizing data and AI ethics is no easy task. It requires commitment from senior leaders and collaboration across the organization. Those that succeed in making the commitment and investment will not only minimize reputational risk but also build on the trust of their customers, enabling them to continuously improve delivery and provide enhanced capabilities to their customers. With greater technology efficiencies and clarity into customer needs, trust will not only be preserved but cemented in the digital era.