Laurence Morvan is a member of Accenture’s global management committee. As chief of staff to Accenture’s CEO she is responsible for ensuring that the CEO’s agenda is effectively represented in Accenture’s core initiatives and leadership forums and in internal and external engagements. As the company’s Chief Corporate Social Responsibility Officer she has management oversight of Accenture’s corporate citizenship initiatives, which include programs around skilling for the digital age and innovating for society. She also represents Accenture in external forums such as G20YEA, an international forum that develops business policy recommendations for the G20 heads of state.
Prior to joining Accenture in 1994, Morvan worked in investment banking on mergers and acquisitions and corporate finance transactions. Morvan, who teaches Strategic Human Resources Management at Sciences Po, a leading French business school, holds a master’s degree in business administration from the Wharton School of the University of Pennsylvania and a Master of Science degree in management from ESSEC Business School (École Supérieure des Sciences Économiques et Commerciales) in Paris. She recently spoke to The Innovator about how companies should approach ethical issues surrounding the uses of technology and data.
Q : How should the C-Suite and company boards approach ethical issues surrounding the uses of technology and data ?
LM : We are experiencing exponential change. The speed of technology adoption and the opportunities are unprecedented so the question is how can companies consciously, responsibly and rightfully leverage technology for good ? While experimenting with new business models and new technologies companies need to be thinking more proactively about potential consequences, in terms of growth and business efficiency, but also in terms of sustainability and trust.
Q : There is a lot of talk these days about unintended consequences, ie products getting used in ways that are unpredictable and sometimes dangerous. This is why Salesforce said it has hired its first Chief Ethical And Humane Use Officer. Is this something that all companies should think about doing ?
LM : Yes, all companies should be thinking about this. We are very much applying the same approach within Accenture. We have organized a multi-disciplinary approach that includes our legal team, our business leaders — in particular in the area of data and AI,- and also myself as corporate social responsibility officer, and we have put together a proactive approach to think about the use cases we see that could have potential unintended consequences. We are looking at how can we equip people on the ground in this area because it is never black and white. We are currently trying to identify what should be our role as a professional services company to make sure that the projects we are involved in are structured in the best way. It involves balancing opportunities and risks but this is not just about risk management. It is about doing things in ways that clearly articulates the purpose, for instance “innovating for impact”,using tech to solve some of the biggest societal challenges and designing solutions responsibly.
Q : Trust was a big theme at the World Economic Forum’s annual meeting in Davos this year. There is a lot of fear of new technologies and about their misuse. Accenture issued a report in Davos that shows that globally, $3.1 trillion of future revenue growth is at stake for large companies, depending on how their workforce data strategies affect employee trust. The report says companies that put in place responsible data strategies could see revenue growth of up to 12.5 % higher than that of companies that fail to adopt responsible data strategies. What guidelines do you suggest companies follow ?
LM : The report we released in Davos was very specifically around the use of employee data. Our report “Decoding Organizational DNA,” is based on qualitative and quantitative research, including global surveys of 1,400 C-level executives and 10,000 workers across 13 industries. Among the key findings: While more than six in 10 C-level executives said that their organizations are using new technologies to collect data on their people and their work to gain more actionable insights — from the quality of work and the way people collaborate to their safety and well-being — fewer than one-third are very confident that they are using the data responsibly. In addition, more than half of workers think that the use of new sources of workforce data risks damaging trust, and 64 % said that recent scandals over the misuse of data makes them concerned that their employee data might be at risk too. The question is how can employers leverage the wealth of employee data but do that in a way that is thoughtful and creates the right trust with employees. Of course from that research there are many opportunities to enhance and manage the work force and create value added services for the employees. Many employees would be willing to give their data if they could understand how it will be used and know that they will benefit personally but trust has to be established. We have three areas of recommendation for companies to establish that trust: one is to ensure that employees know how their data is being used and to allow them to access and own their data ; secondly to share responsibility with employees by involving them in the design of workforce data systems, in order to create benefits for all. Typically companies’ employee-related policies are not formed from the start with the employees in mind so many companies will have to rethink their approaches ; the third principle is to craft processes to ensure that machines and human collaborate in the best way; to have as a principle that technology should be used to elevate people’s capabilities and when there is the potential of some jobs disappearing that all of the consequences are taken into account, for instance through reskilling.
Q : Should policies also be put in place to prevent artificial intelligence from reinforcing existing bias in areas such as hiring ?
LM : The people in charge of our AI practices have come up with tools and techniques to help companies be more proactive when it comes to the impact of algorithms. Bias is one of the areas. Some companies found that when they tried to use technology to scan resumes to select candidates that machines were reproducing biases because the programs were trained using historical data that contained biases. Accenture has developed a tool with the help of the Alan Turing Institute to cross check the outcome against bias. This is just one example of how we need to continuously make sure there is explainability and transparency when we develop algorithms in order to gain and maintain trust with users. Trust comes from transparency so we are making sure that companies build steps to explain outcomes as part of the development process. There needs to be constant innovation but there is a need to have safeguards along the way. If you are too risk management centric you can end up stifling innovation so there needs to be a balance between opportunities and challenges. At the moment there are many innovation projects, such as automation, that yield efficiency gains but they need to be human-centric, and address questions such as “how do they help human focus on the value-added tasks” and “how will we reskill people ?” There needs to be a dialogue that considers all of the dimensions. That is the approach we guide our clients to take in order to be proactive about embedding innovation with responsibility.
Q : How should we define business responsibility in the digital age ?
LM: [French President Emmanuel] Macron has spoken a lot about tech for good and many companies in France in particular are thinking about how they have a role to play in the broader community, asking themselves: what is the role of corporations in the broader society ? Business responsibility in the digital age includes reskilling and clarity of intent with workers. I see two trends. Companies are stepping up to use tech for good and clarify how technology helps achieve their mission, and they are looking at how to achieve their digital transformation in a responsible way, e.g. embedding key considerations around data privacy, cybersecurity and reskilling into their core activities.