While JPMorgan Chase announced this week that it is restricting employee use of the ChatGPT AI-powered chatbot, other corporates are starting to experiment with generative AI, with an eye to developing new business models, according to a new report from BCG Henderson Institute, Boston Consulting Group’s think tank.
The think tank is advising companies to experiment with use cases that go beyond short-term productivity gains.
Some corporates have assigned a special project status to the exploration of Generative AI systems with an innovation sandbox as a first step, says the think tank. It also sees companies creating cross-functional teams that span design, marketing, business, and technical functions to spark creative ideas.
Some companies are developing custom foundation model applications by fine-tuning them with proprietary data. For example, consulting firms are experimenting with their proprietary data to centralize and democratize their extensive but distributed knowledge base. This ensures both that knowledge can remain in the company despite turnover and allows employees to access expert information without overwhelming specific individuals with requests.
The concept of storing knowledge in an interactive chatbot like ChatGPT will also have a dramatic impact on industries such as mining, a sector that is struggling to attract new engineers at a time when many experts are retiring, predicts the think tank. Jasper.ai is another example of a specialized application for marketing content generation built from fine-tuning with proprietary marketing data. Marketing and sales teams can use these tools to quickly create high-quality content. Because Microsoft plans to enable customization of ChatGPT, such applications will become easier to create. But companies also need to look beyond ChatGPT; many other large language models also have useful capabilities, such as Anthropic, BERT, and Macaw., says the think tank.
Beyond existing proprietary data, companies should investigate ways to digitize specialized data for use with foundation models, says the report. In December 2022, NVIDIA unveiled BioNeMO, a foundation model designed to support molecular data representations, chemical structures, and more. Use of BioNeMO is expected to significantly reduce the drug discovery timeline. In fact, in January 2023, ProFluent demonstrated how it used a foundation model trained with 280 million protein sequences to create previously undiscovered enzymes, some of which had anti-microbial properties.
In material science, scientists have gone beyond traditional approaches of design (searching through known material structures to find the optimal properties for a given use case) and implemented the concept of inverse design. , which focuses on is property-to-structure instead of the traditional structure-to-property. In this approach, the desired properties are designed first and then the foundation model is used to generate the chemical compounds that would have those properties. While this approach has primarily been used in material science to date, with applications in aerospace, medicine, electronics, and more, other industries might learn from the idea of inverting design spaces by questioning assumptions about what can be designed versus determined, says the think tank’s report.
Examples such as these allow companies to benefit directly from the financial, engineering, and R&D investments that went into training foundation models, such as those underpinning apps like DALL-E 2, ChatGPT, Stable Diffusion, and Midjourney, among others, says the think tank. In so doing, companies can be competitive without making significant investments in new models by opening a plane of fundamentally new uses and value ahead of competition.
“In a world where automated systems are increasingly being used to provide basic capabilities, company leaders should be innovative and strategic in their use of Generative AI,” concludes the report. “They also need to keep in mind that the ultimate outcome of experimentation is business model innovation. Competitive advantage awaits those that go beyond simple tweaking—and that identify business models not previously possible.”
IN OTHER NEWS THIS WEEK
Google Claims Breakthrough In Quantum Computer Error Correction
Google has claimed a breakthrough in correcting for the errors that are inherent in today’s quantum computers, marking an early but potentially significant step in overcoming the biggest technical barrier to a revolutionary new form of computing. The Internet company’s findings, which have been published in the journal Nature, mark a “milestone on our journey to build a useful quantum computer”, said Hartmut Neven, head of Google’s quantum efforts.
Jaguar Land Rover To Open New Autonomous Vehicle Engineering Hubs
Luxury carmaker Jaguar Land Rover said it is opening three new engineering hubs in Europe to develop autonomous vehicle technologies as part of its partnership with Silicon Valley artificial intelligence company Nvidia. The hubs in Munich, Bologna and Madrid will develop self-driving systems for JLR’s next generation of luxury vehicles. JLR already has six global tech hubs the United States, China and Europe.
Energy Company Announces World First As Its Tidal Power Project Hits Major Milestone
In what is being hailed as a significant milestone in delivering tidal stream power at scale, Edinburgh-headquartered firm SAE Renewables said its project had achieved a world first by producing 50 gigawatt hours of electricity. Located in waters north of mainland Scotland, SAE Renewables’ MeyGen array is made up of four 1.5 megawatt turbines and has a total capacity of 6 MW when fully operational. Currently, three turbines are in operation.
To access more of The Innovator’s News In Context articles click here.