06 Nov 2023 4 min read

Identifying upstream enablers in the AI revolution

By Aanand Venkatramanan , Aude Martin

The cycle of more data leading to more sophisticated AI models means we expect the rate of change to continue to take many by surprise.

upstream_AI.jpg

The following is an extract from our latest CIO Outlook.

Data is the fuel that fires the knowledge economy. This simple axiom explains AI’s potential to increase productivity across virtually all industries, with a commensurately vast impact on overall productivity.

Goldman Sachs estimates generative AI could increase in global GDP by 7% over the coming decade, while PwC believes AI could contribute up to $15.7 trillion to the world economy in 2030.

In the here and now, a 2023 academic paper analysed the impact of generative AI on more than 5,000 real-life customer support agents. It found a 14% increase in productivity as measured by issues resolved per hour, and improved customer satisfaction levels.

For investors, those companies that are upstream enablers of AI are understandably the focus of attention. How can we conceptually frame the AI revolution and identify them?

A symbiotic relationship

As AI has become increasingly sophisticated and datasets have grown, these two fields have become symbiotic.

Just as humans draw on a vast range of experiences to formulate responses, AI systems need a vast training ground to be able to learn. These training grounds, meanwhile, are so large that they are only navigable with AI.

The symbiotic relationship between data and AI is increasingly gaining recognition;1 we believe it could provide a useful framework to identify the most important segments in this field.

Compute: the power behind AI

The computational intensity of today’s AI systems – according to some sources, ChatGPT-4 has 1.7 trillion parameters – means the AI story is unavoidably a hardware story. Today, the task of processing this data falls to graphics processing units and Google’s2 tensor processing units, both of which are optimised for machine learning applications.

Looking further ahead, the level of computing power needed to train AI systems is rising exponentially, meaning we will run up against the limits of today’s computers.

This problem has spurred increased investment into quantum computers, whose fundamentally different architecture could herald a new era for AI computation.

upstream_AI1.png

Performance on knowledge tests is measured with the MMLU benchmark'.3 Training computation is measured in total petaFLOP, which is 10 to the power of 15 floating-point operations4

Big data: separating signal from noise

Big data refers to datasets that cannot be captured, curated or processed using conventional computational approaches in a tolerable timeframe.

The challenges posed by big data are best encapsulated by the ‘three Vs’:

  1. Volume: The sheer quantity of data creation creates issues around storage
  2. Velocity: The rate at which new information is created is increasing rapidly
  3. Variety: Data is being created in an increasing range of formats

Exponential rate of change

The self-perpetuating cycle of more data leading to more sophisticated AI models that can make better use of data means we expect the rate of change to take many by surprise.

As the volume, velocity and variety of data grows year on year, its potential utility continues to increase.

Companies that can address these challenges – as well as the additional vectors of veracity and value – will, in our view, play an essential part in the evolution of big data and AI.

Appendix: AI subsectors

Enablers and developers of AI systems and capabilities, including companies at the forefront of the AI ecosystem:

  • Big data/analytics: Companies providing solutions to the growing scale and scope of data
  • Cloud providers: Public and private clouds are the solution to storing the vast amount of data needed for AI development
  • Semiconductors: A new generation of chip architectures targets the highly parallel processing needed for machine learning applications
  • Network and security: As the value of data continues to increase, sophisticated solutions are needed to protect it from falling into the hands of cybercriminals
  • Cognitive computing: Computing platforms that replicate some of the functionality of the human mind, encompassing machine learning, natural language processing and narrative generation

The above is an extract from our latest CIO Outlook.

 

1. See, for example, the EU’s comments on the importance of open data for regional AI development:

2. For illustrative purposes only. Reference to this and any other security is on a historical basis and does not mean that the security is currently held or will be held within an LGIM portfolio. Such references do not constitute a recommendation to buy or sell any security

3. MMLU benchmark: The Massive Multitask Language Understanding (MMLU) benchmark mimics a multiple-choice knowledge quiz designed to gauge how proficiently Al systems can comprehend various topics like history, science, or psychology. It has 57 different sections, each one looking at a particular subject. The MMLU test has 15,908 questions in total, which are split up into smaller sets. There are at least 100 questions about each subject. The questions in the test come from many places, like practice tests for big exams or questions from university courses. The difficulty of the questions varies, some are as easy as elementary school level, while others are as hard as what professionals in a field might know. The scores achieved by humans on this test are largely dependent on their level of expertise in the subject matter. Individuals who are not specialists in a given area typically achieve a correctness rate of around 34.5%. However, those with a deep understanding and proficiency in their field, such as doctors sitting for a medical examination, can attain a high score of up to 89.8% on the test

4. Floating-point operation: A floating-point operation (FLOP) is a type of computer operation. One FLOP is equivalent to one addition, subtraction, multiplication, or division of two decimal numbers

Aanand Venkatramanan

Head of ETFs, EMEA

Aanand leads the development and growth of the ETF business. Aanand joined the investment manager from ETF Securities after the successful acquisition of the Canvas ETF business which completed in March 2018. He joined ETF Securities as a Director, Quantitative Investment Strategies in May 2017. Prior to that, he worked at Barclays Capital and Goldman Sachs International as a vice president within their index research and structuring groups respectively; and at University of Sussex as an assistant professor in Finance. He has published papers in top academic journals and co-authored book chapters. Aanand holds a PhD in Mathematical Finance and Master’s in applied Mathematics from the University of Reading.

Aanand Venkatramanan

Aude Martin

ETF Investment Specialist

Aude joined L&G ETF in July 2019 as a cross-asset ETF Investment Specialist. Prior to that, Aude worked as a delta one trader at Goldman Sachs and within the structured-products sales teams at HSBC and Credit Agricole CIB. As an investment specialist, she contributes towards the design of investment strategies and actively supports the ETF distribution and marketing efforts. She graduated from EDHEC Business School in 2016 with an MSc in Financial Markets.

Aude Martin