Thursday, December 5, 2024

Should You Sell Alphabet (GOOG) Now Amid Google Chrome Worries?

Must read

We recently published a list of Top 10 AI Stocks on Investors’ Radar These Days. In this article, we are going to take a look at where Alphabet Inc. (NASDAQ:GOOG) stands against other top AI stocks on investors’ radar these days.

The debate around AI systems hitting a “data wall” or plateau is heating up in the tech industry with many arguing that the performance of AI models is not showing signs of further improvement amid a lack of quality inputs, causing scaling issues in the industry.

CNBC’s Deirdre Bosa recently discussed this debate in a program and said:

“All it feels like anyone is talking about right now in tech is this debate over scaling laws and a data wall, which continues to rage in Silicon Valley. This is the idea that more data and bigger models will always lead to better AI, with some arguing that progress has peaked or is starting to plateau. Put another way, it’s a debate over a core assumption in AI that could have massive implications for the industry, from valuations to the GPUs powering it, and of course, the Nvidia story. I was at the Newcomer AI conference yesterday here in San Francisco. It was the theme of the day, with everyone from Scale AI’s Alexander Wang to Anthropic’s Dario Amodei to Databricks’ Ali Ghodsi weighing in.”

Jensen Huang was also asked about the issue of AI systems hitting a data wall and possible scaling issues in a latest earnings call. Here is what he said:

“A foundation model pre-training scaling is intact and it’s continuing. As you know, this is an empirical law, not a fundamental physical law, but the evidence is that it continues to scale. What we’re learning, however, is that it’s not enough that we’ve now discovered two other ways to scale. One is post-training scaling. Of course, the first generation of post-training was reinforcement learning human feedback, but now we have reinforcement learning AI feedback and all forms of synthetic data generated data that assists in post-training scaling.”

Read Huang’s comments in detail here.

Bosa mentioned some other tech leaders pushing back against the idea of AI hitting a data wall and said:

“…..also acknowledge that this alone isn’t enough to push AI further, and progress will come from post-training scaling, which is the development of AI applications on top of existing models. He and others say this will still require massive amounts of compute power.

Latest article