DigitalBridge's Ganzi on AI Infrastructure Outlook

Digital infrastructure is the foundation of AI, cloud computing, and mobile networks, enabling the digital economy. Ten years ago, the focus was on building LTE networks, while today the emphasis is on edge computing and AI applications, requiring robust infrastructure. The rapid evolution of AI, including challenges in building large language models and data centers, parallels the growth of public cloud services. Investments in artificial intelligence infrastructure are projected to continue for the next decade, underscoring the necessity of reliable connectivity for advancements in technology and business operations.

Today’s focus includes enabling cloud applications and edge computing for various sectors.

Development of AI is still in early stages, emphasizing the need for infrastructure growth.

Edge computing is critical, moving workloads to tiered markets for improved performance.

AI Expert Commentary about this Video

AI Infrastructure Expert

The focus on edge computing is crucial as organizations increasingly demand low-latency solutions to support real-time AI applications. With AI's rapid evolution, the need for investment in infrastructure to support inference capabilities will be paramount. As companies like NVIDIA struggle with GPU supply, businesses must innovate in their approach to acquiring resources to ensure they meet growing computational needs.

AI Market Analyst Expert

The trends discussed illustrate a significant investment trajectory within AI infrastructure, particularly as organizations seek to leverage edge computing. The anticipated ten-year investment cycle reflects a strategic shift to meet scaling demands. Companies like Microsoft are at the forefront of cloud innovations, driving the urgency to secure reliable components in an increasingly competitive landscape.

Key AI Terms Mentioned in this Video

Edge Computing

The discussion emphasizes building edge data centers to enable applications and connectivity nearer to users.

Large Language Models

Their development is reliant on substantial data center infrastructure, which is highlighted as a current challenge.

Inference

The conversation notes the significance of moving inference capabilities to edge environments for better latency.

Companies Mentioned in this Video

NVIDIA

The company's supply chain challenges for GPUs highlight the operational bottlenecks impacting AI deployment in data centers.

Mentions: 2

Corning

The company’s relationship in supplying fiber for connectivity is crucial amidst rising demands for digital infrastructure.

Mentions: 3

Microsoft

Its cloud infrastructure is rooted in robust AI applications discussed in the context of resource management and operational capabilities.

Mentions: 2

Company Mentioned:

Industry:

Technologies:

Get Email Alerts for AI videos

By creating an email alert, you agree to AIleap's Terms of Service and Privacy Policy. You can pause or unsubscribe from email alerts at any time.

Latest AI Videos

Popular Topics