New Video – PCIe® Technology Enables Hardware Solutions in AI/ML

  • Posted on: 1 March 2024
  • By Matt Burns, Samtec

There are many unique challenges in designing artificial intelligence (AI) and machine learning (ML) applications, and the exuberance around ChatGPT has highlighted the influence that AI can have on day-to-day life.

Software designers, hardware designers, and system designers face the challenge of processing and distributing data with low latency and in small form factors over ever-increasing AI/ML models and datasets. With GPUs, CPUs, field-programmable gate arrays (FPGAs), system on a chip (SoCs) and more, PCI Express® (PCIe®) technology can provide unique solutions to tie these hardware platforms together.

How PCIe Technology Enables Hardware Solutions for AI

PCI Express card electromechanical specification add-in-cards (CEM AICs) for high power and M.2 for low power are widely adopted within the industry – making PCIe architecture an excellent solution within AI and ML applications. PCIe technology allows developers to build accelerators regardless of the application-specific integrated circuit (ASIC) technology and scale system architecture quickly to meet technical needs. When designing for AI and ML applications or verticals, PCIe technology allows developers to scale by adding more AI accelerators in servers or compute platforms using PCIe CEM AIC form factors.

Another advantage of PCIe technology is that each new specification release doubles the data rate. Due in part to PCI-SIG®’s three-year specification release frequency, AI chipset vendors and AI accelerator developers can maintain a clear path for future growth. Additionally, as AI data sets grow and bandwidth increases, a new concern is growing power needs. Having low-power modes like the L0p within the PCIe 6.2 specification is appealing for AI and ML chipset vendors and application providers to reduce power requirements.

Finally, PCIe technology offers many security features such as Component Measurement and Authentication (CMA), TEE Device Interface Security Protocol (TDISP) and Integrity and Data Encryption (IDE). More information about these security protections is available on our blog.

The Future of PCIe Technology in AI/ML

The widespread reach of PCIe technology adds value not only for existing vendors and suppliers but also for startups that will influence the evolution of AI and ML applications. According to the “PCI Express Market Vertical Opportunity” report from ABI Research, the total addressable market (TAM) for PCIe technology in AI, which includes both edge and data center deployments, is expected to grow from US $449.33 million to US $2.784 billion by 2030, at a compound annual growth rate (CAGR) of 22%. The Edge AI market is likely to grow more rapidly, at a CAGR of 50%, as more enterprise verticals deploy edge servers and AI continues to democratize.

 

Watch the Video and Join PCI-SIG

PCIe technology will continue to be a strong solution for AI/ML applications due to the proliferation of PCIe-enabled CEM AICs existing in the market, its growing data rate and low-power mode. With the AI/ML market forecasted to continue to grow exponentially, the solutions that the PCIe specification provides will become more prevalent.

Watch my recent video to learn more about the role of PCIe technology in AI/ML in my PCIe Technology in AI/ML video on the PCI-SIG YouTube channel. To continue reading about the future of PCIe market opportunities, become a PCI-SIG member to access the full ABI Research Report.