AMD has developed a new patent that doubles memory bandwidth by using a high-bandwidth DIMM (HB-DIMM) approach, which integrates additional chips to multiplex and re-time data streams, rather than improving DRAM silicon itself. This technique is especially beneficial for AI and bandwidth-intensive workloads, though it may increase power consumption and cooling needs.
Apple's latest M3 Pro chip in the new MacBook Pro models boasts faster performance and improved power efficiency. However, it has 25% less memory bandwidth compared to the M1 Pro and M2 Pro chips used in previous generations. The M3 Pro chip features 150GB/s memory bandwidth, while the M3 Max offers up to 400GB/s. Additionally, the M3 Pro chip has a different core configuration compared to its predecessor, with 6 performance cores, 6 efficiency cores, and 18 GPU cores.
Apple's latest M3 Pro chip in the new MacBook Pro models has 25% less memory bandwidth compared to the M1 Pro and M2 Pro chips used in previous generations. Despite this, the M3 Pro chip is touted as the fastest and most power-efficient Apple silicon yet, offering up to 40% faster performance than the M1 Pro. The M3 Pro chip also features changes in core ratios and a slightly weaker Neural Engine compared to its predecessors. The real-world impact of these changes on performance remains unclear, but Apple's new Dynamic Caching memory allocation technology aims to optimize memory usage.