In an industry-first, SK hynix has announced its 16-Hi HBM3E memory, offering capacities of 48GB per stack alongside other ...
SK Hynix plans to reduce its legacy DRAM production to 20% by the fourth quarter of 2024, responding to increased supply and ...
SK hynix unveils the industry's first 16-Hi HBM3E memory, offering up to 48GB per stack for AI GPUs with even more AI memory in the future.
TL;DR: NVIDIA CEO Jensen Huang has requested SK hynix expedite the supply of its next-generation HBM4 memory by six months, originally planned for the second half of 2025. NVIDIA currently uses SK ...
SK Hynix has strengthened its collaboration with TSMC, designating the foundry to manufacture its next-generation HBM4 logic die. The company's development of high-capacity Compute Express Link (CXL) ...
SK hynix, the world's second-largest memory chip maker, is racing to meet explosive demand for the HBM chips that are used to process vast amounts of data to train AI, including from Nvidia ...
We sell different types of products and services to both investment professionals and individual investors. These products and services are usually sold through license agreements or subscriptions ...
At the SK AI Summit 2024, SK hynix CEO Kwak Noh-Jung unveiled the worlds first 16-high 48GB HBM3E memory solution, pushing AI memory capabilities to unprecedented levels. The advanced HBM3E solution ...
SEOUL, Nov 4 (Reuters) - Nvidia (NVDA.O), opens new tab CEO Jensen Huang had asked memory chip maker SK Hynix (000660.KS), opens new tab to bring forward by six months the supply of its next ...
SK Hynix, the world’s second-largest memorychip maker, is racing to meet explosive demand for the high-bandwidth memory (HBM) chips that are used to process vast amounts of data to train AI, including ...
South Korea's SK Hynix said in October that it aimed to supply the chips to customers in the second half of 2025. An SK Hynix spokesperson said on Monday that this timeline was faster than an ...