<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>PIM - SK hynix Newsroom</title>
	<atom:link href="https://skhynix-news-global-stg.mock.pe.kr/tag/pim/feed/" rel="self" type="application/rss+xml" />
	<link>https://skhynix-news-global-stg.mock.pe.kr</link>
	<description></description>
	<lastBuildDate>Mon, 10 Feb 2025 15:03:21 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.7.2</generator>

 
	<item>
		<title>SK hynix to Unveil ‘Full Stack AI Memory Provider’ Vision at CES 2025</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-to-unveil-full-stack-ai-memory-provider-vision-at-ces-2025/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Thu, 02 Jan 2025 23:30:26 +0000</pubDate>
				<category><![CDATA[featured]]></category>
		<category><![CDATA[Press Release]]></category>
		<category><![CDATA[HBM]]></category>
		<category><![CDATA[eSSD]]></category>
		<category><![CDATA[PIM]]></category>
		<category><![CDATA[CXL]]></category>
		<category><![CDATA[HBM3E]]></category>
		<category><![CDATA[On-Device AI]]></category>
		<category><![CDATA[16-layer HBM3E]]></category>
		<category><![CDATA[Full Stack AI Memory Provider]]></category>
		<guid isPermaLink="false">https://skhynix-news-global-stg.mock.pe.kr/?p=16959</guid>

					<description><![CDATA[<p>News Highlights SK hynix to showcase technological capabilities, participating in the world&#8217;s largest consumer electronics show, CES 2025, from January 7-10 Featuring a wide range of products driving the AI era, from HBM, the core of AI infrastructure, to next-gen memories like PIM Company to present new possibilities in the AI era through technological innovation [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-to-unveil-full-stack-ai-memory-provider-vision-at-ces-2025/">SK hynix to Unveil ‘Full Stack AI Memory Provider’ Vision at CES 2025</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<h3 class="tit" style="text-align: left;">News Highlights</h3>
<ul style="color: #000; font-size: 18px; padding-left: 20px;">
<li>SK hynix to showcase technological capabilities, participating in the world&#8217;s largest consumer electronics show, CES 2025, from January 7-10</li>
<li>Featuring a wide range of products driving the AI era, from HBM, the core of AI infrastructure, to next-gen memories like PIM</li>
<li>Company to present new possibilities in the AI era through technological innovation and provide irreplaceable value</li>
</ul>
<h3 class="tit">Seoul, January 3, 2025</h3>
<p>SK hynix Inc. (or “the company”, <span style="text-decoration: underline;"><a href="https://www.skhynix.com/eng/main.do" target="_blank" rel="noopener noreferrer">www.skhynix.com</a></span>) announced today that it will showcase its innovative AI memory technologies at CES 2025, to be held in Las Vegas from January 7 to 10 (local time).</p>
<p>A large number of C-level executives, including CEO Kwak Noh-Jung, CMO (Chief Marketing Officer) Justin Kim and Chief Development Officer (CDO) Ahn Hyun, will attend the event. &#8220;We will broadly introduce solutions optimized for on-device AI and next-generation AI memories, as well as representative AI memory products such as HBM and eSSD at this CES,&#8221; said Justin Kim. &#8220;Through this, we will publicize our technological competitiveness to prepare for the future as a ‘Full Stack AI Memory Provider<sup>1</sup>’.&#8221;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>1</sup><strong>Full Stack AI Memory Provider</strong>: Refers to an all-round AI memory provider, which provides comprehensive AI-related memory products and technologies</p>
<p>SK hynix will also run a joint exhibition booth with SK Telecom, SKC and SK Enmove, under the theme &#8220;Innovative AI, Sustainable Tomorrow.&#8221; The booth will showcase how SK Group&#8217;s AI infrastructure and services are transforming the world, represented in waves of light.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-16653 size-full" title="SK hynix to Unveil 'Full Stack AI Memory Provider' Vision at CES 2025" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2025/02/10145644/SK-hynix_SK-hynix-to-unveil-Full-Stack-AI-Memory-Provider-Vision-at-CES-2025.jpg" alt="SK hynix to Unveil 'Full Stack AI Memory Provider' Vision at CES 2025" width="1000" height="563" /></p>
<p>SK hynix, which is the world&#8217;s first to produce 12-layer HBM products for 5th generation and supply them to customers, will showcase samples of HBM3E 16-layer products, which were officially developed in November last year. This product uses the advanced MR-MUF process to achieve the industry&#8217;s highest 16-layer configuration while controlling chip warpage and maximizing heat dissipation performance.</p>
<p>In addition, the company will display high-capacity, high-performance enterprise SSD products, including the ‘D5-P5336’ 122TB model developed by its subsidiary Solidigm in November last year. This product, with the largest existing capacity, high power and space efficiency, has been attracting considerable interest from AI data center customers.</p>
<p>“As SK hynix succeeded in developing QLC<sup>2</sup>(Quadruple Level Cell)-based 61TB products in December, we expect to maximize synergy based on a balanced portfolio between the two companies in the high-capacity eSSD market” said Ahn Hyun, CDO at SK hynix. The company will also showcase on-device AI products such as ‘LPCAMM2<sup>3</sup>’ and ‘ZUFS 4.0<sup>4</sup>,’ which improve data processing speed and power efficiency to implement AI in edge devices like PCs and smartphones. The company will also present CXL and PIM (Processing in Memory) technologies, along with modularized versions, CMM(CXL Memory Module)-Ax and AiMX<sup>5</sup>, designed to be core infrastructures for next-generation data centers.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><em><sup>2</sup></em><strong><em>QLC</em></strong><em>: NAND flash is divided into SLC (Single Level Cell), MLC (Multi Level Cell), TLC (Triple Level Cell), QLC (Quadruple Level Cell), and PLC (Penta Level Cell) depending on how much information is stored in one cell. As the amount of information stored increases, more data can be stored in the same area.<br />
</em><em><sup>3</sup></em><strong><em>Low Power Compression Attached Memory Module 2 (LPCAMM2)</em></strong><em>: LPDDR5X-based module solution that provides power efficiency and high performance as well as space savings. It has the performance effect of replacing two existing DDR5 SODIMMs with one LPCAMM2.<br />
</em><em><sup>4</sup></em><strong><em>Zoned Universal Flash Storage (ZUFS)</em></strong><em>: A NAND Flash product that improves efficiency of data management. The product optimizes data transfer between an operating system and storage devices by storing data with similar characteristics in the same zone of the UFS, a flash memory product for various electronic devices such as digital camera and mobile phone.<br />
</em><em><sup>5</sup></em><strong><em>Accelerator-in-Memory based Accelerator (AiMX)</em></strong><em>: SK hynix’s accelerator card product that specializes in large language models using GDDR6-AiM chips</em></p>
<p>In particular, CMM-Ax is a groundbreaking product that adds computational functionality to CXL’s advantage of expanding high-capacity memory, contributing to improving performance and energy efficiency of the next-generation server platforms<sup>6</sup>.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>6</sup><strong>Platform</strong>: Refers to a computing system that integrates both hardware and software technologies. It includes all key components necessary for computing, such as the CPU and memory.</p>
<p>“The changes in the world triggered by AI are expected to accelerate further this year, and SK hynix will produce 6<sup>th</sup> generation HBM (HBM4) in the second half of this year to lead the customized HBM market to meet the diverse needs of customers,” said Kwak Noh-Jung, CEO at SK hynix. “We will continue to do our best to present new possibilities in the AI era through technological innovation and provide irreplaceable value to our customers.”</p>
<h3 class="tit">About SK hynix Inc.</h3>
<p>SK hynix Inc., headquartered in Korea, is the world’s top-tier semiconductor supplier offering Dynamic Random Access Memory chips (“DRAM”), flash memory chips (“NAND flash”), and CMOS Image Sensors (“CIS”) for a wide range of distinguished customers globally. The Company’s shares are traded on the Korea Exchange, and the Global Depository shares are listed on the Luxemburg Stock Exchange. Further information about SK hynix is available at <span style="text-decoration: underline;"><a href="https://urldefense.com/v3/__https:/www.skhynix.com/eng/main.do__;!!N96JrnIq8IfO5w!gXFbF1sRVRWAEDJ3PaZ-I4YA0xhBRWyPvGQbcrGYpNvHRRWenoc8P0VxyvcqxTMjl4dfFcFDkTnoPXz6hDU$" target="_blank" rel="noopener noreferrer">www.skhynix.com</a></span>, <span style="text-decoration: underline;"><a href="https://urldefense.com/v3/__https:/news.skhynix.com/__;!!N96JrnIq8IfO5w!gXFbF1sRVRWAEDJ3PaZ-I4YA0xhBRWyPvGQbcrGYpNvHRRWenoc8P0VxyvcqxTMjl4dfFcFDkTnozIJInBk$" target="_blank" rel="noopener noreferrer">news.skhynix.com</a></span>.</p>
<h3 class="tit">Media Contact</h3>
<p>SK hynix Inc.<br />
Global Public Relations</p>
<p>Technical Leader<br />
Sooyeon Lee<br />
E-Mail: <span style="text-decoration: underline;"><a href="mailto:global_newsroom@skhynix.com">global_newsroom@skhynix.com</a></span></p>
<p>Technical Leader<br />
Kanga Kong<br />
E-Mail: <span style="text-decoration: underline;"><a href="mailto:global_newsroom@skhynix.com">global_newsroom@skhynix.com</a></span></p>
<p><a href="https://linkedin.com/showcase/skhynix-news-and-stories/" target="_blank" rel="noopener noreferrer"><img loading="lazy" decoding="async" class="size-full wp-image-15776 aligncenter" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2025/02/10074354/SK-hynix_Newsroom-banner_1.png" alt="" width="800" height="135" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1-680x115.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1-768x130.png 768w" sizes="(max-width: 800px) 100vw, 800px" /></a></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-to-unveil-full-stack-ai-memory-provider-vision-at-ces-2025/">SK hynix to Unveil ‘Full Stack AI Memory Provider’ Vision at CES 2025</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>[SK hynix’s 41st Anniversary] “Celebrating 40+1” … Harnessing 40 Years of Technological Expertise to Stand Alone as the Global No. 1 AI Memory Provider</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-41st-anniversary-rise-to-ai-memory-leader/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Thu, 10 Oct 2024 00:00:47 +0000</pubDate>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[featured]]></category>
		<category><![CDATA[HBM]]></category>
		<category><![CDATA[Anniversary]]></category>
		<category><![CDATA[PIM]]></category>
		<category><![CDATA[CXL]]></category>
		<category><![CDATA[AI Memory]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=15958</guid>

					<description><![CDATA[<p>SK hynix entered the semiconductor business in 1983 and has ascended to become the global no. 1 AI memory provider following more than 40 years of relentless effort and innovation. Building on this longstanding technological expertise and entering a new chapter in 2024, the company is strengthening its leadership and marking the start of its [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-41st-anniversary-rise-to-ai-memory-leader/">[SK hynix’s 41st Anniversary] “Celebrating 40+1” … Harnessing 40 Years of Technological Expertise to Stand Alone as the Global No. 1 AI Memory Provider</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>SK hynix entered the semiconductor business in 1983 and has ascended to become the global <strong>no. 1 </strong>AI memory provider following more than <strong>40 years </strong>of relentless effort and innovation. Building on this longstanding technological expertise and entering a new chapter in 2024, the company is strengthening its leadership and marking the start of its “<strong>40+1 renaissance</strong>.” At the core of this success are <strong>AI memory</strong> solutions such as HBM, PIM, and CXL<sup>®</sup> which are powered by advanced processes and packaging technologies. To mark SK hynix’s 41<sup>st</sup> anniversary, the newsroom reflects on the history, technological achievements, and the dedication of the company’s employees that have driven these innovative products.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15978 size-full" title="SK hynix has embarked on a 41-year journey to become a leader in HBM and AI memory" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/10012638/SK-hynix_41st-Company-Anniversary_1.png" alt="SK hynix has embarked on a 41-year journey to become a leader in HBM and AI memory" width="1000" height="1319" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/10012638/SK-hynix_41st-Company-Anniversary_1.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/10012638/SK-hynix_41st-Company-Anniversary_1-303x400.png 303w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/10012638/SK-hynix_41st-Company-Anniversary_1-768x1013.png 768w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/10012638/SK-hynix_41st-Company-Anniversary_1-776x1024.png 776w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">SK hynix has embarked on a 41-year journey to become a leader in HBM and AI memory</p>
<p>&nbsp;</p>
<h3 class="tit">The Rise of SK hynix &amp; Its HBM Propelled by the AI Era</h3>
<p>SK hynix’s rise to become the leader in the global memory market has been driven by the growth of the AI industry. Since the emergence of generative AI in 2022, a wide range of products and services have adopted AI as the technology has rapidly evolved. This has led to a surge in demand for high-performance memory, which is essential for processing massive datasets and enabling fast training and inference<sup>1</sup>. In response to this demand, SK hynix is providing advanced memory products and thereby playing a defining role in the development of the AI industry.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>1</sup><strong>AI inference</strong>: The process of running live data through a trained AI model to make a prediction or solve a task.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15964 size-full" title="SK hynix has continually advanced its HBM lineup to reach new standards in performance" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073852/SK-hynix_41st-Company-Anniversary_02.png" alt="SK hynix has continually advanced its HBM lineup to reach new standards in performance" width="1000" height="563" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073852/SK-hynix_41st-Company-Anniversary_02.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073852/SK-hynix_41st-Company-Anniversary_02-680x383.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073852/SK-hynix_41st-Company-Anniversary_02-768x432.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">SK hynix has continually advanced its HBM lineup to reach new standards in performance</p>
<p>&nbsp;</p>
<p>SK hynix solidified its capabilities even before the AI boom by focusing on developing the early generations of HBM, a high-bandwidth memory which rapidly transmits large volumes of data. The company then gained market leadership and expanded its influence with the third-generation HBM, HBM2E. HBM3, the successor to HBM2E which is optimized for AI and high-performance computing (HPC), also drew significant attention. Most notably, the company established itself as a key partner in the AI and data center markets by supplying HBM products to NVIDIA. Around this time, <a href="https://www.trendforce.com/presscenter/news/20230418-11647.html" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">SK hynix achieved a 50% market share in the HBM sector</span></a> to strengthen its HBM leadership.</p>
<p>Moving into 2024, SK hynix has maintained its prominence in the AI memory market. The company began supplying the world’s best-performing 8-layer HBM3E, first developed in 2023, to leading global tech giants in March 2024. Offering maximum data processing speeds of around 1.2 terabytes (TB) per second, HBM3E helped SK hynix further bolster its status as the global no. 1 AI memory provider.</p>
<p><strong>Next-Generation HBM: Utilizing 15 Years of HBM Technology Know-How </strong></p>
<p>SK hynix’s HBM success story can be traced back to 2009. This was the year when the company began full-scale product development after discovering that TSV<sup>2</sup> and WLP<sup>3</sup> technologies could break memory performance barriers. Four years later, the company introduced the first-generation HBM, incorporating these TSV and WLP technologies. Although HBM was hailed as an innovative memory solution, it did not receive an explosive market response because the HPC sector had not yet sufficiently matured enough for widespread HBM adoption.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>2</sup><strong>Through-silicon via (TSV)</strong>: A technology that drills thousands of microscopic holes in the DRAM chip and connects the upper and lower chip layers with electrodes that vertically penetrate through these holes.<br />
<sup>3</sup><strong>Wafer-level packaging (WLP)</strong>: A method that is a step beyond the conventional package, where wafers are cut into individual chips and then packaged. In WLP, the packaging is completed at the wafer level, producing finished products.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15965 size-full" title="Building on its 15-year HBM history, SK hynix is well-placed to develop next-generation HBM products" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073858/SK-hynix_41st-Company-Anniversary_03.png" alt="Building on its 15-year HBM history, SK hynix is well-placed to develop next-generation HBM products" width="1000" height="610" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073858/SK-hynix_41st-Company-Anniversary_03.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073858/SK-hynix_41st-Company-Anniversary_03-656x400.png 656w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073858/SK-hynix_41st-Company-Anniversary_03-768x468.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">Building on its 15-year HBM history, SK hynix is well-placed to develop next-generation HBM products</p>
<p>&nbsp;</p>
<p>Despite this, SK hynix pressed forward, focusing on developing the next generation of HBM and pursuing the goal of achieving the “highest performance.” During this period, the company applied <strong>MR-MUF</strong><sup>4</sup> technology, known for its high thermal dissipation and production efficiency, to HBM2E which changed the market landscape. Building upon this, SK hynix developed <strong>Advanced MR-MUF</strong> technology, which excelled in thin chip stacking, thermal management, and productivity, and applied it to both HBM3 and HBM3E. Leveraging this technology, SK hynix set a series of industry-best performance records, successfully mass-producing the 12-layer HBM3 (24 GB) in 2023 and the <a href="https://news.skhynix.com/sk-hynix-begins-volume-production-of-the-world-first-12-layer-hbm3e/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">12-layer HBM3E (36 GB) in 2024</span></a>.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>4</sup><strong>Mass reflow-molded underfill (MR-MUF)</strong>: A technology that ensures secure and reliable connections in densely stacked chip assemblies by melting the bumps between stacked chips.</p>
<p>These achievements were driven by a strategy that precisely aligned with the rise of the AI revolution. SK hynix launched its AI memory products at the right time, fully meeting market demands. This was made possible through 15 years of accumulated technological expertise based on research and development, unwavering employee trust in the company’s know-how, and forward-looking strategic investments.</p>
<p>SK hynix has continued taking strategic steps to strengthen its AI leadership in 2024. In April, the company signed an investment agreement to build an advanced packaging production facility in the U.S. state of Indiana which will produce next-generation HBM and AI memory. In the same month, SK hynix entered a technology agreement with TSMC. The deal aims to establish a collaborative three-way framework between the customer, foundry, and memory provider to overcome technological limits and secure an advantage in the AI market.</p>
<h3 class="tit">Beyond HBM: Relentless Innovation &amp; Strengthening the AI Memory Lineup</h3>
<p>SK hynix’s pursuits and innovations are unfolding across all areas of memory. The company has established its “memory-centric”<sup>5</sup> vision and is developing a wide range of memory solutions based on over 40 years of accumulated technological knowledge. In 2024, SK hynix is making concerted efforts to strengthen its lineup with PIM, CXL, and AI SSD products to mark the first year of its renaissance.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>5</sup><strong>Memory-centric</strong>: An environment where memory semiconductors play the central role in ICT devices.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15966 size-full" title="SK hynix’s AI memory lineup includes PIM, CXL, and AI SSD products" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073905/SK-hynix_41st-Company-Anniversary_04.png" alt="SK hynix’s AI memory lineup includes PIM, CXL, and AI SSD products" width="1000" height="633" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073905/SK-hynix_41st-Company-Anniversary_04.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073905/SK-hynix_41st-Company-Anniversary_04-632x400.png 632w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073905/SK-hynix_41st-Company-Anniversary_04-768x486.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">SK hynix’s AI memory lineup includes PIM, CXL, and AI SSD products</p>
<p>&nbsp;</p>
<p>SK hynix is developing its lineup of processing-in-memory (PIM), an intelligent semiconductor memory which breaks the boundary between storage and computation. PIM, which features a processor for computational functions, is capable of processing and delivering the data required for AI computation. In terms of PIM-based products, SK hynix has launched the GDDR6-Accelerator-in-Memory (AiM) and last year <a href="https://news.skhynix.com/sk-hynix-debuts-first-gddr6-aim-accelerator-card-aimx-for-generative-ai/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">introduced the accelerator card AiMX</span></a>, an AiM-based accelerator that boosts performance by connecting multiple AiM units. In 2024, the company drew attention by <a href="https://news.skhynix.com/sk-hynix-presents-upgraded-aimx-solution-at-ai-hw-edge-ai-summit-2024/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">unveiling a 32 GB version of AiMX</span></a> which offers double the capacity of its predecessor.</p>
<p>SK hynix is also actively investing in Compute Express Link (CXL), a technology that integrates different interfaces such as those for CPUs and memory, to expand memory bandwidth and capacity. In May 2024, the company introduced the <a href="https://news.skhynix.com/sk-hynix-presents-ai-memory-solutions-at-cxl-devcon-2024/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;"><strong>CXL Memory Module (CMM)-DDR5</strong></span></a> which offers 50% greater bandwidth and double the capacity compared to standard DDR5. Then in September, SK hynix <a href="https://news.skhynix.com/sk-hynix-applies-cxl-optimization-solution-to-linux/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">integrated key features of its CXL-optimized software <strong>HMSDK</strong><sup>6</sup> into the open-source operating system Linux</span></a>, setting a new standard for the use of CXL technology.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>6</sup><strong>Heterogeneous Memory Software Development Kit (HMSDK)</strong>: SK hynix&#8217;s proprietary heterogeneous memory software development tool. Enhances the performance of heterogeneous memory systems, including CXL memory, through effective memory control.</p>
<p>Ultra-high-speed, high-capacity enterprise SSDs (eSSDs) for AI servers and data centers are another area of focus for SK hynix. A prime example is the <strong>60 TB Quad Level Cell (QLC) eSSD</strong>, co-developed with the company’s U.S. subsidiary Solidigm. This product stores 4 bits per cell while maintaining low power consumption. Looking ahead, the company is planning to develop a <strong>300 TB eSSD</strong> and launch the product in 2025.</p>
<p>The company also offers a robust lineup for on-device AI. SK hynix <a href="https://news.skhynix.com/sk-hynix-develops-worlds-fastest-mobile-dram-lpddr5t/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">developed the low-power DRAM, <strong>LPDDR5T</strong><sup>7</sup>, in January 2023</span></a> to enhance the performance of AI smartphones. In November of the same year, the company unveiled the modularized version of the LPDDR5X, <strong>LPCAMM2</strong>, which is expected to deliver excellent performance in AI desktops and laptops. SK hynix has also completed the development of <a href="https://news.skhynix.com/sk-hynix-develops-pcb01-for-artificial-intelligence-pcs/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">the high-performance client SSD (cSSD), <strong>PCB01</strong>, for AI PCs</span></a> and <a href="https://news.skhynix.com/sk-hynix-develops-next-generation-mobile-nand-solution-zufs-4-0/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">the mobile NAND solution for AI, <strong>Zoned UFS (ZUFS) 4.0</strong></span></a>.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>7</sup><strong>Low Power Double Data Rate 5 Turbo (LPDDR5T)</strong>: Low-power DRAM for mobile devices, including smartphones and tablets, aimed at minimizing power consumption and featuring low voltage operation. LPDDR5T is an upgraded product of the 7th generation LPDDR5X and will be succeeded by the 8th generation LPDDR6.</p>
<h3 class="tit">Shaping the Future of Total AI Memory</h3>
<p>Today, AI is being used to write reports, generate images, and create various types of content. In healthcare, AI aids in making diagnoses, while in education AI serves as an assistant for teachers. These are just a small selection of the current applications of AI, and the possibilities for the future are expected to be almost limitless as the technology continues to advance.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15967 size-full" title="SK hynix plans to develop customized AI memory and emerging memory products" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073912/SK-hynix_41st-Company-Anniversary_05.png" alt="SK hynix plans to develop customized AI memory and emerging memory products" width="1000" height="563" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073912/SK-hynix_41st-Company-Anniversary_05.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073912/SK-hynix_41st-Company-Anniversary_05-680x383.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/08073912/SK-hynix_41st-Company-Anniversary_05-768x432.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">SK hynix plans to develop customized AI memory and emerging memory products</p>
<p>&nbsp;</p>
<p>At the center of this technological revolution is AI memory. Various AI memory solutions, such as HBM, PIM, CXL, and SSDs, transmit large volumes of data quickly with high bandwidth or send only the processed results directly to the processor, minimizing bottlenecks and enhancing AI learning and inference performance. Furthermore, these technologies improve the energy efficiency of AI systems, contributing to the establishment of more sustainable AI infrastructure. These advanced AI memory technologies are expected to be applied across a wider range of industries such as the automotive and healthcare sectors, enabling faster and more efficient AI services.</p>
<p>To further AI’s development, SK hynix is continuously overcoming technological limitations. The company is focused on developing <strong>custom AI memory</strong> optimized for each customer in line with the growing diversification of AI services. Moreover, SK hynix is also working on next-generation <strong>emerging memory</strong>, which is based on new structures and principles as well as innovative components, such as ReRAM<sup>8</sup>, MRAM<sup>9</sup>, and PCM<sup>10</sup>. By relentlessly investing in technology development, SK hynix aims to secure differentiated competitiveness with advanced technologies and establish a leading position in future markets.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>8</sup><strong>Resistive RAM (ReRAM)</strong>: A type of emerging memory with a simple structure containing a filament in which data is stored by applying voltage to the filament. It features a large data storage capacity through process miniaturization and low power consumption.<br />
<sup>9</sup><strong>Magnetic Random Access Memory (MRAM)</strong>: A type of emerging memory which utilizes both charge and spin, with resistance in the device changing based on the direction of the spin.<br />
<sup>10</sup><strong>Phase-Change Memory (PCM)</strong>: Semiconductor memory which stores data by utilizing the phase change of a specific material (phase-change memory). It combines the benefits of non-volatile flash memory, which retains data even when powered off, with the rapid processing speeds of DRAM.</p>
<p>The semiconductor market itself is poised for significant growth. The World Semiconductor Trade Statistics (WSTS) forecasts that the semiconductor market will expand by 16% year-on-year in 2024. In particular, the semiconductor memory sector is <a href="https://www.wsts.org/76/Recent-News-Release" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">predicted to grow by an impressive 76.8%</span></a> as demand surges for AI memory such as HBM.</p>
<p>Standing at the forefront of the huge AI wave, SK hynix is preparing for another leap forward by building on its past achievements. As SK hynix celebrates its 41st anniversary, the company aims to maintain HBM leadership while securing dominance in the next-generation semiconductor market to stand alone in an era where its products become &#8220;the heart of AI.”</p>
<p>&nbsp;</p>
<p><a href="https://linkedin.com/showcase/skhynix-news-and-stories/" target="_blank" rel="noopener noreferrer"><img loading="lazy" decoding="async" class="size-full wp-image-15776 aligncenter" src=" https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1.png" alt="" width="800" height="135" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1-680x115.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1-768x130.png 768w" sizes="(max-width: 800px) 100vw, 800px" /></a></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-41st-anniversary-rise-to-ai-memory-leader/">[SK hynix’s 41st Anniversary] “Celebrating 40+1” … Harnessing 40 Years of Technological Expertise to Stand Alone as the Global No. 1 AI Memory Provider</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>SK hynix Presents Upgraded AiMX Solution at AI Hardware &#038; Edge AI Summit 2024</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-presents-upgraded-aimx-solution-at-ai-hw-edge-ai-summit-2024/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Fri, 13 Sep 2024 06:00:11 +0000</pubDate>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[featured]]></category>
		<category><![CDATA[artificial intelligence]]></category>
		<category><![CDATA[PIM]]></category>
		<category><![CDATA[GDDR6-AiM]]></category>
		<category><![CDATA[AI Hardware & Edge AI Summit]]></category>
		<category><![CDATA[AiMX]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=15762</guid>

					<description><![CDATA[<p>A glimpse of SK hynix’s booth at the AI Hardware &#38; Edge AI Summit 2024 &#160; SK hynix unveiled an enhanced Accelerator-in-Memory based Accelerator (AiMX) card at the AI Hardware &#38; Edge AI Summit 2024 held September 9–12 in San Jose, California. Organized annually by Kisaco Research, the summit brings together representatives from the AI [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-presents-upgraded-aimx-solution-at-ai-hw-edge-ai-summit-2024/">SK hynix Presents Upgraded AiMX Solution at AI Hardware & Edge AI Summit 2024</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15763 size-full" title="A glimpse of SK hynix’s booth at the AI Hardware &amp; Edge AI Summit 2024" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084255/SK-hynix_AI-HW-Edge-AI-Summit_01.png" alt="A glimpse of SK hynix’s booth at the AI Hardware &amp; Edge AI Summit 2024" width="1000" height="666" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084255/SK-hynix_AI-HW-Edge-AI-Summit_01.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084255/SK-hynix_AI-HW-Edge-AI-Summit_01-601x400.png 601w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084255/SK-hynix_AI-HW-Edge-AI-Summit_01-768x511.png 768w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084255/SK-hynix_AI-HW-Edge-AI-Summit_01-900x600.png 900w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">A glimpse of SK hynix’s booth at the AI Hardware &amp; Edge AI Summit 2024</p>
<p>&nbsp;</p>
<p>SK hynix unveiled an enhanced Accelerator-in-Memory based Accelerator (AiMX) card at the AI Hardware &amp; Edge AI Summit 2024 held September 9–12 in San Jose, California. Organized annually by Kisaco Research, the summit brings together representatives from the AI and machine learning ecosystem to share industry breakthroughs and developments. This year’s event focused on exploring cost and energy efficiency across the entire technology stack.</p>
<p>Marking its fourth appearance at the summit, SK hynix highlighted how its AiM<sup>1</sup> products can boost AI performance across data centers and edge devices<sup>2</sup>.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>1</sup><strong>Accelerator in Memory (AiM)</strong>: SK hynix’s PIM semiconductor product name, which includes GDDR6-AiM.<br />
<sup>2</sup><strong>Edge device</strong>: Hardware that controls the flow of data at the boundary between two networks. While they fulfill numerous roles, edge devices essentially serve as the entry or exit point to a network.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15764 size-full" title="Attendees gather to learn more about the upgraded AimX card" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084310/SK-hynix_AI-HW-Edge-AI-Summit_02.png" alt="Attendees gather to learn more about the upgraded AimX card" width="1000" height="666" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084310/SK-hynix_AI-HW-Edge-AI-Summit_02.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084310/SK-hynix_AI-HW-Edge-AI-Summit_02-601x400.png 601w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084310/SK-hynix_AI-HW-Edge-AI-Summit_02-768x511.png 768w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084310/SK-hynix_AI-HW-Edge-AI-Summit_02-900x600.png 900w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">Attendees gather to learn more about the upgraded AimX card</p>
<p>&nbsp;</p>
<h3 class="tit">Booth Highlights: Meet the Upgraded AiMX</h3>
<p>In the AI era, high-performance memory products are vital for the smooth operation of LLMs<sup>3</sup>. However, as these LLMs are trained on increasingly larger datasets and continue to expand, there is a growing need for more efficient solutions. SK hynix addresses this demand with its PIM<sup>4</sup> product AiMX, an AI accelerator card that combines multiple GDDR6-AiMs to provide high bandwidth and outstanding energy efficiency.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>3</sup><strong>Large language model (LLM)</strong>: Advanced AI systems that require extensive datasets to train models to understand and generate human-like language. It enables applications like natural language processing and translation.<br />
<sup>4</sup><strong>Processing-In-Memory (PIM)</strong>: A next-generation technology that embeds processing capabilities within memory, minimizing data transfer between the processor and memory. This boosts efficiency and speed, especially for data-intensive tasks like LLMs, where quick data access and processing are essential.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15765 size-full" title="The 32 GB AiMX prototype card was shown publicly for the first time at the event" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084325/SK-hynix_AI-HW-Edge-AI-Summit_03.png" alt="The 32 GB AiMX prototype card was shown publicly for the first time at the event" width="1000" height="666" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084325/SK-hynix_AI-HW-Edge-AI-Summit_03.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084325/SK-hynix_AI-HW-Edge-AI-Summit_03-601x400.png 601w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084325/SK-hynix_AI-HW-Edge-AI-Summit_03-768x511.png 768w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084325/SK-hynix_AI-HW-Edge-AI-Summit_03-900x600.png 900w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">The 32 GB AiMX prototype card was shown publicly for the first time at the event</p>
<p>&nbsp;</p>
<p>At the AI Hardware &amp; Edge AI Summit 2024, SK hynix presented its updated 32 GB AiMX prototype which offers double the capacity of the original card featured at last year’s event. To highlight the new AiMX’s advanced processing capabilities in a multi-batch<sup>5</sup> environment, SK hynix held a demonstration of the prototype card with the Llama 3<sup>6</sup> 70B model, an open source LLM. In particular, the demonstration underlined AiMX’s ability to serve as a highly effective attention<sup>7</sup> accelerator in data centers.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>5</sup><strong>Multi-batch</strong>: A computer processing method in which the system groups together multiple tasks (batches) and processes them at once.<br />
<sup>6</sup><strong>Llama 3</strong>: An open source LLM developed by Meta, featuring pretrained and instruction-fine-tuned language models.<br />
<sup>7</sup><strong>Attention</strong>: Mechanisms which give LLMs context about text, lessening the model’s chance of misunderstandings and allowing it to generate more accurate and contextually relevant outputs.</p>
<p><!-- swiper start --></p>
<div class="swiper-container">
<div class="swiper-wrapper">
<div class="swiper-slide">
<p class="img_area"><img loading="lazy" decoding="async" class="aligncenter wp-image-4330 size-full" style="width: 800px;" title="The upgraded AiMX was demonstrated with the Llama 3 70B model LLM to highlight its processing capabilities" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084340/SK-hynix_AI-HW-Edge-AI-Summit_04.png" alt="The upgraded AiMX was demonstrated with the Llama 3 70B model LLM to highlight its processing capabilities" width="1000" height="666" /></p>
</div>
<div class="swiper-slide">
<p class="img_area"><img loading="lazy" decoding="async" class="aligncenter wp-image-4330 size-full" style="width: 800px;" title="The upgraded AiMX was demonstrated with the Llama 3 70B model LLM to highlight its processing capabilities" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/12084357/SK-hynix_AI-HW-Edge-AI-Summit_05.png" alt="The upgraded AiMX was demonstrated with the Llama 3 70B model LLM to highlight its processing capabilities" width="1000" height="666" /></p>
</div>
</div>
<div class="swiper-button-next"></div>
<div class="swiper-button-prev"></div>
<div class="swiper-pagination"></div>
</div>
<p class="source" style="text-align: center;">The upgraded AiMX was demonstrated with the Llama 3 70B model LLM to highlight its processing capabilities</p>
<p>&nbsp;</p>
<p>AiMX addresses the cost, performance, and power consumption challenges associated with LLMs in not only data centers, but also in edge devices and on-device AI applications. For example, when applied to mobile on-device AI applications, AiMX improves LLM speed three-fold compared to mobile DRAM while maintaining the same power consumption.</p>
<h3 class="tit">Featured Presentation: Accelerating LLM Services from Data Centers to Edge Devices​</h3>
<p><!-- swiper start --></p>
<div class="swiper-container">
<div class="swiper-wrapper">
<div class="swiper-slide">
<p class="img_area"><img loading="lazy" decoding="async" class="aligncenter wp-image-4330 size-full" style="width: 800px;" title="Euicheol Lim presenting on how the AiMX system accelerates LLM services" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13005139/SK-hynix_AI-HW-Edge-AI-Summit_06.png" alt="Euicheol Lim presenting on how the AiMX system accelerates LLM services" width="1000" height="666" /></p>
</div>
<div class="swiper-slide">
<p class="img_area"><img loading="lazy" decoding="async" class="aligncenter wp-image-4330 size-full" style="width: 800px;" title="Euicheol Lim presenting on how the AiMX system accelerates LLM services" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13005151/SK-hynix_AI-HW-Edge-AI-Summit_07.png" alt="Euicheol Lim presenting on how the AiMX system accelerates LLM services" width="1000" height="666" /></p>
</div>
<div class="swiper-slide">
<p class="img_area"><img loading="lazy" decoding="async" class="aligncenter wp-image-4330 size-full" style="width: 800px;" title="Euicheol Lim presenting on how the AiMX system accelerates LLM services" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13005202/SK-hynix_AI-HW-Edge-AI-Summit_08.png" alt="Euicheol Lim presenting on how the AiMX system accelerates LLM services" width="1000" height="666" /></p>
</div>
</div>
<div class="swiper-button-next"></div>
<div class="swiper-button-prev"></div>
<div class="swiper-pagination"></div>
</div>
<p class="source" style="text-align: center;">Euicheol Lim presenting on how the AiMX system accelerates LLM services</p>
<p>&nbsp;</p>
<p>On the final day of the summit, SK hynix gave a presentation detailing how AiMX is an optimal solution for accelerating LLM services in data centers and edge devices. Euicheol Lim, research fellow and head of the Solution Advanced Technology team, shared the company’s plans to develop AiM products for on-device AI based on mobile DRAM and revealed the future vision for AiM. In closing, Lim emphasized the importance of close collaboration with companies involved in developing and managing data centers and edge systems to further advance AiMX products.</p>
<h3 class="tit">Looking Ahead: SK hynix’s Vision for AiMX in the AI Era</h3>
<p>The AI Hardware &amp; Edge AI Summit 2024 provided a platform for SK hynix to demonstrate AiMX’s applications in LLMs across data centers and edge devices. As a low-power, high-speed memory solution able to handle large amounts of data, AiMX is set to play a key role in the advancement of LLMs and AI applications.</p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-presents-upgraded-aimx-solution-at-ai-hw-edge-ai-summit-2024/">SK hynix Presents Upgraded AiMX Solution at AI Hardware & Edge AI Summit 2024</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>5 Queries to Boost Your Semiconductor Industry Knowledge</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/5-queries-to-boost-your-semiconductor-industry-knowledge/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Tue, 30 Apr 2024 06:00:09 +0000</pubDate>
				<category><![CDATA[featured]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[PIM]]></category>
		<category><![CDATA[ACIM]]></category>
		<category><![CDATA[AI Memory]]></category>
		<category><![CDATA[Moore's Law]]></category>
		<category><![CDATA[semiconductor industry]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=14932</guid>

					<description><![CDATA[]]></description>
										<content:encoded><![CDATA[<p><center><img loading="lazy" decoding="async" class="aligncenter wp-image-11738 size-full" style="margin: 0;" title="5 Queries to Boost Your Semiconductor Industry Knowledge " src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/04/29073318/SK-hynix_Fun-Facts-2_EN_Q0_Title.gif" alt="5 Queries to Boost Your Semiconductor Industry Knowledge" width="700" height="791" /><img loading="lazy" decoding="async" class="aligncenter wp-image-11738 size-full" style="margin: 0;" title="Does Moore’s Law still apply to the semiconductor industry? " src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/04/29073321/SK-hynix_Fun-Facts-2_EN_Q1.gif" alt="Does Moore’s Law still apply to the semiconductor industry?" width="700" height="900" /><img loading="lazy" decoding="async" class="aligncenter wp-image-11738 size-full" style="margin: 0;" title="How does the growth of AI affect demand for semiconductor memory?" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/04/29073325/SK-hynix_Fun-Facts-2_EN_Q2.gif" alt="How does the growth of AI affect demand for semiconductor memory?" width="700" height="930" /><img loading="lazy" decoding="async" class="aligncenter wp-image-11738 size-full" style="margin: 0;" title="Can memory products perform computations like logic chips?" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/04/29073328/SK-hynix_Fun-Facts-2_EN_Q3.gif" alt="Can memory products perform computations like logic chips?" width="700" height="870" /><img loading="lazy" decoding="async" class="aligncenter wp-image-11738 size-full" style="margin: 0;" title="What factors affect the price of chips?" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/04/29073330/SK-hynix_Fun-Facts-2_EN_Q4.gif" alt="What factors affect the price of chips?" width="700" height="515" /><img loading="lazy" decoding="async" class="aligncenter wp-image-11738 size-full" style="margin: 0;" title="Why are there different types of semiconductor companies?" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/04/29073334/SK-hynix_Fun-Facts-2_EN_Q5.gif" alt="Why are there different types of semiconductor companies?" width="700" height="985" /></center></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/5-queries-to-boost-your-semiconductor-industry-knowledge/">5 Queries to Boost Your Semiconductor Industry Knowledge</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>SK hynix Debuts Prototype of First GDDR6-AiM Accelerator Card &#8216;AiMX&#8217; for Generative AI</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-debuts-first-gddr6-aim-accelerator-card-aimx-for-generative-ai/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Mon, 18 Sep 2023 00:00:48 +0000</pubDate>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[featured]]></category>
		<category><![CDATA[AI Hardware & Edge AI Summit]]></category>
		<category><![CDATA[AiMX]]></category>
		<category><![CDATA[Generative AI accelerator]]></category>
		<category><![CDATA[GDDR6-AiM]]></category>
		<category><![CDATA[artificial intelligence]]></category>
		<category><![CDATA[PIM]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=12888</guid>

					<description><![CDATA[<p>SK hynix unveiled and demonstrated a prototype of AiMX1, a generative AI accelerator2 card based on GDDR6-AiM, at the AI Hardware &#38; Edge AI Summit 2023 held September 12–14 at the Santa Clara Marriott, California. 1Accelerator-in-Memory based Accelerator (AiMX): SK hynix&#8217;s accelerator card product that specializes in large language models (AI that learns with large [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-debuts-first-gddr6-aim-accelerator-card-aimx-for-generative-ai/">SK hynix Debuts Prototype of First GDDR6-AiM Accelerator Card ‘AiMX’ for Generative AI</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>SK hynix unveiled and demonstrated a prototype of AiMX<sup>1</sup>, a generative AI accelerator<sup>2</sup> card based on GDDR6-AiM, at the AI Hardware &amp; Edge AI Summit 2023 held September 12–14 at the Santa Clara Marriott, California.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>1</sup><strong>Accelerator-in-Memory based Accelerator (AiMX)</strong>: SK hynix&#8217;s accelerator card product that specializes in large language models (AI that learns with large amounts of text data such as ChatGPT) using GDDR6-AiM chips.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>2</sup><strong>Accelerator</strong>: A special-purpose hardware device that uses a chip designed specifically for processing and computing information.</p>
<p><!-- swiper start --></p>
<div class="swiper-container">
<div class="swiper-wrapper">
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="size-full wp-image-4330 aligncenter" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094216/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_01.png" alt="SK hynix's exhibition booth at the AI Hardware &amp; Edge AI Summit 2023" /></p>
</div>
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="size-full wp-image-4330 aligncenter" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094227/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_02.png" alt="SK hynix's exhibition booth at the AI Hardware &amp; Edge AI Summit 2023" /></p>
</div>
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="size-full wp-image-4330 aligncenter" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094237/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_03.png" alt="SK hynix's exhibition booth at the AI Hardware &amp; Edge AI Summit 2023" /></p>
</div>
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="size-full wp-image-4330 aligncenter" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094247/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_04.png" alt="SK hynix's exhibition booth at the AI Hardware &amp; Edge AI Summit 2023" /></p>
</div>
</div>
<div class="swiper-button-next"></div>
<div class="swiper-button-prev"></div>
<div class="swiper-pagination"></div>
</div>
<p class="source" style="text-align: center;">Figure 1. SK hynix&#8217;s exhibition booth at the AI Hardware &amp; Edge AI Summit 2023</p>
<p>&nbsp;</p>
<p>Hosted annually by the UK marketing firm Kisaco Research, the AI Hardware &amp; Edge AI Summit brings together global IT companies and high-profile startups to share their developments in artificial intelligence and machine learning. This is SK hynix’s third time participating in the summit.</p>
<p>At the event, the company showcased the prototype of AiMX, an accelerator card that combines multiple <a href="https://news.skhynix.com/sk-hynix-develops-pim-next-generation-ai-accelerator/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">GDDR6-AiMs</span></a> to further enhance performance, along with the GDDR6-AIM itself under the slogan of &#8220;Boost Your AI: Discover the Power of PIM<sup>3</sup> with SK hynix&#8217;s AiM<sup>4</sup>.&#8221;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>3</sup><strong>Processing-In-Memory (PIM)</strong>: A next-generation technology that adds computational capabilities to semiconductor memories to solve the problem of data movement congestion in AI and big data processing.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>4</sup><strong>Accelerator in Memory (AiM)</strong>: SK hynix&#8217;s PIM semiconductor product name, which includes GDDR6-AiM.</p>
<p class="source" style="text-align: center;"><img loading="lazy" decoding="async" class="size-full wp-image-12913 aligncenter" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094325/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_08.png" alt="The AiMX card utilizes multiple GDDR6-AiM chips for enhanced performance" width="1000" height="670" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094325/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_08.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094325/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_08-597x400.png 597w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094325/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_08-768x515.png 768w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094325/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_08-900x604.png 900w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094325/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_08-400x269.png 400w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">Figure 2. The prototype AiMX card utilizes multiple GDDR6-AiM chips for enhanced performance</p>
<p>&nbsp;</p>
<p>As a low-power, high-speed memory solution capable of handling large amounts of data, AiMX is set to play a key role in the advancement of data-intensive generative AI<sup>5</sup> systems. The performance of generative AI improves as it is trained on more data, highlighting the need for high-performance products which can be applied to an array of generative AI systems.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>5</sup><strong>Generative AI</strong>: AI that learns from large amounts of data to actively generate results based on a user&#8217;s specific needs.</p>
<p><img loading="lazy" decoding="async" class="alignnone size-full wp-image-12910" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094257/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_05.png" alt="Demonstrating a large AI language model with AiMX that utilizes GDDR6-AiM" width="1000" height="670" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094257/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_05.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094257/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_05-597x400.png 597w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094257/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_05-768x515.png 768w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094257/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_05-900x604.png 900w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094257/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_05-400x269.png 400w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">Figure 3. Demonstrating a large AI language model with AiMX that utilizes GDDR6-AiM</p>
<p>&nbsp;</p>
<p>SK hynix also demonstrated Meta&#8217;s generative AI Open Pretrained Transformer (OPT) 13B model on a server system equipped with the AiMX prototype. The AiMX system featuring GDDR6-AiM chips reduces data processing time by more than 10 times compared to systems with GPUs, while consuming one-fifth the power. The company&#8217;s demonstration piqued the interest of global companies providing AI services by showing that it can deliver higher performance<sup>6</sup> than the most recent accelerators.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>6</sup>Performance is based on the condition that the AiM Control Hub inside the AiMX card is developed as an application-specific integrated circuit (ASIC).</p>
<p><!-- swiper start --></p>
<div class="swiper-container">
<div class="swiper-wrapper">
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="size-full wp-image-4330 aligncenter" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094308/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_06.png" alt="Eui-cheol Lim, vice president of SK hynix’s Solution Development division, delivers a presentation on AiMX" /></p>
</div>
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="size-full wp-image-4330 aligncenter" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/14094319/SK-hynix_AI-Hardware-Edge-AI-Summit-2023_07.png" alt="Eui-cheol Lim, vice president of SK hynix’s Solution Development division, delivers a presentation on AiMX" /></p>
</div>
</div>
<div class="swiper-button-next"></div>
<div class="swiper-button-prev"></div>
<div class="swiper-pagination"></div>
</div>
<p class="source" style="text-align: center;">Figure 4. Eui-cheol Lim, vice president of SK hynix’s Solution Development division, delivers a presentation on AiMX</p>
<p>&nbsp;</p>
<p>In addition, the company held a session outlining the benefits of AiMX. In a presentation titled &#8220;Cost-Effective Generative AI Inference Acceleration using AiM,&#8221; Eui-cheol Lim, vice president of the Solution Development division, compared the performance of GPUs and AiMX and discussed the future of next-generation intelligent semiconductor memories.</p>
<p>&#8220;SK hynix&#8217;s AiMX is a solution that delivers higher performance while consuming less power, and costing less than conventional GPUs,&#8221; Lim explained. &#8220;We will continue to develop memory technologies that will lead the way in the era of artificial intelligence.&#8221;</p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-debuts-first-gddr6-aim-accelerator-card-aimx-for-generative-ai/">SK hynix Debuts Prototype of First GDDR6-AiM Accelerator Card ‘AiMX’ for Generative AI</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>SK hynix Showcases Data Center Memory Solutions at HPE Discover 2023</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-showcases-data-center-memory-solutions-at-hpe-discover-2023/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Fri, 23 Jun 2023 00:00:51 +0000</pubDate>
				<category><![CDATA[Business]]></category>
		<category><![CDATA[featured]]></category>
		<category><![CDATA[SEMICONDUCTOR MEMORY]]></category>
		<category><![CDATA[PIM]]></category>
		<category><![CDATA[HBM3]]></category>
		<category><![CDATA[CXL]]></category>
		<category><![CDATA[HPE]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=12033</guid>

					<description><![CDATA[<p>&#160; SK hynix presented its next-generation memory technologies and products at HPE Discover 2023, an IT conference held this year in Las Vegas between June 20-22. Figure 1. SK hynix&#8217;s exhibition booth at HPE Discover 2023 &#160; Held annually by the American ICT company Hewlett Packard Enterprise (HPE), HPE Discover brings together the company’s customers [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-showcases-data-center-memory-solutions-at-hpe-discover-2023/">SK hynix Showcases Data Center Memory Solutions at HPE Discover 2023</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>&nbsp;</p>
<p>SK hynix presented its next-generation memory technologies and products at HPE Discover 2023, an IT conference held this year in Las Vegas between June 20-22.</p>
<p><!-- swiper start --></p>
<div class="swiper-container">
<div class="swiper-wrapper">
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/06/23015049/SK-hynixs-exhibition-booth.png" /></p>
</div>
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/06/23015034/SK-hynixs-exhibition-booth-2.png" /></p>
</div>
</div>
<div class="swiper-button-next"></div>
<div class="swiper-button-prev"></div>
<div class="swiper-pagination"></div>
</div>
<p class="source">Figure 1. SK hynix&#8217;s exhibition booth at HPE Discover 2023</p>
<p>&nbsp;</p>
<p>Held annually by the American ICT company Hewlett Packard Enterprise (HPE), HPE Discover brings together the company’s customers and partners along with industry experts to explore data center trends and the latest technologies such as memory solutions. SK hynix was among the exhibitors at the show as the company further strengthened its partnership with HPE.</p>
<p>Under the slogan &#8220;Elevate Your Edge With Memory Performance,&#8221; SK hynix showcased its industry-leading memory solutions for data centers. These included its PS1010 E3.S, a high-performance PCIe<sup>1</sup> Gen5-based eSSD, and DDR5 RDIMM, a DRAM module for servers applied with a 1bnm process. The capabilities of both products were highlighted during a joint promotion with HPE in which they were applied to Gen11, the host company’s latest server range.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>1</sup><strong>Peripheral Component Interconnect Express (PCle)</strong>: A serial-structured, high-speed I/O interface used on the motherboard of digital devices.</p>
<p>&nbsp;</p>
<p><!-- swiper start --></p>
<div class="swiper-container">
<div class="swiper-wrapper">
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/06/23015020/SK-hynix-and-Solidigms-advanced-storage-and-memory-solutions.png" alt="" /></p>
</div>
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/06/23014956/SK-hynix-and-Solidigms-advanced-storage-and-memory-solutions-2.png" /></p>
</div>
<div class="swiper-slide">
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/06/23015009/SK-hynix-and-Solidigms-advanced-storage-and-memory-solutions-3.png" /></p>
</div>
</div>
<div class="swiper-button-next"></div>
<div class="swiper-button-prev"></div>
<div class="swiper-pagination"></div>
</div>
<p class="source">Figure 2. SK hynix and Solidigm display their advanced storage and memory solutions at HPE Discover 2023</p>
<p>&nbsp;</p>
<p>Visitors could also see SK hynix’s lineup of advanced memory solutions including: HBM3<sup>2</sup> , a memory product that has been highlighted recently due to the rise of generative AI; CXL<sup>3</sup> memory, an interconnect technology that enables efficient scaling of memory bandwidth and capacity; and PIM<sup>4</sup> , a next-generation memory chip with computing capabilities. SK hynix’s subsidiary company, Solidigm, also showcased its portfolio of products including its PCIe Gen4 NVMe<sup>5</sup> -based SSD.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>2</sup><strong>High Bandwidth Memory (HBM)</strong>: A high-value, high-performance product that possesses much higher data processing speeds compared to existing DRAMs by vertically connecting multiple DRAMs with through-silicon via (TSV).<br />
<sup>3</sup><strong>Compute Express Link (CXL)</strong>: A next-generation interconnect protocol based on PCIe that efficiently bolsters high-performance computing systems.<br />
<sup>4</sup><strong>Processing-In-Memory (PIM)</strong>: A next-generation technology that adds computational capabilities to semiconductor memories to solve congestions in data movement found in AI and big data processing.<br />
<sup>5</sup><strong>Non-Volatile Memory express (NVMe)</strong>: A communication protocol for storage devices based on PCIe interface. It can achieve speeds of up to six times greater than traditional SATA interfaces, making it suitable for ultra-fast, large data processing.</p>
<p>&nbsp;</p>
<p><img loading="lazy" decoding="async" class="alignnone size-full wp-image-12050" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/06/23015105/Vice-President-Eui-cheol-Lim-of-SK-hynix%E2%80%99s-Solution-Development-division.png" alt="" width="1600" height="1022" /></p>
<p class="source">Figure 3. Vice President Eui-cheol Lim of SK hynix’s Solution Development division explains how PIM semiconductors will increase the efficiency of GPTs in the future</p>
<p>&nbsp;</p>
<p>SK hynix also held a session where its members spoke about the role and vision of memory solutions in the future. For his part of the presentation, Vice President Eui-cheol Lim of the Solution Development division gave a talk on how PIM semiconductors can increase the efficiency of Generative Pre-trained Transformers (GPTs). Technical Leaders Tai-jin Choi and Santosh Kumar of SK hynix America presented about trends in SSD storage technology for next-generation servers, and their colleague Technical Leader Yoosung Lee presented on how DDR5 is set to be the standard for next-generation DRAMs in the era of big data. All three presentations emphasized how memory solutions are essential in responding to the rapidly changing IT environment.</p>
<p>&#8220;Going forward, we plan to not only strengthen our partnerships with key customers but also showcase our unprecedented, next-generation memory solutions,&#8221; said Seok Kim, the head of GSM Strategy at SK hynix.</p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-showcases-data-center-memory-solutions-at-hpe-discover-2023/">SK hynix Showcases Data Center Memory Solutions at HPE Discover 2023</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>How SK hynix is Set  to Power the Generative AI Revolution</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/how-sk-hynix-is-set-to-power-the-generative-ai-revolution/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Tue, 23 May 2023 06:00:52 +0000</pubDate>
				<category><![CDATA[featured]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[SK hynix]]></category>
		<category><![CDATA[PIM]]></category>
		<category><![CDATA[HBM3]]></category>
		<category><![CDATA[GDDR6-AiM]]></category>
		<category><![CDATA[Generative AI]]></category>
		<category><![CDATA[ChatGPT]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=11709</guid>

					<description><![CDATA[]]></description>
										<content:encoded><![CDATA[<p><img loading="lazy" decoding="async" class="size-full wp-image-11738 aligncenter" style="margin: 0;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/05/22003547/SK-hynix_Generative-AI-Infographic_EN_013.gif" alt="" width="1000" height="1132" /><img loading="lazy" decoding="async" class="size-full wp-image-11736 aligncenter" style="margin: 0;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/05/19040511/SK-hynix_Generative-AI-Infographic_EN_021.gif" alt="" width="1000" height="1044" /><img loading="lazy" decoding="async" class="size-full wp-image-11714 aligncenter" style="margin: 0;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/05/19021510/SK-hynix_Generative-AI-Infographic_EN_03.gif" alt="" width="1000" height="885" /><img loading="lazy" decoding="async" class="size-full wp-image-11715 aligncenter" style="margin: 0;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/05/19021552/SK-hynix_Generative-AI-Infographic_EN_04.gif" alt="" width="1000" height="810" /><img loading="lazy" decoding="async" class="size-full wp-image-11716 aligncenter" style="margin: 0;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/05/19021614/SK-hynix_Generative-AI-Infographic_EN_05.gif" alt="" width="1000" height="1078" /><img loading="lazy" decoding="async" class="size-full wp-image-11717 aligncenter" style="margin: 0;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/05/19021727/SK-hynix_Generative-AI-Infographic_EN_06.gif" alt="" width="1000" height="856" /><img loading="lazy" decoding="async" class="size-full wp-image-11718 aligncenter" style="margin: 0;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/05/19021745/SK-hynix_Generative-AI-Infographic_EN_07.gif" alt="" width="1000" height="765" /><img loading="lazy" decoding="async" class="size-full wp-image-11732 aligncenter" style="margin: 0;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/05/19030037/SK-hynix_Generative-AI-Infographic_EN_0809.gif" alt="" width="1000" height="1434" /><img loading="lazy" decoding="async" class="size-full wp-image-11721 aligncenter" style="margin: 0;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/05/19021829/SK-hynix_Generative-AI-Infographic_EN_10.gif" alt="" width="1000" height="666" /></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/how-sk-hynix-is-set-to-power-the-generative-ai-revolution/">How SK hynix is Set  to Power the Generative AI Revolution</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>SK hynix to Showcase Energy-Efficient, High-Performance Memory Products at CES 2023</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-to-showcase-energy-efficient-high-performance-memory-products-at-ces-2023/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Mon, 26 Dec 2022 23:30:48 +0000</pubDate>
				<category><![CDATA[featured]]></category>
		<category><![CDATA[Press Release]]></category>
		<category><![CDATA[PIM]]></category>
		<category><![CDATA[HBM3]]></category>
		<category><![CDATA[GDDR6-AiM]]></category>
		<category><![CDATA[CXL]]></category>
		<category><![CDATA[PS1010]]></category>
		<category><![CDATA[CES2023]]></category>
		<category><![CDATA[Green Digital Solution]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=10586</guid>

					<description><![CDATA[<p>News Highlights Core and brand-new products introduced under the theme of “Green Digital Solution” Introduction of eSSD with ultrahigh-performance to solidify SK hynix’s leadership in server memory market Solution to solve customers’ pain point proposed Seoul, December 27, 2022 SK hynix Inc. (or “the company”, www.skhynix.com) announced today that it will showcase a number of [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-to-showcase-energy-efficient-high-performance-memory-products-at-ces-2023/">SK hynix to Showcase Energy-Efficient, High-Performance Memory Products at CES 2023</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<h3 class="tit">News Highlights</h3>
<ul style="color: #000; font-size: 18px; padding-left: 20px;">
<li>Core and brand-new products introduced under the theme of “Green Digital Solution”</li>
<li>Introduction of eSSD with ultrahigh-performance to solidify SK hynix’s leadership in server memory market</li>
<li>Solution to solve customers’ pain point proposed</li>
</ul>
<h3 class="tit"></h3>
<h3 class="tit">Seoul, December 27, 2022</h3>
<p>SK hynix Inc. (or “the company”, <a href="https://urldefense.com/v3/__https:/www.skhynix.com/eng/main.do__;!!N96JrnIq8IfO5w!mA80I9OXgyLho-eXDg2fttNQQBXKvVfOSZvkXNmFsgQDbCQq6zwGJB84bBRElqnJHAiFZkquLcIEPfIGPD46jqgrwXPETlQ$" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">www.skhynix.com</span></a>) announced today that it will showcase a number of its core and brand-new products at the CES 2023, the most influential tech event in the world taking place in Las Vegas from Jan 5<sup>th</sup> through Jan 8<sup>th</sup>.</p>
<p>The products, introduced under the theme of the “Green Digital Solution,” as part of the SK Group’s “Carbon-Free Future” campaign, are expected to attract Big Tech customers and experts given the significant improvement in performance and energy efficiency compared with the previous generation as well as the effect of lessening the impact on the environment.</p>
<p>Attention on energy-efficient memory chips has been on the rise as global tech companies pursue products that process data faster, while consuming less energy. SK hynix is confident that its products to be displayed at the CES2023 will meet customers’ such needs with outstanding performance per watt* and performance.</p>
<p style="font-size: 14px; font-style: italic; color: #555;">* Performance per watt: an indicator of how much computation is performed per watt of power consumed.</p>
<p>The core product put forward at the show is PS1010 E3.S, an eSSD product composed of multiple 176-layer 4D NAND that supports the fifth generation of the PCIe* interface.</p>
<p style="font-size: 14px; font-style: italic; color: #555;">*PCIe (Peripheral Component Interconnect Express): a high-speed input/output series interface used in the mainboard of digital devices. PCIe’s data-tranfer speed doubles in accordance with a generation shift.</p>
<h3 class="tit"><img loading="lazy" decoding="async" class="size-full wp-image-10594 aligncenter" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/12/26093443/CES2023_SK-hynix-Products.png" alt="" width="1000" height="707" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/12/26093443/CES2023_SK-hynix-Products.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/12/26093443/CES2023_SK-hynix-Products-566x400.png 566w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/12/26093443/CES2023_SK-hynix-Products-768x543.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></h3>
<p>SK hynix said that the introduction of the PS1010, a combination of the company’s industry-leading technologies, was made timely as the server chip market continues to grow despite the current industry downturn.</p>
<p>The PS1010 product shows improvement both in reading and writing speed by 130% and 49%, respectively, compared with the previous generation. Its performance-per-watt is also improved by more than 75%, helping customers reduce costs to run servers and carbon emission.</p>
<p>“We’re proud to launch PS1010, an ultrahigh-performance product with self-developed controller and firmware, at the CES 2023, the world’s largest technology show,” Yun Jae Yeun, Head of NAND Product Planning, said. “This product will solve pain point of our server-chip customers, while paving the way for a stronger competitiveness in NAND business for us.”</p>
<p>Other products to be introduced at the show are HBM3*, a memory product with the world’s best specification for high performance computing, GDDR6-AiM that adopts the PIM* technology and CXL* memory capable of flexible expansion of memory capacity and performance.</p>
<p style="font-size: 14px; font-style: italic; color: #555;">* HBM (High Bandwidth Memory): High-value, high-performance memory that vertically interconnects multiple DRAM chips and dramatically increases data processing speed in comparison to traditional DRAM products.</p>
<p style="font-size: 14px; font-style: italic; color: #555;">* PIM (Processing In Memory): A next-generation technology that provides a solution for data congestion issues for AI and big data by adding computational functions to semiconductor memory</p>
<p style="font-size: 14px; font-style: italic; color: #555;">* CXL (Compute Express Link): A PCIe-based next-generation interconnect protocol on which high-performance computing systems are based</p>
<p>SK hynix will also present the immersion cooling* technology of SK enmove, which specializes in energy efficiency. The technology, designed to help cool down the heat of the servers generated during the operation, marks a successful case where SK hynix cooperated with other SK companies or external business partners to create new values in the semiconductor business.</p>
<p style="font-size: 14px; font-style: italic; color: #555;">* Immersion Cooling: A next-generation thermal-management technology that cools down temperature by submerging data servers into cooling oil. This way, the total electricity consumption can be reduced by 30% compared with the existing technology that uses air to cool down temperature.</p>
<h3 class="tit">About SK hynix Inc.</h3>
<p>SK hynix Inc., headquartered in Korea, is the world’s top tier semiconductor supplier offering Dynamic Random Access Memory chips (“DRAM”), flash memory chips (&#8220;NAND flash&#8221;) and CMOS Image Sensors (&#8220;CIS&#8221;) for a wide range of distinguished customers globally. The Company’s shares are traded on the Korea Exchange, and the Global Depository shares are listed on the Luxembourg Stock Exchange. Further information about SK hynix is available at <span style="text-decoration: underline;"><a href="https://urldefense.com/v3/__https:/www.skhynix.com/eng/main.do__;!!N96JrnIq8IfO5w!mA80I9OXgyLho-eXDg2fttNQQBXKvVfOSZvkXNmFsgQDbCQq6zwGJB84bBRElqnJHAiFZkquLcIEPfIGPD46jqgrwXPETlQ$" target="_blank" rel="noopener noreferrer">www.skhynix.com</a></span>, <span style="text-decoration: underline;"><a href="https://urldefense.com/v3/__https:/news.skhynix.com/__;!!N96JrnIq8IfO5w!mA80I9OXgyLho-eXDg2fttNQQBXKvVfOSZvkXNmFsgQDbCQq6zwGJB84bBRElqnJHAiFZkquLcIEPfIGPD46jqgroMl7UVQ$" target="_blank" rel="noopener noreferrer">news.skhynix.com</a></span>.</p>
<h3 class="tit">Media Contact</h3>
<p>SK hynix Inc.<br />
Global Public Relations</p>
<p><em>Technical Leader</em><br />
Kanga Kong<br />
E-Mail: <a href="mailto:global_newsroom@skhynix.com">global_newsroom@skhynix.com</a></p>
<p><em>Technical Leader</em><br />
Jaehwan Kevin Kim<br />
E-Mail: <a href="mailto:global_newsroom@skhynix.com">global_newsroom@skhynix.com</a></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-to-showcase-energy-efficient-high-performance-memory-products-at-ces-2023/">SK hynix to Showcase Energy-Efficient, High-Performance Memory Products at CES 2023</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Let PIM Do the Learning: The Brainpower Behind the AI Memory Chip</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/let-pim-do-the-learning-the-brainpower-behind-the-ai-memory-chip/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Fri, 17 Jun 2022 07:00:57 +0000</pubDate>
				<category><![CDATA[featured]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[PIM]]></category>
		<category><![CDATA[GDDR6-AiM]]></category>
		<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[ISSCC]]></category>
		<category><![CDATA[AI]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=9359</guid>

					<description><![CDATA[<p>When IBM-developed computer Watson beat out its human competitors on the quiz show Jeopardy in 2011, it was thought to be the beginning of the end of the superior reign of human intelligence. Watson brought discussions of AI to the mainstream. Its ability to apply machine learning to gather and analyze massive amounts of data [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/let-pim-do-the-learning-the-brainpower-behind-the-ai-memory-chip/">Let PIM Do the Learning: The Brainpower Behind the AI Memory Chip</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><img loading="lazy" decoding="async" class="size-full wp-image-9360 aligncenter" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045057/SK-hynix_Let-PIM-Do-the-Learning_thumbnail.png" alt="" width="680" height="400" /></p>
<p>When IBM-developed computer Watson beat out its human competitors on the quiz show Jeopardy in 2011, it was thought to be the beginning of the end of the superior reign of human intelligence. Watson brought discussions of AI to the mainstream. Its ability to apply machine learning to gather and analyze massive amounts of data in a flash was something most thought exclusive to sci-fi.</p>
<p>Quintillions of bytes of data are now being generated each day, with the <a class="-as-ga" style="text-decoration: underline;" href="https://www.statista.com/statistics/871513/worldwide-data-created/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.statista.com/statistics/871513/worldwide-data-created/">amount of data generated by 2025</a> predicted to be 181 zettabytes. While this volume of data exceeds far beyond the realm of human consumption, cloud computing, faster processing, faster networks, and faster chips mean it can be processed and applied efficiently. AI isn’t a pipe dream &#8211; it’s a reality.</p>
<h3>From Synapses to Circuits</h3>
<p>Semiconductors supporting AI functions must capitalize on space and provide means for parallel processing for complex tasks. Enter, Processing in Memory chips. The so-called PIM chip integrates a processor with Random Access Memory (RAM) on a single memory module. This structure removes the boundary between memory and system semiconductors, allowing data storage and data processing to happen in the same place.</p>
<p>By eliminating the need for data to traverse modules, response times are greatly improved, allowing for <a class="-as-ga" style="text-decoration: underline;" href="https://www.techtarget.com/searchbusinessanalytics/definition/processing-in-memory-PIM" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.techtarget.com/searchbusinessanalytics/definition/processing-in-memory-PIM">real-time data processing.</a> More traditional computer architectures, which manage processing and storage in separate modules, often fall prey to latency issues, commonly referred to as the von Neumann bottleneck. Adding processing functions to memory semiconductors presents a unique solution to overcome this long-standing problem.</p>
<p>SK hynix <a class="-as-ga" style="text-decoration: underline;" href="https://news.skhynix.com/sk-hynix-develops-pim-next-generation-ai-accelerator/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://news.skhynix.com/sk-hynix-develops-pim-next-generation-ai-accelerator/">unveiled its next-generation PIM</a> in February 2022 at ISSCC in San Francisco. The GDDR6-AiM (Accelerator in Memory) adds computational functions to GDDR6 memory chips, allowing for data to be processed at speeds of up to 16 Gbps.</p>
<p>GDDR6-AiM is also more energy efficient, reducing power consumption by 80% by removing data movement to the CPU and GPU. Advancing technology in a manner that supports a greener and more equitable world is an integral part of SK hynix future vision. GDDR6-AiM can help reduce carbon emissions and shrink the carbon footprint of any technology it’s applied to, advancing <a class="-as-ga" style="text-decoration: underline;" href="https://www.skhynix.com/sustainability/UI-FR-SA1601/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.skhynix.com/sustainability/UI-FR-SA1601/">SK hynix’s ESG-related goals</a> and expanding their positive impact across their clients’ industries.</p>
<p>While particularly effective in managing the needs of AI-based systems, PIM can be applied to a broad spectrum of technologies. Databases, query engines, data grids, and more all require some version of data storage and processing coupled with custom applications leveraging a variety of inputs.</p>
<p><img loading="lazy" decoding="async" class="alignnone size-full wp-image-9361" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045059/GDDR6-AiM_01.jpg" alt="" width="1000" height="614" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045059/GDDR6-AiM_01.jpg 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045059/GDDR6-AiM_01-651x400.jpg 651w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045059/GDDR6-AiM_01-768x472.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source">The next generation of smart memory</p>
<h3>Machine Learning vs. Deep Learning</h3>
<p>Unbeknownst to many, artificial intelligence is a broad term that describes the science of creating machines that think like humans. The term machine learning marks functionalities that enable computers to perform tasks without explicit programming and includes deep learning, a subset that relies on artificial neural networks.</p>
<p>Deep learning can be seen as the most independent AI system as it manages both <a class="-as-ga" style="text-decoration: underline;" href="https://www.computer.org/publications/tech-news/trends/deep-learning-vs-machine-learning-whats-the-difference" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.computer.org/publications/tech-news/trends/deep-learning-vs-machine-learning-whats-the-difference">feature input and classification.</a> These systems also require vast amounts of data and rely on parallel processes as their algorithms are primarily self-directed once trained.</p>
<p>AI machines, including deep learning models, are already a part of our lives. There are countless real-world AI applications, which only stand to increase. Everything from mobile devices to autonomous vehicles utilize AI models for tasks like location-based recommendation, auto-braking, camera-based object classification, and navigation through complex environments.</p>
<p><img loading="lazy" decoding="async" class="alignnone size-full wp-image-9362" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045102/SK-hynix_Let-PIM-Do-the-Learning.png" alt="" width="1000" height="551" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045102/SK-hynix_Let-PIM-Do-the-Learning.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045102/SK-hynix_Let-PIM-Do-the-Learning-680x375.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045102/SK-hynix_Let-PIM-Do-the-Learning-768x423.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source">The art of computationally mimicking human intelligence takes many forms</p>
<h3>Overcoming the Challenges</h3>
<p>The road to PIM development was not without detours, roadblocks, and congestion. As the technology continues to advance, there are still obstacles to surmount across design, manufacturing, cost, and more.</p>
<p>Designing PIM requires the application of novel approaches to chip structures. Traditional semiconductors do not need to accommodate near-memory queues or perform parallel functions in the same way PIM chips do. Once onto the manufacturing stage, space and distance considerations become paramount. It is crucial to reduce how far signals must travel without increased cost or risk of thermal issues.</p>
<p>Furthermore, integrated chips such as PIM have an increased dependency on memory – a unique feature that is both a blessing and a curse. Any damage to the memory components could result in compromised data.</p>
<p>With the AI market expected <a class="-as-ga" style="text-decoration: underline;" href="https://www.statista.com/statistics/607716/worldwide-artificial-intelligence-market-revenues/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.statista.com/statistics/607716/worldwide-artificial-intelligence-market-revenues/">to reach $190 billion by 2025,</a> investment in AI is ripe. According to a Boston Consulting Group and MIT Sloan Management Review study, <a class="-as-ga" style="text-decoration: underline;" href="https://www.forbes.com/sites/louiscolumbus/2017/09/10/how-artificial-intelligence-is-revolutionizing-business-in-2017/?sh=53667e385463" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.forbes.com/sites/louiscolumbus/2017/09/10/how-artificial-intelligence-is-revolutionizing-business-in-2017/?sh=53667e385463">83% of businesses</a> say AI is a strategic priority. SK hynix will continue to advance its expertise in the area and lead this growing sector in the years to come.</p>
<p><iframe loading="lazy" title="SK hynix GDDR6-AiM (Accelerator in memory)" width="1080" height="608" src="https://www.youtube.com/embed/rTULRWpbd1k?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/let-pim-do-the-learning-the-brainpower-behind-the-ai-memory-chip/">Let PIM Do the Learning: The Brainpower Behind the AI Memory Chip</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
