<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Deep Learning - SK hynix Newsroom</title>
	<atom:link href="https://skhynix-news-global-stg.mock.pe.kr/tag/deep-learning/feed/" rel="self" type="application/rss+xml" />
	<link>https://skhynix-news-global-stg.mock.pe.kr</link>
	<description></description>
	<lastBuildDate>Mon, 07 Oct 2024 09:29:56 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.7.2</generator>

 
	<item>
		<title>[All About AI] The Origins, Evolution &#038; Future of AI</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/all-about-ai-the-origins-evolution-future-of-ai/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Mon, 14 Oct 2024 06:00:50 +0000</pubDate>
				<category><![CDATA[featured]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[artificial intelligence]]></category>
		<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[Generative AI]]></category>
		<category><![CDATA[All About AI]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=15942</guid>

					<description><![CDATA[<p>AI has revolutionized people’s lives. For those who want to gain a deeper understanding of AI and use the technology, the SK hynix Newsroom has created the All About AI series. This first episode covers the historical evolution of AI and explains how it became integrated into today’s world. &#160; AI-powered robots that walk, talk, [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/all-about-ai-the-origins-evolution-future-of-ai/">[All About AI] The Origins, Evolution & Future of AI</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<div style="border: none; background: #D9D9D9; height: auto; padding: 10px 20px; margin-bottom: 10px; color: #000;"><span style="color: #000000; font-size: 18px;">AI has revolutionized people’s lives. For those who want to gain a deeper understanding of AI and use the technology, the SK hynix Newsroom has created the All About AI series. This first episode covers the historical evolution of AI and explains how it became integrated into today’s world.</span></div>
<p>&nbsp;</p>
<p>AI-powered robots that walk, talk, and think like humans have long been a staple of sci-fi comics and movies. However, AI and robotics are no longer merely works of fiction—they have become a reality. Now that AI is here and transforming people’s lives, it is prudent to look back and consider AI’s origins, the milestones which have shaped the technology’s evolution, and consider what the future might hold.</p>
<h3 class="tit">From the Turing Test to Machine Learning: AI’s Early Beginnings</h3>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15944 size-full" title="An overview of AI’s evolution through the decades from the 1950s to the 2020s" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053213/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_01.png" alt="An overview of AI’s evolution through the decades from the 1950s to the 2020s" width="1000" height="563" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053213/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_01.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053213/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_01-680x383.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053213/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_01-768x432.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">Figure 1. An overview of AI’s evolution through the decades from the 1950s to the 2020s</p>
<p>&nbsp;</p>
<p>The birth of AI can be traced back to the 1950s. In 1950, British mathematician Alan Turing proposed that machines could “think,” introducing what is now known as the &#8220;Turing test” to evaluate this capability. This is widely recognized to be the first study to present the concept of AI. In 1956, the Dartmouth Summer Research Project on Artificial Intelligence formally introduced the term “AI” to the wider world for the first time. Held in the U.S. state of New Hampshire, the conference fueled further debates on whether machines could learn and evolve like humans.</p>
<p>During the same decade, the development of artificial neural network<sup>1</sup> models marked a significant milestone in computing history. In 1957, U.S. neuropsychologist Frank Rosenblatt introduced the “perceptron” model<sup>2</sup>, empirically demonstrating that computers can learn and recognize patterns. This practical application built on the “neural network theory” developed in 1943 by neurophysiologists Warren McCulloch and Walter Pitts, who conceptualized nerve cell interactions into a simple computational model. Despite these early breakthroughs raising high expectations, research in the field soon stagnated due to limitations in computing power, logical framework, and data availability.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>1</sup><strong>Neural network</strong>: A machine learning program, or model, that makes decisions in a manner similar to the human brain. It creates an adaptive system to make decisions and learn from mistakes.<br />
<sup>2</sup><strong>Perceptron</strong>: The simplest form of a neural network. It is a model of a single neuron that can be used for binary classification problems, enabling it to determine whether an input belongs to one class or another.</p>
<p>Then in the 1980s, “expert system” emerged which operated solely based on human-defined rules. These systems could make automated decisions to perform tasks such as diagnosis, categorization, and analysis in practical fields such as medicine, law, and retail. However, during this period, expert systems were limited by their reliance on rules set by humans and struggled to understand the complexities of the real world.</p>
<p>In the 1990s, AI evolved from following human commands to autonomously learning and discovering new rules by adopting machine learning algorithms. This became possible due to the advent of digital technology and the internet, which provided access to vast amounts of online data. At this point, AI was able to unearth new rules even humans could not discover. This period marked the start of renewed momentum for AI research, based on machine learning.</p>
<h3 class="tit">The Rise of Deep Learning: A Key Technology in AI’s Growth</h3>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15945 size-full" title="Timeline showing advances in artificial neural networks and deep learning" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053217/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_02.png" alt="Timeline showing advances in artificial neural networks and deep learning" width="1000" height="596" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053217/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_02.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053217/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_02-671x400.png 671w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053217/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_02-768x458.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">Figure 2. Timeline showing advances in artificial neural networks and deep learning</p>
<p>&nbsp;</p>
<p>While the 1990s presented opportunities for AI to grow, the journey and evolution of AI has had its share of setbacks. In 1969, early artificial neural network research hit a roadblock when it was discovered that the perceptron model could not solve nonlinear problems<sup>3</sup>, leading to a prolonged downturn in the field. However, computer scientist Geoffrey Hinton, often hailed as the “godfather of deep learning,” breathed new life into artificial neural network research with his groundbreaking ideas.</p>
<p>For example, in 1986, Hinton applied the backpropagation<sup>4</sup> algorithm to a “multilayer perceptron” model, essentially layers of artificial neural networks, proving it could address the limitations of the initial perceptron model. This seemed to spark a revival in artificial neural networks research, but as the depth of the networks increased, issues began to emerge in the learning process and outcomes.</p>
<p>In 2006, Hinton introduced the “deep belief network (DBN),” which enhanced the performance of a multilayer perceptron, in his paper “A Fast Learning Algorithm for Deep Belief Nets.” By pre-training each layer through unsupervised learning<sup>5</sup> and then fine-tuning the entire network, the DBN significantly improved the speed and efficiency of neural network learning—which had previously been deemed an issue. This progress paved the way for future advancements in deep learning.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>3</sup>The initial perceptron model was a single-layer perceptron that could not solve nonlinear problems such as the XOR problem, which involves two input values; it outputs 0 if the two input values ​​are the same and 1 if they are different.<br />
<sup>4</sup><strong>Backpropagation</strong>: An algorithm used in neural networks to minimize errors by adjusting the weights. It works by calculating the difference between the predicted and actual values and then updating the weights in reverse order, starting from the output layer.<br />
<sup>5</sup><strong>Unsupervised Learning</strong>: A type of machine learning where the model is trained on input data without explicit labels or predefined outcomes. The goal is to discover and understand hidden structures and patterns within the data.</p>
<p>In 2012, deep learning made a historic leap forward when Hinton’s team won the ImageNet Large Scale Visual Recognition Challenge (ILSVRC) with their deep learning-based model, AlexNet. This triumph demonstrated deep learning’s immense power by recording an error rate of just 16.4%, surpassing the 25.8% of the previous year’s winner.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15946 size-full" title="An Overview of ILSVRC’s Image Recognition Error Rate by Year (Kien Nguyen, Arun Ross, Iris Recognition With Off-the-Shelf CNN Features: A Deep Learning Perspective, IEEE Access, Sept. 2017 p.3)" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053222/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_03.png" alt="An Overview of ILSVRC’s Image Recognition Error Rate by Year (Kien Nguyen, Arun Ross, Iris Recognition With Off-the-Shelf CNN Features: A Deep Learning Perspective, IEEE Access, Sept. 2017 p.3)" width="1000" height="623" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053222/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_03.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053222/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_03-642x400.png 642w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053222/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_03-768x478.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">Figure 3. An Overview of ILSVRC’s Image Recognition Error Rate by Year (Kien Nguyen, Arun Ross, <em>Iris Recognition With Off-the-Shelf CNN Features: A Deep Learning Perspective</em>, IEEE Access, Sept. 2017 p.3)</p>
<p>&nbsp;</p>
<p>Deep learning, a focal point of AI research, has grown rapidly since the 2010s for two primary reasons. First, advances in computer systems, including graphics processing units (GPUs), have driven AI development. Originally designed for graphics processing, GPUs can process repetitive and similar tasks in parallel. This capability enables GPUs to process data faster than central processing units (CPUs). In the 2010s, general-purpose computing on GPUs (GPGPU) emerged, enabling GPUs to be used for broader computational tasks beyond graphics rendering and allowing them to replace CPUs in some instances. The use of GPUs has further increased as they have been utilized for training artificial neural networks, accelerating the development of deep learning. Deep learning, which needs to perform iterative computations during analysis of large datasets to extract features, benefits from the parallel processing capability of GPUs.</p>
<p>Second, the expansion of data resources has fueled progress in deep learning. Training an artificial neural network requires vast amounts of data. In the past, data was primarily sourced from users manually inputting information into computers. However, the explosion of the internet and search engines in the 1990s exponentially increased the range of data available for processing. In the 2000s, the advent of technologies such as smartphones and the Internet of Things (IoT) contributed to the birth of the Big Data era, where real-time information flows from every corner of the globe. Deep learning algorithms use this large quantity of data for training, growing increasingly sophisticated. This data revolution has therefore set the stage for significant advancements in deep learning technology.</p>
<p style="text-align: center;"><iframe loading="lazy" src="https://www.youtube.com/embed/WXuK6gekU1Y?si=W_lg7iEWjh4bfGDb" width="810" height="455" frameborder="0" allowfullscreen="allowfullscreen"><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start">﻿</span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start">﻿</span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start">﻿</span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start"></span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start"></span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start"></span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start"></span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start"></span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start"></span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start"></span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start"></span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start"></span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start"></span></iframe></p>
<p class="source" style="text-align: center;">Figure 4. Google DeepMind’s <em>AlphaGo </em><em>&#8211; The Movie </em>is a documentary film about the epic battle between AlphaGo and Lee Sedol on March 9, 2016</p>
<p>&nbsp;</p>
<p>By 2016, the evolution of AI reached a dramatic turning point with the development of AlphaGo, an advanced AI program created by Google DeepMind to play the board game Go. This extraordinary AI program captivated the world when it defeated Go grandmaster Lee Sedol by an impressive 4-1 score. Combining deep learning with reinforcement learning<sup>6</sup> and Monte Carlo tree search (MCTS)<sup>7</sup> algorithms, AlphaGo learned to mimic human intuition, predict moves, and strategize through tens of thousands of self-played games. AlphaGo’s victory over a legendary human player signaled the beginning of a new AI era.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>6</sup><strong>Reinforcement Learning</strong>: A type of machine learning where an AI agent learns to make decisions by interacting with an environment. The agent receives rewards or penalties based on its actions and aims to maximize cumulative rewards over time by optimizing its strategy.<br />
<sup>7</sup><strong>Monte Carlo tree search (MCTS)</strong>: A stochastic algorithm that repeatedly generates a series of random numbers to derive a numerical approximation of a function&#8217;s value. It structures the possible actions of the current situation into a search tree and uses random simulations to infer the pros and cons of each, ultimately determining the optimal course of action.</p>
<h3 class="tit">ChatGPT: The Catalyst for the Generative AI Boom</h3>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15947 size-full" title="Generative AI explained through key AI subsets" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053226/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_04.png" alt="Generative AI explained through key AI subsets" width="1000" height="563" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053226/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_04.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053226/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_04-680x383.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053226/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_04-768x432.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">Figure 5. Generative AI explained through key AI subsets</p>
<p>&nbsp;</p>
<p>At the close of 2022, humanity stood on the brink of a transformative leap with AI technology. OpenAI unveiled ChatGPT, powered by a type of LLM<sup>8</sup> known as generative pre-trained transformer (GPT) 3.5, marking the dawn of the generative AI era. Most notably, this leap propelled AI into the creative realm, a domain once considered uniquely human. Now, generative AI can produce high-quality content across diverse formats, moving beyond traditional deep learning, which merely predicts or classifies data. Instead, generative AI, using LLMs or various image generation models such as variational autoencoders (VAEs), generative adversarial networks (GANs), and diffusion models, creates original results tailored to user needs.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>8</sup><strong>Large language model (LLM)</strong>: Deep learning algorithms that perform a variety of natural language processing tasks by leveraging extensive data.</p>
<p>To provide a clearer context for the evolution of generative AI, it is essential to examine its origins and key developments. The roots of generative AI trace back to 2014, when American scientist and researcher Ian Goodfellow introduced GANs. In GANs, two neural networks engage in a continuous duel: one generates new data from a dataset, while the other network compares this new data to the original dataset to determine its authenticity. Through this iterative process, GANs produce increasingly refined and sophisticated outputs. Over time, researchers have enhanced and expanded upon this model, leading to its widespread use in applications such as image generation and transformation.</p>
<p>In 2017, the natural language processing (NLP)<sup>9</sup> model “transformer” was introduced. This model considers the relationships between data as key variables. By focusing more attention to certain information, transformers can learn complex data patterns and relationships between data, capturing essential details to produce higher quality results. This advancement transformed NLP tasks such as language comprehension, machine translation, and conversational systems, leading to the development of LLMs such as the aforementioned GPT.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>9</sup><strong>Natural language processing (NLP)</strong>: A subfield of AI that uses algorithms to analyze and process natural language data. By examining syntactic structures, semantic relationships, and contextual patterns, NLP systems can perform tasks such as language translation.</p>
<p>First released in 2018, GPTs have rapidly advanced in performance by expanding their parameters and training on data every year. By 2022, OpenAI’s chatbot ChatGPT, powered by GPT-3.5, completely changed the paradigm of AI. ChatGPT, with its exceptional ability to understand user context, deliver relevant responses, and handle diverse queries, quickly gained traction. <a href="https://www.statista.com/chart/29174/time-to-one-million-users/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">Within a week of its launch, it drew over 1 million users</span></a> and attracted <a href="https://www.reuters.com/technology/chatgpt-sets-record-fastest-growing-user-base-analyst-note-2023-02-01/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">more than 100 million active users within two months</span></a>.</p>
<p>The rapid advancements in AI culminated in a major technological leap forward in 2023 with the launch of GPT-4 by OpenAI. This new model is built on a dataset roughly 500 times larger than that of GPT-3.5. GPT-4, now considered a Large Multimodal Model (LMM)<sup>10</sup>, can simultaneously process diverse formats of input data, including images, audio, and video, expanding far beyond its text-only predecessors. In 2024, OpenAI introduced GPT-4o, an enhanced model offering faster, more efficient processing of text, voice, and images. Capitalizing on the generative AI boom triggered by ChatGPT, companies have rolled out diverse services. For example, Google’s Gemini can simultaneously recognize and understand text, images, and audio; Meta’s SAM accurately identifies and isolates objects in images; and OpenAI’s Sora generates videos from text prompts.</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>10</sup><strong>Large Multimodal Model (LMM)</strong>: A deep learning algorithm that can handle many types of data, including images, audio, and more, not just text.</p>
<p>The generative AI market is only beginning to unleash its potential. According to a <a href="https://www.idc.com/getdoc.jsp?containerId=prUS51572023" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;">report from the global market research firm International Data Corporation (IDC)</span></a>, the market is set to be worth 40.1 billion USD in 2024—2.7 times larger than the previous year. Looking ahead, the market is expected to continue its growth each year and reach 151.1 billion USD by 2027. As generative AI evolves, its influence will extend beyond software to various formats including hardware and internet services. The world can expect a leap in capabilities and a push towards greater accessibility, making cutting-edge AI technology available to an ever-growing audience.</p>
<h3 class="tit">AI’s Impact on Revolutionizing Today and Redefining Tomorrow</h3>
<p>Just as Google search revolutionized the early 2000s and mobile social media reshaped the 2010s, AI is now driving transformative changes across society. The pace of this technological advancement is unprecedented, and the challenges and concerns of humanity are growing along with it.</p>
<p>So what is the “next generative AI”? The most notable technology around today is perhaps on-device AI. Unlike traditional AI that relies on large cloud servers to pull data to edge devices, on-device AI operates directly on electronic devices such as smartphones and PCs through integrated AI chipsets and smaller LLMs (sLLMs). This shift promises to enhance security, conserve resources, and deliver more personalized AI experiences.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15948 size-full" title="Cloud-based AI vs on-device AI structures" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053230/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_05.png" alt="Cloud-based AI vs on-device AI structures" width="1000" height="563" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053230/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_05.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053230/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_05-680x383.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053230/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_05-768x432.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">Figure 6. Cloud-based AI vs on-device AI structures</p>
<p>&nbsp;</p>
<p>AI will seamlessly integrate into an increasing number of devices, continuously evolving in form and function. Thus, innovations that once seemed like science fiction are becoming reality. For instance, in 2023, U.S. startup Humane launched the AI Pin, a wearable device with a laser-ink display that projects a menu onto the user’s palm. At CES 2024, Rabbit&#8217;s R1 and Brilliant Labs’ Frame showcased their own cutting-edge wearable AI technology. Meanwhile, mixed reality (MR) headsets, like Apple’s Vision Pro and Meta’s Quest, are pushing beyond traditional virtual reality (VR) and metaverse experiences, opening up new markets.</p>
<p>However, as technology races forward, it not only creates new opportunities but also brings about social challenges. The rapid rise of AI has sparked concerns about society’s ability to keep up with these advancements. In particular, AI’s potential misuse and impact on real-world issues has heightened these fears. Sophisticated AI-generated content, such as deepfake videos and manipulated images, creates fake news and disrupts society. Recently, concerns about fake content have intensified in many countries ahead of major elections, including the 2024 U.S. presidential election.</p>
<p><img loading="lazy" decoding="async" class="aligncenter wp-image-15949 size-full" title="Social anxiety and disruption due to deepfake technology portrayed by DALL-E, a generative AI platform" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053239/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_06.png" alt="Social anxiety and disruption due to deepfake technology portrayed by DALL-E, a generative AI platform" width="1000" height="563" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053239/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_06.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053239/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_06-680x383.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/10/07053239/SK-hynix_All-About-AI_The-Origins-Evolution-Future-of-AI_06-768x432.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source" style="text-align: center;">Figure 7. Social anxiety and disruption due to deepfake technology portrayed by DALL-E, a generative AI platform</p>
<p>&nbsp;</p>
<p>There are also risks associated with the development and use of AI. As generative AI crawls and merges publicly available web contents to train its AI models, there are concerns about plagiarism. Moreover, copyright disputes can arise from creating content using similar prompts with the same generative AI program. The potential for AI to shift from enhancing productivity to replacing jobs and disrupting the labor market presents a troubling reality for some as well.</p>
<p>AI has created a world beyond human imagination. As this new world unfolds, it is crucial to prepare for the changes ahead. Addressing this new era involves thoughtful planning and social discussion. These action items first require a deep understanding of AI’s potential and implications, which will be provided throughout the All About AI series.</p>
<p>&nbsp;</p>
<p><a href="https://linkedin.com/showcase/skhynix-news-and-stories/" target="_blank" rel="noopener noreferrer"><img loading="lazy" decoding="async" class="size-full wp-image-15776 aligncenter" src=" https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1.png" alt="" width="800" height="135" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1-680x115.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2024/09/13015412/SK-hynix_Newsroom-banner_1-768x130.png 768w" sizes="(max-width: 800px) 100vw, 800px" /></a></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/all-about-ai-the-origins-evolution-future-of-ai/">[All About AI] The Origins, Evolution & Future of AI</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Let PIM Do the Learning: The Brainpower Behind the AI Memory Chip</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/let-pim-do-the-learning-the-brainpower-behind-the-ai-memory-chip/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Fri, 17 Jun 2022 07:00:57 +0000</pubDate>
				<category><![CDATA[featured]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Machine Learning]]></category>
		<category><![CDATA[PIM]]></category>
		<category><![CDATA[GDDR6-AiM]]></category>
		<category><![CDATA[Deep Learning]]></category>
		<category><![CDATA[ISSCC]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=9359</guid>

					<description><![CDATA[<p>When IBM-developed computer Watson beat out its human competitors on the quiz show Jeopardy in 2011, it was thought to be the beginning of the end of the superior reign of human intelligence. Watson brought discussions of AI to the mainstream. Its ability to apply machine learning to gather and analyze massive amounts of data [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/let-pim-do-the-learning-the-brainpower-behind-the-ai-memory-chip/">Let PIM Do the Learning: The Brainpower Behind the AI Memory Chip</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><img loading="lazy" decoding="async" class="size-full wp-image-9360 aligncenter" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045057/SK-hynix_Let-PIM-Do-the-Learning_thumbnail.png" alt="" width="680" height="400" /></p>
<p>When IBM-developed computer Watson beat out its human competitors on the quiz show Jeopardy in 2011, it was thought to be the beginning of the end of the superior reign of human intelligence. Watson brought discussions of AI to the mainstream. Its ability to apply machine learning to gather and analyze massive amounts of data in a flash was something most thought exclusive to sci-fi.</p>
<p>Quintillions of bytes of data are now being generated each day, with the <a class="-as-ga" style="text-decoration: underline;" href="https://www.statista.com/statistics/871513/worldwide-data-created/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.statista.com/statistics/871513/worldwide-data-created/">amount of data generated by 2025</a> predicted to be 181 zettabytes. While this volume of data exceeds far beyond the realm of human consumption, cloud computing, faster processing, faster networks, and faster chips mean it can be processed and applied efficiently. AI isn’t a pipe dream &#8211; it’s a reality.</p>
<h3>From Synapses to Circuits</h3>
<p>Semiconductors supporting AI functions must capitalize on space and provide means for parallel processing for complex tasks. Enter, Processing in Memory chips. The so-called PIM chip integrates a processor with Random Access Memory (RAM) on a single memory module. This structure removes the boundary between memory and system semiconductors, allowing data storage and data processing to happen in the same place.</p>
<p>By eliminating the need for data to traverse modules, response times are greatly improved, allowing for <a class="-as-ga" style="text-decoration: underline;" href="https://www.techtarget.com/searchbusinessanalytics/definition/processing-in-memory-PIM" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.techtarget.com/searchbusinessanalytics/definition/processing-in-memory-PIM">real-time data processing.</a> More traditional computer architectures, which manage processing and storage in separate modules, often fall prey to latency issues, commonly referred to as the von Neumann bottleneck. Adding processing functions to memory semiconductors presents a unique solution to overcome this long-standing problem.</p>
<p>SK hynix <a class="-as-ga" style="text-decoration: underline;" href="https://news.skhynix.com/sk-hynix-develops-pim-next-generation-ai-accelerator/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://news.skhynix.com/sk-hynix-develops-pim-next-generation-ai-accelerator/">unveiled its next-generation PIM</a> in February 2022 at ISSCC in San Francisco. The GDDR6-AiM (Accelerator in Memory) adds computational functions to GDDR6 memory chips, allowing for data to be processed at speeds of up to 16 Gbps.</p>
<p>GDDR6-AiM is also more energy efficient, reducing power consumption by 80% by removing data movement to the CPU and GPU. Advancing technology in a manner that supports a greener and more equitable world is an integral part of SK hynix future vision. GDDR6-AiM can help reduce carbon emissions and shrink the carbon footprint of any technology it’s applied to, advancing <a class="-as-ga" style="text-decoration: underline;" href="https://www.skhynix.com/sustainability/UI-FR-SA1601/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.skhynix.com/sustainability/UI-FR-SA1601/">SK hynix’s ESG-related goals</a> and expanding their positive impact across their clients’ industries.</p>
<p>While particularly effective in managing the needs of AI-based systems, PIM can be applied to a broad spectrum of technologies. Databases, query engines, data grids, and more all require some version of data storage and processing coupled with custom applications leveraging a variety of inputs.</p>
<p><img loading="lazy" decoding="async" class="alignnone size-full wp-image-9361" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045059/GDDR6-AiM_01.jpg" alt="" width="1000" height="614" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045059/GDDR6-AiM_01.jpg 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045059/GDDR6-AiM_01-651x400.jpg 651w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045059/GDDR6-AiM_01-768x472.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source">The next generation of smart memory</p>
<h3>Machine Learning vs. Deep Learning</h3>
<p>Unbeknownst to many, artificial intelligence is a broad term that describes the science of creating machines that think like humans. The term machine learning marks functionalities that enable computers to perform tasks without explicit programming and includes deep learning, a subset that relies on artificial neural networks.</p>
<p>Deep learning can be seen as the most independent AI system as it manages both <a class="-as-ga" style="text-decoration: underline;" href="https://www.computer.org/publications/tech-news/trends/deep-learning-vs-machine-learning-whats-the-difference" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.computer.org/publications/tech-news/trends/deep-learning-vs-machine-learning-whats-the-difference">feature input and classification.</a> These systems also require vast amounts of data and rely on parallel processes as their algorithms are primarily self-directed once trained.</p>
<p>AI machines, including deep learning models, are already a part of our lives. There are countless real-world AI applications, which only stand to increase. Everything from mobile devices to autonomous vehicles utilize AI models for tasks like location-based recommendation, auto-braking, camera-based object classification, and navigation through complex environments.</p>
<p><img loading="lazy" decoding="async" class="alignnone size-full wp-image-9362" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045102/SK-hynix_Let-PIM-Do-the-Learning.png" alt="" width="1000" height="551" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045102/SK-hynix_Let-PIM-Do-the-Learning.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045102/SK-hynix_Let-PIM-Do-the-Learning-680x375.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/06/15045102/SK-hynix_Let-PIM-Do-the-Learning-768x423.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p>
<p class="source">The art of computationally mimicking human intelligence takes many forms</p>
<h3>Overcoming the Challenges</h3>
<p>The road to PIM development was not without detours, roadblocks, and congestion. As the technology continues to advance, there are still obstacles to surmount across design, manufacturing, cost, and more.</p>
<p>Designing PIM requires the application of novel approaches to chip structures. Traditional semiconductors do not need to accommodate near-memory queues or perform parallel functions in the same way PIM chips do. Once onto the manufacturing stage, space and distance considerations become paramount. It is crucial to reduce how far signals must travel without increased cost or risk of thermal issues.</p>
<p>Furthermore, integrated chips such as PIM have an increased dependency on memory – a unique feature that is both a blessing and a curse. Any damage to the memory components could result in compromised data.</p>
<p>With the AI market expected <a class="-as-ga" style="text-decoration: underline;" href="https://www.statista.com/statistics/607716/worldwide-artificial-intelligence-market-revenues/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.statista.com/statistics/607716/worldwide-artificial-intelligence-market-revenues/">to reach $190 billion by 2025,</a> investment in AI is ripe. According to a Boston Consulting Group and MIT Sloan Management Review study, <a class="-as-ga" style="text-decoration: underline;" href="https://www.forbes.com/sites/louiscolumbus/2017/09/10/how-artificial-intelligence-is-revolutionizing-business-in-2017/?sh=53667e385463" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.forbes.com/sites/louiscolumbus/2017/09/10/how-artificial-intelligence-is-revolutionizing-business-in-2017/?sh=53667e385463">83% of businesses</a> say AI is a strategic priority. SK hynix will continue to advance its expertise in the area and lead this growing sector in the years to come.</p>
<p><iframe loading="lazy" title="SK hynix GDDR6-AiM (Accelerator in memory)" width="1080" height="608" src="https://www.youtube.com/embed/rTULRWpbd1k?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/let-pim-do-the-learning-the-brainpower-behind-the-ai-memory-chip/">Let PIM Do the Learning: The Brainpower Behind the AI Memory Chip</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
