<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>CIS - SK hynix Newsroom</title>
	<atom:link href="https://skhynix-news-global-stg.mock.pe.kr/tag/cis/feed/" rel="self" type="application/rss+xml" />
	<link>https://skhynix-news-global-stg.mock.pe.kr</link>
	<description></description>
	<lastBuildDate>Tue, 05 Dec 2023 13:04:43 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.7.2</generator>

 
	<item>
		<title>SK hynix&#8217;s Evolution in CIS HDR Technology and Future Outlook</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-evolution-in-cis-hdr-technology/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Tue, 26 Sep 2023 06:00:34 +0000</pubDate>
				<category><![CDATA[featured]]></category>
		<category><![CDATA[Opinion]]></category>
		<category><![CDATA[CIS]]></category>
		<category><![CDATA[HDR]]></category>
		<category><![CDATA[QHDR]]></category>
		<category><![CDATA[sHDR]]></category>
		<category><![CDATA[iDCG-HDR]]></category>
		<category><![CDATA[DAG-HDR]]></category>
		<category><![CDATA[analog gain]]></category>
		<category><![CDATA[dynamic range]]></category>
		<category><![CDATA[quad-sensor]]></category>
		<category><![CDATA[conversion gain]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=12770</guid>

					<description><![CDATA[<p>While smartphone cameras have improved over the years, they still lag behind DSLR cameras in terms of low-light image quality and dynamic range performance. To address this issue, SK hynix has been working on developing high dynamic range (HDR) technology to allow users to experience better smartphone cameras.  In this insightful EE Times article, Suram Cha, [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-evolution-in-cis-hdr-technology/">SK hynix’s Evolution in CIS HDR Technology and Future Outlook</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><span class="TextRun SCXW123707617 BCX0" lang="EN-US" xml:lang="EN-US" data-contrast="auto"><span class="NormalTextRun SCXW123707617 BCX0">While smartphone cameras have improved over the years, they still lag behind DSLR cameras in terms of low-light image quality and dynamic range performance. To address this issue, SK </span><span class="NormalTextRun SpellingErrorV2Themed SCXW123707617 BCX0">hynix</span><span class="NormalTextRun SCXW123707617 BCX0"> has been working on developing high dynamic range (HDR) technology to allow users to experience better smartphone cameras.</span></span><span class="LineBreakBlob BlobObject DragDrop SCXW123707617 BCX0"><span class="SCXW123707617 BCX0"> </span><br class="SCXW123707617 BCX0" /></span><span class="LineBreakBlob BlobObject DragDrop SCXW123707617 BCX0"><span class="SCXW123707617 BCX0"> </span><br class="SCXW123707617 BCX0" /></span><span class="TextRun SCXW123707617 BCX0" lang="EN-US" xml:lang="EN-US" data-contrast="auto"><span class="NormalTextRun SCXW123707617 BCX0">In this insightful EE Times article, </span><span class="NormalTextRun SCXW123707617 BCX0">Suram</span><span class="NormalTextRun SCXW123707617 BCX0"> Cha, </span><span class="NormalTextRun SCXW123707617 BCX0">T</span><span class="NormalTextRun SCXW123707617 BCX0">echnical </span><span class="NormalTextRun SCXW123707617 BCX0">L</span><span class="NormalTextRun SCXW123707617 BCX0">eader of the Next Gen Biz Team at SK </span><span class="NormalTextRun SpellingErrorV2Themed SCXW123707617 BCX0">hynix</span><span class="NormalTextRun SCXW123707617 BCX0">, delves into the company’s pioneering work in HDR technology, which has enhanced smartphone camera capabilities and </span><span class="NormalTextRun SCXW123707617 BCX0">provided</span><span class="NormalTextRun SCXW123707617 BCX0"> an elevated user experience.</span></span><span class="LineBreakBlob BlobObject DragDrop SCXW123707617 BCX0"><span class="SCXW123707617 BCX0"> </span><br class="SCXW123707617 BCX0" /></span><span class="LineBreakBlob BlobObject DragDrop SCXW123707617 BCX0"><span class="SCXW123707617 BCX0"> </span><br class="SCXW123707617 BCX0" /></span><span class="TextRun SCXW123707617 BCX0" lang="EN-US" xml:lang="EN-US" data-contrast="auto"><span class="NormalTextRun SCXW123707617 BCX0">The evolution of SK </span><span class="NormalTextRun SpellingErrorV2Themed SCXW123707617 BCX0">hynix&#8217;s</span><span class="NormalTextRun SCXW123707617 BCX0"> HDR technology began with quad-sensor HDR (QHDR) in 2017, which used multiple exposure time-based HDR technology. This was followed by staggered HDR (</span><span class="NormalTextRun SpellingErrorV2Themed SCXW123707617 BCX0">sHDR</span><span class="NormalTextRun SCXW123707617 BCX0">) in 2018, which </span><span class="NormalTextRun SCXW123707617 BCX0">optimized</span><span class="NormalTextRun SCXW123707617 BCX0"> image quality and dynamic range by collaborating between the sensor and the application processor. In 2021</span></span><span class="TextRun SCXW123707617 BCX0" lang="KO-KR" xml:lang="KO-KR" data-contrast="auto"><span class="NormalTextRun SCXW123707617 BCX0">&#8211;</span></span><span class="TextRun SCXW123707617 BCX0" lang="EN-US" xml:lang="EN-US" data-contrast="auto"><span class="NormalTextRun SCXW123707617 BCX0">2</span><span class="NormalTextRun SCXW123707617 BCX0">022, SK </span><span class="NormalTextRun SpellingErrorV2Themed SCXW123707617 BCX0">hynix</span><span class="NormalTextRun SCXW123707617 BCX0"> developed intra-scene dual conversion gain (</span><span class="NormalTextRun SpellingErrorV2Themed SCXW123707617 BCX0">iDCG</span><span class="NormalTextRun SCXW123707617 BCX0">)-HDR technology, which merged high and low conversion gain images to expand the dynamic range. In 2023, the company introduced dual analog gain (DAG)-HDR, which achieved the effect of HDR by capturing frames with two different analog gains. Finally, S</span><span class="NormalTextRun SCXW123707617 BCX0">K</span> <span class="NormalTextRun SpellingErrorV2Themed SCXW123707617 BCX0">hynix</span><span class="NormalTextRun SCXW123707617 BCX0"> combined DAG and </span><span class="NormalTextRun SpellingErrorV2Themed SCXW123707617 BCX0">iDCG</span><span class="NormalTextRun SCXW123707617 BCX0"> HDR to create a technology that minimizes motion artifacts while improving dynamic range under various shooting conditions.</span></span><span class="LineBreakBlob BlobObject DragDrop SCXW123707617 BCX0"><span class="SCXW123707617 BCX0"> </span><br class="SCXW123707617 BCX0" /></span><span class="LineBreakBlob BlobObject DragDrop SCXW123707617 BCX0"><span class="SCXW123707617 BCX0"> </span><br class="SCXW123707617 BCX0" /></span><span class="TextRun SCXW123707617 BCX0" lang="EN-US" xml:lang="EN-US" data-contrast="auto"><span class="NormalTextRun SCXW123707617 BCX0">SK </span><span class="NormalTextRun SpellingErrorV2Themed SCXW123707617 BCX0">hynix&#8217;s</span><span class="NormalTextRun SCXW123707617 BCX0"> commitment to advancing HDR technology reflects its dedication to empowering all smartphone users with the full potential of HDR to deliver high-quality photos in any lighting condition.</span></span><span class="LineBreakBlob BlobObject DragDrop SCXW123707617 BCX0"><span class="SCXW123707617 BCX0"> </span><br class="SCXW123707617 BCX0" /></span><span class="LineBreakBlob BlobObject DragDrop SCXW123707617 BCX0"><span class="SCXW123707617 BCX0"> </span><br class="SCXW123707617 BCX0" /></span><span class="TextRun SCXW123707617 BCX0" lang="EN-US" xml:lang="EN-US" data-contrast="auto"><span class="NormalTextRun SCXW123707617 BCX0">To learn more about SK </span><span class="NormalTextRun SpellingErrorV2Themed SCXW123707617 BCX0">hynix&#8217;s</span><span class="NormalTextRun SCXW123707617 BCX0"> evolution in CIS HDR technology and the company’s </span><span class="NormalTextRun SCXW123707617 BCX0">future outlook</span><span class="NormalTextRun SCXW123707617 BCX0">, read the full EE Times article here:</span></span> <a href="https://www.eetimes.com/1409388-2/" target="_blank" rel="noopener noreferrer"><span style="text-decoration: underline;"><span class="TextRun SCXW123707617 BCX0" lang="KO-KR" xml:lang="KO-KR" data-contrast="auto"><span class="NormalTextRun SCXW123707617 BCX0">SK hynix&#8217;s Evolution in CIS HDR Technology and Future Outlook</span></span></span></a></p>
<p>&nbsp;</p>
<p><img decoding="async" class="aligncenter wp-image-12773 size-full" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/08024940/Sk-hynix_CIS-HDR-Technology_profile-banner.png" alt="By Suram Cha, Technical Leader of Next Gen Biz Team, SK hynix" width="1000" height="170" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/08024940/Sk-hynix_CIS-HDR-Technology_profile-banner.png 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/08024940/Sk-hynix_CIS-HDR-Technology_profile-banner-680x116.png 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2023/09/08024940/Sk-hynix_CIS-HDR-Technology_profile-banner-768x131.png 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-evolution-in-cis-hdr-technology/">SK hynix’s Evolution in CIS HDR Technology and Future Outlook</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>SK hynix&#8217;s BSI Technology a Leading Light in the Global Mobile Market</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/sk-hynixs-bsi-technology-a-leading-light-in-the-global-mobile-market/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Thu, 22 Dec 2022 06:00:13 +0000</pubDate>
				<category><![CDATA[featured]]></category>
		<category><![CDATA[Opinion]]></category>
		<category><![CDATA[BSI]]></category>
		<category><![CDATA[semiconductor]]></category>
		<category><![CDATA[CIS]]></category>
		<category><![CDATA[CMOS Image sensor]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=10446</guid>

					<description><![CDATA[<p>As a technology innovator that always strives to be at the forefront of the latest advancements, SK hynix has been developing CIS (CMOS Image Sensor) products for the past 15 years. A CIS is a sensor that converts the color and brightness of light into an electrical signal before transmitting it to a processing unit, [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynixs-bsi-technology-a-leading-light-in-the-global-mobile-market/">SK hynix’s BSI Technology a Leading Light in the Global Mobile Market</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p><span data-contrast="auto">As a technology innovator that always strives to be at the forefront of the latest advancements</span><span data-contrast="auto">, </span><span data-contrast="auto">SK hynix</span> <span data-contrast="auto">has been developing CIS (CMOS Image Sensor) products for the past 15 years</span><span data-contrast="auto">. </span><span data-contrast="auto">A CIS is</span> <span data-contrast="auto">a</span> <span data-contrast="auto">sensor</span> <span data-contrast="auto">that</span> <span data-contrast="auto">co</span><span data-contrast="auto">nverts</span> <span data-contrast="auto">the</span> <span data-contrast="auto">color and brightness of light</span> <span data-contrast="auto">into</span> <span data-contrast="auto">an</span> <span data-contrast="auto">electrical</span> <span data-contrast="auto">signal</span> <span data-contrast="auto">before</span> <span data-contrast="auto">transmitting</span> <span data-contrast="auto">it</span> <span data-contrast="auto">to</span> <span data-contrast="auto">a</span> <span data-contrast="auto">processing</span> <span data-contrast="auto">unit</span><span data-contrast="auto">, </span><span data-contrast="auto">essentially</span> <span data-contrast="auto">acting</span> <span data-contrast="auto">as</span> <span data-contrast="auto">the</span> <span data-contrast="auto">eyes of mobile</span> <span data-contrast="auto">devices</span> <span data-contrast="auto">such</span> <span data-contrast="auto">as</span> <span data-contrast="auto">smartphones and tablets</span><span data-contrast="auto">. </span><span data-contrast="auto">Today, SK hynix</span> <span data-contrast="auto">is able to develop ultra-high resolution CIS products</span> <span data-contrast="auto">thanks largely to the</span><span data-contrast="none"> application of its</span> <span data-contrast="none">Backside Illumination (BSI</span><span data-contrast="none">)</span><span data-contrast="none">-based pixel technology</span><span data-contrast="none">.</span></p>
<p><span data-contrast="none">Previously, CIS products relied on Frontside Illumination (FSI)</span> <span data-contrast="none">technology which was vulnerable to diffraction</span> <span data-contrast="none">and impacted the image quality. However, the introduction of BSI proved to be a transformative development for CIS technology as it resolved the issues of FSI and led to the development of next-generation CIS products.</span></p>
<p><span data-contrast="auto">In this EE Times article, Technical Leader Kyoung-in Lee from SK hynix</span> <span data-contrast="auto">explains in detail how developments in BSI</span> <span data-contrast="auto">technology led to</span> <span data-contrast="auto">significant upgrades in image sensors</span><span data-contrast="auto">. </span><span data-contrast="auto">These developments include smaller pixel sizes, high-resolution products with millions of pixels</span><span data-contrast="auto">, </span><span data-contrast="auto">the prevention of light diffraction, and</span> <span data-contrast="auto">an increase</span> <span data-contrast="auto">in the </span><span data-contrast="none">quantum efficiency (QE) </span><span data-contrast="auto">of products—which makes it possible to display bright images even in low-light conditions.</span></p>
<p><span data-contrast="none">SK hynix</span> <span data-contrast="none">is continuously evolving its BSI technology</span><span data-contrast="none">.</span><span data-contrast="none"> The company first introduced basic element technologies including Backside Deep Trench Isolation (BDTI) and Air Grid—which increases the QE of products. More recently, the company succeeded in developing hybrid bonding technology that applies &#8216;Cu-to-Cu bonding&#8217; to stacked sensors based on TSV (Through Silicon Via), laying the foundation for increased competitiveness in chip size and expansion of multilayer wafer bonding technology.</span></p>
<p><span data-contrast="none">Furthermore, these technological achievements are expected to contribute to the expansion of the market by being utilized in the development of various sensors that support AI, medical devices, AR, and VR in the future.</span></p>
<p><span data-contrast="auto">To learn more about BSI technology</span><span data-contrast="auto">, </span><span data-contrast="auto">check out the full article on EE Times: </span><span style="text-decoration: underline;"><a href="https://www.eetimes.com/sk-hynix-bsi-technology-a-leading-light-in-the-global-mobile-market/" target="_blank" rel="noopener noreferrer">SK hynix&#8217;s BSI Technology a Leading Light in the Global Mobile Market</a></span></p>
<p>&nbsp;</p>
<p><img loading="lazy" decoding="async" class="alignnone size-full wp-image-10447" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/12/21045825/SK-hynix_BSI-Technology-in-CIS_08_byline-banner.jpg" alt="" width="1000" height="170" srcset="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/12/21045825/SK-hynix_BSI-Technology-in-CIS_08_byline-banner.jpg 1000w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/12/21045825/SK-hynix_BSI-Technology-in-CIS_08_byline-banner-680x116.jpg 680w, https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/12/21045825/SK-hynix_BSI-Technology-in-CIS_08_byline-banner-768x131.jpg 768w" sizes="(max-width: 1000px) 100vw, 1000px" /></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynixs-bsi-technology-a-leading-light-in-the-global-mobile-market/">SK hynix’s BSI Technology a Leading Light in the Global Mobile Market</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Future Era of Robotics and Metaverse Pioneered by SK hynix ToF Technology</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/future-era-of-robotics-and-metaverse-pioneered-by-sk-hynix-tof-technology/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Thu, 03 Mar 2022 07:00:31 +0000</pubDate>
				<category><![CDATA[featured]]></category>
		<category><![CDATA[Opinion]]></category>
		<category><![CDATA[CIS]]></category>
		<category><![CDATA[ToF]]></category>
		<category><![CDATA[VFM]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=8577</guid>

					<description><![CDATA[<p>The Star Wars movies that featured the Jedi who fought against the forces of evil despite unfavorable circumstances, the courage and sacrifice of the resistance, and the victory earned through a brilliant strategy in the end moved the hearts of many for a very long time. The actions of droids including R2-D2, C-3PO, and BB-8, [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/future-era-of-robotics-and-metaverse-pioneered-by-sk-hynix-tof-technology/">Future Era of Robotics and Metaverse Pioneered by SK hynix ToF Technology</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>The Star Wars movies that featured the Jedi who fought against the forces of evil despite unfavorable circumstances, the courage and sacrifice of the resistance, and the victory earned through a brilliant strategy in the end moved the hearts of many for a very long time. The actions of droids including R2-D2, C-3PO, and BB-8, especially, along with the highlights of the movie &#8211; the lightsaber duel &#8211; were impressive and Star Wars would not have been able to produce such an amazing end if it weren’t for the robots.</p>
<p>The hottest topic at CES 2022<sup>1)</sup> was robotics and metaverse. Considering the implications of CES, we will soon witness an era where all households will have at least one robot that looks like it appeared in the scenes of a sci-fi movie like Star Wars. We can already easily see machines that perform roles instead of us in non-human form such as <a class="-as-ga" style="text-decoration: underline;" href="https://www.reuters.com/technology/lighter-robots-hi-tech-routing-ocado-innovates-deliver-growth-2022-01-26/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.reuters.com/technology/lighter-robots-hi-tech-routing-ocado-innovates-deliver-growth-2022-01-26/">delivery robots<sup>2)</sup></a>, autonomous cars, robot vacuums, and drones flying the skies.</p>
<p>On the other hand, we saw an exponential increase in the popularity and demand for metaverse services that blur the boundaries between the virtual and reality in the era of COVID-19 pandemic that accelerated non-face-to-face services. Many are turning towards AR/VR<sup>3)</sup> technologies. Soon will come a time when we all carry around AR and VR devices like we carry smartphones wherever we go and will open an era where we can access service from anywhere without having to go to a bank or manufacture and keep maintenance on products without going into a factory.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083625/220302_fig1.png" alt="" /></p>
<p class="source">Fig 1. Ocado delivery robots (Sources: Reuters)</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083625/220302_fig1.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p><!-- // 이미지 사이즈 지정해서 업로드 --></p>
<h3 class="tit">The Machine’s Eye (Machine Vision)</h3>
<p>Complementary metal-oxide semiconductor (CMOS) image sensor (or CIS) technology is adopted as the eyes of various devices, including smartphones, backed by the stunning advancements in semiconductor processing and image signal processing (ISP) technologies as well as lower prices and outstanding high resolution and performance. The competition over pixel, which is what determines a camera’s performance, has now led to enabling 600 million pixels which exceed the human eye.</p>
<p>But does high resolution automatically mean it is appropriate for machine vision? Even the clearest 2D image data is not enough for the eye of a cutting-edge machine that is responsible for safety and security when performing a human’s job. It may not be a droid like R2-D2 performing a role in a tactical operation, but machines like self-driving cars and drones that require accurate moment to brake during high-speed driving or facial recognition technology that scans not images but actual people, and AR devices that realize augmented reality by scanning large spaces in real time all require not only 2D image data, but also 3D. A machine could obtain 3D data through a complex computation process without a camera through ancillary tools such as ultrasonic or laser devices. But a machine with various additional parts will be rejected by consumers in terms of design as well as price.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083630/220302_fig2.png" alt="" /></p>
<p class="source">Fig 2. Necessary features of a machine’s eye</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083630/220302_fig2.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p><!-- // 이미지 사이즈 지정해서 업로드 --></p>
<p>People can recognize objects multi-dimensionally and have depth perspective using two eyes and a brain. With a similar mechanism, machines too can recognize objects in multi-dimension and measure distance by applying a triangulation like stereo vision that uses two cameras and a processor. However, there are disadvantages such as complex calculation, lack of accuracy when it comes to measuring distance of a flat surface, and low accuracy in relatively darker places, which narrows the scope of application. Time-of-flight (ToF) has recently risen to the surface as an alternative to overcome these disadvantages. ToF is a simple method that measures the distance by the time light reflects off an object and returns. It is simple and fast to process. It also has the advantage of accurately measuring distance regardless of the luminous environment as it uses a separate light source.</p>
<p>ToF: To measure round-trip time of emitted light for acquiring the distance</p>
<p>Stereo Vision: Two optical systems observe the same target from two different points with respect to the same baseline</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083635/220302_fig3.png" alt="" /></p>
<p class="source">Fig 3. Comparison of how Stereo Vision and ToF recognize objects</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083635/220302_fig3.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p><!-- // 이미지 사이즈 지정해서 업로드 --></p>
<h3 class="tit">Time-of-Flight</h3>
<p>ToF can be categorized into direct ToF (d-ToF), which measures the distance based on the time light returns after being reflected off an object, and indirect ToF (i-ToF), which calculates the distance using the difference in phase shift of the returning light. SK hynix develops both ToF technologies so that they are utilized across various products. Who knows, maybe future robots will have one eye that uses i-ToF to recognize objects in close range, while the other eye uses d-ToF to explore the long distance.</p>
<p>This article aims to shed light on the i-ToF technology of SK hynix.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083641/220302_fig4.png" alt="" /></p>
<p class="source">Fig 4. Comparative analysis of indirect ToF and direct ToF</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083641/220302_fig4.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p><!-- // 이미지 사이즈 지정해서 업로드 --></p>
<p>The i-ToF method measures distance by calculating the phase difference from the light source by the proportion of electric charge accumulated in more than two different storages inside one pixel [1, 2]. With this mechanism, there is a limitation in measuring distance compared to d-ToF because there is less signal to isolate as light intensity weakens when it returns from a long distance. However, it has the advantage of delivering higher resolution compared to d-ToF because the circuit is simple making it easy to shrink pixels as pixels can isolate signals themselves. To make up for such limitation and maximize the advantage, there are a lot of studies being conducted on improving Signal to Noise Ratio (SNR), increasing QE of IR light source, or technology removing background light (BGL).</p>
<p>The current i-ToF pixel structure can be largely categorized into a gate structure and a diffusion structure. A gate structure is a method that collects surrounding electrons by applying modulated voltage<sup>4)</sup> to the photo gate to create an electric potential difference<sup>5)</sup> [2]. A diffusion structure collects electrons using the current generated by applying modulated voltage to the substrate as a current assisted photonic demodulator (CAPD) [3]. The latter can quickly detect electrons generated in deeper regions compared to the former, making transmission efficiency higher, but it requires more power consumption as it uses majority current [4]. In addition, it consumes even more power as pixels become smaller and the number of pixels increases to enhance resolution [5].</p>
<p>To maximize the advantages and minimize the limitations of CAPD, SK hynix developed a 10um QVGA<sup>6)</sup> grade and 5um VGA grade [6] pixel technology using a new structure called Vertical Field Modulator (VFM). Let’s take a deeper look into VFM technology and its strengths.</p>
<h3 class="tit">The Advantages of VFM Pixel Technology</h3>
<p>There are various criteria that make a good distance measuring sensor, but first and foremost, it should accurately detect distance and have less heating issue by having lower power consumption. In other words, a good sensor must quickly detect signals at a higher efficiency with low power consumption, while it must also accurately separate signals according to phase shift.</p>
<h3 class="tit">1. SK hynix’s CIS Back Side Illumination Technology and Combination</h3>
<p>Like CIS, there are a lot of advantages that back side illumination<sup>7)</sup> processing brings to the design or performance of ToF. The light source used to calculate the time of flight uses Infrared Ray (IR), because it must be invisible to the human eye. And it is used to calculate accurate distance even in low-light environment. Compared to visible light, IR has a longer wavelength meaning that most of the light is penetrated if a wafer thicker than that of CIS is not used, causing extremely low levels of signals generated in a pixel. But that doesn’t mean that the thickness can be extended infinitely. It is difficult to quickly collect electrons generated in deeper regions like how it is difficult to catch a fish in the deep sea compared to a fishing site. When applying back side illumination instead of front side illumination<sup>8)</sup>, signal is quickly and easily detected as it creates a closer light collection where the electric field, which plays a role of the fishing line, is stronger by projecting light from the opposite side (Fig 5).</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083559/220302_fig5.png" alt="" /></p>
<p class="source">Fig 5. Comparison of FSI and BSI (Penetration ratio and light collection per thickness)</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083559/220302_fig5.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p><!-- // 이미지 사이즈 지정해서 업로드 --></p>
<p>The performance of i-ToF depends on how well it isolates signals according to charge accumulation ratio. In this aspect, front side illuminated sensors may cause an error in distance because of a higher possibility that the light enters the detecting node neglecting phase difference in the process of passing through pixel surface. It is the same as doing a roll call and there is another student in the classroom. Also, back side illumination enables a wider option of metal routing which has many constraints to ensure higher fill factor<sup>9)</sup> in front side illumination, just like how it is more effective to draw water from underground rather than cutting down trees in a thick forest to collect rainwater (Fig 6).</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083604/220302_fig6.png" alt="" /></p>
<p class="source">Fig 6. i-ToF charge accumulation ratio according to method of illumination<br />
(compared to drawing water from underground instead of cutting down trees in a thick forest)</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083604/220302_fig6.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p><!-- // 이미지 사이즈 지정해서 업로드 --></p>
<p>This advantage can be realized by combining the CIS back side illumination technology of SK hynix, which has the technology to create pixels smaller than 1-micrometer (1/1000m).</p>
<h3 class="tit">2. SLA &amp; Trench Guide Structure and Quantum Efficiency (QE)</h3>
<p>According to the mechanism of i-ToF that uses charge accumulation ratio, maximum level of signal is required to obtain accurate distance data of an even greater distance. Therefore, high QE<sup>10)</sup> of the IR wavelength range<sup>11)</sup> is essential.</p>
<p>As explained above, the depth of light collection is deep due to the high penetration of IR light source, making light intensity weaker compared to visible light. There is a way of intentionally forming a micro lens structure (small-sized lenses arranged according to the size and number of pixels under a camera lens) high up to enable better light collection, but there is a limit to the height due to technical restrictions. SK hynix took a different approach to overcome the shortcoming. By placing several lenses smaller than the size of a pixel on each pixel, it increased light collection depth and thereby increased the total amount of light received.</p>
<p>SK hynix killed two birds with one stone by lengthening the transmission path of the light hitting the structure and returning by digging a special pattern in the back and making it focus onto the modulation region, thereby reducing light loss rate and increasing transmission efficiency with the same light intensity. In fact, this confirmed that QE more than doubled with 940nm light source. With higher QE, it successfully reduced the error between actual distance and measured distance by a whopping 55% compared to previous methods.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083607/220302_fig7.png" alt="" /></p>
<p class="source">Fig 7. (Left/Right) SLA (Small Lens Array) / (Trench Guide)</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083607/220302_fig7.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p><!-- // 이미지 사이즈 지정해서 업로드 --></p>
<h3 class="tit">3. Ensuring Low-Power, High-Performance</h3>
<p>ToF sensors consume the most power in the circuit that modulates signals while in motion, excluding the power of the light source. The power of the modulation driver circuit is proportional to the electric current flowing across the circuit board. In other words, power consumption can be reduced by reducing substrate current. Also, accurate and precise distance measurement requires a short modulation cycle and fast signal detection. A vehicle (photon) has to speed up by hitting the accelerator in order to arrive at the same distance (Si thickness) quickly, which consumes that much fuel (or current). In another example, drawing up water from a deep well requires a lot of strength in lifting up the pulley. But what if you could draw up underground water with a pump? You could draw up as much water as you want without making an effort by simply turning on the faucet.</p>
<p>The VFM method increases the depletion region by optimizing the condition and structure of the pixel ion implantation to enable the pump-like role and strengthens the vertical electric field. As a result, the power of the electric field is added onto the current to effectively collect electrons, while also enabling fast collection even when the current is low, making it strong in terms of power consumption. Numerous experiments proved that the performance of VFM pixel depletes when the current increases, meaning that it is a structure more appropriate to low power and that current is no longer an important element. In other words, the method enhances pixel performance by securing a design that can realize a strong vertical electric field and controlling the current to simply play a guiding role. The 5um VGA-grade ToF sensor showed reduced current per pixel compared to a QVGA-grade ToF sensor even with smaller pixel size and higher resolution, showing almost zero increase in power consumption.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083611/220302_fig8.png" alt="" /></p>
<p class="source">Fig 8. VFM with more efficient power consumption as ToF sensor</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02083611/220302_fig8.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p><!-- // 이미지 사이즈 지정해서 업로드 --></p>
<h3 class="tit">Summary</h3>
<p>SK hynix develops ToF technology, while at the same time, contributes to generating economic and social value by enabling various module manufacturers to enter a vast range of application markets by offering close technology support and sensors.</p>
<p>We look forward to the depth solution technologies of SK hynix opening up a world where we travel around the world using AR/VR devices, a world where drones deliver packages and home droids bring them to us, a world where robot vacuum cleaners tidy up our homes as we watch the news in a self-driving car that opens by facial recognition.</p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p><strong>Footnotes</strong></p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>1)</sup>CES: World’s largest technology show held in Las Vegas, U.S., as a venue for not only home appliances, but also new technologies for electronics across all industries including robotics and mobility.<br />
<sup>2)</sup>Delivery robots: currently being operated as part of pilot programs by application companies or at some convenience stores.<br />
<sup>3)</sup>AR: augmented reality, VR: virtual reality<br />
<sup>4)</sup>Modulated voltage: voltage that switches pixel nodes to isolate signals.<br />
<sup>5)</sup>Electric potential difference: energy difference across electrical positions in an electrical field. A charge moves from a low energy point to a high energy point.<br />
<sup>6)</sup>QVGA: refers to a pixel’s resolution (320&#215;240), VGA is 640&#215;480<br />
<sup>7)</sup>Back side illumination: a processing method that makes CIS in the order of ulens &#8211; color filter &#8211; PD &#8211; metal from top to bottom. The collection efficiency is much greater than using FSI + Light Guide.<br />
<sup>8)</sup>Front side illumination: a processing method that makes CIS in the order of ulens &#8211; color filter &#8211; metal &#8211; PD from top to bottom.<br />
<sup>9)</sup>Fill factor: the proportion of activated region (photodiode) out of the entire region of each pixel in a sensor.<br />
<sup>10)</sup>QE: Quantum Efficiency, the measure of incident photons and converted electrons.<br />
<sup>11)</sup>IR wavelength range: wavelength in the range of 750nm~1mm. ToF generally requires 850nm/940nm wavelength range.</p>
<p><!-- //각주 스타일 --></p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p><strong>References</strong></p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>[1]</sup>R. Lange, P. Seitz, A. Biber, and R. Schwarte, “Time-of-flight range imaging with a custom solid-state image sensor,” in Proc. SPIE, Laser Metrology and Inspection, Munich, Germany, 1999, vol. 3823.<br />
<sup>[2]</sup>David Stoppa et al., “A Range Image Sensor Based on 10-um Lock-In Pixels in 0.18um CMOS Imaging Technology,” IEEE J. solid-state circuit, vol. 46, no. 1, pp. 248-258, Jan. 2011.<br />
<sup>[3]</sup>Cyrus S. Bamji et al., “A 0.13um CMOS System-on-chip for a 512&#215;424 Time-of-Flight Image Sensor With Multi-Frequency Photo-Demodulation up to 130MHz and 2GS/s ADC,” IEEE J. solid-state circuit, vol. 50, no. 1, pp. 303-319, Jan. 2015.<br />
<sup>[4]</sup>Yuich Kato et al., “320&#215;240 Back-Illuminated 10-um CAPD Pixels for High-Speed Modulation Time-of-Flight CMOS Image Sensor,” IEEE J. Solid-State Circuits, VOL. 53, NO. 4, pp1071-1078, Apr. 2018.<br />
<sup>[5]</sup>L. Pancheri et al., “Current Assisted Photonic Mixing Devices Fabricated on High Resistivity Silicon,” SENSORS, IEEE, pp981-983, Oct. 2008.<br />
<sup>[6]</sup>Y. Ebiko et al., “Low power consumption and high resolution 1280&#215;960 Gate Assisted Photonic Demodulator pixel for indirect Time of flight,” 2020 IEEE International Electron Devices Meeting (IEDM), 2020, pp. 33.1.1-33.1.4.<br />
<sup>[7]</sup>JH. Jang et al., “An Ultra-low current operating 5-μm Vertical Field Modulator Pixel for in-direct Time of Flight 3D Sensor.,” 2020 International Image Sensor Workshop (IISW), Sep. 2020.</p>
<p><!-- //각주 스타일 --></p>
<p><!-- 기고문 스타일 --></p>
<p><!-- namecard --></p>
<div class="namecard">
<p><img decoding="async" class="alignnone size-full wp-image-3446" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2022/03/02090014/jaehyung_jang.png" alt="" /></p>
<div class="name">
<p class="tit">By<strong>Jaehyung Jang</strong></p>
<p><span class="sub">TL(Technical Leader) of CIS AR/VR Technology Project Team at SK hynix Inc.</span></p>
</div>
</div>
<p><!-- //기고문 스타일 --></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/future-era-of-robotics-and-metaverse-pioneered-by-sk-hynix-tof-technology/">Future Era of Robotics and Metaverse Pioneered by SK hynix ToF Technology</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>SK hynix’s Next – Generation CMOS Image Sensor: All 4-Coupled (A4C) Sensor</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-in-ee-times-sk-hynixs-next-generation-cmos-image-sensor-all-4-coupled-a4c-sensor/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Tue, 23 Nov 2021 01:00:10 +0000</pubDate>
				<category><![CDATA[Opinion]]></category>
		<category><![CDATA[CIS]]></category>
		<category><![CDATA[CMOS]]></category>
		<category><![CDATA[A4C]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=8138</guid>

					<description><![CDATA[<p>The amazing feat of vision first appeared on Earth more than 500 million years ago. Over time, most animals, including humans, developed two eyes along with a supporting function known as binocular disparity – the ability to measure distance using these two eyes. This same concept has been replicated in smartphone cameras to help them [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-in-ee-times-sk-hynixs-next-generation-cmos-image-sensor-all-4-coupled-a4c-sensor/">SK hynix’s Next – Generation CMOS Image Sensor: All 4-Coupled (A4C) Sensor</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>The amazing feat of vision first appeared on Earth more than 500 million years ago. Over time, most animals, including humans, developed two eyes along with a supporting function known as binocular disparity – the ability to measure distance using these two eyes.</p>
<p>This same concept has been replicated in smartphone cameras to help them focus on an image through a technology called phase detection auto focus (PDAF).</p>
<p>SK hynix, though, recently unveiled a new technology called the All 4-Coupled (A4C) image sensor that provides a giant leap forward in camera sensor technology, much better than conventional PDAF. In this recent EE Times column [LINK], Tae-hyun (Ted) Kim, head of the CIS ISP at SK hynix, explains how A4C technology works and the benefits it delivers.</p>
<p>Similar to today’s Quad sensors, A4C uses a photodiode to convert light into an electric current and color filters to selectively absorb certain light wavelengths. Unlike Quad sensors, however, the A4C structure has one micro lens on each group of four-color filters of the same color pixels.</p>
<p>Using this unique structure, a subject is determined to be in focus if the different rays of light from the subject converge to one focal point. In other words, the intensity value is the same for the four pixels under one micro lens.</p>
<p>This unique structure and supporting SK hynix technology promises to dramatically improve smartphone cameras with faster and more accurate focus of images. The images captured also have higher-resolution, providing more detail for computer vision applications.</p>
<p>For more on this important technology, click on this link to read the column &#8212; <a class="-as-ga" style="text-decoration: underline;" href="https://www.eetimes.com/the-next-generation-cmos-image-sensor-all-4-coupled-a4c-sensor/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.eetimes.com/the-next-generation-cmos-image-sensor-all-4-coupled-a4c-sensor/">SK hynix’s Next-Generation CMOS Image Sensor: All 4-Coupled (A4C) Sensor [LINK]</a></p>
<p>&nbsp;</p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 1000px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/10/26013135/profile_Tae-hyun-Kim.png" alt="" /></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-in-ee-times-sk-hynixs-next-generation-cmos-image-sensor-all-4-coupled-a4c-sensor/">SK hynix’s Next – Generation CMOS Image Sensor: All 4-Coupled (A4C) Sensor</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>[Thought Leadership] Innovating for the CIS Surge: Chang-rock Song, Head of CIS Business</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/thought-leadership-innovating-for-the-cis-surge-chang-rock-song-head-of-cis-business/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Tue, 19 Oct 2021 07:00:45 +0000</pubDate>
				<category><![CDATA[Culture & People]]></category>
		<category><![CDATA[Chang-rock Song]]></category>
		<category><![CDATA[Head of CIS]]></category>
		<category><![CDATA[CIS]]></category>
		<category><![CDATA[interview]]></category>
		<category><![CDATA[Leader]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=8006</guid>

					<description><![CDATA[<p>Confidence is crucial when facing the uncertainties of the future. Those who continuously perform well have their confidence spurred by both big and small personal accomplishments. An individual’s confidence can grow and become a good influence on peers, leading to even more success. Chang-rock Song, Head of CIS Business, has been with SK hynix for [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/thought-leadership-innovating-for-the-cis-surge-chang-rock-song-head-of-cis-business/">[Thought Leadership] Innovating for the CIS Surge: Chang-rock Song, Head of CIS Business</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>Confidence is crucial when facing the uncertainties of the future. Those who continuously perform well have their confidence spurred by both big and small personal accomplishments. An individual’s confidence can grow and become a good influence on peers, leading to even more success.</p>
<p>Chang-rock Song, Head of CIS Business, has been with SK hynix for more than 20 years and seems to have a firm grasp on confidence. Although he has the difficult task of improving the foundations of SK hynix’s CIS business, he truly believes that the team’s current efforts will lead to future success.</p>
<h3 class="tit">Making Waves in the CIS Market: The Future Bread and Butter of the Semiconductor Industry</h3>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/10/19011912/cut_01.jpg" alt="" /></p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/10/19011912/cut_01.jpg" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>A CMOS Image Sensor, or Complementary Metal Oxide Semiconductor Image Sensor—commonly referred to as CIS—is a semiconductor that converts the color and brightness of light received through a lens into electrical signals, which are then transmitted to processing devices. Acting as the increasingly important “eyes” in IT devices, such as smartphones, means CIS-related markets are experiencing rapid growth. According to a June 2021 Gartner report, the CIS market is expected to grow from USD 19.9 billion to USD 26.3 billion before 2026, or roughly 7.3% annually.<br />
Comparatively, the overall semiconductor and semiconductor memory markets are expected to achieve 4.0% and 4.1% annual growth respectively.</p>
<p>“CIS is not a replacement for the human eye, but an improvement as far as its ability to expand upon its features. In the future, CIS will be utilized beyond mobile technology in security, robotics, automotive, and AR/VR industries. I’m confident that the CIS business will be a pillar of SK hynix’s growth along with DRAM and NAND flash,” Song said.</p>
<p>He went on to describe why CIS fits nicely within SK hynix’s product portfolio.</p>
<p>“As the semiconductor memory market grows and new technologies are developed, new fabs are built and the adoption of new processes and equipment follows suit. Unused assets and retired technologies generated as a product of this growth can be repurposed for CIS production. This is because there is a lower level of scaling required for CIS compared to memory but the manufacturing process and equipment are similar. Additionally, CIS business plays an important role as a foothold for SK hynix to expand deeper into the non-memory market.”</p>
<p>Currently, Sony and Samsung Electronics lead the CIS market with about 80% market share based on sales, while SK hynix, OmniVision, and GalaxyCore compete over the remaining 20%.</p>
<p>SK hynix has faced some challenges expanding its market share, due to its late entry into the high-end CIS market.</p>
<p>“At first, customers doubted if SK hynix could run a CIS business, but now we are recognized as a major supplier of low-pixel areas below 13MP (megapixel). To expand into the high-pixel market of 32MP or higher, we are strengthening our R&amp;D capabilities and are striving to achieve increased productivity. In addition, SK hynix has secured Pixel Shrink technology which determines CIS reliability and provides us with a great advantage. We have accumulated know-how regarding cell scaling from our work in the DRAM field and possess proven equipment already in use on the production line. By comparison, our competitors must go through several additional steps, delaying their advancement while we optimize unique time-saving methods.”</p>
<p>SK hynix secured a meaningful global market share in CIS and has now set the goal of joining the list of industry leaders.</p>
<p>To achieve such a goal, Song pointed out that defining product portfolios and achieving competitiveness in regard to development are prerequisites. He emphasized that “we need to release our product line-up at the same time as our competitors with the same level of quality.”</p>
<p>Song believes that the market will dramatically change in the near future and is putting significant effort into preparing for it.</p>
<p>“Pixel size in analog CIS cannot be continuously reduced in the same way as DRAM. When the limit is reached, new innovations are needed in peripheral technologies, as opposed to process technologies. In the future, CIS will not just be a visual sensor but will evolve into an information sensor and intelligence sensor. As a result, the competitive paradigm will also change. Meanwhile, the existing technology gap will become meaningless and market share will be reorganized. Our top priority is to be a leader in this upcoming match.”</p>
<h3 class="tit">A Hero of Innovation: Writing Another Success Story for SK hynix</h3>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/10/19011915/cut_02.jpg" alt="" /></p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/10/19011915/cut_02.jpg" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>Song joined the memory lab at SK hynix (previously Hyundai Electronics) in 1999 after getting his Doctorate in Materials Science and Engineering. He contributed to increasing yield rates by spearheading process innovation for DRAM production and leading R&amp;D. From 2017 to 2020, he guided company-wide efforts towards processes and system innovation as a CIO (Chief Information Officer) and ushered in DX (Digital Transformation). Currently, Song is responsible for SK hynix’s CIS business and strives to improve its foundations ahead of future opportunities.</p>
<p>How was it possible to create continued success in such an arduous industry? Song cited SK hynix’s special “DNA” as the means to overcoming challenges.</p>
<p>“It is in the ‘DNA’ of SK hynix team members to be passionate and overcome difficulties together as well as being innovative and reasonable. We learned a lot from our seniors who strived to rebuild the company following the Asian financial crisis in the 1990s and the global financial crisis in the late 2000s. So, we always keep in mind that we need to leave a better system and a strong heritage behind for the younger generation.”</p>
<p>As SK hynix’s CIS business is still relatively new, Song is designing fresh ways to awaken the passion in his team members’ “DNA” and understands that the business still needs to be supplemented in terms of overall systems and work processes.</p>
<p>First, he restructured the CIS team and changed the way individuals work together. After uncovering inefficient work processes, Song aimed to improve productivity and minimize the burden of unnecessary communications. He also increased the adoption of business automation and intelligence by promoting DX.</p>
<p>“To efficiently increase the amount of work one person can handle, simple repetitive tasks and paperwork should be completed by digital systems. To implement such processes, protocols should be in place from the time of product development through to mass production. Therefore, we actively introduced IT systems such as an internal messenger and an internal blog as well as a project management system. By utilizing these systems, we are setting strong operation standards for our CIS team members.</p>
<p>Once the promised protocol is mounted on the system, the work is automatically delivered to the next person without any meeting or reporting procedure. When one phase is over, teams can move immediately to the next. Team members can focus on finding their own ways to efficiently perform using VWBE (Voluntarily, Willingly Brain Engagement) rather than preparing for meetings or finalizing reports. As a result, the competitiveness of our CIS business can be continually enhanced as productivity per person improves.”</p>
<h3 class="tit">A ‘Warmhearted Leader’ Helping Team Members Find Happiness by Innovating the Way They Work</h3>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/10/19011917/cut_03.jpg" alt="" /></p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/10/19011917/cut_03.jpg" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>Song’s concept of “innovating the way you work” is closely related to team members’ happiness. He understands that team members feel happiest when “there aren’t any troubles at work, and everything is going smoothly.” To make this possible, he points out that thorough proactive management is necessary.</p>
<p>“When work proceeds smoothly, team members are much happier. Thanks to confirmed protocols and simple communication between coworkers, team members are likely to spread their happiness around—sometimes even to their family if they feel inspired at work. Employees should be able to earn happiness at work, not just money.</p>
<p>To make businesses run smoothly, corporate pre-management, or “continuous care for team members”, is needed. The opposite of happiness is indifference, not sadness. Employee happiness increases when organizations show they care. It is important employers make changes that lead to a more positive working environment that inspires their teams.”</p>
<p>When asked what kind of leader he is, Song said he strives to be a “leader who feels confident in delegating.”</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/10/19011919/cut_04.jpg" alt="" /></p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/10/19011919/cut_04.jpg" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>“I try to delegate more duties to the team members. Otherwise, junior team members cannot grow as their work becomes limited to including only the things that they are deemed good at. That being said, senior team members like myself still need to take responsibility for the result even though our juniors will be accountable. In taking this approach, each team member can perform self-leadership, meaning that they can make decisions on behalf of the team.“</p>
<p>Finally, he shared the following important keywords as key takeaways for his team (listed in priority): safety, ethics, security, quality, yield rate, and productivity.</p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/thought-leadership-innovating-for-the-cis-surge-chang-rock-song-head-of-cis-business/">[Thought Leadership] Innovating for the CIS Surge: Chang-rock Song, Head of CIS Business</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Let’s Hit the Road: How Autonomous Vehicles are Making “10 and 2” Driving Obsolete</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/lets-hit-the-road-how-autonomous-vehicles-are-making-10-and-2-driving-obsolete/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Thu, 26 Aug 2021 07:00:28 +0000</pubDate>
				<category><![CDATA[Technology]]></category>
		<category><![CDATA[HBM2E]]></category>
		<category><![CDATA[CIS]]></category>
		<category><![CDATA[Autonomous Vehicle]]></category>
		<category><![CDATA[ADAS]]></category>
		<category><![CDATA[eMMC 5.1]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=7783</guid>

					<description><![CDATA[<p>In recent years, efforts to produce the first fully autonomous vehicle have really shifted into gear. While self-driving cars once seemed like futuristic inventions straight from a sci-fi classic, today, most standard vehicles are already utilizing some form of technology that classifies them alongside autonomous vehicles. Features like auto-braking and rear-view cameras, also known as [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/lets-hit-the-road-how-autonomous-vehicles-are-making-10-and-2-driving-obsolete/">Let’s Hit the Road: How Autonomous Vehicles are Making “10 and 2” Driving Obsolete</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<div style="display: none;">In recent years, efforts to produce the first fully autonomous vehicle have really shifted into gear. While self-driving cars once seemed like futuristic inventions straight from a sci-fi classic, today, most standard vehicles are already utilizing some form of technology that classifies them alongside autonomous vehicles. Features like auto-braking and rear-view cameras, also known as advanced driver-assistance systems (ADAS), mark some of the first steps toward fully autonomous, self-driving vehicles.</div>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/08/26014127/Autonomous_SKhynix_Thumbnail_680x400.jpg" alt="" /></p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/08/26014127/Autonomous_SKhynix_Thumbnail_680x400.jpg" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>In recent years, efforts to produce the first fully autonomous vehicle have really shifted into gear. While self-driving cars once seemed like futuristic inventions straight from a sci-fi classic, today, most standard vehicles are already utilizing some form of technology that classifies them alongside autonomous vehicles. Features like auto-braking and rear-view cameras, also known as <a class="-as-ga" style="text-decoration: underline;" href="https://news.skhynix.com/lets-hit-the-road-automated-vehicles-are-pulling-into-the-fast-lane/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://news.skhynix.com/lets-hit-the-road-automated-vehicles-are-pulling-into-the-fast-lane/">advanced driver-assistance systems (ADAS)</a>, mark some of the first steps toward fully autonomous, self-driving vehicles.</p>
<p>Innovation in semiconductor technology is responsible for much of the recent development in the autonomous vehicle industry. And without semiconductors, the ongoing efforts to reach full autonomy would prove fruitless.</p>
<p>But, where is the line drawn between assistance and autonomy? The engineers at SAE International created the industry standard for making such distinctions. Its six-tier system classifies vehicles beginning at level 0, where a car may issue warnings to a driver, through to level 5, at which time there is no intervention needed whatsoever by the driver. Breaking the barrier between assistance and autonomy is the driving force behind innovation in the automotive industry today.</p>
<h3 class="tit">Becoming driver-“less” with more</h3>
<p>The automotive industry is primed to conquer levels 4 and 5 of the SAE International scale but still has a few hurdles to clear. Beyond the complications of ethics and legislation, many of the key challenges lie in technology. Surmounting the existing limitations in connectivity, sensing, and judgment will only be possible with the support of faster, denser, more advanced semiconductors.</p>
<p>Engineers need to reevaluate a vehicle’s ability to connect with the world around it. Not only are our streets packed with pedestrian and road signs, but also with multiple machines and other technology enabling the smooth operation of traffic management. Establishing a connection with these other smart elements of the road enables an autonomous vehicle to better model its environment. When channels are established between vehicles (vehicle to vehicle, V2V) and between a vehicle and fixed infrastructure (vehicle-to-infrastructure, V2I) the result is that a vehicle can signal its presence and planned behavior. Additionally, it can share environmental information, like icy roads, among local users as well as process warnings and information distributed via V2I channels including an impending change to a traffic light. While v2i and v2v technology already exist, the burden of reliable connectivity limits its application, especially on rapidly moving vehicles.</p>
<p>Sensing presents yet another challenge in parallel. While connection establishes a link between smart technologies on the road, sensors like LiDAR and RADAR help vehicles to identify and classify the various obstacles and elements in our road systems. A vehicle that has achieved level 5 autonomy must be able to organize and understand its environment in real-time regardless of environmental conditions or other variables like low- or impaired visibility. The development of <a class="-as-ga" style="text-decoration: underline;" href="https://news.skhynix.com/sk-hynix-in-edn-the-future-of-cis-technology/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://news.skhynix.com/sk-hynix-in-edn-the-future-of-cis-technology/">CMOS</a> as a part of a dynamic system of sensors will be critical to establishing a reliable and steadfast sensing system even more detailed and expansive than human vision itself.</p>
<p>But, what good is all this information if the vehicle can’t make the right decision? Overcoming judgment and decision-making challenges still requires significant development in the world of AI and machine-learning. Our roads are dynamic environments, and their conditions are ever-changing. Therefore, the data collected in these environments mimics the same dynamic and turbulent characteristics.</p>
<p>Autonomous vehicles not only need to make quick work of processing this data, but they also need to be able to utilize and store it as an input to multiple simultaneous computing operations. At the cornerstone of all this data management and processing will be innovative semiconductor memory with increased processing power and storage capacity supporting the configuration and reference of each piece of crucial, potentially life-saving data.</p>
<h3 class="tit">All roads lead to innovation</h3>
<p>In support of the overarching goals of the autonomous vehicle industry to create safer, greener roads for all, SK hynix has devoted countless hours and resources to push the envelope in automotive-grade semiconductor technology. Its dedicated automotive teams are conceptualizing and developing the semiconductor and chip technology responsible for increased processing power, data storage, and speed, ensuring the reliable and safe operation of autonomous vehicles for years to come.</p>
<p>In 2020, SK hynix began producing the industry’s fastest high-bandwidth memory chip, <a class="-as-ga" style="text-decoration: underline;" href="https://news.skhynix.com/behind-the-scenes-story-ofhbm2e-the-fastest-dram-in-history/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://news.skhynix.com/behind-the-scenes-story-ofhbm2e-the-fastest-dram-in-history/">HBM2E</a>. DRAM of this caliber supports supercomputing with ultra-fast densities and speeds. This remarkable technology is responsible for enabling the simultaneous computations required of AI for lightning-fast road response. It allows vehicles to assess information from multiple sources to complete operations in parallel as a mimic of the human mind. Just as a brain uses millions of neurons to process driving decisions, autonomous vehicles will rely on advanced DRAM like HBM2E to evaluate and respond to their own on-road experiences.</p>
<p>SK hynix’s HBM products will be capable of processing data even more quickly; such advancements will support the implementation of essential machine learning algorithms, enabling an autonomous vehicle to continually learn and improve its understanding of the road and the results of its own decisions.</p>
<p>SK hynix also offers solutions like eMMC 5.1, an advanced, managed NAND flash memory. Automotive NAND flash technology supports the critical function of information storage. It is essential that fully autonomous vehicles are able to intake information from multiple sources including data from sensors and semiconductors like eMMC 5.1. This technology will be relied upon for storing sensing and map data for the proper operation of advanced algorithms allowing vehicles to identify, classify, and save important details about elements of the road environment.</p>
<p>Bringing driver-less technology to life is about more than just convenience. Filling our roads with autonomous vehicles has various and broad implications including improvements to safety, efficiency, and mobility. SK hynix seeks to use technology to improve the world around us and contributing to the development of the autonomous vehicles is only one of the many ways it does so. It’s SK hynix’s commitment to research, development, and leadership within the industry that is making the future of fully automated vehicles a safe and reliable one.</p>
<p><iframe loading="lazy" src="https://www.youtube.com/embed/IppHJ0hjUPU" width="810" height="455" frameborder="0" allowfullscreen="allowfullscreen"><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start">﻿</span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start">﻿</span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start">﻿</span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start">﻿</span></iframe></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/lets-hit-the-road-how-autonomous-vehicles-are-making-10-and-2-driving-obsolete/">Let’s Hit the Road: How Autonomous Vehicles are Making “10 and 2” Driving Obsolete</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>The Future of CIS Technology</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-in-edn-the-future-of-cis-technology/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Thu, 13 May 2021 02:10:29 +0000</pubDate>
				<category><![CDATA[Opinion]]></category>
		<category><![CDATA[HDR]]></category>
		<category><![CDATA[CIS]]></category>
		<category><![CDATA[Image sensors]]></category>
		<category><![CDATA[Kangbong Seo]]></category>
		<category><![CDATA[ToF]]></category>
		<category><![CDATA[Future Innovation Technology]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=7112</guid>

					<description><![CDATA[<p>Recent advancements in CMOS Image Sensor (CIS) technology have been nothing short of phenomenal. These small but powerful sensors have become an integral part of smartphones and other electronic devices, enabling us to capture more images and information than ever before. The journey, however, has just begun for CIS technology, as outlined in a recent [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-in-edn-the-future-of-cis-technology/">The Future of CIS Technology</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<p>Recent advancements in CMOS Image Sensor (CIS) technology have been nothing short of phenomenal. These small but powerful sensors have become an integral part of smartphones and other electronic devices, enabling us to capture more images and information than ever before.</p>
<p>The journey, however, has just begun for CIS technology, as outlined in a recent <a style="text-decoration: underline;" href="https://www.edn.com/metavision-of-cmos-image-sensors-the-eye-beyond-the-eye/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.edn.com/metavision-of-cmos-image-sensors-the-eye-beyond-the-eye/">EDN column</a> by Kangbong Seo, SK hynix Head of Future Innovation Technology.</p>
<p>CIS technology continues to play catch-up to the ability of human eye that took billions of years to evolve. In the quest to capture more complex information, Seo outlines how the CIS industry is pushing the boundaries on not only pixel size and resolution, but also power efficiency and value added function.</p>
<p>CIS is consistently enlarging its application fields to wearable devices, self-driving cars, robots, drones and more. These will enrich the life of human beings. However, we will also have to solve the privacy issue of unauthorized information and the potential misuse of huge amounts of data, he writes.</p>
<p>For Seo’s full column, click on this link [<a style="text-decoration: underline;" href="https://www.edn.com/metavision-of-cmos-image-sensors-the-eye-beyond-the-eye/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.edn.com/metavision-of-cmos-image-sensors-the-eye-beyond-the-eye/">EDN: Metavision of CMOS image sensors: The eye beyond the eye</a>]</p>
<p>&nbsp;</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/05/13020710/pic01.jpg" alt="" /></p>
<p class="source">Figure 1. SK hynix’s 1st ToF Sensor</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/05/13020710/pic01.jpg" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p><!-- namecard --></p>
<div class="namecard">
<p><img decoding="async" class="alignnone size-full wp-image-3446" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2021/05/13020727/namecard_seo.png" alt="" /></p>
<div class="name">
<p class="tit">By<strong>Kangbong Seo (KB)</strong></p>
<p><span class="sub">Head of Future Innovation Technology at SK hynix</span></p>
</div>
</div>
<p><!-- //기고문 스타일 --></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/sk-hynix-in-edn-the-future-of-cis-technology/">The Future of CIS Technology</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Applying Light to Semiconductors: Introducing to CIS Key Process Technologies</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/applying-light-to-semiconductors-introducing-to-cis-key-process-technologies/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Thu, 10 Dec 2020 08:00:02 +0000</pubDate>
				<category><![CDATA[Opinion]]></category>
		<category><![CDATA[CIS]]></category>
		<category><![CDATA[CMOS Image sensor]]></category>
		<category><![CDATA[CIS Process]]></category>
		<category><![CDATA[In-Chul Jeong]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=6130</guid>

					<description><![CDATA[<p>Applying Light to Semiconductors: Introducing to CIS Key Process Technologies Cameras are one of the media which record objects embodied by light, allowing people to express various emotions, identity, and philosophy either objectively or subjectively. In particular, we are in a so-called “digital nomad era” where people carry digital devices and live without being restricted [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/applying-light-to-semiconductors-introducing-to-cis-key-process-technologies/">Applying Light to Semiconductors: Introducing to CIS Key Process Technologies</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<h3 class="tit">Applying Light to Semiconductors: Introducing to CIS Key Process Technologies</h3>
<p>Cameras are one of the media which record objects embodied by light, allowing people to express various emotions, identity, and philosophy either objectively or subjectively. In particular, we are in a so-called “digital nomad era” where people carry digital devices and live without being restricted from time and space. In this era, digital cameras equipped with image sensors are more widely used than film cameras. It is also the era of smartphones, which are equipped with digital camera functionality. In digital cameras or smartphone cameras that record our daily life and memories, the role of image sensors is like films in film cameras. Image sensors play a key role in converting the information of the subject received through the lens into an electrical image signal.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/12/09030718/SK_hynix_Image_sensor_application__Image_generation_mechanism.png" alt="" /></p>
<p class="source">Figure 1. Image sensor applications &amp; Image generation mechanism</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/12/09030718/SK_hynix_Image_sensor_application__Image_generation_mechanism.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>Image sensors can be largely divided into CCD image sensors and CMOS image sensors based on their applications and manufacturing processes. In particular, CMOS image sensors (CIS) are going beyond digital cameras and are actively applied to various new market fields, such as smartphones, tablets, CCTVs, black boxes for cars, autonomous vehicle sensors, virtual reality (VR), medical equipment, and drones. Through this, sales of this semiconductor product line are showing rapid growth<sup>1</sup>.</p>
<p>The operation process of a CMOS-based image sensor is as follows: When the light energy in the visible light wavelength range (400 to 700nm) is condensed on the photodiode (PD) of the silicon substrate, the silicon surface receives the light energy to form an electron-hole pair. The electron generated in this process is converted into voltage through floating diffusion (FD) and then converted into digital data through an analog to digital converter (ADC). To make such a CIS product making the series of these processes possible, key manufacturing process technologies unique to CIS, which are different from the ones for semiconductor memory, are required. These process technologies can be classified into five categories.</p>
<h3 class="tit">1. Deep PD Formation Process Technology</h3>
<p>Consumers’ desire for clearer image quality led to competition to increase the pixel density and resolution in mobile CIS, accelerating the development of CIS process technology. The pixel size should be more reduced to increase the number of pixels in the same chip size. Also, forming deep PD is a key technology to avoid deterioration in image quality. To secure sufficient full well capacity (FWC) in small pixels, patterning and implementing technologies with higher difficulty level compared to the ones for semiconductor memory are required. Especially, it is essential to secure a high aspect ratio (&gt;15:1) implant MASK process technology that can block high-energy ion implantation; in fact, the aspect ratio tends to be gradually increasing in the industry these days<sup>2</sup>.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/12/09030722/SK_hynix_Schematic_diagram_of_photo_diode_structure_change_along_with_reduction_in_pixel_size.png" alt="" /></p>
<p class="source">Figure 2. Schematic diagram of photo diode structure change along with reduction in pixel size</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/12/09030722/SK_hynix_Schematic_diagram_of_photo_diode_structure_change_along_with_reduction_in_pixel_size.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<h3 class="tit">2. Pixel to Pixel Isolation Process Technology</h3>
<p>The technology to isolate pixels from one another is very important to make a high-definition CIS. A less developed isolation technology can cause various image defects such as color mixing and color spreading. Each chipmaker has different isolation technology, and the difference will be an important criterion for image quality in the CIS market where higher pixel density and higher resolution are becoming common standards. Various issues can occur during the isolation process. For this reason, huge efforts are being made to select better equipment and develop new recipes to improve yield and product quality.</p>
<h3 class="tit">3. Color Filter Array (CFA) Process Technology</h3>
<p>Color filter array (CFA) is a process unique to CIS that is not seen in the semiconductor memory process. The CFA process is generally composed of a color filter (CF) that filters the incident light into red, green, and blue for each wavelength range, and a microlens (ML) to increase efficiency in condensing. To create excellent image quality, it is important to develop and evaluate R/G/B color materials and develop technologies to optimize process conditions such as shape and thickness. Recently, a series of high-quality and high-functional CIS products are being released, thanks to the development of various application technologies such as Bayer and Quad being combined with the basic form of CFA.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/12/09030726/SK_hynix_Structure_of_color_filter_array.png" alt="" /></p>
<p class="source">Figure 3. Structure of CFA (Color Filter Array)</p>
<p class="download_img"><a class="-as-download -as-ga" href="//d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/12/09030726/SK_hynix_Structure_of_color_filter_array.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<h3 class="tit">4. Wafer Stacking Process Technology</h3>
<p>Wafer stacking refers to attaching two wafers together. This is an essential technology for making high-pixel and high-definition CIS products<sup>3</sup>. For high-pixel CIS products, pixel arrays and logic circuits are formed on individual wafers separately. These wafers are attached together in the middle of the process, which is called “wafer bonding”. Separation of the pixel arrays and logic circuits would increase manufacturing cost, but more chips can be produced in the same amount of wafer area, and it also helps improve product properties. As a result, it is a technology adopted by most CIS chipmakers these days. Wafer stacking technology is continuously developing in various forms. Recently, the wafer stacking technology has been applied to the semiconductor memory segment as well, contributing to the improvement of the product properties.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/12/09030714/SK_hynix_Wafer_stacking_configuration.png" alt="" /></p>
<p class="source">Figure 4. Wafer Stacking configuration</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/12/09030714/SK_hynix_Wafer_stacking_configuration.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<h3 class="tit">5. Control Technology to Improve CIS Product Yield and Quality</h3>
<p>One of the most fundamental requirements in the CIS product development and mass production process is to control metallic contamination. Since CIS products are sensitive to contamination several times more than memory products and the contamination directly affects product yield and quality, various contamination control technologies are required. The next important factor is the plasma damage<sup>4</sup> <sup>5</sup> control. Since the deterioration of image properties such as hot pixels occurs due to the damage caused during the process, it is necessary to manage key processes accurately.</p>
<p>This article covered the main characteristics of the key CIS process technologies. It is no exaggeration to say that the completeness of CIS products is determined by not only the process technology, but also how organically pixel devices, analog and digital design, and image signal processing (ISP) technology are complementing each other and being optimized. Based on the existing semiconductor memory process technologies, SK hynix has secured the core process technologies which are unique to CIS products, including the ones listed above. Through this, the Company is proactively responding to the market demand, with the in-time development of high-pixel and high-definition products. In the future, based on its process technologies, devices, design, and ISP technology, the CIS business of SK hynix is expected to expand its base to various application product lines in different fields, such as medical and security, as well as mobile CIS products. Ultimately, it is expected that this will contribute to the creation of both economic and social values that SK Group pursues.</p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>1</sup>Yole Development, “Status of the CMOS image sensors industry 2012”, <a class="-as-ga" href="https://www.slideshare.net/Yole_Developpement/yole-cmos imagesensorsoctober2012reportsample" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.slideshare.net/Yole_Developpement/yole-cmos imagesensorsoctober2012reportsample">https://www.slideshare.net/Yole_Developpement/yole-cmos imagesensorsoctober2012reportsample</a><br />
<sup>2</sup>SONY.net, “Perspectives from the creators of the image sensor ‘microcosm’” <a class="-as-ga" href="https://www.sony.net/SonyInfo/technology/stories/IMX586/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.sony.net/SonyInfo/technology/stories/IMX586/">https://www.sony.net/SonyInfo/technology/stories/IMX586/</a><br />
<sup>3</sup>Cheng-Ta Ko, et al., &#8220;Wafer-to-wafer hybrid bonding technology for 3D IC&#8221; 3RD Electronics System Integration Technology Conference ESTC, Berlin, 2010, pp. 1-5, doi: 10.1109/ESTC.2010.5642848.<br />
<sup>4</sup>Koji Eriguchi, “Defect generation in electronic devices under plasma exposure: Plasma-induced damage” Jpn. J. Appl. Phys. 56 (2017)<br />
<sup>5</sup>K. Eriguchi., “Application of Molecular Dynamics Simulations to Plasma Etch Damage in Advanced Metal-Oxide-Semiconductor Field-Effect Transistors”, Molecular Dynamics− Studies of Synthetic and Biological Macromolecules, 221-244 (2012)</p>
<p><!-- //각주 스타일 --></p>
<p><!-- 기고문 스타일 --><br />
<!-- namecard --></p>
<div class="namecard">
<p><img decoding="async" class="alignnone size-full wp-image-3446" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/11/16060502/namecard_In-Chul-Jung.png" alt="" /></p>
<div class="name">
<p class="tit">By<strong>In-Chul Jeong</strong></p>
<p><span class="sub">CIS Process Team Leader at SK hynix Inc.</span></p>
</div>
</div>
<p><!-- //기고문 스타일 --></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/applying-light-to-semiconductors-introducing-to-cis-key-process-technologies/">Applying Light to Semiconductors: Introducing to CIS Key Process Technologies</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>The Visual Evolution &#038; Innovation of Image Sensors</title>
		<link>https://skhynix-news-global-stg.mock.pe.kr/the-visual-evolution-innovation-of-image-sensors/</link>
		
		<dc:creator><![CDATA[user]]></dc:creator>
		<pubDate>Wed, 28 Oct 2020 01:00:40 +0000</pubDate>
				<category><![CDATA[Opinion]]></category>
		<category><![CDATA[CMOS Image sensor]]></category>
		<category><![CDATA[Taehyun Kim]]></category>
		<category><![CDATA[CIS]]></category>
		<guid isPermaLink="false">http://admin.news.skhynix.com/?p=5930</guid>

					<description><![CDATA[<p>Visual Evolution – from the Cambrian Era to Today. Close your eyes and imagine a world without eyesight. Millions of years ago, that was the way of life on Earth. Image Download Visual Evolution – from the Cambrian Era to Today Close your eyes and imagine a world without eyesight. Millions of years ago, that [&#8230;]</p>
<p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/the-visual-evolution-innovation-of-image-sensors/">The Visual Evolution & Innovation of Image Sensors</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></description>
										<content:encoded><![CDATA[<div style="display: none;">Visual Evolution – from the Cambrian Era to Today. Close your eyes and imagine a world without eyesight. Millions of years ago, that was the way of life on Earth.</div>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27084846/CIS_Contribution.jpg" alt="" /></p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27084846/CIS_Contribution.jpg" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<h3 class="tit" style="text-decoration: underline;">Visual Evolution – from the Cambrian Era to Today</h3>
<p>Close your eyes and imagine a world without eyesight.</p>
<p>Millions of years ago, that was the way of life on Earth. It was nearly 540 million years ago that animals first developed the ability to see – transforming everything about the ways in which they could avoid enemies, secure food, and evolve into various species.<sup>1</sup> It led to a geological event known as the Cambrian explosion, where the number of animal groups exploded from three to 38.</p>
<p>A similar visual innovation took place much more recently with the proliferation of smartphones in the late 2000s. Suddenly, people around the world were equipped with high-performance cameras that could fit into their pocket or the palm of their hand. Photography was no longer restricted to photographers and the easy transfer of visual information became widely available.</p>
<p>This year, with the outbreak of COVID-19, the transition to the contactless digital era accelerated once again as video conferencing and online classes became a part of our daily lives.</p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>1</sup>Andrew Parker, “In the Blink of an Eye: How Vision Sparked the Big Bang of Evolution”, (2003) (<a class="-as-ga" style="text-decoration: underline;" href="https://www.theage.com.au/national/the-eyes-might-have-it-20030830-gdw96w.html" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.theage.com.au/national/the-eyes-might-have-it-20030830-gdw96w.html">URL</a>)</p>
<p><!-- //각주 스타일 --></p>
<p><strong>“Retina of Camera” CIS Technology Development, the Heyday of Smartphone Cameras</strong></p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/28012519/SK_hynix_Structure_of_CMOS_Image_Sensor.png" alt="" /></p>
<p class="source">Figure 1. Structure of CMOS Image Sensor</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/28012519/SK_hynix_Structure_of_CMOS_Image_Sensor.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>A camera is designed much like the human eye.</p>
<p>A smartphone camera is composed of various parts such as a lens, infrared cut-off filter<sup>2</sup>, auto focusing actuator,<sup>3</sup> and CMOS image sensor (CIS).<sup>4</sup> Among them, CIS is a key component that acts as a retina of the human eye. As shown in Figure 1, it is composed of a photodiode that converts light into electrons, a color filter where only the light of a specific wavelength can pass through, an analog/digital circuit that converts electrons into digital signals, and an image signal processor (ISP) responsible for correction and image processing.</p>
<p>Since the resolution, sensitivity, and signal-to-noise ratio (SNR)<sup>5</sup> are determined by the CIS performance, it can be said that the image quality of the smartphone camera is determined by the CIS. Today, the CIS image quality of smartphone cameras has surpassed the level of compact cameras, and the gap from DSLRs is continually being narrowed.<sup>6</sup></p>
<p>In terms of performance, CIS has developed in the direction of reinforcing the number of pixels and its functions. Since more detailed and clearer image quality can be obtained with the increase in the number of pixels, the competition for the higher number of pixels started from the early stage of smartphone cameras. In addition, with innovations in semiconductor micro-processing such as back side illumination (BSI)<sup>7</sup> and deep trench isolation (DTI)<sup>8</sup>, more pixels are integrated in the same size of area, making it possible to easily capture images of tens of millions of pixels even in a general smartphone.</p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>2</sup>Infrared cut-off filter: A device that only passes the visible light and blocks the infrared wavelength<br />
<sup>3</sup>Auto focusing actuator: A lens driving device for autofocusing, implemented with a small motor<br />
<sup>4</sup>CIS: A device that can detect light and convert it into an electrical signal with a structure of ‘Complementary Metal Oxide Semiconductor (CMOS)’ composed of different MOS integrated circuits.<br />
Acts as an electronic film in fliming electronic devices such as smartphones and cameras with high speed and low power consumption.<br />
<sup>5</sup>SNR (Signal-to-Noise Ration): Defined as 20 log (signal/noise)<br />
<sup>6</sup>David Cardinal, “Smartphones vs Cameras: Closing the gap on image quality”, DXO Mark. (2020) (<a class="-as-ga" style="text-decoration: underline;" href="https://www.dxomark.com/smartphones-vs-cameras-closing-the-gap-on-image-quality/" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://www.dxomark.com/smartphones-vs-cameras-closing-the-gap-on-image-quality/">URL</a>)<br />
<sup>7</sup>BSI (Back Side Illumination): A technology that increases the amount of light received on a photodiode by accepting light from the rear of the sensor. When light enters from the front of the sensor, light loss occurs due to scattering by metal wiring.<br />
<sup>8</sup>DTI (Deep Trench Isolation): Process technology to make barriers between physical pixels between adjacent photodiodes inside silicon to prevent signal interference between pixels.</p>
<p><!-- //각주 스타일 --></p>
<h3 class="tit">The Latest Technology Trend in CIS is about Function, not Pixel</h3>
<p>Nevertheless, this trend for high pixels in CIS is expected to face technical difficulties soon, and the innovation for a high level of functions centered on the ISP will be in full swing.</p>
<p>This is due to the limits of miniaturization of CIS pixels due to the diffraction limit<sup>9</sup>. CIS is a complex component that combines optical technologies such as micro lenses and semiconductor technologies such as devices and circuits. It is possible to reduce the critical dimension of electric circuits to several nanometers with the current semiconductor technology; however, since the light reception amount decreases as the pixel size decreases, the sensitivity and the signal level is reduced, resulting in the decline in SNR and the image quality degradation.</p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>9</sup>Diffraction limit: A limit point in which the distance between two objects is too close, making it difficult to distinguish with an optical lens.</p>
<p><!-- //각주 스타일 --></p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085209/SK_hynix_Airy_disk_diffraction_image.png" alt="" /></p>
<p class="source">Figure 2. Airy disk diffraction image</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085209/SK_hynix_Airy_disk_diffraction_image.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>The optical system of the camera also has a physical limit where the performance is limited by the diffraction effect,<sup>10</sup> and even if one point light source<sup>11</sup> is taken, the image formed on the CIS through the lens is spread out, as seen in Figure 2. This is called an airy disk<sup>12</sup>; given the wavelength (λ), focal length (f), and lens diameter (d), the distance (x) that can separate the two points is determined by the following formula:</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 117px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085203/201027.jpg" alt="" /></p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>10</sup>Diffraction effect: A phenomenon in which light proceeds in a curved path rather than a straight path when it encounters an obstacle.<br />
<sup>11</sup>Point light source: A light source that is small enough to be considered as a dot.<br />
<sup>12</sup>Airy disk: A phenomenon that the image is spread out when one point light source is condensed on the CIS through the lens.</p>
<p><!-- //각주 스타일 --></p>
<p>For example, for a 400nm blue point light source, even if a high-performance lens with an F (=f/d) number of 1.4 is used, the distance that can separate the two points is 0.68μm. In other words, to distinguish the two blue point light sources, the distance should be at least 0.68μm. Therefore, even if the size of CIS pixels is made smaller than this, it is difficult to expect substantial improvement in resolution. Since the size of the commercially available CIS pixel has already reached 0.7 to 0.8μm, it is necessary to develop a new optical technology to reduce the F number or a new application that can merge several fine pixels.</p>
<p>Another reason for this innovation is the emergence of stack sensor technology. Since the conventional sensor has a structure where pixels and circuits are implemented on the same substrate, it was essential to reduce the light-free area for the CIS size reduction. Therefore, only essential functions of analog/digital circuits were implemented and adding circuits for additional functions was very limited.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085240/SK_hynix_Structure_comparison_of_conventional_sensor_and_stack_sensor.png" alt="" /></p>
<p class="source">Figure 3. Left: Conventional sensor structure / Right: Stack sensor structure (right)</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085240/SK_hynix_Structure_comparison_of_conventional_sensor_and_stack_sensor.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>On the contrary, the stack sensor has a structure where pixels and circuits are implemented on separate substrates as shown in Figure 3, and then the two substrates are connected electrically by Through Silicon Via (TSV)<sup>13</sup> or hybrid bonding technology.<sup>14</sup> Since pixels and circuits are stacked together, the circuits on the lower substrate can be used as much as the area occupied by the pixels on the upper substrate, leading that area free to use.</p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>13</sup>TSV (Through Sillicon Via): An interconnecting technology that delivers electric signals through column-shaped paths that penetrate the entire silicon wafer thickness.<br />
<sup>14</sup>Hybrid bonding technology: A process technology that bonds metal electrodes of two wafers together and connects them, instead of an electrode penetrating through a silicon wafer. It can reduce size while increasing performance.</p>
<p><!-- //각주 스타일 --></p>
<p>For example, a 48 megapixel (8,000 x 6,000) sensor implemented with 1μm occupies an area of 48mm<sup>2</sup> or more on the upper substrate. If this size of area on the lower substrate can be used for the implementation of digital logic, it is possible to utilize a large space, enough to integrate a large number of high-performance microprocessors.<sup>15</sup></p>
<p>In addition, the stack sensor has the advantage that an independent process can be applied to the pixels on the upper substrate and the circuits on the lower substrate. If an advanced logic process<sup>16</sup> is applied to the lower substrate for circuits, even a complicated ISP algorithm can be implemented with low power, high density digital circuits. In other words, while the ISP of the conventional sensor only supported simple functions such as lens correction and defect correction due to the limitation of the circuit area, the ISP of the stack sensor can implement innovative algorithms such as image processing, computer vision, and artificial intelligence (AI) by using an advanced logic process.</p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>15</sup>Microprocessor: A device that integrates the control functions of the processing unit and central processing unit into one chip<br />
<sup>16</sup>Logic process: Semiconductor process that manufactures digital devices to process logical operations such as AND, OR, NOT</p>
<p><!-- //각주 스타일 --></p>
<h3 class="tit">SK hynix’s CIS with Various Functions</h3>
<p>Currently, SK hynix’s CIS has built-in image processing functions such as phase detection auto focus (PDAF), Quad pixel processing, and high dynamic range (HDR) processing, and new functions are constantly being added to it.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/28012607/SK_hynix_PDAF_Structure.png" alt="" /></p>
<p class="source">Figure 4. Left: Half Shield PDAF structure / Right: Paired PDAF structure</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/28012607/SK_hynix_PDAF_Structure.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>PDAF is a function that applies the principle that human’s both eyes use to predict the distance to a subject. That is, for some pixels of the CIS, this method generates a phase difference<sup>17</sup> by covering the left and right, as shown in the left side of Figure 4, or by placing the left and right pixels under one large micro lens as shown in the right side of Figure 4. Through this method, the ISP algorithm calculates the phase difference from the left and right images and converts it into a distance to the subject to focus quickly and accurately.</p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>17</sup>Phase difference : The phase difference between two vibrations (wavelengths). The phase refers to the relative position of the vibration(wavelength) at a certain point.</p>
<p><!-- //각주 스타일 --></p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085230/SK_hynix_Output_by_sensor_1.png.jpg" alt="" /></p>
<p class="source">Figure 5. Left: Output of a conventional sensor / Right: Output of a Quad sensor</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085230/SK_hynix_Output_by_sensor_1.png.jpg" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>The Quad sensor has the function of placing four color filters of the same color adjacent to each other and processing them together. In dark places, four pixels are combined and processed to receive more light and in bright places, the individual pixels are separately processed by the ISP algorithm to improve resolution. Figure 5 shows two images taken with SK hynix’s 48 megapixel Quad sensor and a conventional sensor, respectively. This shows that it is possible to obtain a bright image without noise even in a dark place when using a Quad sensor, compared to when using a conventional sensor.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085227/SK_hynix_Output_by_sensor.png" alt="" /></p>
<p class="source">Figure 6. Left: Output of a conventional sensor / Right: Output of an HDR sensor</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085227/SK_hynix_Output_by_sensor.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>The HDR sensor supports a function that makes a clear contrast between the bright and dark parts of an image by synthesizing multiple pixels with different sensitivity and exposure time. In particular, with the SK hynix’s CIS, image processing is performed by a built-in ISP, resulting in real-time processing and clear image quality even with a moving object. Figure 6 shows the output images with and without HDR application to the SK hynix’s CIS. Compared to the image where HDR is not used, the image using HDR restores the background clearly while maintaining the same brightness of the whole image.</p>
<p>Currently, SK hynix’s CIS, mainly the Black Pearl product line, is widely used in smartphone cameras and the application field is expected to expand to various fields such as bio, security, and autonomous vehicles.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085214/SK_hynix_Camera_mounting_location_in_self-driving_car.png" alt="" /></p>
<p class="source">Figure 7. Camera mounting location in self-driving car</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27085214/SK_hynix_Camera_mounting_location_in_self-driving_car.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>In particular, autonomous vehicles use at least ten cameras to detect their surroundings.<sup>18</sup> To improve accuracy, various requirements such as high resolution support for distinguishing distance objects, HDR support for recognizing objects even in dark environments, and pre-processing of the ISP to reduce the computational amount of the processor must be satisfied.</p>
<p>In the security field, a function to compress and encrypt image signals in the CIS built-in ISP and to transmit them to an external processor is required. If the unencrypted image signal is transmitted to the outside as it is, the possibility of security vulnerability and information leakage increases. For this reason, the encryption function inside the CIS is essential.</p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>18</sup>Peter Brown, “Breaking Down the Sensors Used in Self-Driving Cars”, Electronics 360. (2018) (<a class="-as-ga" style="text-decoration: underline;" href="https://electronics360.globalspec.com/article/12563/breaking-down-the-sensors-used-in-self-driving-cars" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://electronics360.globalspec.com/article/12563/breaking-down-the-sensors-used-in-self-driving-cars">URL</a>)</p>
<p><!-- //각주 스타일 --></p>
<h3 class="tit">The future of CIS: Information Sensor that supports advanced functions</h3>
<p>In the future, CIS is expected to evolve into an information sensor that supports advanced additional functions, without being limited to image quality improvement.<sup>19</sup> SK hynix’s stack sensor is already capable of embedding a simple AI hardware engine inside the ISP on the lower substrate, based on the advanced semiconductor process. Based on this, SK hynix is currently developing new machine learning-based technologies such as super resolution, color restoration, face recognition, and object recognition.</p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>19</sup>Sungjoo Hong, “Smart Cloud and Information Sensor”, Smart Cloud Show. (2018) (<a class="-as-ga" style="text-decoration: underline;" href="https://convention.chosunbiz.com/%ED%96%89%EC%82%AC/%EC%8A%A4%EB%A7%88%ED%8A%B8%ED%81%B4%EB%9D%BC%EC%9A%B0%EB%93%9C%EC%87%BC/2018-1#h.p_3-FJhI7-cRiF" target="_blank" rel="noopener noreferrer" data-ga-category="sk-hynix-newsroom" data-ga-action="click" data-ga-label="goto_https://convention.chosunbiz.com/%ED%96%89%EC%82%AC/%EC%8A%A4%EB%A7%88%ED%8A%B8%ED%81%B4%EB%9D%BC%EC%9A%B0%EB%93%9C%EC%87%BC/2018-1#h.p_3-FJhI7-cRiF">URL</a>)</p>
<p><!-- //각주 스타일 --></p>
<p>Since it is possible to extract and classify various features from input images when using machine learning-based ISP technology, CIS will become a key component of information sensors that collect various information such as image information, location information, distance information, and biometric information.</p>
<p>In particular, to utilize CIS as an information sensor, it is necessary to take a new approach from a different perspective. This is because the quality goal of CIS is now “achieving the image quality optimized for machine algorithms”, while the goal was previously “achieving the image quality optimized for human eyes” so far.</p>
<p><!-- 이미지 사이즈 지정해서 업로드 --></p>
<p class="img_area"><img decoding="async" class="alignnone size-full wp-image-4330" style="width: 800px;" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27090544/Less_noise_but_misrecognized_as_a_spotlight.png" alt="" /></p>
<p class="source">Figure 7. Left: Less noise but misrecognized as a spotlight<br />
Right: More noise but accurately recognized as an espresso</p>
<p class="download_img"><a class="-as-download -as-ga" href="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27090544/Less_noise_but_misrecognized_as_a_spotlight.png" target="_blank" rel="noopener noreferrer" download="" data-ga-category="sk-hynix-newsroom" data-ga-action="download" data-ga-label="download_image">Image Download</a></p>
<p>According to the results of a research of Stanford University, even if a good-looking image to the human eye is created through complex image processing, it does not always mean that this image produces excellent results when computer vision algorithms are applied to it. For example, when an object recognition algorithm is applied to the coffee cup image in Figure 7, the image on the left with less noise is incorrectly recognized as a spotlight, but the image on the right with much more noise is accurately recognized as an espresso. This shows that the key function for the future CIS is to provide image quality optimized for the computer algorithm to be used.<sup>20</sup></p>
<p><!-- 각주 스타일 --></p>
<div style="border-top: 1px solid #e0e0e0;"></div>
<p>&nbsp;</p>
<p style="font-size: 14px; font-style: italic; color: #555;"><sup>20</sup>Diamond, V. Sitzmann, S. Boyd, G. Wetzstein, F. Heide ‘Dirty Pixels: Optimizing Image Classification Architectures for Raw Sensor Data’, arXiv preprint arXiv:1701.06487.</p>
<p><!-- //각주 스타일 --></p>
<p>As explained in this article, SK hynix is increasing the level of integration of the CIS pixels through the continuous development of device and process technologies and supporting various application fields through the ISP technology development. Especially, to pioneer new technology fields, it has established and is operating overseas research institutes in Japan and the United States, and various researches are actively being conducted with domestic and foreign universities through the academic-industrial collaboration. In the future, SK hynix’s CIS is expected to be utilized in various application fields including smartphone cameras to contribute to the creation of economic and social value and to grow as a key component of information sensors in the future.</p>
<p><!-- 기고문 스타일 --><br />
<!-- namecard --></p>
<div class="namecard">
<p><img decoding="async" class="alignnone size-full wp-image-3446" src="https://d36ae2cxtn9mcr.cloudfront.net/wp-content/uploads/2020/10/27084850/namecard_kim_tae_hyun.png" alt="" /></p>
<div class="name">
<p class="tit">By<strong>Tae-hyun (Ted) Kim, Ph.D.</strong></p>
<p><span class="sub">Head of CIS ISP at SK hynix Inc.</span></p>
</div>
</div>
<p><!-- //기고문 스타일 --></p><p>The post <a href="https://skhynix-news-global-stg.mock.pe.kr/the-visual-evolution-innovation-of-image-sensors/">The Visual Evolution & Innovation of Image Sensors</a> first appeared on <a href="https://skhynix-news-global-stg.mock.pe.kr">SK hynix Newsroom</a>.</p>]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
