Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Ray tracing (graphics)
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Interactive ray tracing{{anchor|In real time}}== {{See also|Ray-tracing hardware}} The first implementation of an interactive ray tracer was the [[Supercomputing in Japan|LINKS-1 Computer Graphics System]] built in 1982 at [[Osaka University]]'s School of Engineering, by professors Ohmura Kouichi, Shirakawa Isao and Kawata Toru with 50 students.{{Citation needed|reason=There were other real-time ray tracing claims researched for and discussed at SIGGRAPH 2005, but none could be proven prior to 1986. This claim warrants proof of real-time update (e.g., over 1 frame/sec), not just high-speed as there were numerous fast parallel distributed systems in the early-mid 1980s. The provided LINKS-1 citation does not support a real-time claim.|date=January 2019}} It was a [[massively parallel]] processing [[computer]] system with 514 [[microprocessor]]s (257 [[Zilog Z8000|Zilog Z8001]]s and 257 [[iAPX 86]]s), used for [[3-D computer graphics]] with high-speed ray tracing. According to the [[Information Processing Society of Japan]]: "The core of 3-D image rendering is calculating the luminance of each pixel making up a rendered surface from the given viewpoint, [[Computer graphics lighting|light source]], and object position. The LINKS-1 system was developed to realize an image rendering methodology in which each pixel could be parallel processed independently using ray tracing. By developing a new software methodology specifically for high-speed image rendering, LINKS-1 was able to rapidly render highly realistic images." It was used to create an early 3-D [[planetarium]]-like video of the [[Universe|heavens]] made completely with computer graphics. The video was presented at the [[Fujitsu]] pavilion at the 1985 International Exposition in [[Tsukuba]]."<ref>{{cite web |title=γOsaka University γ LINKS-1 Computer Graphics System |url=http://museum.ipsj.or.jp/en/computer/other/0013.html |website=IPSJ Computer Museum |publisher=[[Information Processing Society of Japan]] |access-date=November 15, 2018}}</ref> It was the second system to do so after the [[Evans & Sutherland]] [[Digistar]] in 1982. The LINKS-1 was claimed by the designers to be the world's most powerful computer in 1984.<ref>{{cite book |last1=Defanti |first1=Thomas A. |title=Advances in computers. Volume 23 |publisher=[[Academic Press]] |isbn=0-12-012123-9 |page=121 |url=http://www.vasulka.org/archive/Writings/VideogameImpact.pdf#page=29|year=1984 }}</ref> The next interactive ray tracer, and the first known to have been labeled "real-time" was credited at the 2005 [[SIGGRAPH]] computer graphics conference as being the REMRT/RT tools developed in 1986 by [[Mike Muuss]] for the [[BRL-CAD]] solid modeling system. Initially published in 1987 at [[USENIX]], the BRL-CAD ray tracer was an early implementation of a parallel network distributed ray tracing system that achieved several frames per second in rendering performance.<ref>See Proceedings of 4th Computer Graphics Workshop, Cambridge, MA, USA, October 1987. Usenix Association, 1987. pp 86β98.</ref> This performance was attained by means of the highly optimized yet platform independent LIBRT ray tracing engine in BRL-CAD and by using solid implicit [[Constructive solid geometry|CSG]] geometry on several shared memory parallel machines over a commodity network. BRL-CAD's ray tracer, including the REMRT/RT tools, continue to be available and developed today as [[Open-source software|open source]] software.<ref>{{cite web |url=http://brlcad.org/d/about |title=About BRL-CAD |access-date=2019-01-18 |archive-date=September 1, 2009 |archive-url=https://web.archive.org/web/20090901222818/http://brlcad.org/d/about |url-status=dead }}</ref> Since then, there have been considerable efforts and research towards implementing ray tracing at real-time speeds for a variety of purposes on stand-alone desktop configurations. These purposes include interactive 3-D graphics applications such as [[Demo (computer programming)|demoscene productions]], [[Video game|computer and video games]], and image rendering. Some real-time software 3-D engines based on ray tracing have been developed by hobbyist [[demoscene|demo programmers]] since the late 1990s.<ref>{{cite web |url=http://www.acm.org/tog/resources/RTNews/demos/overview.htm |title=The Realtime Raytracing Realm |author=Piero Foscari |work=ACM Transactions on Graphics |access-date=2007-09-17}}</ref> In 1999 a team from the [[University of Utah]], led by Steven Parker, demonstrated interactive ray tracing live at the 1999 Symposium on Interactive 3D Graphics. They rendered a 35 million sphere model at 512 by 512 pixel resolution, running at approximately 15 frames per second on 60 CPUs.<ref> {{cite book | last1 = Parker | first1 = Steven | last2 = Martin | first2 = William | title = Proceedings of the 1999 symposium on Interactive 3-D graphics | chapter = Interactive ray tracing | date = April 26, 1999 | chapter-url = https://dl.acm.org/citation.cfm?id=300537 | series = I3D '99 | volume = 5 | issue = April 1999 | pages = 119β126 | doi = 10.1145/300523.300537 | isbn = 1581130821 | citeseerx = 10.1.1.6.8426 | s2cid = 4522715 | access-date = October 30, 2019 }}</ref> The Open RT project included a highly optimized software core for ray tracing along with an [[OpenGL]]-like API in order to offer an alternative to the current [[rasterization]] based approach for interactive 3-D graphics. [[Ray-tracing hardware|Ray tracing hardware]], such as the experimental [[Ray Processing Unit]] developed by Sven Woop at the [[Saarland University]], was designed to accelerate some of the computationally intensive operations of ray tracing. [[File:Quake_Wars_Ray_Traced.ogv|thumb|''Quake Wars: Ray Traced'']] The idea that video games could ray trace their graphics in real time received media attention in the late 2000s. During that time, a researcher named Daniel Pohl, under the guidance of graphics professor Philipp Slusallek and in cooperation with the [[Erlangen University]] and [[Saarland University]] in Germany, equipped ''[[Quake III]]'' and ''[[Quake IV]]'' with an [[game engine|engine]] he programmed himself, which Saarland University then demonstrated at [[CeBIT]] 2007.<ref>{{cite news |url=http://news.bbc.co.uk/1/hi/technology/6457951.stm |title=Rays light up life-like graphics |author=Mark Ward |work=BBC News |date=March 16, 2007 |access-date=2007-09-17}}</ref> [[Intel]], a patron of Saarland, became impressed enough that it hired Pohl and embarked on a research program dedicated to ray traced graphics, which it saw as justifying increasing the number of its processors' cores.<ref name=Peddie>{{cite book|url=https://books.google.com/books?id=CS2oDwAAQBAJ|title=Ray Tracing: A Tool for All|last=Peddie|first=Jon|publisher=[[Springer Nature Switzerland]]|date=2019|access-date=2022-11-02|isbn=978-3-030-17490-3}}</ref>{{Rp|99β100}}<ref name=Abi-Chahla>{{cite web|url=https://www.tomshardware.com/reviews/ray-tracing-rasterization,2351.html|title=When Will Ray Tracing Replace Rasterization?|last=Abi-Chahla|first=Fedy|work=[[Tom's Hardware]]|date=July 22, 2009|access-date=2022-11-04|archive-url=https://archive.today/20221103235551/https://www.tomshardware.com/reviews/ray-tracing-rasterization,2351.html|archive-date=2022-11-03|url-status=live}}</ref> On June 12, 2008, Intel demonstrated a special version of ''[[Enemy Territory: Quake Wars]]'', titled ''Quake Wars: Ray Traced'', using ray tracing for rendering, running in basic HD (720p) resolution. ''ETQW'' operated at 14β29 frames per second on a 16-core (4 socket, 4 core) Xeon Tigerton system running at 2.93 GHz.<ref>{{cite web|url=http://www.tgdaily.com/html_tmp/content-view-37925-113.html|title=Intel converts ET: Quake Wars to ray tracing|last=Valich|first=Theo|publisher=TG Daily|date=2008-06-12|access-date=2008-06-16|archive-url=https://web.archive.org/web/20081002030022/http://www.tgdaily.com/html_tmp/content-view-37925-113.html|archive-date=2008-10-02|url-status=dead}}</ref> At SIGGRAPH 2009, Nvidia announced [[OptiX]], a free API for real-time ray tracing on Nvidia GPUs. The API exposes seven programmable entry points within the ray tracing pipeline, allowing for custom cameras, ray-primitive intersections, shaders, shadowing, etc. This flexibility enables bidirectional path tracing, Metropolis light transport, and many other rendering algorithms that cannot be implemented with tail recursion.<ref>{{cite web |url=http://www.nvidia.com/object/optix.html |title=Nvidia OptiX|author=Nvidia |publisher=Nvidia |date=October 18, 2009 |access-date=2009-11-06}}</ref> OptiX-based renderers are used in [[Autodesk]] Arnold, [[Adobe Systems|Adobe]] [[AfterEffects]], Bunkspeed Shot, [[Autodesk Maya]], [[3ds max]], and many other renderers. In 2014, a demo of the [[PlayStation 4]] video game ''[[The Tomorrow Children]]'', developed by [[Q-Games]] and [[Japan Studio]], demonstrated new [[Computer graphics|lighting]] techniques developed by Q-Games, notably cascaded [[voxel]] [[Cone tracing|cone]] ray tracing, which simulates lighting in real-time and uses more realistic [[Reflection (computer graphics)|reflections]] rather than [[Screen space ambient occlusion|screen space]] reflections.<ref name=ps10>{{cite web|url=http://blog.eu.playstation.com/2014/10/24/creating-striking-unusual-visuals-tomorrow-children-ps4-2/|title=Creating the beautiful, ground-breaking visuals of The Tomorrow Children on PS4|first=Dylan|last=Cuthbert|work=[[PlayStation Blog]]|date=October 24, 2015|access-date=December 7, 2015}}</ref> Nvidia introduced their [[GeForce 20 series|GeForce RTX]] and Quadro RTX GPUs September 2018, based on the [[Turing (microarchitecture)|Turing architecture]] that allows for hardware-accelerated ray tracing. The Nvidia hardware uses a separate functional block, publicly called an "RT core". This unit is somewhat comparable to a texture unit in size, latency, and interface to the processor core. The unit features [[Bounding volume hierarchy|BVH]] traversal, compressed BVH node decompression, ray-AABB intersection testing, and ray-triangle intersection testing.<ref>{{cite web|url=https://developer.nvidia.com/blog/nvidia-turing-architecture-in-depth|title=NVIDIA Turing Architecture In-Depth|last1=Kilgariff|first1=Emmett|last2=Moreton|first2=Henry|last3=Stam|first3=Nick|last4=Bell|first4=Brandon|website=Nvidia Developer|date=2018-09-14|access-date=2022-11-13|archive-url=https://web.archive.org/web/20221113010753/https://developer.nvidia.com/blog/nvidia-turing-architecture-in-depth/|archive-date=2022-11-13|url-status=live}}</ref> The GeForce RTX, in the form of models 2080 and 2080 Ti, became the first consumer-oriented brand of graphics card that can perform ray tracing in real time,<ref>{{cite web|url=https://venturebeat.com/games/nvidia-unveils-geforce-rtx-graphics-chips-for-real-time-ray-tracing-games|title=Nvidia unveils GeForce RTX graphics chips for real-time ray tracing games|last=Takahashi|first=Dean|work=[[VentureBeat]]|date=2018-08-20|access-date=2022-11-13|archive-url=https://web.archive.org/web/20221113013850/https://venturebeat.com/games/nvidia-unveils-geforce-rtx-graphics-chips-for-real-time-ray-tracing-games|archive-date=2022-11-13|url-status=live}}</ref> and, in November 2018, [[Electronic Arts]]' ''[[Battlefield V]]'' became the first game to take advantage of its ray tracing capabilities, which it achieves via Microsoft's new API, [[DirectX Raytracing]].<ref>{{cite web|url=https://www.pcworld.com/article/402902/battlefield-v-dxr-rtx-ray-tracing.html|title=RTX on! Battlefield V becomes the first game to support DXR real-time ray tracing|last=Chacos|first=Brad|work=[[PCWorld]]|date=2018-11-14|access-date=2018-11-13|archive-url=https://web.archive.org/web/20221113014909/https://www.pcworld.com/article/402902/battlefield-v-dxr-rtx-ray-tracing.html|archive-date=2022-11-13|url-status=live}}</ref> AMD, which already offered interactive ray tracing on top of [[OpenCL]] through its [[Radeon Pro#ProRender|Radeon ProRender]],<ref>{{cite web|url=https://gpuopen.com/announcing-real-time-ray-tracing|title=Real-Time Ray Tracing with Radeon ProRender|website=GPUOpen|date=2018-03-20|access-date=2022-11-13|archive-url=https://web.archive.org/web/20221113030034/https://gpuopen.com/announcing-real-time-ray-tracing|archive-date=2022-11-13|url-status=live}}</ref><ref>{{cite web|url=https://gpuopen.com/learn/radeon-prorender-2-0|title=Hardware-Accelerated Ray Tracing in AMD Radeonβ’ ProRender 2.0|last=Harada|first=Takahiro|website=GPUOpen|date=2020-11-23|access-date=2022-11-13|archive-url=https://web.archive.org/web/20221113025101/https://gpuopen.com/learn/radeon-prorender-2-0|archive-date=2022-11-13|url-status=live}}</ref> unveiled in October 2020 the [[Radeon RX 6000 series]], its [[RDNA 2|second generation]] Navi GPUs with support for hardware-accelerated ray tracing at an online event.<ref>{{cite web|url=https://www.tweaktown.com/news/75066/amd-to-reveal-next-gen-big-navi-rdna-2-graphics-cards-on-october-28/index.html|title=AMD to reveal next-gen Big Navi RDNA 2 graphics cards on October 28|last=Garreffa|first=Anthony|work=TweakTown|date=September 9, 2020|access-date=September 9, 2020}}</ref><ref>{{cite web|url=https://www.theverge.com/2020/9/9/21429127/amd-zen-3-cpu-big-navi-gpu-events-october|title=AMD's next-generation Zen 3 CPUs and Radeon RX 6000 'Big Navi' GPU will be revealed next month|last=Lyles|first=Taylor|work=[[The Verge]]|date=September 9, 2020|access-date=September 10, 2020}}</ref><ref>{{cite web|url=https://www.anandtech.com/show/16150/amd-teases-radeon-rx-6000-card-performance-numbers-aiming-for-3080|title=AMD Teases Radeon RX 6000 Card Performance Numbers: Aiming For 3080?| website=anandtech.com| publisher=[[AnandTech]]|date=2020-10-08|access-date=2020-10-25}}</ref><ref>{{cite web|url=https://www.anandtech.com/show/16077/amd-announces-ryzen-zen-3-and-radeon-rdna2-presentations-for-october-a-new-journey-begins|title=AMD Announces Ryzen "Zen 3" and Radeon "RDNA2" Presentations for October: A New Journey Begins|website=anandtech.com|publisher=[[AnandTech]]|date=2020-09-09|access-date=2020-10-25}}</ref><ref>{{cite web|url=https://www.eurogamer.net/articles/digitalfoundry-2020-10-28-amd-unveils-three-radeon-6000-graphics-cards-with-ray-tracing-and-impressive-performance|title=AMD unveils three Radeon 6000 graphics cards with ray tracing and RTX-beating performance|last=Judd|first=Will|work=Eurogamer|date=October 28, 2020|access-date=October 28, 2020}}</ref> Subsequent games that render their graphics by such means appeared since, which has been credited to the improvements in hardware and efforts to make more APIs and game engines compatible with the technology.<ref>{{cite book|title=Ray Tracing Gems II: Next Generation Real-Time Rendering with DXR, Vulkan, and OptiX|last1=Marrs|first1=Adam|last2=Shirley|first2=Peter|author2-link=Peter Shirley|last3=Wald|first3=Ingo|publisher=[[Apress]]|date=2021|pages=213β214, 791β792|isbn=9781484271858|hdl=20.500.12657/50334}}</ref> Current home gaming consoles implement dedicated [[Ray-tracing hardware|ray tracing hardware components]] in their GPUs for real-time ray tracing effects, which began with the [[ninth generation of video game consoles|ninth-generation]] consoles [[PlayStation 5]], [[Xbox Series X and Series S]].<ref>{{Cite web|url=https://www.theverge.com/2019/6/8/18658147/microsoft-xbox-scarlet-teaser-e3-2019|title=Microsoft hints at next-generation Xbox 'Scarlet' in E3 teasers|last=Warren|first=Tom|date=June 8, 2019|website=[[The Verge]]|access-date=October 8, 2019}}</ref><ref>{{cite web|url=https://www.theverge.com/2019/10/8/20904351/sony-ps5-playstation-5-confirmed-haptic-feedback-features-release-date-2020|title=Sony confirms PlayStation 5 name, holiday 2020 release date|last=Chaim|first=Gartenberg|work=[[The Verge]]|date=October 8, 2019|access-date=October 8, 2019}}</ref><ref>{{cite web|url=https://www.theverge.com/2020/2/24/21150578/microsoft-xbox-series-x-specs-performance-12-teraflops-gpu-details-features|title=Microsoft reveals more Xbox Series X specs, confirms 12 teraflops GPU|last=Warren|first=Tom|work=The Verge|date=February 24, 2020|access-date=February 25, 2020}}</ref><ref>{{cite web|url=https://www.theverge.com/2020/9/9/21428792/microsoft-xbox-series-s-specs-cpu-teraflops-performance-gpu|title=Microsoft reveals Xbox Series S specs, promises four times the processing power of Xbox One|last=Warren|first=Tom|work=The Verge|date=September 9, 2020|access-date=September 9, 2020}}</ref><ref>{{cite magazine|url=https://www.wired.co.uk/article/xbox-series-x-release-date-uk-price-specs|title=Making sense of the rampant Xbox Series X rumour mill|last=Vandervell|first=Andy|magazine=[[Wired (magazine)|Wired]]|date=2020-01-04|access-date=2022-11-13|archive-url=https://web.archive.org/web/20221113010340/https://www.wired.co.uk/article/xbox-series-x-release-date-uk-price-specs|archive-date=2022-11-13|url-status=live}}</ref> On 4 November, 2021, [[Imagination Technologies]] announced their IMG CXT GPU with hardware-accelerated ray tracing.<ref>{{Cite web |last=93digital |date=2021-11-04 |title=Imagination launches the most advanced ray tracing GPU |url=https://www.imaginationtech.com/news/imagination-launches-the-most-advanced-ray-tracing-gpu/ |access-date=2023-09-17 |website=Imagination |language=en-GB}}</ref><ref>{{Cite web |title=Ray Tracing |url=https://www.imaginationtech.com/products/ray-tracing/ |access-date=2023-09-17 |website=Imagination |language=en-GB}}</ref> On January 18, 2022, Samsung announced their [[Exynos#Exynos 2000 series|Exynos 2200]] AP SoC with hardware-accelerated ray tracing.<ref>{{Cite web |title=Samsung Introduces Game Changing Exynos 2200 Processor With Xclipse GPU Powered by AMD RDNA 2 Architecture |url=https://news.samsung.com/global/samsung-introduces-game-changing-exynos-2200-processor-with-xclipse-gpu-powered-by-amd-rdna-2-architecture |access-date=2023-09-17 |website=news.samsung.com |language=en}}</ref> On June 28, 2022, [[Arm (company)|Arm]] announced their [[Mali (processor)#Valhall 4th Gen|Immortalis-G715]] with hardware-accelerated ray tracing.<ref>{{Cite web |date=2022-06-28 |title=Gaming Performance Unleashed with Arm's new GPUs - Announcements - Arm Community blogs - Arm Community |url=https://community.arm.com/arm-community-blogs/b/announcements/posts/gaming-performance-unleashed |access-date=2023-09-17 |website=community.arm.com |language=en}}</ref> On November 16, 2022, [[Qualcomm]] announced their [[List of Qualcomm Snapdragon systems on chips#Snapdragon 8 Gen 2 (2023)|Snapdragon 8 Gen 2]] with hardware-accelerated ray tracing.<ref>{{Cite web |title=Snapdragon 8 Gen 2 Defines a New Standard for Premium Smartphones |url=https://www.qualcomm.com/news/releases/2022/11/snapdragon-8-gen-2-defines-a-new-standard-for-premium-smartphone |access-date=2023-09-17 |website=www.qualcomm.com |language=en}}</ref><ref>{{Cite web |title=New, Snapdragon 8 Gen 2: 8 extraordinary mobile experiences, unveiled |url=https://www.qualcomm.com/news/onq/2022/11/new-snapdragon-8-gen-2-8-extraordinary-mobile-experiences-unveiled |access-date=2023-09-17 |website=www.qualcomm.com |language=en}}</ref> On September 12, 2023 Apple introduced hardware-accelerated ray tracing in its chip designs, beginning with the [[Apple A17|A17 Pro chip]] for iPhone 15 Pro models.<ref>{{Cite web |last=Bonshor |first=Ryan Smith, Gavin |title=The Apple 2023 Fall iPhone Event Live Blog (Starts at 10am PT/17:00 UTC) |url=https://www.anandtech.com/show/20051/the-apple-2023-fall-iphone-event-live-blog |access-date=2023-09-17 |website=www.anandtech.com}}</ref><ref name=":apple0">{{Cite web |title=Apple unveils iPhone 15 Pro and iPhone 15 Pro Max |url=https://www.apple.com/newsroom/2023/09/apple-unveils-iphone-15-pro-and-iphone-15-pro-max/ |access-date=2024-10-27 |website=Apple Newsroom |language=en-US}}</ref> Later the same year, Apple released M3 family of processors with HW enabled ray tracing support.<ref name=":apple1">{{Cite web |title=Apple unveils M3, M3 Pro, and M3 Max, the most advanced chips for a personal computer |url=https://www.apple.com/newsroom/2023/10/apple-unveils-m3-m3-pro-and-m3-max-the-most-advanced-chips-for-a-personal-computer/ |access-date=2024-10-27 |website=Apple Newsroom |language=en-US}}</ref> Currently, this technology is accessible across iPhones, iPads, and Mac computers via the [[Metal (API)|Metal API]]. Apple reports up to a 4x performance increase over previous software-based ray tracing on the phone<ref name=":apple0" /> and up to 2.5x faster comparing M3 to M1 chips.<ref name=":apple1" /> The hardware implementation includes acceleration structure traversal and dedicated ray-box intersections, and the API supports RayQuery (Inline Ray Tracing) as well as RayPipeline features.<ref>{{Cite web |title=Discover ray tracing with Metal - WWDC20 - Videos |url=https://developer.apple.com/videos/play/wwdc2020/10012/ |access-date=2024-10-27 |website=Apple Developer |language=en}}</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Ray tracing (graphics)
(section)
Add topic