GPU

What is GPU

The GPU, or graphics processing unit, is one of the most valuable parts of modern computing technology for personal and industrial use. Initially produced for parallel processing, the GPU is used for different purposes, including graphics and video rendering. Although it is best known for the gaming industry, its use in creative production and artificial intelligence (AI) has become popular as well. The main aim of the GPUs was to accelerate the rendering of 3D graphics. The developing technologies allowed GPUs to be used flexibly as they became more programmable and enhanced their capabilities. This development and new capabilities allowed graphics programmers to create more interesting visual effects and more realistic visuals with advanced lighting and shadowing techniques. In turn, the application fields of GPUs started to get bigger, starting from high-performance computing to deep learning (Intel, 2025).


GPU and CPU

GPUs are complementary parts of the central processing unit (CPU), which is considered the brain of computers and the main control center for processes. The development of GPUs did not negatively impact CPUs, but they are designed to accelerate computer graphics workloads. On the other hand, a GPU is different from a graphics card (video card), as a graphics card refers to an add-in board that incorporates the GPU. They are in two basic forms: integrated and discrete.


GPUs applications

As mentioned, GPUs were meant mainly for real-time 3D graphics application acceleration, primarily for gaming reasons. In the last couple of decades, scientists have understood that GPUs can also be used to solve some comprehensive computing problems. This has led to a new era for GPUs, which are now applied to a broader range of fields. Thanks to technological advancements, today, GPUs are much more programmable, giving them flexibility beyond traditional graphics renderings.


Gaming

Though GPUs were initially designed for gaming reasons, today, they have gone beyond traditional use, and games have become more computationally intensive, ultrarealistic, and complicated. Due to 4 K screen resolutions, modern display technologies, and the development of virtual reality games, we need more demand for GPUs. 


Video Editing and Content Creation/Synthetic Media

One of the main struggles of editors, graphic designers, and creative industries has been long rendering times, which take most of the sources. After the emergence of parallel processing, rendering video and graphics in higher-definition formats is much easier and faster. By the development and integrating GPUs we have been able to use more realistic video, audio. Tools like DALL-E, Stable Diffusion, and RunwayML enabled to create real-time synthetic art, videos and art. While it is considered democratization to access creative industries, it also has some ethical concerns regarding misinformation, consent, and identity theft.

https://blogs.nvidia.com/blog/generative-ai-studio-ces-geforce-rtx-50-series


Artificial intelligence (AI) and Machine Learning

Behind the AI boom is the GPU and its capabilities to render a vast amount of data. As GPUs can process a large number of operations simultaneously, the industry uses this opportunity to train AI models. It enables researchers and developers to iterate on models more quickly and unlock breakthroughs in AI capabilities (Cloud Google, 2025).

AI searches the origin of the Universe (powered by NVIDIA GPU)


Cryptocurrency mining

Bitcoin (in the early days) and Ethereum (until 2022) used GPUs for mining. As a result, home miners started to use consumer GPUs to mine crypto. This led to soaring demand for GPUs and increased prices, which caused significant shortages. Ethereum Mining favored NVIDIA and AMD GPUs, and dedicated mining farms moved to China and Kazakhstan. Though it was considered a profitable practice at the beginning, due to sparked energy consumption and some security concerns, most countries banned mining.

Cryptocurrency mining farm (in Russia)
Cryptocurrency mining farm (in Russia)


Main GPU producers and their locations and years 

  1. Samsung 2016 (Korea)
  2. NVIDIA 1993 (California, Santa Clara)
  3. Taiwan Semiconductor Manufacturing Co.Ltd 1999 (Taiwan)
  4. Intel 1998 (California, Santa Clara)
  5. Broadcom Inc 2007 (Palo Alto, California)
  6. Qualcom 2008 (San Diego,California)

NVIDIA GPUs Development between 2005-2025

In 2006, NVIDIA introduced its game-changing GeForce 8 series and started its Tesla (not connected to Telsa cars) architecture. It was also the year NVIDIA released its famous API CUDA (Compute Unified Device Architecture), which is a parallel computing platform and programming model that allows developers to harness the power of their NVIDIA GPUs for general-purpose computing tasks. Instead of just being used for rendering graphics, CUDA enables GPUs to be used for a wide range of applications, including image processing, deep learning, numerical analytics, and computational science. It can be considered the beginning of a new period as CUDA-based GPUs is applied to many industries. The new GPU is used for Computational Chemistry, Bioinformatics, Machine learning, Data Science, Computational Fluid dynamics, Weather and Climate, and other fields.

Harvard University:
Finding Hidden Heart Problems Faster
NAMD Molecular Simulation
Boston Scientific Simulation:
E-field distribution at 64 MHz
Digital Tomosynthesis
Neural Array Simulation
Headwave Analysis Program
Instancing at work-numerous characters rendered
A wooden structure with a windmill

Description automatically generated

In 2008, the company introduced the GeForce 200 series, which was based on the GT200 architecture, developed performance, and offered new GPU computing capabilities. According to an official technical brief, the new series significantly enhances computation ability for high-performance CUDA™ applications and GPU physics and rebalances the architecture for future games that use more complex shaders and more memory (NVIDIA, 2008).

2008
A car driving on a track

Description automatically generated

In 2010, NVIDIA introduced Fermi architecture with the GeForce 400 series. It was developed for real-time physics, advanced cinematic effects, and ray tracing, which was the future of gaming.

2010

In 2012, the company unveiled the Kepler architecture with the GeForce series, and NVIDIA called it ‘the new era’ and added, “How the World’s First GPU Leveled Up Gaming and Ignited the AI Era.” The company considers it the foundation for an AI-driven future.

A breakthrough came when Alex Krizhevsky from the University of Toronto used NVIDIA GPUs to win the ImageNet image recognition competition. His neural network, AlexNet, trained on a million images and crushed the competition, beating handcrafted software written by vision experts.

In their press release, NVIDIA highlights that:
This marked a seismic shift in technology. What once seemed like science fiction — computers learning and adapting from vast amounts of data — was now a reality, driven by the raw power of GPUs.

2012

In 2014, NVIDIA released the Maxwell architecture with the GeForce series, which increased performance and introduced new features such as real-time global illumination engines, reflections, dynamic geometry and lights, and others.

Debunking Lunar Landing Conspiracies with Maxwell and VXGI

A collage of a person in space suit climbing up a ladder
Description automatically generated
2014

In 2016, the company released its Pascal architecture supported by GeForce 10 series and it promised higher performance and also introducing support for VR technologies.

A book on a stone building
Description automatically generated
A person on a horse in front of a sunset
Description automatically generated
A computer with a video game screen
Description automatically generated
A collage of a cat's eyes
Description automatically generated
A collage of a robot and a robot
Description automatically generated
A screenshot of a video game
Description automatically generated
A collage of a city and a tree
Description automatically generated
A collage of images of a person
Description automatically generated
A screenshot of a video game
Description automatically generated
2016

In 2020, NVIDIA released the Ampere architecture supported by the GeForce RTX 30 series, promising considerable performance gains and increased tracing capabilities. The company called it “A new era AI powered computer graphics”

A computer graphics with a diagram
Description automatically generated with medium confidence
A screenshot of a video game
Description automatically generated
A screenshot of a video game
Description automatically generated
A video game of a soldier in a red uniform
Description automatically generated
A screenshot of a video game
Description automatically generated
2020

In 2022, NVIDIA introduced Ada Lovelace architecture with the GeForce RTX 40 series, adding extra advancing ray tracing performance and AI-driven graphics developments. The company labeled it “a revolution in Neural Graphics.”

A screenshot of a video game
Description automatically generated
A screenshot of a video game
Description automatically generated
A screenshot of a computer
Description automatically generated
A screenshot of a video game
Description automatically generated
2022

 In 2025, the company released its ultimate GeForce RTX 50 series. In their technical  brief, it is called Game-Changing AI and Neural Rendering Capabilities To Gamers and Creators.

A person walking in a street
Description automatically generated
A video game screen of a person walking down a street
Description automatically generated
A screenshot of a video
Description automatically generated
A person in a white shirt
Description automatically generated
A screenshot of a video game
Description automatically generated
A screenshot of food items on a table
Description automatically generated
2025

GPUs are complementary parts of the central processing unit (CPU), which is considered the brain of computers and the main control center for processes. The development of GPUs did not negatively impact CPUs, but they are designed to accelerate computer graphics workloads. On the other hand, a GPU is different from a graphics card (video card), as a graphics card refers to an add-in board that incorporates the GPU. They are in two basic forms: integrated and discrete.


References

Intel (n.d.) What is a GPU? [Accessed: 6 May 2025].

Castells, M. (2011) The Rise of the Network Society. 2nd edn. Chichester: Wiley-Blackwell.

NVIDIA (2018) NVIDIA Turing Architecture Whitepaper. [Accessed: 6 May 2025].

NVIDIA (2019) GeForce RTX 20 Series SUPER GPUs Announced. [Accessed: 6 May 2025]

NVIDIA Developer (2023) AI-Generated Heat Maps Keep Seniors—and Their Privacy—Safe. [Accessed: 6 May 2025].

Dell Technologies (n.d.) NVIDIA Solutions with Dell Technologies.[Accessed: 6 May 2025].

Investopedia (2024) World’s Top 10 Semiconductor Companies. [Accessed: 6 May 2025].

NVIDIA (2020) GeForce RTX 30 Series Graphics Cards. [Accessed: 6 May 2025].

NVIDIA (2020) Introducing RTX 30 Series Graphics Cards. [Accessed: 6 May 2025].

NVIDIA (n.d.) Portal with RTX: Real-Time Ray Tracing Comparison. [Accessed: 6 May 2025].

NVIDIA (2022) Introducing GeForce RTX 40 Series GPUs. [Accessed: 6 May 2025].

NVIDIA (2024) GeForce RTX 50 Series GPU and Laptop Announcements. [Accessed: 6 May 2025].

NVIDIA (2024) GeForce RTX 50 Series Graphics Cards. [Accessed: 6 May 2025].