top of page
Search

GPU in AI: How the most complex microchip is used to make machine learn

  • Writer: RRHS ScienceNHS
    RRHS ScienceNHS
  • Dec 15, 2024
  • 2 min read


By: Jason Park


One of the most revolutionary invention in today's world would be Artificial Intelligence. In order to make this amazing thinking machine, lots of advanced computer technologies are needed. However, not many people know that GPU; so-called Graphic Card plays a major role in developing AIs. Because of its complex functions, GPUs tend to be the most expensive part in the computer. Despite its high price, many IT enterprises are currently seeking for high-quality GPU just for their AI development. If you have an interest on stocks, you probably know Nvidia Corporation, famous for its high-tech GPU microchips.

GPU stands for Graphic Processing Unit, which literally means that it helps computer to process picture-related functions like graphic effects and videos. Specifically, GPU performs graphic rendering, which determines the quality of the video & game display. GPU consists of many multiple Processor Clusters (PC). This enables GPU take a single set of instructions and run it across multiple data with lots of nodes gets distributed parts of computational task, which makes each pixel to lighten in specific color, ultimately functions to display a fully rendered image on the screen.

Then why, and how is GPU used for AI development? The reason GPU is used in AI is because of its fast data-rendering speed. Compared to CPU, another essential component of computer, GPU is capable of processing significantly more amount of data. By simple estimation, it is believed that while CPU can work on about 24 blocks of data, GPU can process about 3000 blocks of data simultaneously, which means that GPU shows high energy & time efficiency. Using its fast data rendering speed, GPU is perfect for handling parallel computing. Parallel computing is widely used for the basics of many deep learning AI models. As the researchers thrive to reach the dreams of creating perfect "thinking machine", more the complex data reading will be needed, and producing more powerful GPU will be the solution we will need to work on.

 
 
 

Comments


bottom of page