fbpx

Blog Page

Uncategorized

How GeForce changed graphics forever, the GPU: what to know – Interesting Engineering

South_agency/iStock 
The term GPU gets thrown around all over the place but isn't always used correctly. In many ways, it is a computer within a computer, and more often in recent years, it is getting big enough to need external cooling, just like the CPU. Let's take a closer look at the Graphics Processing Unit.
wikipedia https://en.wikipedia.org/wiki/Graphics_processing_unit#:~:text=A%20graphics%20processing%20unit%20(GPU,%2C%20workstations%2C%20and%20game%20consoles.
https://www.geeksforgeeks.org/what-is-parallel-processing/

https://www.digitaltrends.com/computing/what-to-expect-from-gpus-2022/

https://www.techtimes.com/photos/225784/20180703/11-incredible-facts-about-nvidia-you-probably-didnt-know/3/

There are two types of GPUs: the graphics card and the embedded graphics processor in a central processing unit. This article focuses on the graphics card and how it grew to be a huge part of computer technology today.
PhonlamaiPhoto/iStock 
Processing units are Integrated circuits that process inputted data. They are made up of cores, the building blocks of computers. Any processing unit comprises cores, and the Graphical Processing Unit (GPU) includes many smaller cores than a Central Processing Unit (CPU).
The GPU has specialized circuits within the cores. These specialized circuits are designed to build an image on a screen as it is being transferred from the CPU to the monitor.
wikipedia commons 
It was first used in the 1970s because RAM was expensive and unreliable. The video game companies began using special circuits to scan the images onto the screen directly. Think of Pong, one of the first game consoles on the market, which had a massive graphics circuit inside.
In the 1980s, there were several improvements, design changes, and miniaturizing of circuitry in general. The first dedicated graphics processors were being built near the decade's end. They took up a great deal of space inside a personal computer. Think of the all-in-one Apple Macintosh, which introduced the first mouse, and it was a large box, mostly filled with graphics circuitry.
CC BY-SA 3.0/wikipedia
It was in the 1990s that real steady change was seen across the computer industry in all sectors. The Graphics Card, which had a small dedicated processor, was introduced. Think of the Apple Performa or the Windows personal computer in the mid-decade.
The Apple computers had an integrated video card that ran off the CPU. It was a proprietary system and cost about $1,200. That cost would be a major stumbling block for the company. Windows was something altogether different. It was an operating system that could be placed on any PC other than an Apple. If the person buying the PC was handy, they could build a full Windows PC from components for a bargain price of $600.
By Hyins – Own work, Public Domain,/Wikipedia 
Inside the Windows PC was a separate video card, which, although similar to Apple's proprietary component, was an independent component that could be upgraded as advancements were rapidly being realized in the industry.
That leads us to today. From 1994 until the present, GPU has had steady, constant miniaturization and power expansion.
Sony coined the GPU term in 1994. But most experts called the devices "video cards" because while they had excellent resolution capabilities, they mainly performed one function, to render graphics onto the monitor.
In recent years the term GPU has come to mean something entirely different, yet the basics remain the same.
Михаил Руденко/iStock 
Today's GPUs carry their own RAM, cooled by a fan, and they can do some amazing feats of calculation, like processing the mining operations in the blockchain.
The culling of game worlds has gotten so fast that many look like videos of actual physically real people and objects. Culling is how the scenery and characters are displayed as you move through a video graphics world in a game. The modern GPU culls in milliseconds rather than seconds.
For use as a mining appliance, the most formidable GPU is probably the NVIDIA GeForce RTX 3090. Right now, this is a reliable GPU that is powerful enough and proven strong enough to handle cryptographic mining operations.
The technology company NVIDIA is often credited with developing the first GPU. This is not strictly true, the Graphics processor has been around for about 40 years, and NVIDIA has only been around since 1993, or 30 years.
That being said, by 1997, NVIDIA had nearly 25 percent of the graphics market, and the NVIDIA GeForce is the most powerful and famous brand of the graphical processor so far to date. There have been many iterations of the GeForce "video" card, but they have come so far. The original from 1999 and the one for today are galaxies apart from each other.
In 1999, NVIDIA couldn't decide on a name for the graphics processing unit they had developed. So they held a contest where people could submit ideas for the product's name. They received 12,000 ideas until GeForce was selected. As a prize, some entrants received free GPUs.
Kinwun/iStock 
Over the years, NVIDIA has developed many relationships with technology companies on the cutting edge of graphics and video capture. They have partnered with over 370 autonomous driving companies. The partnerships have diverse ranges, including companies such as Tesla and Volkswagen, Airbus, and Nuro; they even work with the Massachusetts Institute of Technology (MIT) and Carnage Mellon University.
One of the little-known facts about NVIDIA is just where the name came from. The company's founders, Jensen Haung, Curtis Priem, and Chris Malachosky, had been brainstorming ideas when they hit on "NV" which stands for "next version." And with that, they came to the Latin word for envy, invidia, hence the name NVIDIA.
GPUs are an integral part of computer processing, and their use will likely define the future of computing in general. The technique used to separate a GPU from the Central Processing Unit, or main chip in a computer, is called Parallel Processing.
The GPU has become versatile enough to do many tasks a CPU does while maintaining the highest resolution possible. For the geek in all of us, parallel processing is when two or more processors are used together to perform processing functions in the same computer simultaneously.
Parallel processors can increase the speed of a computer and are used to run concurrent data processing to enhance or increase execution speeds. Parallel architecture can be used to break a task down into its parts and then multi-task them.
Breaking this down, the computer can be running one app, for example, while at the same time running another app. They can also run an app and simultaneously perform two separate tasks within the app.
Two aspects of the computer with parallel processors are the speed of the system is faster and has a higher overall cost of the system.
The cost of GPUs recently rose substantially, primarily due to the shortage of chips being fabricated in general. But already, the prices are stabilizing, and the market is seeing some movement towards overall normalizing of the pricing for the systems.
Over the next few years, there will be, in general, more processing power for the GPU, as well as more RAM allocated. The cooling systems have gone into three fans per card range, and some are water-cooled.
If the general thinking that people are shying away from the large, sometimes bulky, high-speed GPUs, they would be wrong. The general marketplace for GPUs from 2020 at $46.4 billion is projected to grow to over $292.1 billion by 2027. That is a CAGR of 30.1 percent.
That means many more Graphic Processing Units are being fabricated today than were built just two years ago.
The Diffractive Solar Sailing project will let you see the sun like never before. It was selected for Phase III study under the NASA NIAC program.

source

× How can I help you?