AMD sold their GPU to deepseek

They can make better ones.
nvidia was always better even back to the ATI days
no this is more of a conflict per the government, they are making money by giving
the competition, chinese the ability to do so
the chinese will likely use this to spy on us and other nefarious things
china is not our friend, trade partner as it's beneficial but not a friend
 
nvidia was always better even back to the ATI days
no this is more of a conflict per the government, they are making money by giving
the competition, chinese the ability to do so
the chinese will likely use this to spy on us and other nefarious things
china is not our friend, trade partner as it's beneficial but not a friend
iu
 
sorry but any tech site nvidia is always a step ahead of AMD, they play second fiddle. Their gpu's were good for mining do to their hash algorithm but AI works differently, you need power and memory for number crunching

bought my first nvidia card 25 years ago way before the CPU had built in graphics
ATI/AMD always were slower, ran hotter and their driver package sucked
they were slightly cheaper, that was their main selling point
 
Last edited:
sorry but any tech site nvidia is always a step ahead of AMD, they play second fiddle. Their gpu's were good for mining do to their hash algorithm but AI works differently, you need power and memory for number crunching
No. It's still parallel processing.
AMD was ahead of NVIDIA for years, and they're about due to be ahead of them again.
 
Nvidia holds a significantly larger market share in the GPU (Graphics Processing Unit) market compared to AMD, especially when focusing on discrete GPUs for desktop PCs and AI applications. Here's a detailed comparison based on the most recent data:

Discrete GPU Market Share:

AI Chip Market Share:
 
No. It's still parallel processing.
AMD was ahead of NVIDIA for years, and they're about due to be ahead of them again.
I call bullshit unless DeepSeek plays chess against AlphaZero, the Google AI to demonstrate that it is not just a CHATgpt clone, but has technical cred.
 
Nvidia holds a significantly larger market share in the GPU (Graphics Processing Unit) market compared to AMD, especially when focusing on discrete GPUs for desktop PCs and AI applications. Here's a detailed comparison based on the most recent data:

Discrete GPU Market Share:

AI Chip Market Share:

:blahblah:
Bubba. That's now, this was then.
I know what's possible and what's not.
NVIDIA once thought their 9800GT was the best thing ever until...
That's something I know. Do you-?
Could be you're fishing for me anyways, because I always got that vibe from ya.
 
Last edited:
fishing? if AMD was better and they are cheaper why does nvidia vastly dominate the market?

when the 9800GT was out what did AMD have that even came close?

I had an 8800 that played all my games at the time without any trouble, blower could get loud when it was run hard
I still have it in my old tech box, it was huge

that story was from 2024 not 2004 and nvidia kicked their ass then too, just the facts
 
fishing? if AMD was better and they are cheaper why does nvidia vastly dominate the market?

when the 9800GT was out what did AMD have that even came close?

I had an 8800 that played all my games at the time without any trouble, blower could get loud when it was run hard
I still have it in my old tech box, it was huge

that story was from 2024 not 2004 and nvidia kicked their ass then too, just the facts
91a.jpg
 
fishing? if AMD was better and they are cheaper why does nvidia vastly dominate the market?

when the 9800GT was out what did AMD have that even came close?

I had an 8800 that played all my games at the time without any trouble, blower could get loud when it was run hard
I still have it in my old tech box, it was huge

that story was from 2024 not 2004 and nvidia kicked their ass then too, just the facts
Oh yeah, you gotta be trying to bait me. Because you know I know 100+ motherfuckers that would laugh at your silliness. So what now? Yeah.
Ya got me. If only there was some way I could invite my computer compadres to right here.
I mean, they'd have to sign up, meh.
I wouldn't put that on 'em. But I bet if I asked, I got people.
 
Last edited:
Oh yeah, you gotta be trying to bait me. Because you know I know 100+ motherfuckers that would laugh at your silliness. So what now? Yeah.
I don't bait and why would they laugh at me?, just pointing out the facts, nvidia dominated then and still does, AMD was always playing catch up and is still doing it
I would hope after building computers for 25 years and being a member of more than one tech site I'd have an understanding of this

It's ok if you are an AMD fanboy, nothing wrong with that, they are a good company and if you are a stock holder and bought a few years ago you are sitting on a great return

nvidia vs AMD is equal to asus vs gigabyte and ford vs chevy, you are going to have fans that say theirs is better and defend their decision
 
What happened in the days between 9800GT and GTX 970, hmm? :rolleyes-41:
 
I don't bait and why would they laugh at me?, just pointing out the facts, nvidia dominated then and still does, AMD was always playing catch up and is still doing it
I would hope after building computers for 25 years and being a member of more than one tech site I'd have an understanding of this

It's ok if you are an AMD fanboy, nothing wrong with that, they are a good company and if you are a stock holder and bought a few years ago you are sitting on a great return

nvidia vs AMD is equal to asus vs gigabyte and ford vs chevy, you are going to have fans that say theirs is better and defend their decision
You are not on my level. We are not the same.
I bricked my motherboard this morning and fixed it.
We are not the same.
 
You are not on my level. We are not the same.
I bricked my motherboard this morning and fixed it.
We are not the same.
alright? this is not about who knows more, the original thread is why is AMD selling their GPU's to china

AMD needs to be around to push nvidia and intel to not get lazy

how did you brick your board? I've built many builds for friends and family, never had one brick

just for fun and not argue my first built was an athlon, I don't recall they came with a heatsink.
I had to buy and aftermarket and it still ran freaking hot
next build was a pentium and it was faster and ran cooler so there's that
 
alright? this is not about who knows more, the original thread is why is AMD selling their GPU's to china
AMD needs to be around to push nvidia and intel to not get lazy

how did you brick your board? I've built many builds for friends and family, never had one brick
just for fun and not argue my first built was an athlon, I don't recall they came with a heatsink.
I had to buy and aftermarket and it still ran freaking hot
next build was a pentium and it was faster and ran cooler so there's that
1. AMD (H800?) chip is allowed to be sold to China, heard that on CNBC this morning.

2. My question is, can DeepSeek play chess and solve serious problems, or is it just a "glorified search engine"?
 
1. AMD (H800?) chip is allowed to be sold to China, heard that on CNBC this morning.

2. My question is, can DeepSeek play chess and solve serious problems, or is it just a "glorified search engine"?
still if the USG is giving them money for Ai here in the USA and they are selling their GPU's to a chinese company for their Ai, does not that seem wrong to you?
kind of like the whole steal from peter to pay paul scenario

let china develop their own GPU's for their own Ai
 
A GPU is a graphics card or chip. I don't see how that improves AI.
AI (Artificial Intelligence) uses GPUs (Graphics Processing Units) primarily because of their capability to perform parallel processing much more efficiently than CPUs (Central Processing Units). Here's how AI leverages GPUs:
Parallel Processing:
  • Matrix Operations: Many AI algorithms, particularly those in deep learning like neural networks, involve operations on large matrices. GPUs excel at these operations due to their architecture designed for handling thousands of threads simultaneously. Each core in a GPU can compute a small part of a matrix operation, significantly speeding up the process.
  • Neural Network Training: Training neural networks involves forward and backward propagation of data through thousands or millions of neurons. Each neuron's computation can be done in parallel, which is where GPUs shine. They can process multiple data points or images in batches, speeding up both training and inference (prediction) phases.
Specific Uses in AI:
  • Deep Learning Frameworks: Frameworks like TensorFlow, PyTorch, and others are optimized to leverage GPU acceleration. They use CUDA (NVIDIA's parallel computing platform) or ROCm (for AMD GPUs) to offload computations to GPUs.
  • Convolutional Neural Networks (CNNs): These networks, commonly used in image recognition tasks, involve convolutions that are computationally intensive. GPUs can perform these convolutions across images much faster than CPUs.
  • Recurrent Neural Networks (RNNs): For sequence data like text or time series, GPUs can process multiple sequences or time steps concurrently.
  • Generative Models: For tasks like generative adversarial networks (GANs) or transformers for language generation, GPUs help in handling the large-scale computations required to generate and evaluate data.
How It Works:
  1. Data Transfer: Data is moved from the CPU to the GPU's memory (VRAM). This step can be a bottleneck if not managed well due to the latency of data transfer between CPU and GPU.
  2. Kernel Execution: Once data is on the GPU, the AI framework sends "kernels" (small programs) to the GPU which define the operations to be performed. These kernels are executed in parallel across the GPU's many cores.
  3. Synchronization: After computations, results might need to be synchronized or brought back to the CPU for further processing or for output.
  4. Optimization: Techniques like batch processing, where multiple data points are processed at once, leverage GPU power for efficiency. Also, frameworks automatically optimize operations for GPU execution.
Benefits:
  • Speed: Dramatically reduces the time for training complex models.
  • Scalability: Multiple GPUs can be used together for even larger models or datasets, scaling up performance.
  • Energy Efficiency: For certain tasks, GPUs provide more computations per watt than CPUs, though they consume more power individually.
Challenges:
  • Memory Management: GPUs have limited memory compared to CPU systems, which can limit the size of models or batches.
  • Programming Complexity: Writing code that efficiently uses GPU capabilities can be more complex than CPU programming.
  • Cost: High-performance GPUs are expensive, though cloud services offer GPU computing as a more accessible option.
In summary, AI uses GPUs to leverage their parallel processing capabilities, which are crucial for speeding up the computationally intensive tasks involved in training and deploying AI models.
 
Back
Top Bottom