...Where the error occurs is in the assumptions of initial amounts that somehow we know how much uranium was there at the beginning ....
...Another assumption was that the rate of decay was the same for long-time, i.e. hundreds of millions or billions of years. No one can observe this, but is based on exponential decay....
There are a number of other radiometric dating methods that are all quite different. You would have to find fault with all of them. Here are a few:
Alpha decay of samarium-147 to neodymium-143 with a half-life of 1.06 x 10^11 years
Electron capture or positron decay of potassium-40 to argon-40. Potassium-40 has a half-life of 1.3 billion years
Beta decay of rubidium-87 to strontium-87, with a half-life of 50 billion years
Decay of uranium-234 into thorium-230, which has a half-life of about 80,000 years.
Electron capture and beta decay rates involve the "fine structure constant", alpha. The constant was measured by examining the spectral lines of distant stars and found to be no different than it is today.
If the physical constants involved in nuclear or atomic phenomena were to vary over the millennia, it would be quite obvious from the observation of distant galaxies. It doesn't take much of a change in the physical constants to radically change the nature or stability of stars or everyday substances. So it's not possible that radiometric methods are wrong because the physics was confirmed at galactic scales.
.