That work is completely open to the public and has been peer reviewed.
This part isn't true. NOAA does not make public their raw data or the model manipulations.
Even the ARGO data is modeled- there aren't nearly enough buoys to cover all the oceans.
There was no warming between 2000 and 2010. This was a big problem because the models do not permit CO2 to go up for 10 years without surface temps following.
NOAA "corrected" the ARGO data by artificially hiking the temperature recorded on every buoy.
The reason they said it was needed, was because the historical data didn't diverge enough from current temps to show the trend the models predict. They had to change the data. The excuse was that the historical data- readings taken from ships at sea- must have been warmer than it really was. Maybe it was taken in the engine room.
They had no empirical data to support the assumption- they just needed to make a correction to the current sea surface temps from ARGO to conform with the models.
Even if you accept that was correct- the proper fix would be to lower the historical averages by an adjustment factor, not increase the current data (which is much higher quality data anyway).
The stratospheric cooling that must accompany surface warming was not reflected in the satellite data- because the warming from 2000-2010 didn't happen.
That alone disproves the models.
The very best supercomputers we have for modeling complex non-linear systems are the hurricane models. They are pretty good in the short term- 24-48 hours. But the further out you try to predict, the larger the margin of error becomes, it's just the nature of chaotic systems.
Predicting the earth's climate is that problem- multiplied by a gazillion times for the added complexity.
CO2 has increased as the temperature warms, that is natural gas law. The oceans warm and dissolved gases escape into the atmosphere as the osmotic pressure increases. Very predictable.
Using 1850 as the "starting point" is disingenuous because that was the end of the LIA, and the temps naturally rebounded from the cool period.
If there is a human signal in the data, it's obscured in the noise of natural variability. Yes, the surface is warming. No, it cannot be attributed to CO2, which I like to call the "magic thermostat".
IOW, the AGW theory is based on the notion that a trace gas that makes up 0.04% of the atmosphere is the control. If we can somehow make that trace gas go down to say 0.038%, we would have our fingers on the earth's thermostat.
It's hubris, and asinine, and non-scientific in the extreme.
Your solar irradiance graph shows two cooling periods- the Maunder Minimum and the Dalton Minimum that marks the end of the LIA. The rest of the time (and the current time) it's flat.
Geologically speaking, we are in a cool period. It is called the Holocene Interglacial (meaning we are in between ice ages).