So your chart is made up from a computer model fudged to fit the TIM satellite data that started in 2003, it is not actual data plotted on a graph. The deniers say computer models are worthless because you can program them to give you any result you want. Below is your chart with another collection of actual measured proxy data. Why doesn't the model manufactured Maunder minimum match the beryllium-10 measured proxy data?
From the SORCE/TIM site:
The values from their SATIRE model have been offset a small amount (-0.30 W/m2) to match the latest SORCE/TIM measurements during years of overlap and then extended using SORCE/TIM annual averages from 2003 onward.
From the source of the model:
The time series of accurate irradiance measurements are, however, relatively short and limit the assessment of the solar contribution to the climate change. Here we reconstruct solar total and spectral irradiance in the range 115–160,000 nm since 1610. The evolution of the solar photospheric magnetic flux, which is a central input to the model, is appraised from the historical record of the sunspot number using a simple but consistent physical model. The model predicts an increase of 1.25 W/m2, or about 0.09%, in the 11-year averaged solar total irradiance since the Maunder minimum. Also, irradiance in individual spectral intervals has generally increased during the past four centuries, the magnitude of the trend being higher toward shorter wavelengths. In particular, the 11-year averaged Ly-
α irradiance has increased by almost 50%. An exception is the spectral interval between about 1500 and 2500 nm, where irradiance has slightly decreased (by about 0.02%).
This figure shows two different proxies of solar activity during the last several hundred years. In red is shown the Group Sunspot Number (Rg) as reconstructed from historical observations by Hoyt and Schatten (1998a, 1998b) [1]. In blue is shown the beryllium-10 concentration (104 atoms/(gram of ice)) as measured in an annually layered ice core from Dye-3, Greenland