Total solar irradiance using a traceable solar spectroradiometer
Abstract. Accurate, precise and traceable measurements of total and spectral solar irradiance measurements are fundamental for solar energy applications, climate studies, and satellite validation. In this study, we assess the performance and the quality of the data from a commercially available, compact BTS Spectroradiometer system, by comparing its spectrally integrated total solar irradiance (TSI) values with an electric substitution cavity radiometer (PMO2), which is traceable to the World Radiometric Reference (WRR). The resulting ratio between BTS Spectroradiometer system and WRR-traceable TSI is 0.9975 with a standard deviation of 0.0050. Applying a correction factor of (-) 0.34 % to PMO2, accounting for the known offset between WRR and the International system of Units (SI) results in a relative difference between the BTS Spectroradiometer system derived TSI and PMO2 of +0.09 % with a standard deviation of 0.0050 demonstrating good consistency between BTS derived TSI and the cavity radiometer.
This comparison confirms the precision and accuracy of the BTS spectroradiometer system, and its capability to deliver SI traceable TSI from spectrally resolved solar irradiance measurements. Its spectral resolution enables accurate measurements of spectral solar irradiance, which are essential, not only for determining total solar irradiance but also for retrieving key atmospheric gases such as water vapor, ozone, and aerosols, establishing its relevance as a compact instrument for atmospheric and climate research.
Comments on “Total solar irradiance using a traceable solar spectroradiometer” by Jaine, Groebner, and Finsterle https://doi.org/10.5194/egusphere-2025-4030
This paper compares direct normal solar radiation measured with a WRR-traceable absolute cavity radiometer to spectrally integrated solar spectra between the wavelengths 280 to 5000 nm, which includes at least 99.5% of the spectrum for their site in Davos, Switzerland. The range from 280 to 2150 nm is measured with a Bi-Tec Sensor (BTS) spectroradiometer and beyond that (2150 – 5000 nm) is modeled using model inputs of aerosol optical depth and wavelength dependence, ozone, water vapor, carbon dioxide, and the instantaneous atmospheric pressure and solar zenith angle. After correction for scatter light in the cavity measurements the agreement between the two quantities is within 0.1%. As the authors state, this lends a great deal of validity to the spectral measurements that can be used to characterize trace constituents in the atmospheric column.
I accept the paper after a few items are clarified in the manuscript.
In Section 2.1:
What is the FOV of the BTS spectroradiometer? Does it match the 5 deg FOV of the PM02?
How often is the BTS calibrated? If only initially, how do you guarantee that it is stable?
In Section 3:
The caption for Fig. 2 is incorrect in that the grey area does not represent 90 % of the TSI.
Perhaps an insert that blows up the 4000 - 5000 nm region in Fig. 2 would clarify the points made in lines 133 and 134.
I did not understand the necessity of a machine learning approach since one needs the model inputs (eqn. 4) to estimate the 2150 – 5000 nm contribution for machine learning or the model runs; why not just run the model to calculate the contribution?
In Section 4:
Fig. 6 is difficult to examine. Perhaps a blow up of just one vertical grouping would more clearly show the degree of agreement. I think you could eliminate the left part (a) of this figure.
Other:
Line 44 “gases constituents” “gases”
Look for “could” that should be changed to “cloud” in at least two places. Lines 156 and 178.
In Fig. 2 caption “grey vertical” “vertical”