the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Rain on snow (ROS) understudied in sea ice remote sensing: a multi-sensor analysis of ROS during MOSAiC (Multidisciplinary drifting Observatory for the Study of Arctic Climate)
Julienne Stroeve
Vishnu Nandan
Rosemary Willatt
Ruzica Dadic
Philip Rostosky
Michael Gallagher
Robbie Mallett
Andrew Barrett
Stefan Hendricks
Rasmus Tonboe
Michelle McCrystall
Mark Serreze
Linda Thielke
Gunnar Spreen
Thomas Newman
John Yackel
Robert Ricker
Michel Tsamados
Amy Macfarlane
Henna-Reetta Hannula
Martin Schneebeli
Download
- Final revised paper (published on 11 Oct 2022)
- Supplement to the final revised paper
- Preprint (discussion started on 07 Feb 2022)
- Supplement to the preprint
Interactive discussion
Status: closed
-
RC1: 'Review of tc-2021-383 by Stroeve, J., et al.', Anonymous Referee #1, 14 Mar 2022
Review of
Rain-on-Snow (ROS) Understudied in Sea Ice Remote Sensing: A
Multi-Sensor Analysis of ROS during MOSAiCby Stroeve, J., et al.
Summary:
Rain falling on snow changes its physical properties, thereby influencing its microwave signature with potentially far reaching consequences for the retrieval of several geophysical parameters from satellite microwave measurements. The present manuscript deals with presentation and investigation of multi-sensor observations of the impact of rain-falling-on-snow (ROS) events during the middle of September 2020 in the High-Arctic sea ice cover. Observations cover ground-based passive and active microwave measurements including altimeter-type measurements, supported by a set of comprehensive in-situ measurements of snow and meteorological parameters in the framework of the MOSAiC expedition. The manuscript provides a good overview about the various measurements, comes up with a set of interpretations of these measurements, and attempts to put these observations into a wider context, for instance by comparing them with satellite observations, estimating the impact of ROS on retrieval of sea-ice concentration, snow depth on sea ice and sea-ice freeboard.The manuscript is a valuable contribution to the current state-of-knowledge and should be published in "The Cryosphere".
The manuscript would benefit from a number of clarifications and improvements, though, that I list in my general comments, being detailed further in my specific comments. In addition there are several editoral comments I would like the authors to pay attention to.
General comments (GC):
GC1: The manuscript should be improved regarding the motivation and the suitability of using a late-summer / fall case as a surrogate for ROS events and their impact during winter / spring. The need for an improvement is given by:
1) Current sea-ice freeboard (and hence thickness) retrieval using satellite radar altimetry typically begins half a month to month later than the case investigated here.
2) The current motivation is build around winter/spring conditions.
3) The environmental conditions encountered during the case investigated differ considerably from those during winter / spring which to an unknown degree limits the relevance of the work presented here.GC2: The manuscript would benefit from relating observations made to what has been published elsewhere. This applies to the presented microwave measurements themselves (are these typical and/or realistic for the conditions encountered?), and this applies to the presented impact on various sea-ice parameters. Given the fact that conditions in the Southern Ocean are likely even more conducive for ROS events I highly recommend to include the other hemisphere into the discussion of your results. In this category I would also like to mention that the authors should stress that their investigation about the impact on snow depth retrieval are hypothetical because the floe is not a first-year ice floe.
GC3: The credibility of the results and their impact would benefit a lot from a better consideration and critical review of the uncertainties and limitations involved in the measurements themselves and in their interpretation. Examples of this are
1) Unclear location of the SSL and the SWE measurements with respect to it.
2) Unclear representativity of the KuKa-radar / SBR measurements site with respect to other snow measurement sites as well as a lack of information how that site actually looked during the measurements.
3) Sub-optimal treatment and discussion of the limitations of the KuKa-radar measurements in nadir-looking mode and the interpretation of the results obtained. While I acknowledge that ROS events such as the one observed certainly have an impact on radar altimeter measurements (and this is also stated sufficiently clear in the manuscript - but was kind of known before), the respective measurements and their interpretation do not back up this very well the way presented, leaving a lot of doubts in the capabilities of instrument and experimental set-up to detect the stated vertical displacement of the main scattering layer reliably.GC4: I do understand that the authors would like to emphasize the importance of their findings and therefore, for instance, discuss issues like the potential impact on satellite active microwave (AMW) (scatterometer / SAR) observations and their interpretation, and show re-analysis based trends in ROS events. However, the impact on AMW observations and interpretation is not overly well elaborated, the ROS event trend analysis appears to be quite global, not taking into account that the case made here is from September while that trend analysis is for winter in general. Finally, the results presented are not that overwhelmingly convincing that emphasizing their value the way done appears to be a well-selected element for the discussion.
Specific comments:
L58-62: There are a few approaches to derive melt onset on sea ice in the Southern Ocean based on satellite microwave imagery that should perhaps be mentioned here as well (Willmes, S., et al., 2009, doi:10.1029/2008JC004919 and Arndt, S. et al., 2016, doi:10.1002/2015JC011504 )L67-70: Between this last paragraph and the previous paragraphs or at the end you should perhaps write something about the fact that most (if not all) studies you cited so far, were dealing with cold season / winter and/or winter-spring transition conditions. In contrast, the data you are dealing with during MOSAiC are from a completely different season with also completely different physical properties of the sea ice underneath. Here you are dealing with end of summer / commence of fall freeze-up. I am sure you will get back to this inconsistency in environmental conditions later in the paper. But it will be very helpful if you prepare us, the readers, for the fact that you attempt to further knowledge about ROS events by using late summer / early fall conditions as a surrogate for winter/spring conditions.
Figure 1: This is a busy figure that contains a lot more information than is relevant for this paper. I suggest to get rid of all the unnecessary information to be able to concentrate on the conditions encountered at the RS site
L91/92: You refer to a calibration of the KuKa radar during leg 2 here. How about during leg 5? Is the radar that stable that it did not need a re-calibration even though it was unmounted during leg 4 and then deployed again for leg 5?
- You are pointing out the antenna's far field. Where does that begin? Possibly close enough to the antenna that both regular measurements and calibration measurements were carried out in the far field?
- Given the height of the antennas above the ice surface of about 1.6 m (see A1) I can guess that the calibration measurements were carried out by pointing the antennas such that they looked parallel to the surface and that the corner reflector was mounted on a tripod at exactly the same height as the antennas such that it opened into the direction of the antennas. Is that correct? It would not hurt to mention this detail, I think.
L96: "at nadir and at 45 degrees"
What is the motivation to focus on an angle of 45 degrees? Is this the common angle currently used by spaceborne Ku-Band scatterometers? It might be useful to tell the reader.L117: What is the resulting height of the antenna above the surface then?
L127/128: What was different in the calibration of the 89 GHz SBR channel using the absorber between the attempts during leg 3 and leg 5 that those during leg 5 were not useable? Or in other words, what caused the calibration during leg 3 to be realiable?
L130/131: I note that the data gap is substantially longer for SBR than the KuKa radar; I suggest to reformulate this accordingly. While the power outage of the KuKa-radar coincides with the worst ROS conditions that of the SBR is not linked to that.
- In addition I note that this power outage is not reflected in Table A2 which content suggest continuous data acquisition from Sep 12 to 15. This should be changed for consistency.
L132/133: Would it be helpful for other scientists to learn what you consider "unstable" in this context?
L137-139: One more sentence describing how this pluviometer deals with the different forms of precipitation and what the measurement principle is (Is it heated? Is it just detecting the impact of the precipitation particles?) would be appreciated in addition to the reference Wagner et al. [2021].
L152-159: Please provide the grid resolution and the forecast interval that you used. I assume ERA5 provides a 6-hourly forecast of the precipitation? I assume you used all four and computed the daily total?
- I note that Leg 5 took place August/September while here you refer to the cold-season and/or wintertime precipitation. You should perhaps to provide a better link between the observations carried out during and ERA5 data used for the MOSAiC Leg 5 on the one hand and this investigation of the cold-season ROS events based on ERA5 on the other hand; please define clearly what you understand by "wintertime" or "cold-season".
L164: "around the MOSAiC floe" ... or "on the MOSAiC floe"?
- I note that, aside from showing Figure 1, you did not comment and/or describe the surface conditions the SBR and the KuKa radar were looking at. It remains hence unclear how representative the snow pit measurements are with respect to the conditions within the field-of-views of the used instruments.
- You state "routine snow pit observations" but I missed an information about the sampling; sub-daily? daily? 3-daily? Depending on / triggered by precpitation events?
- Didn't you perform any observations of the crystal structure of the snow following the Colbeck classification?
L166/167: Sorry to ask but how was the SWE measured? Was it measured for the 3 cm cut-out samples? What happened (in your case of a 7 cm thick snow cover) with the bottom 1 cm?
L203-206: What is the scientific rationale to include the surface scattering layer (SSL) into the SWE measurement? Did you cross-check the SWE measurements by simply computing SWE using density and depth of the snow? I get about 12 mm SWE and 14 mm SWE for a 7 cm thick snow cover and the bulk densities given by you.
- Apart from that, I doubt that the comparably small increase in density visible in the respective panels of Fig. 4 is responsible for a doubling of the SWE. It is kind of clear that almost 3 hours of rain has caused a certain mass gain but why is that not yet visible in density or SSA? One could hypothesize that the classical way to estimate SWE from snow depth and density fails because there is too much interstitial liquid water between the snow grains.
L211/212: Why did the thickness of the SSL increase under the warmer temperatures? How warm did the SSL (or ice/snow interface) get? In L166 you write that you measured the ice/snow interface temperature. Also: How do you know that the SSL thickness increased? Do you have Micro-CT measurements that go as deep below the ice-snow interface as in the middle profile in Fig. 4 also for the two left profiles (ROV, ALB)?
L214: Such considerable differences in SWE are also observed earlier between the Kuka radar PIT and the coring site; you don't explain those. Why?
- I might be wrong but, despite the fact that the SWE measurements could be really helpful to understand differences in the mass accumulation at the surface, my impression is that it is not sufficiently well clear how much of the SSL underneath the snow cover is included in the SWE measurements, for basically all examples shown.
- What I also observe is, that the profiles in Figure 4 appear to be, frankly speaking, randomly put with respect to where the snow cover begins (i.e. 0). It is actually not clear how thick the SSL is. It is absent completely in the 4th profile (FLUX) while it is hard to delineate where the SSL begins in the 5th profile (RS). This is associated with a substantial difference in snow depths: 6 cm for FLUX (at least, because we don't see the SSL) and 2 cm for RS (if the location of where the SSL begins is correct).
L220-223: I am sorry but I neither see a clear indication of that higher-density layer about 1 cm above the snow ice interface in the profiles shown in Fig. 4 nor do I understand from these what may indicate internal ponding. Again the question pops up what the snow-ice interface temperature might have been.
- I suggest to remove the salinity observations as long as these are not of further relevance for the study because the jump between the left 3 and the right 2 measurements is visually quite large. But you write it is small and falls within the accuracy with which it could be measured. Then these salinity measurements seem to be a bit misleading here.
L231/232: "porous ice layers in the snow volume" --> What makes you think that at the frequencies used here, for reasonably dry snow conditions before the ROS event, the majority of the backscatter is not (also) caused by the sea ice underneath the only 7 cm thick snow over? Also: Which ice layers in the snow volume are you referring to here?
L239-242: I don't think that what you write here is well illustrated by the data shown. Firstly, I doubt you can speak of the funicular regime here (I guess Garrity, 1992 in the Book "Microwave Remote Sensing of Sea Ice" mentioned that from her field work). The micro-CT images do not show that. In particular do the 2nd and 3rd micro-CT profile essentially show the same distribution of bluish gaps near the surface and there is not overly much difference further inside the snow pack.
- Secondly, for that considerable change of the snow internal structure during the 2nd ROS one would expect to need also a considerable amount of rain. But this is not the case. According to Figure 2 the precipitation intensity during the second ROS event was much smaller than towards the end and after the 1st ROS event.
L242-245: What could explain a decrease in backscatter that is larger at nadir than 45 degrees incidence angle? Could it be that a pond or a slush layer developed right below the KuKa-radar because of rain water dripping from antennas and equipment onto the snow?
L246-256: As noted by you, quite a bit of what is written in this paragraph is speculative. I am not sure whether the snow property observations and the quality of the remote sensing data you have at hand justify all these detailed speculations.
- I don't see "evidence of a percolation channel". I also note that none of the micro-CTs are from the immediate vicinity of the KuKa-Radar, are they?
- You write of "volume scattering from the refrozen surface crust" --> How thick is that crust that you can have volume scattering being dominant over surface scattering?
- You write of a "glazed surface crust" but I could not see that from your earlier results which point towards rain entering the snow, percolating it, not leaving a hard crust at the surface as would be typical for a freezing rain event with air-temperatures remaining below 0 deg C.
- Why should pores and channels that were just filled with rain water percolating through the snow (and potentially also some snow melt water) just become air-filled? Would you consider the ice underneath as being that permeable that this water leaves the snow due to gravity drainage? Is this reasonable given the (unknown) ice/snow interface and ice temperatures? Aren't the density measurements for Sep. 15 (Fig. 2) suggesting that densities remain high after refreezing?
- I finally note that the vertical scale of Figure 5 and the way you plotted the observations is not ideal for all the interpretations made. Using thinner lines and avoiding dashed lines when plotting a time series which has gaps anyways are potential solutions for improvement.
Figure 5: Your main interest is in the response during the two ROS events. I therefore strongly suggest to focus on these events more by showing days Sep. 11 through 15, i.e. 96 hours of data. That way you would be able to show much better how especially the Ku and Ka-Band radar data changed during the 2nd ROS event.
- What explains the jump in Ku-Band nadir backscatter and 19 GHz TB on Sep. 15 after the data gap?
- I suggest to narrow down the TB range for which SBR measurements are shown to something like 150 K to 280 K. I also suggest to use thinner lines. That way it would become clearer how large TBs actually get around the ROS events and how low 89 GHz TBs get after the ROS events.
- So far, I found little evidence in the text that used the densities shown on the right hand side of the figure. I suggest to either delete those or, in case you decide to refer more to these, to equip them also with TB and/or sigma0 value axes to be better able to quantify the differences between before and after the ROS events.
Figure 6: The zooms are an excellent idea. Still I vote for reducing the time period shown in panels a) and b) to the same period I suggested to show in Fig. 5 (Sep. 11 through including Sep. 15). This would have the advantage that you don't need to discuss the snow dune issue.
- So you indeed have a data gap when the precipitation of the first ROS was strongest. Did it perhaps caused the failure of the instrument?
- What these data suggest is: if rain increases snow wetness a radar can look deeper into the snow ... is this backed up by theory?
- I suggest to use markers at the top and bottom axes (e.g. down- and upward pointing filled triangles) instead of using bars to indicate the locations in the echogram for which you show the profiles on the right. You could connect these markers with thin dotted lines of the same color.
- Can you add a 6th profile from, e.g. Sep. 13, 0 UTC from clearly before the onset of the 1st ROS event? I am curious to see whether the strange shape of the black profile compared to all other profiles isn't in fact caused by some beginning failure of the radar. Such a profile could be used much better to illustrate the change in the echograms from before the ROS events to after the ROS events.
- In addition, simply because there a quite many jumps of the yellow line in the echograms, particularly at Ku-Band, which appear to be caused (according to your writing earlier) by a temporary relocation of the KuKa-radar, I suggest to plot a 7th profile when signals have stabilized again and any relocation effects have ceased, e.g. for Sep. 15, 23:50 UTC or so.
- I note that by far not every change in these echograms and the yellow line is understandable. Why, for instance, does the Ka-Band range of maximum relative power decrease (less) than the Ku-Band one at the beginning of the relocation (around 6 UTC on Sep. 15) while there is no change at the end of the relocation (around 21 UTC on Sep. 15) at Ka-Band but Ku-Band jumps back to almost the same range as before? I have to admit that I do not necessarily trust the vertical displacements in the range associated with the maximum relative power right after the 2nd ROS event; more explanation might help here.
- For me, looking at Fig. 6 as it is, the main conclusion is that the location of the maximum relative power gets a bit closer to the radar at Ku-Band (slightly lower range) but bit farther away from the radar at Ka-Band (slightly larger range). Given the fact that Ka-Band observes at the smaller wavelength of the two I don't understand this immediately as this would mean that after the ROS the Ka-Band penetrates deeper into the snow-ice system than the Ku-Band.
L271/272: You are talking about a vertical shift of between 0.5 and 1 cm? How sure are you that this is not just some kind of noise? I note that the range bin resolution is about 0.5 and 0.8 cm for Ka- and Ku-Band, respectively.
L276/277: Moving the instrument should impact both frequencies, right? But we only see a shift in the Ku-Band. Why?
L312: "because volume scattering in the snow is larger at 89GHz" --> Are you sure this is the reason? What is the ice type below the just 7 cm thick snow layer which is comparably cold and potentially more or less transparent at both frequencies. Wouldn't it be more likely that the observed difference in the observed TBs results from the emissivity of the underlying ice? Please check with earlier experiments and provide 1-2 references for your (revised?) statement then.
- Please put the observed value (L313) into context of existing knowledge.
L315: I agree that the emissivity of a wet snow pack is nearly 1 ... but how near is nearly? According to Figure 2 the snow surface temperature was 0degC, i.e. say 273K. In order to observe a TB of 274 K requires the emissivity to be larger than 1, which appears not to be realistic unless you proof differently. Even for an emissivity as high as 0.99 the observed TB would be 270.4K. Not knowing how accurate and reliable the calibration of the SBR actually is, I suggest to not put too much emphasis on the discussion of these perhaps artificially high TB values.
Figure 8: I like the delay in the pulse peakiness between CS2 and KuKa-radar data with CS2 showing an earlier increase; this seems to be in line with the direction from which the cyclone responsible for the ROS event was moving into the region of interest - aka from the South / Southeast; hence it arrived earlier at the location of the CS2 overpasses than the MOSAiC floe.
I recommend, though, to check the KuKa radar sigma0 values prior to the ROS events in Fig. 8 because Fig. 5 shows values around 0 dB here, hence a discrepancy between the values shown in both figures.L316/317: How thick a wet snow cover needs to be to mask the emission of the underlying sea ice? Is the observation that both frequencies show a similarly high TB enough evidence that the ENTIRE snowpack is wet?
L318: "TBs drop to cold conditions again" --> what you write in the following lines misses the observations that the TBs drop to values considerably lower than before the ROS event. Please quantify this change and also quantify the changes in the PD before and after the ROS events.
- What explains the super low 89GHz TB values of almost as low as 160K at H-polarization? Has there been evidence for such low values before in the published literature?
- What explains the fluctuations in 89 GHz TB after the ROS? Is this real or an artifact of the SBR?
L322: "grain size increased throughout the snowpack" --> one could speculate how much these 4 cm of apparently icy snow (see RS profile in Fig. 4e) still resemble snow and how one could distinguish between ice layers (see L321) such a snow pack / refrozen slush cover can contain.
Figure 9: Please provide information in the text how you co-located these AMSR2 data with the MOSAiC floe location (see further below).
- I suggest to omit the 23 and 36 GHz data and instead, similarly to the comparison between KuKa-radar and CS-2 (Fig. 8), plot the daily average TB-values. This would make the comparison more consistent. Please check, like you did for CS-2, the respective AMSR2 overpass times to figure out how you could optimally compare the SBR observations with the AMSR2 data.
- Why are no data shown for Sep. 15/16? I assume that this is because of the observation gap around the North Pole. If this is the case then I recommend to, instead of using TB of a single 12.5 km grid cell (which I assume has been done), computing the mean TB of a slightly larger area. Why not using the same radius as you used for CS-2?
- While the PD at 89GHz before the ROS events is kind of in line with the SBR observations, the PD at 18.7 GHz is with 25-30 K about 10 K larger than the one observed by the SBR. Why? I also note that SBR 19 GHz TBs are higher than AMSR2 TBs while SBR 89GHz TBs are lower. Why?
L336/337: I have difficulties to find this statement in the cited paper. Therein it is clearly described that there are so-called track point differences between Ku- and Ka-Band radar altimetry and that penetration into and attenuation / scattering within the snow is a function of snow depth and grain size but there is not this clear statement made. I suggest to use a different reference if you want the keep the statement as written. Also, for Ku-Band it has been shown in published literature that the main scattering horizon could be located anywhere from within the upper centimetres of the sea ice through the entire snow pack up to at the snow surface itself.
L339-342: What is written here regarding the change of the elevation of the main scattering horizon appears to be quite speculative. What is the vertical resolution of the two radars when looking nadir? How precise could the altitude of the antennas above the snow surface be tracked? You state yourself that the sled might have sunk into the snow a bit (millimetres? centimetres?). In addition, please see my comment with respect to the echograms when viewed from "a greater distance", suggesting that Ku- and Ka-Band main scattering horizons actually changed differently from before to after the ROS. I suggest some reconsideration of the results and depending on that some clarification of the writing here, taking into account the involved uncertainties and limitations more rigorously.
L364: "reducing the elevation of the air/snow ..." --> Which effect on the elevation of the air/snow interface relative to a radar sensor is more important: the compaction of the formerly dry snow becoming wet or even slushy hence much decreased depth, or the weighing effect you describe?
L365/366: How did you compute that an increase of the existing SWE by 11.5 mm would cause the explicitly stated change in elevation of 13.6 mm? This is not clear to me. Also, when you say "reduction of the air/snow interface elevation" then you take the ice/snow interface as the reference? Hence, in other words, snow depth decreases by 13.6 mm?
L370/371: This is the only place where I find the observations of (very slightly) elevated basal snow layer salinities after the ROS very interesting because it seems to point towards some brine wicking from the underlying ice due to the refreezing process. Alone, the ice underneath is MYI and should be fresh in its upper some centimeters and you state that your values fall within the measurement uncertainty anyways. But, what if the sea ice underneath would be saline FYI and you would have had some brine in the basal snow layer that was flushed downwards by the rain ... would the refreezing kind of suck this brine back upwards again and perhaps at an even larger concentration? I don't find your argumentation here overly conclusive.
L378-393: "For satellite-based ... can this lead to permanent geophysical and ... microstructure [Colbeck, 1982]." --> I suggest to switch the information here: ROS events during winter can create certain geophysical changes at various scales. These can then impact active microwave measurements in various ways.
- I find the two paragraphs that follow very speculative and not necessarily backed up well by your results. For instance, you refer to ROS as an event that might happen in spring and erroneously interpreted as melt onset ... but your example is from September. I am wondering whether you could delete this 5.2 section completely without missing too much information.
Figure 10: The information given here is partly equivalent to Fig. 8 but the way computations and perhaps also co-locations are done seem to be different. I suggest to either combine Fig. 8 and 10 or, if you keep Fig. 8 then I suggest to use the same co-location and computation procedures for both figures with respect to AMSR2 TBs and SBR TBs. In any case you should describe how the co-location between AMSR2 and SBR data was done.
L403-408: In order to understand that the PD decreases when both V and H-pol emissivities are close to 1, one needs to state the emissivities differed from 1 by an amount that differs between the polarizations for dry snow.
- Obviously, because ASI SIC is above 100% anyways (almost all the time) before and after the ROS for both AMSR2 and SBR, you are in the saturation regime of that algorithm, i.e. the sensitivity of the PD to changes in SIC is comparably small and changes non-linearily. Is is correct to assume, that because of this non-linearity that occurs here in contrast to the linear change of PD with SIC at lower SIC values you have that larger increase in SIC due to the ROS event from 82% to 90%? I suggest that you state more clearly the reason for these different numbers.
L409-413: I suggest to see this in a more differentiated way. Before the ROS NT SIC is at 90%. After the first, much more intense ROS event the NT SIC is close to 100%. It is just after the second (weak) ROS event that the NT SIC drops to 70%. What causes the upswing to near 100%?
- I note that you show GR3719V but not PD19. Why?
- You state that the response lasts long after refreeze ... but if one compares SIC values, then one has NT SIC of around 80% on Sep. and then again on Sep. 16. So while I agree that certainly there is some longer lasting change in microwave signature I am not so sure whether the SIC is a good indicator here. In order to understand this better a graph showing PD19 could help. What I do note is the decrease of the GR3719V to even more negative values is very clear which would also mean an increase in the MYI concentration when retrieved with the NT approach.
- Did you recognize that GR3719V values approach 0 during the ROS, hence making the MYI to look like 100% FYI?
L414: "multiyear ice [Rostosky et al., 2018]" --> this reads as if this paper only deals with snow depth retrieval over MYI which is certainly not the case. You might want to rephrase your statement.
- On another note: you are standing on an ice floe in mid September ... hence the ice certainly is not FYI but it is at least second-year ice. Therefore, I recommend to reformulate your statements in this paragraph accordingly - aka: "If we assume that the ice floe is FYI then we could retrieve snow depth using the approach of ...." The fact that you retrieve a snow depth which is much too high: 20-30 cm instead of 7 cm measured before, or 4 cm measured after the ROS supports this notion very well. That way you would also demonstrate to the reader that these considerations are purely hypothetical.
L423/424: In the first moment I would think similarly. However, there are (at least) two things fundamentally different here: 1) The snow layer on the ice is quite thin in your case but would be much thicker in a normal winter case. Therefore the water entering the snow would have a different effect. On thing that is not sufficiently discussed here is the possibility that the underlying sea ice was still quite warm and potentially contributed to the observed change in the vertical snow structure. What were ice-snow interface temperatures? 2) The bulk and initial snow surface temperatures would be substantially lower and it is reasonable to expect that a winter-time ROS event would provide freezing rain and therefore a different microphysical environment and hence microwave signature of the snow. I therefore suggest that you rephrase this statement.
Figure 11: Is the area for which this trend is computed in panel a) identical to the region shown in panel b)? You write that this is "over sea ice", while panel b) shows all areas, i.e. land, ice-free ocean and sea ice.
- What is the unit of the slope in panel a)?
- I note that here you sub-sum also "wet snow" under wet precipitation in contrast to Figure A3 where "ice pellets and snow" do not contribute. Which one is correct? Did you apply the same selection criterion for both figures?
- The map in panel b) appears to show the trend in mean wet precipitation for the entire period, i.e. the total change and not the change per year. Is this correct? Or do I have to read the map such that there are vast regions where the winter-time wet precipitation increased by 40 mm/day within the 42-year period?
- I note that there are vast regions where the trend is not covered adequately by the legend and suggest to change this.
- I as well suggest to indicate the sea ice extent.
- While I can guess that the hatched area has something to do with significance you did not mention this in the caption.
- The title of panel b) is a bit misleading because you are not only referring to rainfall but you are also referring to wet snow (which can have a completely different effect on an existing snow cover - as well as freezing rain) and th mixed rain / snow events. To what extent this "mean wet precipitation" therefore matches the conditions you experienced during MOSAiC in September remains unclear.
- Finally, while the MOSAiC observations, for which you apparently had a good agreement between observed and modeled precipitation type (otherwise you would not dare to carry out such a trend analysis) are from September, these results here are for winter, specifically for winter conditions which in terms of precipitation phase might be more of a challenge for the reanalysis. Hence my question: What is your idea of how credible these results are?
- Provokative question at the end: How about during legs 1 to 4: How often did you have ROS events?
Editoral comments / typos:
L55/56: Please provide the full name of all sensors upon their first mentioning in the text. This applies also to all sensors mentioned later such as AMSR2 (L59, SSM/I and others.L77: "VV, HH, ..."
I guess you need to explain once what is meant by V, H, VV, HH, and the HV, VH - also in view of the next subsection.L77: Here you name the instrument "KuKa radar". I recommend that you use this name throughout the paper; the usage of other acroynms further down ("KuKa system", "KuKa instrument" or just "KuKa") is confusing and does not read overly well.
Table 2: Please check whether row "Receive Noise" should read "receiver noise" and whether the unit is really K and not mK.
- Since you provide the center frequency with 1/10 GHz precision you might want to do that for the bandpass as well.
L113: Typo: SSMI --> SSM/I
L124-126: Please check these two sentences ... "Physical temperatures were also made of the absorber pads ..." reads strange as does the next sentence.
L182: Please check this sentence. "vertical temperatures above zero" reads strange. See also L186 please.
Figure 2 caption: "Symbols denote ..." --> Perhaps better: "Different symbols denote different snowpits in panels d) and e).
- I find it a bit unfortunate to have different colors representing different parameters AND different methods in panel d). Perhaps you could move SSA to panel e) should you decide that there is no need to show the snow salinity.
L200: I would not consider 8:30 UTC "shortly" after it had started to rain (5 UTC, see L179).
L224: "were generally similar across the floe" ... I am not sure the reader gets this information from the figures just discussed. Would it make sense to summarize the key changes in the snow properties caused by this ROS event in 2 sentences?
Figure 3: Please explain the devices that are seen in the photographs.
L263: "samples" reads strange. Please see my comment to Figure 6.
L279: "the peakiness" --> Just for my understanding ... usually one speaks of "pulse peakiness", here you use "peakiness" or "waveform peakiness". Are these terms that can be used interchangably?
L281/282: "in the Arctic" --> perhaps better "in the region shown" because the region shown in the maps of Fig. 7 appears to be a rather small sub-set of the Arctic.
L290: Typo: "included" --> "include"
L296: Please check my comments to Fig. 6 in this context.
Figure 7: I suggest to delete the images at the beginning and end (Sep. 9 and 18) and use the space created to illustrate how the area shown maps with respect to land and sea-ice distribution for Sep. 13/14. Information about latitudes and longitudes (i.e. numbers) would help as well to geolocate the region shown.
L311: "less" --> here perhaps better "lower"
L314: Please note my comment regarding the SBR data shown in Figure 5.
L321: "potentially ... snowpack" --> would it make sense to refer to a) the increase in overall snow density and to b) the observed vertical density gradient (see Fig. 4 e), RS)?
L325: Figure 7 needs to read Figure 9
L326: "temporal averaging ..." --> This sentence reads as if AMSR2 only provides this kind of data (daily averaged) ... I suggest to be more correct here and state that the data product you used comprises daily averaged TB observations of ascending orbits and that even though you used this product instead of single swaths you are able to detect the ROS impact on the TBs.
L332/333: "the ROS covered quite a large region" --> you could make a cross-reference from the comparison of KuKa-radar and CS-2 data to this statement because there you needed to assume that the ROS event covers a comparably large region.
L345-347: This looks like a perfect place to cite the work of Landy et al., JGR, 2020, https://doi.org/10.1029/2019JC015820
L348-352: "Kuka data combined ... scale satellite footprints." --> This sounds all very good and reasonable - alone: the experiment which results you showed and discussed here appears not be optimally suited to follow that path yet. You could clarify this in the text and suggest what needs to be improved.
L353-354: The sentence refering to Laxon, 1994 should perhaps include the notion that leads actually INCREASE the peakiness because of their specular return. This would make it easier to understand why you then move over to water on top of the sea ice in form of melt ponds, also causing specular returns and hence an increase in pulse peakiness before you then come to your results representing kind of a melt-pond precurser: wet or more or less saturated snow triggered here by ROS events instead of summer melt.
L363: "the introduction ..." --> Perhaps better: "by adding rainwater to an existing snow cover its density and also its SWE is increased"
Figure 10:
Typo in caption "teh" --> "the"L431: "Appendix" --> please refer to the specific figure(s) in the appendix.
Citation: https://doi.org/10.5194/tc-2021-383-RC1 - AC1: 'Reply on RC1', Julienne Stroeve, 10 Apr 2022
-
RC2: 'Comment on tc-2021-383', Anonymous Referee #2, 30 Apr 2022
Summary
The authors use a time series of atmospheric and surface geophysical observations to document the effect of rain-on-snow events on passive and active microwave remote sensing emission, backscatter, and radar waveform. The examined passive microwave frequencies (19 and 89 GHz) focus on two of the commonly used in sea ice concentration retrieval algorithms, and the active data are at the Ku- and Ka-bands found in current and planned radar altimeter missions used to infer sea ice thickness from sea ice buoyancy. The authors highlight the strong effect that a ROS event has on these remote sensing signals, and how changes in snow structure caused by ROS events are pervasive in their impact on passive emission and radar waveforms. They argue that there is an increase in ROS events on sea ice and that the topic is under-studied in that the community does not understand how these events contribute to sea ice geophysical retrieval errors. Data used are from the large, multidisciplinary, MOSAiC drift campaign that took place in the central Arctic from 2019-2020. The paper focuses specifically on data collected in late August and early September 2020. The paper is original and relevant to TC, and it should be of interest to the sea ice and snow readership. Major and minor comments are as follows.
Major
(1) Speculation about emission and scattering mechanisms: The authors use a large volume of data to document the rain event and the changes in snow properties that occurred during and after it. The MOSAiC project affords this opportunity, and the authors should be lauded for putting together such a detailed picture of the event as it happened. The impact of the event on remote sensing signals is well documented. However, despite the effort to incorporate so much detail, many statements made about the connections between snow property and microwave emission/backscatter/waveform are speculative. Examples include: on lines 242-245, where the downward percolation of water in snow is "likely" attributed to a 12-15dB decrease in backscatter; and lines 246-250, where snow porosity is related to an increase volume scattering, yet porosity isn't examined and the authors express the need for more analysis. These speculative comments are not well enough substantiated by the data at hand, or by microwave scattering and emission theory and/or a modelling framework. While it is understandable that there isn't a lot of well-established microwave interaction theory dealing with such a complex scenario, there are still basic principles that would help drive the interpretation. For example, does it makes sense that the drainage of the absorbing water during the second rain event should lead to such a dramatic backscatter decline? Isn't the snow being wetted by absorbing rain? What is the expected penetration depth? What is the surface roughness contribution? Those look like structure from motion / photogrammetry targets in Figure 3; perhaps data on surface roughness are available and, if so, should be used.
In particular the paper needs to be focused more on the basic mechanisms driving the observed changes in backscatter and waveforms. For the active case, establishing the surface roughness and dielectric properties, and the relative contributions of surface and volume backscatter are important. MOSAiC datasets that help with this should be better utilized. Otherwise, so much of detailed analysis of various MOSAiC datasets, as interesting as it is, misses the mark in terms of guiding the interpretation and much of what we gain from the extensive analysis is consistent with what is already known to be the case from studies of terrestrial snow (i.e. what is introduced in lines 27-36).
(2) Cryosat-2 data usage: The authors compare their surface observations to Cryosat-2 backscatter and peakiness data to, as they suggest, to see how their results scale-up to the satellite scale. However, the explanations in lines 300-307 point to how the surface data do not scale-up, and the comparison is confusing overall. The observation that the satellite-based waveforms also change is correct, but given the unexplained discrepancy between what's observed at the surface and in the satellite data, it does not add much value to the paper.
(3) Winter ERA5 winter precipitation time series analysis: The authors use precipitation amount and type from ERA5 data to, as they state, expand the study beyond the time-period of the studied ROS event (i.e., winter period). Though the question of whether or not more ROS events in winter are occurring is important, the analysis doesn’t effectively offer an answer to the question. The authors find an increase in the amount of rainfall during cold periods over the period of 1980-2020, but the amount of rainfall is, as stated, relatively small in magnitude. In order to make a connection to the studied ROS event, which took place in late summer and not during winter, the authors need to define an “ROS event” in terms of time period (e.g. number of consecutive days) and rainfall magnitude, then use the ERA5 data to assess whether or not these “ROS events” occur in the winter, and how much they have been changing over time. It is unclear from the precipitation amounts presented whether or not we would expect any impacts on snow properties and microwave scattering and emission behaviors that are comparable to the studied late summer event.
(4) Inferences about time series changes in snow properties during the ROS: In Section 3 there are lot of inferences made about time series changes in snow properties, using data collected from different positions on the sea ice floe. The authors acknowledge this on lines 218-220 where they state “This highlights potential spatial variability in snow conditions, yet it is difficult to separate spatial variability from temporal changes since the snow pits were not sampled at the same time.” One line 224 the authors then state that conditions are generally similar across the floe. Overall it reads like the authors are choosing to use spatial variability to explain some of the observed changes and homogeneity to explain others. As such, it is not very convincing what role the ROS events played in altering the snow physics relative to how much sampling spatial variability plays a role. An unbiased approach to the analysis is needed.
Minor Comments (by line number)
45: Delete “surface”
81: There are a lot of undefined locations in the Figure 1 map. Define them or, if they are not important, remove them.
91: The calibration was done several months before. Does this have any impact on the analyzed data, e.g. due to instrument drift?
106: HV, VV, and HH data are used in the analysis.
118: Choose better wording than “seeing”.
124: “..thick microwave absorber…”
130: physical temperature not absolute temperature
132: delete “zenith”
146: Was a manual weather observation program implemented during MOSAIC? Manual weather observations are a useful complement to these more sophisticated sensor-based techniques and add confidence to the estimations from them. With the stated goal of straightforward interpretation on line 150, manual weather observations would be very useful.
153: It would be better to clarify what time period is of interest earlier here (1980-2020).
164-175: Indicate how reliable the snow data are when sampled during melting conditions.
177: Before it was referred to as a ROS event. Now it is events. Clarify.
184: 13 September
199: Clarify what you mean by “below the snow/ice interface”, i.e. what the SSL is in relation to surface snow and sea ice volumes.
205: define SSL earlier.
224: Explain the headings in Figure 4 in the caption (ROV, ALEBDO, etc.).
230-232: See major comment: would we expect VV>HH when backscatter is dominated by volume scattering?
242: See major comment: if the absorbing water is now drained then how does the backscatter decline so much in its absence? Wouldn't we expect an increase from water drainage, when the dielectric constant reduces and air and snow particles are snow scattering above the wet basal layer? Or is the wetness of the air-snow interface during the second rain event causing this effect? What is the expected penetration depth?
247: It is unclear what is meant by increasing porosity in snow pore spaces. Do you simply mean the pores are filled with water (during rain) then air (after refreezing)?
254: How does a glazed surface crust increase the dielectric constant? What is the increase compared to, cold snow? What about the surface roughness contribution to the observed change in backscatter?
265: Clarify what the green samples are.
278: That is not scaling up, which implies some kind of scaling function and consideration of spatial heterogeneity. It is comparing two different scales.
279: Define peakiness.
308: Section 4.2. is out of place since it refers back to Figure 5. Move the SBR data to this section, in a new figure, or move this analysis to earlier in the paper.
371: It is better to say “with a reduced brine volume” because a ROS event wouldn’t necessarily completely flush the snow of brine.
423: See major comment about ERA5 analysis.
Citation: https://doi.org/10.5194/tc-2021-383-RC2 - AC2: 'Reply on RC2', Julienne Stroeve, 15 Jun 2022