Skip to content

mankoff/sankey

Repository files navigation

On Ice Sheet Mass Flow

Table of contents

Introduction

Sankey diagrams for mass flow in Greenland and Antarctica.

Abstract & Methods

See ./ms.tex

Results

Note

All figures are width-proportional within and between each other.

./fig_aq_gl.png

./fig_aq_parts.png

Processes and flow

flowchart.png

Implementation

Greenland

IDTermValueIOPeriodSourceComment
RFRainfall40Ig2010-2019fettweis_2020
CDCondensation5Ig2010-2019fettweis_2020
DPDeposition10Ig2010-2019fettweis_2020
SFSnowfall680Ig2010-2019fettweis_2020
EVEvaporation5Og2010-2019fettweis_2020
RURunoff435Og2010-2019fettweis_2020
SUSublimation60Og2010-2019fettweis_2020
RFZRefreezing200-2010-2019fettweis_2020Melt + rainfall - runoff
BMBGrounded basal melting20Ogsteadykarlsson_2021Grounded basal mass balance
DISCHDischarge485-20+5Og2010-2019mankoff_2020_solid,kochtitzky_2023,bollen_2023Discharge (less SMB below gates) = Frontal melting + calving + peripheral
CALVCalving235Ooenderlin_201350 % of discharge
FRONTMELTFrontal melting235Ooenderlin_201350 % of discharge
SHELFMELTSub-shelf melting25Oo2013-2022wang_2024Assume 10 Gt is steady state in FRONTMELT. Remaining 25 Gt is net loss
SHELFFREEZESub-shelf freeze-on5Io2013-2022wang_2024
GZRETGrounding line retreat5OgEstimate
FRONTRETFrontal retreat50Oo2010-2020kochtitzky_2023
FRONTADVFrontal advance0Oo2010-2020kochtitzky_2023None in Greenland
Exported gl_baseline to ./dat/gl_baseline.csv

Reset

trash G_GL
[[ -e ./G_GL ]] || grass -e -c EPSG:3413 ./G_GL

Discharge

From total mass balance product

!date
fname="https://thredds.geus.dk/thredds/fileServer/MassBalance/MB_SMB_D_BMB.csv"
df = pd.read_csv(fname, index_col=0, parse_dates=True)[['D','D_err']]
df = df\
    .resample('1D').interpolate().resample('YS').sum()

print("2010s")
print(df['2010-01-01':'2019-12-31'].mean())
Wed Dec 18 07:11:05 AM EST 2024
2010s
D        485.373414
D_err     44.951388
dtype: float64

Then, subtract 15 from 475 based on citet:kochtitzky_2023 who report, in Section 3.3, 17 +- 6.8 and 14.5 +- 5.8 but that “[b]ecause our fluxgates were typically located tens to hundreds of meters lower than those in the similar studies (King et al., 2018; Mankoff et al., 2020), the melt correction for these studies would be higher than values presented herein, although it is beyond the scope of the current study to determine what those values would be.”

Peripheral discharge (Bollen 2023)

Where are these glaciers
grass ./G_GL/PERMANENT
g.mapset -c Bollen_2023

cat "${DATADIR}/Bollen_2023/GreenlandGIC_discharge_timeseries - Ellyn Enderlin.csv" \
    | cut -d, -f1-3 \
    | v.in.ascii input=- output=bollen_2023 separator=, skip=1 x=2 y=3 z=1
How much do they contribute?
import pandas as pd
data_root='/home/kdm/data'
path='Bollen_2023'
fname='GreenlandGIC_discharge_timeseries - Ellyn Enderlin.csv'
df = pd.read_csv(f"{data_root}/{path}/{fname}", index_col=0, header=[0])
df = df.sum(axis='rows')
df = df / 1E9 # per email from Ellyn, units are m^3/year. Convert to Gt.
df = df['2010':'2018']
df.mean()
5.209345977852399

Basal melt

GZ retreat

From Millan (2022) http://doi.org/10.5194/tc-16-3021-2022

  • Gz retreat is ~0.13 km/yr (Fig. 3a)
  • Ice velocity is ~1200 m/yr (Fig. 3b) (not needed)
  • 20 km wide

Rates are higher per Ciraci (2023) http://doi.org/10.1073/pnas.2220924120, but

  • Ice surface close to flotation near GZ, and shelf is ~500 m thick, so estimate 600 m ice.

Therefore, gz retreat in Gt/year is width * thick * retreat rate * density

frink "0.13 km/yr * 20 km * 600 m * 917 kg/m^3 -> Gt/yr"
1.43052

Assume similar from other ice shelves too, for a total of ~5 Gt/yr GZ retreat in Greenland.

SMB

g.mapset -c MAR

ncdump -v TIME dat/MARv3.12-GRD-15km-annual.nc4 # 30-39 = 2010-2019
ncra --overwrite -d TIME,30,39 dat/MARv3.12-GRD-15km-annual.nc4 tmp/MAR_GL.nc

ncdump -v X10_110 tmp/MAR_GL.nc # 101
ncdump -v Y20_200 tmp/MAR_GL.nc # 181
g.region w=$(( -645000 - 7500 )) e=$(( 855000 + 7500 )) s=$(( -3357928 - 7500 )) n=$((-657928 + 7500 )) res=15000 -p

var=SF # debug
for var in SF RF RU SU ME SMB EVA CON DEP SUB MSK AREA; do
  r.in.gdal -o input=NetCDF:tmp/MAR_GL.nc:${var} output=${var}
  r.region -c map=${var}
done

r.mapcalc "GL_ice_all = (MSK > 50) & ((x()-y()) > 520000)" # Limit to ice and remove Canada
# r.clump input=GL_ice output=clumps --o
# main_clump=$(r.stats -c -n clumps sort=desc | head -n2 | tail -n1 | cut -d" " -f1)
# r.mapcalc "GL_ice = if(clumps == ${main_clump}, 1, null())"
# r.mask raster=GL_ice --o
r.mapcalc "MASK = if(GL_ice_all == 1)" --o

# if only X % of a cell is ice, scale by that.
r.mapcalc "scale_mask = (GL_ice_all * MSK) / 100"

# scale
## units are mm.w.eq. per grid cell. Grid cell areas are in km^2
## + mm.w.eq. -> m w.eq.: /1E3
## + m w.eq -> kg: *1E3
## + area in km^2 -> m^2: *1E3*1E3
## + kg -> Gt: /1E12
# ds = ds/1E3 * 1E3 * ds['AREA']*1E3*1E3 / 1E12
for var in SF RF RU SU ME SMB EVA CON DEP SUB; do
  r.mapcalc "${var} = (${var}/1000) * 1000 * (AREA * 1000*1000) * scale_mask / exp(10,12)"
done
r.mask -r

r.mapcalc "RFZ = ME + RF - RU"
for var in SF RF RU ME SMB EVA CON DEP SUB RFZ; do
  echo ${var} $(r.univar -g ${var} | grep sum)
done
�[?2004lSF sum=678.472341306034
RF sum=41.0073369748482
RU sum=433.411271134275
ME sum=594.819117205514
SMB sum=232.245706856329
EVA sum=7.43645901936729
CON sum=2.02922271273767
DEP sum=12.3770587084991
SUB sum=60.0712550947222
RFZ sum=202.41518304609

Shelf melt and freezing

grass ./G_GL/PERMANENT
g.mapset -c Wang_2024
tif_list=$(find ~/data/Wang_2024 -name "????.tif")
t=$(echo $tif_list | tr ' ' '\n' | head -n1) # debug
for t in ${tif_list}; do
  dirname=$(basename $(dirname ${t}))
  fname=$(basename ${t})
  fname=${fname%.*}
  tname=g_${dirname}_${fname} # add g_ because "79N" is not a valid name
  r.in.gdal input=${t} output=${tname}
done
g.region raster=$(g.list type=raster sep=,) -pa

r.series input=$(g.list type=raster sep=,) output=melt method='average'
r.colors -a map=melt color=viridis

r.mapcalc "area = area()"

## Melt data is m/year
## Multiply by area to get m/m^2 or grams, then 1000 to get kg
r.mapcalc "melt = melt * 1000 * area / exp(10,12)" --o

r.mapcalc "melt_on = if(melt > 0, melt, null())"
r.mapcalc "freeze_on = if(melt < 0, melt, null())"

Stats

echo "NET"
r.univar -gt map=melt | cut -d"|" -f11

echo ""
echo "FREEZE_ON"
r.univar -gt map=freeze_on | cut -d"|" -f11

echo ""
echo "MELT_OFF"
r.univar -gt map=melt_on | cut -d"|" -f11
�[?2004lNET
�[?2004lsum
33.4127947245078
�[?2004l
�[?2004lFREEZE_ON
?2004lsum
-2.68199438110646
�[?2004l
�[?2004lMELT_OFF
�[?2004lsum
36.094789105614

MB

GRACE ESA

import xarray as xr
ds = xr.open_dataset("~/data/Dohne_2023/GIS_GMB_grid.nc")
ds['dm'] = ds['dm'] * ds['area']
ds = ds.sel({'time':slice('2010-01-01','2019-12-31')})
ds = data=ds['dm'].to_dataset()
ds = ds['dm'].sum(dim=['x','y'])/1E12
ds = ds - ds.values[0]
_ = ds.plot()
ds = ds.resample({'time':'YS'}).mean()
ds = ds.diff(dim='time')
print(ds.mean())
<xarray.DataArray 'dm' ()> Size: 8B
array(-250.12027707)

./figs_tmp/51f7e7c93a2146c72173d08304aa80e49389d237.png

GRACE JPL

import pandas as pd
from datetime import datetime, timedelta

df = pd.read_csv("~/data/GRACE/greenland_mass_200204_202409.txt",
                 comment="H", parse_dates=True, sep="\s+", header=None,
                 names=['year','mass','err'])

# Function to convert year.frac to ISO format (YYYY-MM-DD)
def year_frac_to_iso(year_frac):
    year = int(year_frac)
    frac = year_frac - year
    start_of_year = datetime(year, 1, 1)
    days_in_year = (datetime(year + 1, 1, 1) - start_of_year).days
    date = start_of_year + timedelta(days=frac * days_in_year)
    return pd.to_datetime(date.strftime('%Y-%m-%d'))

# Apply the conversion to the 'Year' column
df['date'] = df['year'].apply(year_frac_to_iso)
df = df.drop(columns=['year'])
df = df.set_index('date')
df = df['mass']

# df.resample('D').mean().interpolate()
df = df['2010-01-01':'2019-12-31']
df = df - df.max()

# df.head()
_ = df.plot()
print(df.resample('YS').mean().diff().mean())
-264.96212962962966

./figs_tmp/bfcfdbb8d4a3e42ea23a749f439bc8dddbd3e743.png

Mankoff 2021

!date
fname="https://thredds.geus.dk/thredds/fileServer/MassBalance/MB_SMB_D_BMB.csv"
df = pd.read_csv(fname, index_col=0, parse_dates=True)[['MB','MB_err']]
df = df\
    .resample('1D').interpolate().resample('YS').sum()

print("2010s")
print(df['2010-01-01':'2019-12-31'].mean())
Fri Dec 20 09:56:38 AM EST 2024
2010s
MB       -246.172157
MB_err     94.196209
dtype: float64

Antarctica

MB per GRACE is -140 https://svs.gsfc.nasa.gov/31166 but long trend. Not sure what recent trend is.

SMB w/ Islands (57) but w/o shelves = 2231 from MAR 2010-2019 D from Rignot: 2231 + 77 (islands) = 2308. This includes non-shelf discharge which is 330 Gt/year MB = SMB - D => 2232 - 2308 = -76. Missing -70 BMB = -146. Good match!

D-shelf from Davison: 1935.082208. Add 77 (Rignot Islands) + 331 (Rignot non-shelf D) = 1935+77+331 = 2343 MB = SMB - D => 2232 - 2343 = -111. Missing -70 BMB = -181. Decent match.

IDTermValueIOPeriodSourceComment
RFRainfall5I2010-2019fettweis_2020
CDCondensation5I2010-2019fettweis_2020
DPDeposition80I2010-2019fettweis_2020
SFSnowfall2775I2010-2019fettweis_2020
RFZRefreezing105-2010-2019fettweis_2020
EVEvaporation5O2010-2019fettweis_2020
RURunoff10O2010-2019fettweis_2020
SUSublimation230O2010-2019fettweis_2020
BMBGrounded basal melting70O-van-liefferinge_2013
DISCHDischarge1935+331+77+2O2008-2019davison_2023 (to shelves) + rignot_2019 (grounded + islands)
CALVCalving1411+331+77+1O2010-2019greene_2022 + rignot_2019 discharge (grounded + islands)
FRONTMELTFrontal melting0O
SHELFMELTSub-shelf melting1380O2010-2017paolo_2023
SHELFFREEZESub-shelf Freeze-on405I2010-2017paolo_2023
GZRETGrounding line retreat45O1997-2021davison_2023 (only Pine Island, Thwaites, Crosson, and Dotson)
FRONTRETFrontal retreat400O2010-2021greene_2022
FRONTADVFrontal advance195O2010-2021greene_2022
IDTermValueIOPeriodSourceComment
RFRainfall5I2010-2019fettweis_2020
CDCondensation5I2010-2019fettweis_2020
DPDeposition40I2010-2019fettweis_2020
SFSnowfall1535+20I2010-2019fettweis_20201535 + East AQ Islands
RFZRefreezing40IO2010-2019fettweis_2020
EVEvaporation5O2010-2019fettweis_2020
RURunoff5O2010-2019fettweis_2020
SUSublimation170O2010-2019fettweis_2020
BMBGrounded basal melting45O-van-liefferinge_2013
DISCHDischarge924+177+23+1-2008-2019davison_2023 (to shelves) + rignot_2019 (grounded + islands)
CALVCalving694+177-1O2010-2019greene_2022 + rignot_2019 discharge (grounded + islands)
FRONTMELTSubmarine melting0
SHELFMELTSub-shelf melting515O2010-2017paolo_2023
SHELFFREEZESub-shelf freeze-on200I2010-2017paolo_2023
GZRETGrounding line retreat5O1997-2021davison_2023
FRONTRETFrontal retreat70O2010-2019greene_2022
FRONTADVFrontal advance190O2010-2019greene_2022
IDTermValueIOPeriodSourceComment
RFRainfall5I2010-2019fettweis_2020
CDCondensation5I2010-2019fettweis_2020
DPDeposition30I2010-2019fettweis_2020
SFSnowfall865 +30I2010-2019fettweis_2020865 + West AQ Islands
RFZRefreezing15IO2010-2019fettweis_2020
EVEvaporation5O2010-2019fettweis_2020
RURunoff5O2010-2019fettweis_2020
SUSublimation40O2010-2019fettweis_2020
BMBGrounded basal melting20O-van-liefferinge_2013
DISCHDischarge857+23+45-2008-2019davison_2023 (to shelves) + rignot_2019 (grounded + islands)
CALVCalving567+23O2010-2019greene_2022 + rignot_2019 discharge (grounded + islands)
FRONTMELTFrontal melting0
SHELFMELTSub-shelf melting665O2010-2017paolo_2023
SHELFFREEZESub-shelf freeze-on145I2010-2017paolo_2023
GZRETGrounding line retreat50O1997-2021davison_2023
FRONTRETFrontal retreat205O2010-2019greene_2022
FRONTADVFrontal advance5O2010-2019greene_2022
IDTermValueIOPeriodSourceComment
RFRainfall5I2010-2019fettweis_2020
CDCondensation5I2010-2019fettweis_2020
DPDeposition5I2010-2019fettweis_2020
SFSnowfall320+5I2010-2019fettweis_2020320 + Peninsula Islands
RFZRefreezing45IO2010-2019fettweis_2020
EVEvaporation5O2010-2019fettweis_2020
RURunoff5O2010-2019fettweis_2020
SUSublimation15O2010-2019fettweis_2020
BMBGrounded basal melting5O-van-liefferinge_2013
DISCHDischarge153+131+9-1-2008-0219davison_2023 (to shelves) + rignot_2019 (grounded + islands)
CALVCalving104+131O2010-2019greene_2022 + rignot_2019 discharge (grounded + islands)
FRONTMELTFrontal melting0
SHELFMELTSub-shelf melting155O2010-2017paolo_2023
SHELFFREEZESub-shelf freeze-on10I2010-2017paolo_2023
GZRETGrounding line retreat5O1997-2021davison_2023
FRONTRETFrontal retreat125O2010-2019greene_2022
FRONTADVFrontal advance5O2010-2019greene_2022

Export tables to CSV

Exported aq_baseline to ./dat/aq_baseline.csv
Exported aq_east to ./dat/aq_east.csv
Exported aq_west to ./dat/aq_west.csv
Exported aq_peninsula to ./dat/aq_peninsula.csv

Reset

trash G_AQ
[[ -e ./G_AQ ]] || grass -e -c EPSG:3031 ./G_AQ

Masks: East, West, Peninsula, Islands, Grounded and Shelves

grass ./G_AQ/PERMANENT

v.in.ogr input=${DATADIR}/NSIDC/NSIDC-0709.002/1992.02.07/IceBoundaries_Antarctica_v02.shp output=basins

g.region vector=basins res=10000 -pas

v.db.select map=basins|head
v.db.select -c map=basins columns=Regions | sort | uniq # East West Peninsula Islands
v.db.select -c map=basins columns=TYPE | sort | uniq # FL GR IS (float, ground, island)

v.to.rast input=basins output=east use=val val=1 where='(Regions == "East")'
v.to.rast input=basins output=west use=val val=2 where='(Regions == "West")'
v.to.rast input=basins output=peninsula use=val val=3 where='(Regions == "Peninsula")'
v.to.rast input=basins output=islands use=val val=4 where='(Regions == "Islands")'
r.patch input=east,west,peninsula,islands output=basins
r.category basins separator=":" rules=- << EOF
1:East
2:West
3:Peninsula
4:Islands
EOF
r.colors map=basins color=viridis

v.to.rast input=basins output=ground use=val val=1 where='(TYPE == "GR") or (TYPE == "IS")'
v.to.rast input=basins output=ground_noisland use=val val=1 where='(TYPE == "GR")'

Label islands to nearest region (east,west,peninsula)

Rignot 2019 provides discharge for Islands, but not by region. Here, determine island area per region. This assumes all islands have the same discharge per unit area.

r.patch input=east,west,peninsula output=main_ice
r.colors map=main_ice color=viridis
r.grow.distance input=main_ice value=main_ice_grow

r.mapcalc "islands_near = if(islands, main_ice_grow)"
r.stats --q -a input=islands_near
�[?2004l1-1.007843 41700000000.000000
1.996078-2.003922 80300000000.000000
2.992157-3 17400000000.000000
* 23004600000000.000000

Area = 417 + 803 + 174 = 1394 East = 417 / 1394 % = 29.9139167862 = 30 West = 803 / 1394 % = 57.6040172166 = 58 Peninsula = 174 / 1394 % = 12.4820659971 = 12

Discharge

  • Discharge is “grounded discharge”
    • Input to ice shelves where ice shelves exist
    • Calving (similar to Greenlandic discharge) where ice shelves do not exist.

Rignot 2019 (Shelf, non-shelf, and Island)

Load
import pandas as pd
df = pd.read_excel("~/data/Rignot_2019/pnas.1812883116.sd01.xlsx", index_col=0)

##############################################################################
###
### cleanup
###
df = df.loc[df.index.dropna()]

for i in [0,0,0]: # drop Excel rows 2,3,4
    df = df.drop(index=df.index[i])

# Drop super-shelves and rename indented sub-shelves
super_shelf = ["LarsenB", "Wordie", "Ronne", "Ross West", "Ross East", "Amery_Ice_Shelf", "Filchner", "AP", "WAIS", "EAIS", "TOTAL SURVEYED"]
df = df.drop(index=super_shelf)
for i in df.index: 
    if i[0] == ' ':  df = df.rename(index={i: i.strip()})

for c in df.columns: # Drop extra columns
    if 'Unnamed' in str(c):
        df = df.drop(columns=c)
df = df.drop(columns=["Basin.1", "σ SMB", "σ D", "D type"]) # Drop unused columns
##############################################################################

# Green color = no ice shelf
noshelf = ["West_Graham_Land", "Eastern_Graham_Land", "Hektoria_Headland", "Evans_Headland", "Drygalski_Headland", "LarsenA", "Rydberg_Peninsula", "Zonda_Eureka", "Cape_Jeremy", "Wilkins_George_VI", "Wilkins_Island", "Thomson", "Fox", "Cooke", "Walgreen_Coast", "Lucchitta_Velasco", "Jackson-Perkins", "Frostman-Lord-Shuman-Anandakri", "Shirases_Coast", "Saunders_Coast", "Ross_East1", "Ross_East2", "Ross_East3", "Ross_East4", "Ross_East5", "Dry_Valleys", "Icebreaker-Fitzgerald", "Victoria_Land", "Oates_Coast", "Wilkes_Land", "Adelie_Coast", "Sabrina_Coast", "Clarie_Coast", "Law_Dome", "Budd_Coast", "Knox_Coast", "Ingrid_Christensen_Coast", "Wilhelm_II_Coast", "Enderby_Land", "Prince_Olav_Coast", "Mawson_Coast"]
df['shelf'] = 1
df.loc[noshelf, 'shelf'] = 0

# Sum numeric columns
df.loc['Sum'] = np.nan
for c in df.columns: # convert to numbers
    try: df[c] = pd.to_numeric(df[c])
    except: df.loc['Sum',c] = 'All'

cols = df.select_dtypes(include=[np.number]).columns.drop('shelf')
df.loc['Sum', cols] = df[cols].sum(axis='rows')

cols = df.columns[0:10].to_list()
cols.insert(3,'shelf')
df[cols].tail(10)
Glacier nameBasinRegionSubregionshelfSMBD19791980198119821983
Stancomb_WillsK-AEastStancomb_Wills_Ice_Shelf122.0321.3925.26124.729124.197223.665323.1333
Princess_MarthaK-AEastPrincess_Martha_Coast10.210.210.210.210.210.210.21
Coats_CoastK-AEastCoats_Coast16.436.436.436.436.436.436.43
AcademyJ”-KEastFilchner_Ice_Shelf124.2724.124.2724.250524.23124.211524.1921
Support_ForceJ”-KEastFilchner_Ice_Shelf19.729.3619.729.730579.741159.751729.7623
RecoveryJ”-KEastFilchner_Ice_Shelf141.0541.0541.0541.124241.198341.272541.3467
SlessorJ”-KEastFilchner_Ice_Shelf126.1124.91626.1126.125626.141226.156826.1724
BaileyJ”-KEastFilchner_Ice_Shelf18.988.618.989.001269.022529.043789.06504
IslandsnanIslandsIslands176.989976.989976.989976.989976.989976.989976.9899
SumAllAllAllnan2097.572236.962126.642133.492140.252147.012153.77
Shelf vs Non-shelf discharge

WARNING: Using shelf vs. non-shelf is important and can be done for all AQ, but Island discharge (~77 Gt/year) doesn’t provide enough metadata to break down by east/west/peninsula.

<<load_rignot>>
c = np.arange(2010,2017+1)

dd = df.groupby(['shelf','Region']).sum().drop(columns=['Basin','Subregion'])[c].mean(axis='columns')
dd.loc['Non-shelf discharge'] = dd.loc[0,:].sum()
dd.loc['shelf discharge'] = dd.loc[1,:].sum()
dd['Total discharge'] = dd.loc[['Non-shelf discharge','shelf discharge']].sum()
dd
# df_shelf = df[df['shelf'] == 1][c].mean(axis='columns')
# df_noshelf = df[df['shelf'] == 0][c].mean(axis='columns')

# df_shelf
# print("Total discharge: ", df[df['shelf'] >= 0][c].mean(axis='columns').sum())
# print('Shelf discharge: ', df_shelf.sum())
# print('Non-shelf discharge: ', df_noshelf.sum())
shelf                Region   
0.0                  East          177.140000
                     Peninsula     131.398873
                     West           23.003920
1.0                  East          926.544384
                     Islands        76.989900
                     Peninsula     205.140504
                     West          768.695078
Non-shelf discharge                331.542793
shelf discharge                   1977.369866
Total discharge                   2308.912659
dtype: float64
NOTDONE Mass loss

Don’t use mass estimates from Rignot 2019 because SMB is not provided with constituent terms, nor with temporal resolution aligned with this effort.

disch = df.loc['Sum'][c].mean()
disch_all = df.loc['Sum'][np.arange(1979,2017+1)].mean()

smb = df['SMB']['Sum']
smb_MAR_1980_2017 = 2267.03809702665
smb_MAR_2010_2019 = 2588.33462654902

print("Discharge (2010-2017)", disch)
print("Discharge (1979-2017)", disch_all)
print("Rignot SMB (1979-2008)", smb)
print("")
print("MB dall - smball", smb - disch_all)
print("MB D2010 - smb MAR 2010", smb_MAR_2010_2019 - disch)
print("MB D2010 - smb MAR 2010", smb_MAR_1980_2017 - disch)
Discharge (2010-2017) 2308.912659378624
Discharge (1979-2017) 2219.589344623233
Rignot SMB (1979-2008) 2097.5676000000003

MB dall - smball -122.02174462323273
MB D2010 - smb MAR 2010 279.4219671703959
MB D2010 - smb MAR 2010 -41.874562351973964

Davison 2023 (Discharge to shelf)

This is steady-state discharge from grounded ice to shelves.

import numpy as np
import pandas as pd

fname = '~/data/Davison_2023/adi0186_table_s2.xlsx'

loc = pd.read_excel(fname, sheet_name='Total mass changes', index_col = 0, usecols = 'B,C,D', skiprows = 4)
loc = loc.drop('Antarctic Ice Shelves')

df = pd.read_excel(fname, sheet_name='Discharge', index_col = 1, skiprows = 3)
df = df[df.columns[1::2]]
df.columns = [np.floor(c).astype(int) for c in df.columns]

df = df.drop(index=df.index[0])
df = df.drop(index='Antarctic Ice Shelves')
df = df[np.arange(2010,2020)].mean(axis='columns')
df.name = 'Mass'
df
Abbot        32.473268
Ainsworth     0.157966
Alison        2.985331
Amery        78.564587
Andreyev      2.207105
               ...    
Withrow       0.480019
Wordie        7.754308
Wylde         0.005026
Zelee          0.42351
Zubchatyy     0.469816
Name: Mass, Length: 162, dtype: object
<<load_davison_discharge>>
df = loc.join(df)

import geopandas as gpd
fname = '~/data/NSIDC/NSIDC-0709.002/1992.02.07/IceBoundaries_Antarctica_v02.shp'
ew = gpd.read_file(fname)

df = gpd.GeoDataFrame(df, geometry=gpd.points_from_xy(df['longitude'],df['latitude']), crs="EPSG:4326")
df = df.drop(columns=['latitude','longitude'])
df = df.to_crs('epsg:3031')
e = ew.to_crs('epsg:3031')

idx = ew.sindex.nearest(df['geometry'], return_all=False)
df['Region'] = ''
for dfidx,ewidx in idx.T:
    arr = df.iloc[dfidx].copy(deep=True)
    arr['Region'] = ew.iloc[ewidx]['Regions']
    df.iloc[dfidx] = arr
    
# df.loc['Total'] = [df['Mass'].sum(), None, 'All']

dd = df[['Mass','Region']].groupby('Region').sum()
dd.loc['Total'] = dd.sum()
dd
RegionMass
East923.794
Islands1.28338
Peninsula152.536
West857.468
Total1935.08

SMB (MAR)

g.mapset -c MAR

ncdump -v TIME dat/MARv3.12-ANT-35km-annual.nc4 # 30-39 = 2010-2019
ncra --overwrite -d TIME,30,39 dat/MARv3.12-ANT-35km-annual.nc4 tmp/MAR_AQ.nc

ncdump -v X tmp/MAR_AQ.nc # 176
ncdump -v Y tmp/MAR_AQ.nc # 148
g.region w=$(( -3010000 - 17500 )) e=$(( 3115000 + 17500 )) s=$(( -2555000 - 17500 )) n=$(( 2590000 + 17500 )) res=35000 -p

var=SF # debug
for var in SF RF RU ME SMB EVA CON DEP SUB MSK AREA; do
  r.in.gdal -o input=NetCDF:tmp/MAR_AQ.nc:${var} output=${var}
  r.region -c map=${var}
done

r.mapcalc "MASK = if(MSK > 50)" --o
r.mapcalc "scale_mask = MSK / 100" # if only X % of a cell is ice, scale by that.

# scale
## units are mm.w.eq. per grid cell. Grid cell areas are in km^2
## + mm.w.eq. -> m w.eq.: /1E3
## + m w.eq -> kg: *1E3
## + area in km^2 -> m^2: *1E3*1E3
## + kg -> Gt: /1E12
# ds = ds/1E3 * 1E3 * ds['AREA']*1E3*1E3 / 1E12
for var in SF RF RU ME SMB EVA CON DEP SUB; do
  r.mapcalc "${var} = (${var}/1000) * 1000 * (AREA * 1000*1000) * scale_mask / exp(10,12)"
done

r.mapcalc "RFZ = ME + RF - RU"

Stats

Main grounded ice (no shelf) SMB
r.mask --o raster=ground@PERMANENT --q
for var in SMB; do
  echo -n "${var}"
  r.univar -gt map=${var} zones=basins@PERMANENT | cut -d"|" -f2,13 | column -s"|" -t | sed 's/label.*//'
  r.univar -g ${var} | grep sum
  echo "#"; echo "#"
done
r.mask -r --q
�[?2004l?2004lSMB
East       1245.91548135353
West       670.071679116414
Peninsula  258.356884650752
Islands    57.2462599089497
�[01;31m�[Ksum�[m�[K=2231.59030502963
All locations and all terms
r.mask --o raster=basins@PERMANENT --q
for var in RF CON DEP SF RFZ EVA RU SUB; do
  echo -n "${var}"
  r.univar -gt map=${var} zones=basins@PERMANENT | cut -d"|" -f2,13 | column -s"|" -t | sed 's/label.*//'
  r.univar -g ${var} | grep sum
  echo "#"; echo "#"
done
r.mask -r --q
�[?2004lRF
East       1.55632116250339
West       0.808106668364101
Peninsula  4.09478244665841
Islands    0.138051253656
�[01;31m�[Ksum�[m�[K=6.59726153118188

CON
East       0.00661255569554064
West       0.00272055846697959
Peninsula  0.0419858971970306
Islands    0.0004622996319
�[01;31m�[Ksum�[m�[K=0.0517813109914508

DEP
East       41.665876584542
West       27.9828632785555
Peninsula  6.63393628684254
Islands    2.5341322486399
�[01;31m�[Ksum�[m�[K=78.8168083985797

SF
East       1533.3202293162
West       863.167047001661
Peninsula  322.457642541089
Islands    55.9839132874438
�[01;31m�[Ksum�[m�[K=2774.92883214639

RFZ
East       41.6358385500057
West       15.8325757324401
Peninsula  46.6152538919879
Islands    2.5265341720841
�[01;31m�[Ksum�[m�[K=106.610202346517

EVA
East       1.28114312886215
West       0.469596624150337
Peninsula  1.29873172532395
Islands    0.018721805005
�[01;31m�[Ksum�[m�[K=3.06819328334145

RU
East       2.98889276511441
West       0.05457606699
Peninsula  4.76209158620797
Islands    0.0014503067633
�[01;31m�[Ksum�[m�[K=7.80701072507568

SUB
East       171.750571828021
West       41.4080900874875
Peninsula  15.7626191697665
Islands    1.3719237929954
�[01;31m�[Ksum�[m�[K=230.293204878272

[Raster MASK present]

Basal melt

Van Liefferinge (2013) http://doi.org/10.5194/cp-9-2335-2013

Convert MAT file to XYZ for importing into GRASS

import scipy as sp
import numpy as np
import pandas as pd

mat = sp.io.loadmat('/home/kdm/data/Van_Liefferinge_2023/Melt_Mean_Std_15exp.mat')
X = mat['X'].flatten() * 1E3 # convert from km to m
Y = mat['Y'].flatten() * 1E3
m = mat['MeanMelt'].flatten() / 10 # cm to mm
e = mat['StdMelt'].flatten() / 10 # cm to mm

melt = pd.DataFrame(np.array([X,Y,m,e]).T, columns=['x','y','melt','err']).dropna()
melt.to_csv('./tmp/melt.csv', header=False, index=False)
melt.head()
xymelterr
1487411.045e+06-2.14e+061e-091.71243e-25
1498591.03e+06-2.135e+060.001466080.000148305
1498601.035e+06-2.135e+060.0002660420.000389444
1498611.04e+06-2.135e+061e-091.71243e-25
1498621.045e+06-2.135e+060.000456980.000668948
grass ./G_AQ/PERMANENT
g.mapset -c liefferinge_2023
r.in.xyz input=./tmp/melt.csv output=melt sep=, --o
r.in.xyz input=./tmp/melt.csv output=err z=4 sep=, --o
echo "All: " $(r.univar -g map=melt | grep sum)
echo "All: " $(r.univar -g map=err | grep sum)
echo ""
r.univar -gt map=melt zones=basins | cut -d"|" -f2,13 | column -s"|" -t
echo ""
r.univar -gt map=err zones=basins | cut -d"|" -f2,13 | column -s"|" -t
All:  sum=69.3982306335468

All:  sum=20.0261054475124


label      sum
East       46.7540492694752
West       18.8528624157926
Peninsula  3.18704264192471


label      sum
East       14.784924204035
West       4.90322548927998
Peninsula  0.221183549670513

20/69 % = 28.9855072464

Antarctic Ice shelves

Calving: Greene 2022

import pandas as pd

fname = "/home/kdm/data/Greene_2022/data/greene_Supplementary_Table_1.xlsx"
df = pd.read_excel(fname, index_col=1, skiprows=4)

df = df.drop(index=df.index[0])
df = df.drop(index=['Antarctica'])

df = df[df.columns[[1,2,9]]]
df.columns = ['latitude','longitude','Mass']

import geopandas as gpd
fname = '~/data/NSIDC/NSIDC-0709.002/1992.02.07/IceBoundaries_Antarctica_v02.shp'
ew = gpd.read_file(fname)

df = gpd.GeoDataFrame(df, geometry=gpd.points_from_xy(df['longitude'],df['latitude']), crs="EPSG:4326")
df = df.to_crs('epsg:3031')
e = ew.to_crs('epsg:3031')

idx = ew.sindex.nearest(df['geometry'], return_all=False)
df['Region'] = ''
for dfidx,ewidx in idx.T:
    arr = df.iloc[dfidx].copy(deep=True)
    arr['Region'] = ew.iloc[ewidx]['Regions']
    df.iloc[dfidx] = arr
df = df.drop(columns=['latitude','longitude'])
    
# df.loc['Total'] = [df['Mass'].sum(), None, 'All']
dd = df[['Mass','Region']].groupby('Region').sum()
dd.loc['Total'] = dd.sum(axis='rows')
dd
RegionMass
44.7053
East694.1
Islands1.51803
Peninsula103.676
West566.998
Total1411

Frontal retreat and advance: Greene 2022

import pandas as pd
import numpy as np

fname = "/home/kdm/data/Greene_2022/data/greene_Supplementary_Table_1.xlsx"
df = pd.read_excel(fname, index_col=1, skiprows=4)

##############################################################################
###
### cleanup
###
df = df.drop(index=df.index[0])
df = df.drop(index=['Antarctica'])

lon = df['Unnamed: 3']
lat = df['Unnamed: 2']

for c in df.columns: # Drop extra columns
    if 'Unnamed' in str(c):
        df = df.drop(columns=c)
    if 'Gt/yr' in str(c):
        df = df.drop(columns=c)
    if ('control run' in str(c)) | ('instantaneous' in str(c)):
        df = df.drop(columns=c)
        
for c in df.columns:
    if type(c) == str: df = df.drop(columns=c)

# df = df.drop(columns=[2000.75])
# df = df.drop(columns=[1997.75])
# df.columns = df.columns.round().astype(int)    
##############################################################################
<<load_greene_2022_adv_ret>>

c = df.columns
diff = df.diff(axis='columns')[c]
diff_gain = diff[diff > 0].sum(axis='columns')
diff_loss = diff[diff < 0].sum(axis='columns')
diff_gain.name = 'Mass'
diff_loss.name = 'Mass'
df_gain = pd.DataFrame(diff_gain)
df_loss = pd.DataFrame(diff_loss)
df_net = df_loss + df_gain

print("Net:")
print('Mass gain', df_net[df_net > 0].sum(axis='rows').values)
print('Mass Loss', df_net[df_net < 0].sum(axis='rows').values)
print('Net mass change', df_net.sum(axis='rows').values)

dt = df.columns[-1] - df.columns[0]
print("\nPer year:")
print('Mass gain', df_net[df_net > 0].sum(axis='rows').values / dt)
print('Mass Loss', df_net[df_net < 0].sum(axis='rows').values / dt)
print('Net mass change', df_net.sum(axis='rows').values / dt)
Net:
Mass gain [4563.264309048649]
Mass Loss [-9434.897965610035]
Net mass change [-4871.633656561384]

Per year:
Mass gain [194.59549292318295]
Mass Loss [-402.3410646315572]
Net mass change [-207.74557170837417]
  • Most numbers here match what’s in the publication
  • Neither the total nor Ronne match.
    • Here, total is 4871 Gt net change.
    • Below, Ronne net loss is 1031
    • From the paper (paragraph under Fig. 2)
      • Total should be 5874 (missing 5874-4871 = 1003)
      • Ronne should be 2034 (missing 2034-1031 = 1003)
      • But Thwaites, Larsen C, and Ross West match paper, so it seems like I’m parsing the dataset correctly.
      • Filchner matches mass gain.

Find the top 10 shelves with net and gross mass gain and loss total (summed) over the period

tmp = pd.DataFrame(index=np.arange(10))

tmp['Net gain: Name'] = df_net.sort_values(by='Mass', ascending=False).head(10).index
tmp['Net gain: Mass'] = df_net.sort_values(by='Mass', ascending=False).head(10)['Mass'].values

tmp['Net loss: Name'] = df_net.sort_values(by='Mass', ascending=True).head(10).index
tmp['Net loss: Mass'] = df_net.sort_values(by='Mass', ascending=True).head(10)['Mass'].values

tmp['Gross gain: Name'] = df_gain.sort_values(by='Mass', ascending=False).head(10).index
tmp['Gross gain: Mass'] = df_gain.sort_values(by='Mass', ascending=False).head(10)['Mass'].values

tmp['Gross loss: Name'] = df_loss.sort_values(by='Mass', ascending=True).head(10).index
tmp['Gross loss: Mass'] = df_loss.sort_values(by='Mass', ascending=True).head(10)['Mass'].values

tmp
Net gain: NameNet gain: MassNet loss: NameNet loss: MassGross gain: NameGross gain: MassGross loss: NameGross loss: Mass
0Filchner1796.26Thwaites-1968.41Ronne2762.7Ronne-3794.31
1Amery569.883Larsen C-1166.92Ross West1968.02Ross West-2897.63
2Cook414.571Ronne-1031.61Filchner1843.53Thwaites-2248.52
3Shackleton369.773Ross West-929.617Amery917.121Larsen C-1619.1
4Brunt Stancomb362.787Wilkins-622.156Ross East857.219Pine Island-1231.67
5West278.333Larsen B-530.038Pine Island780.117Ross East-1135.23
6Jelbart169.419Pine Island-451.554Brunt Stancomb493.663Ninnis-675.229
7Fimbul148.221Mertz-381.971Shackleton493.661Wilkins-642.061
8Riiser-Larsen84.376Ninnis-300.936Cook454.636Larsen B-584.603
9Pourquoi Pas64.9883Larsen A-286.377Larsen C452.177Mertz-571.076

Now convert to Gt/year

for col in tmp.columns:
    if 'Mass' in col: tmp[col] = tmp[col] / c.size

tmp    
Net gain: NameNet gain: MassNet loss: NameNet loss: MassGross gain: NameGross gain: MassGross loss: NameGross loss: Mass
0Filchner74.8441Thwaites-82.0169Ronne115.113Ronne-158.096
1Amery23.7451Larsen C-48.6219Ross West82.0007Ross West-120.735
2Cook17.2738Ronne-42.9837Filchner76.8138Thwaites-93.6885
3Shackleton15.4072Ross West-38.734Amery38.2134Larsen C-67.4626
4Brunt Stancomb15.1161Wilkins-25.9232Ross East35.7174Pine Island-51.3196
5West11.5972Larsen B-22.0849Pine Island32.5049Ross East-47.3011
6Jelbart7.05912Pine Island-18.8147Brunt Stancomb20.5693Ninnis-28.1346
7Fimbul6.17586Mertz-15.9155Shackleton20.5692Wilkins-26.7525
8Riiser-Larsen3.51567Ninnis-12.539Cook18.9432Larsen B-24.3585
9Pourquoi Pas2.70785Larsen A-11.9324Larsen C18.8407Mertz-23.7948
<<green_2022_mean>> # provides df_net
# df = df_net
df['longitude'] = lon
df['latitude'] = lat

import geopandas as gpd
fname = '~/data/NSIDC/NSIDC-0709.002/1992.02.07/IceBoundaries_Antarctica_v02.shp'
ew = gpd.read_file(fname)

df = gpd.GeoDataFrame(df, geometry=gpd.points_from_xy(df['longitude'],df['latitude']), crs="EPSG:4326")
df = df.to_crs('epsg:3031')
e = ew.to_crs('epsg:3031')

idx = ew.sindex.nearest(df['geometry'], return_all=False)
df['Region'] = ''
for dfidx,ewidx in idx.T:
    arr = df.iloc[dfidx].copy(deep=True)
    arr['Region'] = ew.iloc[ewidx]['Regions']
    df.iloc[dfidx] = arr

df = df.drop(columns=['latitude','longitude','geometry'])
# df.loc['Total'] = [df['Mass'].sum(), None, 'All']

# df.groupby('Region').sum().round()

diff = df[c].diff(axis='columns')
diff_gain = diff[diff > 0].sum(axis='columns')
diff_loss = diff[diff < 0].sum(axis='columns')
diff_gain.name = 'Mass'
diff_loss.name = 'Mass'
df_gain = pd.DataFrame(diff_gain)
df_loss = pd.DataFrame(diff_loss)
df_net = df_loss + df_gain
df_gain['Region'] = df['Region']
df_loss['Region'] = df['Region']
df_net['Region'] = df['Region']
Net:
Mass gain [4563.264309048649]
Mass Loss [-9434.897965610035]
Net mass change [-4871.633656561384]

Per year:
Mass gain [194.59549292318295]
Mass Loss [-402.3410646315572]
Net mass change [-207.74557170837417]
for loc in ['East','West','Peninsula']:
    print("\n", loc)
    sub = (df_net['Mass'] > 0) & (df_net['Region'] == loc); print('Mass gain', df_net[sub].drop(columns='Region').sum().values/dt)
    sub = (df_net['Mass'] < 0) & (df_net['Region'] == loc); print('Mass loss', df_net[sub].drop(columns='Region').sum().values/dt)
 East
Mass gain [192.48919217062863]
Mass loss [-69.4663382466161]

 West
Mass gain [1.923849262408337]
Mass loss [-206.1686424710849]

 Peninsula
Mass gain [0.17175689689132026]
Mass loss [-124.70409821345615]

Submarine melt

import pandas as pd

fname = '~/data/Davison_2023/adi0186_table_s2.xlsx'

loc = pd.read_excel(fname, sheet_name='Total mass changes', index_col = 0, usecols = 'B,C,D', skiprows = 4)
loc = loc.drop('Antarctic Ice Shelves')

df = pd.read_excel(fname, sheet_name='Steady-state',
                   index_col = 0, skiprows = 4, usecols=((1,4)))

df.columns = ['Mass']

df = loc.join(df)

import geopandas as gpd
fname = '~/data/NSIDC/NSIDC-0709.002/1992.02.07/IceBoundaries_Antarctica_v02.shp'
ew = gpd.read_file(fname)

df = gpd.GeoDataFrame(df, geometry=gpd.points_from_xy(df['longitude'],df['latitude']), crs="EPSG:4326")
df = df.to_crs('epsg:3031')
e = ew.to_crs('epsg:3031')

idx = ew.sindex.nearest(df['geometry'], return_all=False)
df['Region'] = ''
for dfidx,ewidx in idx.T:
    arr = df.iloc[dfidx].copy(deep=True)
    arr['Region'] = ew.iloc[ewidx]['Regions']
    df.iloc[dfidx] = arr
df = df.drop(columns=['latitude','longitude'])
    
df.loc['Total'] = [df['Mass'].sum(), None, 'All']

df[['Mass','Region']].groupby('Region').sum().drop('Islands').round()
/tmp/ipykernel_2536812/879752930.py:31: FutureWarning: The behavior of DataFrame concatenation with empty or all-NA entries is deprecated. In a future version, this will no longer exclude empty or all-NA columns when determining the result dtypes. To retain the old behavior, exclude the relevant entries before the concat operation.
  df.loc['Total'] = [df['Mass'].sum(), None, 'All']
RegionMass
All902.775
East392.012
Peninsula101.994
West408.457

Calving

Same as above, different sheet. Reuses variables from above, run that first.

fname = '~/data/Davison_2023/adi0186_table_s2.xlsx'

df = pd.read_excel(fname, sheet_name='Steady-state',
                   index_col = 0, skiprows = 4, usecols=((1,6)))

df.columns = ['Mass']

df = loc.join(df)

df = gpd.GeoDataFrame(df, geometry=gpd.points_from_xy(df['longitude'],df['latitude']), crs="EPSG:4326")
df = df.to_crs('epsg:3031')
e = ew.to_crs('epsg:3031')

idx = ew.sindex.nearest(df['geometry'], return_all=False)
df['Region'] = ''
for dfidx,ewidx in idx.T:
    arr = df.iloc[dfidx].copy(deep=True)
    arr['Region'] = ew.iloc[ewidx]['Regions']
    df.iloc[dfidx] = arr
df = df.drop(columns=['latitude','longitude'])
    
df.loc['Total'] = [df['Mass'].sum(), None, 'All']

df[['Mass','Region']].groupby('Region').sum().drop('Islands').round()
/tmp/ipykernel_2536812/353247760.py:22: FutureWarning: The behavior of DataFrame concatenation with empty or all-NA entries is deprecated. In a future version, this will no longer exclude empty or all-NA columns when determining the result dtypes. To retain the old behavior, exclude the relevant entries before the concat operation.
  df.loc['Total'] = [df['Mass'].sum(), None, 'All']
RegionMass
All1348.02
East681.734
Peninsula103.439
West561.832

Frontal Retreat

[greene_Supplementary_Table_1.xlsx](https://github.com/user-attachments/files/15598602/greene_Supplementary_Table_1.xlsx)

I think the data in the attached spreadsheet from [Greene et al., 2022 ](https://doi.org/10.1038/s41586-022-05037-w) is everything needed for ice-shelf mass-change resulting from frontal advance/retreat, so in Excel `=BI189-O189` gives Antarctica’s net retreat from 1997 to 2021. Change the column to adjust the time period.

BI189 = 24596304.0 BI189 = 2021.2 Q189 = 24597630.0 Q189 = 2000.2

(24596304.0 - 24597630.0) / (2021.2-2000.2) = -63.1428571429

But we need to recreate this in code so we can split by east/west/peninsula

import pandas as pd
import geopandas as gpd
fname = "~/data/Greene_2022/data/greene_Supplementary_Table_1.xlsx"

df = pd.read_excel(fname, sheet_name='greene_iceshelf_area_and_mass',
                    index_col = 1, skiprows = 4)
df = df.rename(columns={'Unnamed: 2':'lat',
                        'Unnamed: 3':'lon'})

# drop uncertainty columns
unc = []
for c in df.columns:
    if type(c) == str:
        if c[0:8] == 'Unnamed:':
            unc.append(c)
df = df.drop(columns = unc)
df = df[['lat','lon',2010.2,2021.2]]
df = df.iloc[1:]

# Remove last two rows
aq = df.loc['Antarctica']
other = df.loc['Other']
df = df.iloc[:-2]
print(df.sum())
print("")
print(aq)
print("")
print(other)
lat       -12882.373098
lon         6279.268331
2010.2    679991.871066
2021.2    681213.775349
dtype: object

lat            -90
lon          every
2010.2    24595086
2021.2    24596304
Name: Antarctica, dtype: object

lat            NaN
lon            NaN
2010.2    23915094
2021.2    23915090
Name: Other, dtype: object
shelf = df.sum()
print("All AQ loss: ", (aq[2021.2] - aq[2010.2]) / (2021-2010))
print("Named shelf loss: ", (shelf[2021.2] - shelf[2010.2]) / (2021-2010))
print("Other loss: ", (other[2021.2] - other[2010.2]) / (2021-2010))
print("Named + Other: ", (((other + shelf)[2021.2] - (other + shelf)[2010.2]) / (2021-2010)))
print("Named %: ", 2.19/63.02*100)
All AQ loss:  110.72727272727273
Named shelf loss:  111.08220751989971
Other loss:  -0.36363636363636365
Named + Other:  110.71857115626335
Named %:  3.4750872738813077
import geopandas as gpd
fname = '~/data/NSIDC/NSIDC-0709.002/1992.02.07/IceBoundaries_Antarctica_v02.shp'
ew = gpd.read_file(fname)
ew.drop(columns=['geometry']).head()
NAMERegionsSubregionsTYPEAsso_Shelf
0LarsenEPeninsulaIpp-JGRLarsenE
1Dawson_LambtonEastnanFLnan
2AcademyEastJpp-KGRFilchner
3Brunt_StancombEastK-AGRBrunt_Stancomb
4Riiser-LarsenEastK-AGRRiiser-Larsen
gdf = gpd.GeoDataFrame(df, geometry=gpd.points_from_xy(df['lon'],df['lat']), crs="EPSG:4326")

gdf = gdf.to_crs('epsg:3031')
ew = ew.to_crs('epsg:3031')

idx = ew.sindex.nearest(gdf['geometry'], return_all=False)
gdf['Region'] = ''
for gdfidx,ewidx in idx.T:
     arr = gdf.iloc[gdfidx].copy(deep=True)
     arr['Region'] = ew.iloc[ewidx]['Regions']
     gdf.iloc[gdfidx] = arr

gdf.head()

gdf.loc['Total'] = gdf.drop(columns='geometry').sum(axis='rows')
gdf.loc['Total', 'Region'] = 'All'

gdf['frontal change'] = (gdf[2021.2] - gdf[2010.2]) / (2021.2-2010.2)
pos = gdf[gdf['frontal change'] > 0]
neg = gdf[gdf['frontal change'] <= 0]
# gdf

print('neg', neg[['Region','frontal change']].groupby('Region').sum().round().abs())
print('')
print('pos', pos[['Region','frontal change']].groupby('Region').sum().round().abs())
print('')
print('all', gdf[['Region','frontal change']].groupby('Region').sum().round().abs())
neg           frontal change
Region                  
East           51.964246
Islands         0.134919
Peninsula     115.849899
West          222.842824

pos           frontal change
Region                  
All           111.082208
East          249.493215
Islands         0.003094
Peninsula       5.055953
West          247.321834

all           frontal change
Region                  
All           111.082208
East          197.528968
Islands         0.131825
Peninsula     110.793946
West            24.47901
/tmp/ipykernel_2536812/507091481.py:15: FutureWarning: The behavior of DataFrame concatenation with empty or all-NA entries is deprecated. In a future version, this will no longer exclude empty or all-NA columns when determining the result dtypes. To retain the old behavior, exclude the relevant entries before the concat operation.
  gdf.loc['Total'] = gdf.drop(columns='geometry').sum(axis='rows')

GZ retreat

Email from Davison

Ice ShelfMass change due to grounding line migration from 1997 to 2021 (Gt)Error (Gt)
Pine Island22040
Thwaites23025
Crosson20025
Dotson42080

(220+230+200+420)/(2021-1997) = 44.5833333333

Shelf freeze/melt

import xarray as xr
ds = xr.open_mfdataset("~/data/Paolo_2023/ANT_G1920V01_IceShelfMelt.nc")
ds = ds[['melt','melt_err']].sel({'time':slice('2010-01-01','2017-12-31')}).mean(dim='time')

delayed_obj = ds.to_netcdf('tmp/shelf_melt.nc', compute=False)
from dask.diagnostics import ProgressBar
with ProgressBar():
    results = delayed_obj.compute()

print(ds)
[########################################] | 100% Completed | 5.55 s
<xarray.Dataset> Size: 68MB
Dimensions:   (y: 2916, x: 2916)
Coordinates:
  * x         (x) float64 23kB -2.798e+06 -2.796e+06 ... 2.796e+06 2.798e+06
  * y         (y) float64 23kB 2.798e+06 2.796e+06 ... -2.796e+06 -2.798e+06
Data variables:
    melt      (y, x) float32 34MB dask.array<chunksize=(486, 486), meta=np.ndarray>
    melt_err  (y, x) float32 34MB dask.array<chunksize=(486, 486), meta=np.ndarray>
g.mapset -c Paolo_2023

ncdump -v x tmp/shelf_melt.nc # 2916x2916
ncdump -v y tmp/shelf_melt.nc

x0=-2798407.5
x1=2798392.5
y0=-2798392.5
y1=2798407.5

g.region w=$(( -2798407 - 960 )) e=$(( 2798392 + 960 )) s=$(( -2798392 - 960 )) n=$(( 2798407 + 960 )) res=1920 -p
r.mapcalc "area = area()"

r.in.gdal -o input=NetCDF:tmp/shelf_melt.nc:melt output=melt
r.in.gdal -o input=NetCDF:tmp/shelf_melt.nc:melt_err output=err
r.region -c map=melt
r.region -c map=err

## + kg/m^2 -> Gt: / 1E12
r.mapcalc "melt = melt * 1000 * area / exp(10,12)" --o
r.mapcalc "err = err * 1000 * area / exp(10,12)" --o

r.mapcalc "melt_on = if(melt > 0, melt, null())"
r.mapcalc "err_on = if(melt > 0, err, null())"
r.mapcalc "melt_off = if(melt < 0, melt, null())"
r.mapcalc "err_off = if(melt < 0, err, null())"

r.colors -ae map=melt color=difference
r.colors -ge map=melt_on color=viridis
r.colors -ge map=melt_off color=viridis

# d.rast melt
# d.rast melt_on
# d.rast melt_off

r.mapcalc "basins = if((basins@PERMANENT == 1) | (basins@PERMANENT == 11), 1, 0)"
r.mapcalc "basins = if((basins@PERMANENT == 2) | (basins@PERMANENT == 12), 2, basins)"
r.mapcalc "basins = if((basins@PERMANENT == 3) | (basins@PERMANENT == 13), 3, basins)"
r.colors map=basins color=viridis
r.category basins separator=":" rules=- << EOF
1:East
2:West
3:Peninsula
EOF

Stats

echo "NET"
r.univar -gt map=melt zones=basins | cut -d"|" -f2,13 | column -s"|" -t | sed 's/label.*//'
r.univar -gt map=err zones=basins | cut -d"|" -f2,13 | column -s"|" -t | sed 's/label.*//'
r.univar -g melt | grep sum

echo ""
echo "FREEZE_ON"
r.univar -gt map=melt_on zones=basins | cut -d"|" -f2,13 | column -s"|" -t | sed 's/label.*//'
r.univar -gt map=err_on zones=basins | cut -d"|" -f2,13 | column -s"|" -t | sed 's/label.*//'
r.univar -g melt_on | grep sum

echo ""
echo "MELT_OFF"
r.univar -gt map=melt_off zones=basins | cut -d"|" -f2,13 | column -s"|" -t | sed 's/label.*//'
r.univar -gt map=err_off zones=basins | cut -d"|" -f2,13 | column -s"|" -t | sed 's/label.*//'
r.univar -g melt_off | grep sum
�[?2004l
NET
�[?2004l
East       -327.749700568986
West       -476.184170767834
Peninsula  -140.389218858182
�[?2004l
East       1261.7643241234
West       1501.03281071649
Peninsula  225.909323225931
�[?2004l
�[01;31m�[Ksum�[m�[K=-977.26950720935
�[?2004l
�[?2004l
FREEZE_ON
�[?2004l
East       204.212024518552
West       174.058868541697
Peninsula  16.7004224367821
�[?2004l
East       449.068552754539
West       579.006947989805
Peninsula  50.2702348521852
�[?2004l
�[01;31m�[Ksum�[m�[K=404.684807702491
�[?2004l
�[?2004l
MELT_OFF
�[?2004l
East       -531.961725087539
West       -650.243039309524
Peninsula  -157.089641294966
�[?2004l
East       812.695771368885
West       922.025862726728
Peninsula  175.639088373746
�[?2004l
�[01;31m�[Ksum�[m�[K=-1381.95431491188

GRACE

import xarray as xr
ds = xr.open_dataset("~/data/Groh_2021/AIS_GMB_grid.nc")
ds['dm'] = ds['dm'] * ds['area']
ds = ds['dm'].sum(dim=['x','y'])/1E12
df = ds.to_dataframe()

# df = df.loc['2002-01-01':'2024-12-31']
df = df - df.max()
_ = df.plot()
# df = df.resample('D').mean().interpolate().resample('YS').mean()
df = df.resample('YS').mean()
df = df.diff()
print(df.mean())
dm   -79.146737
dtype: float64

./figs_tmp/0f945242fb8fcfd039ef097240df737ac32412ca.png

Misc

Export tables to CSVs

(save-excursion
  (goto-char (point-min))
  (re-search-forward (concat "^#\\+name: " tbl) nil t)
  (next-line)
  (org-table-export (concat "./dat/" tbl ".csv") "orgtbl-to-csv")
  ;;(shell-command-to-string (concat "head " tbl ".csv"))
  (message (concat "Exported " tbl " to " (concat "./dat/" tbl ".csv")))
  )

Convert PDFs to PNG

About

Sankey diagram for ice sheet mass flow

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •