You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that for some large values (not really that large) and lots of samples, the data.std() yields different values than np.std(data). This seems to be related to the magnitude. See attached code here:
What happened?
I noticed that for some large values (not really that large) and lots of samples, the
data.std()
yields different values thannp.std(data)
. This seems to be related to the magnitude. See attached code here:and the results are:
So I guess this is related to the magnitude, but not sure. Anyone has similar issue?
What did you expect to happen?
Adding or subtracting a constant should not change the standard deviation.
See screenshot here about what the data look like:
Minimal Complete Verifiable Example
No response
MVCE confirmation
Relevant log output
No response
Anything else we need to know?
No response
Environment
xarray: 2022.6.0
pandas: 1.4.4
numpy: 1.22.3
scipy: 1.8.1
netCDF4: 1.6.1
pydap: None
h5netcdf: None
h5py: None
Nio: None
zarr: None
cftime: 1.6.2
nc_time_axis: 1.4.1
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: 1.3.5
dask: 2022.9.0
distributed: 2022.9.0
matplotlib: 3.5.2
cartopy: 0.21.0
seaborn: None
numbagg: None
fsspec: 2022.10.0
cupy: None
pint: None
sparse: 0.13.0
flox: None
numpy_groupies: None
setuptools: 65.5.0
pip: 22.2.2
conda: None
pytest: None
IPython: 8.6.0
sphinx: None
The text was updated successfully, but these errors were encountered: