satpy.utils module

Module defining various utilities.

exception satpy.utils.PerformanceWarning[source]

Bases: Warning

Warning raised when there is a possible performance impact.

class satpy.utils._WarningManager[source]

Bases: object

Class to handle switching warnings on and off.

filt = None
satpy.utils._all_dims_same_size(data_arrays: tuple[DataArray, ...]) bool[source]
satpy.utils._check_file_protocols(filenames, storage_options)[source]
satpy.utils._check_file_protocols_for_dicts(filenames, storage_options)[source]

Import the specified modules and provide status.

satpy.utils._check_yaml_configs(configs, key)[source]

Get a diagnostic for the yaml configs.

key is the section to look for to get a name for the config at hand.

satpy.utils._filenames_to_fsfile(filenames, storage_options)[source]

Compute the maximum chunk size from PYTROLL_CHUNK_SIZE.

satpy.utils._get_first_available_item(data_dict, possible_keys)[source]
satpy.utils._get_prefix_order_by_preference(prefixes, preference)[source]
satpy.utils._get_sat_altitude(data_arr, key_prefixes)[source]
satpy.utils._get_sat_lonlat(data_arr, key_prefixes)[source]

Get satellite position if no orbital parameters in metadata.

Some cloud top height datasets lack orbital parameter information in metadata. Here, orbital parameters are calculated based on the platform name and start time, via Two Line Element (TLE) information.

Needs pyorbital, skyfield, and astropy to be installed.

satpy.utils.angle2xyz(azi, zen)[source]

Convert azimuth and zenith to cartesian.

satpy.utils.atmospheric_path_length_correction(data, cos_zen, limit=88.0, max_sza=95.0)[source]

Perform Sun zenith angle correction.

This function uses the correction method proposed by Li and Shibata (2006):

The correction is limited to limit degrees (default: 88.0 degrees). For larger zenith angles, the correction is the same as at the limit if max_sza is None. The default behavior is to gradually reduce the correction past limit degrees up to max_sza where the correction becomes 0. Both data and cos_zen should be 2D arrays of the same shape.

satpy.utils.check_satpy(readers=None, writers=None, extras=None)[source]

Check the satpy readers and writers for correct installation.

  • readers (list or None) – Limit readers checked to those specified

  • writers (list or None) – Limit writers checked to those specified

  • extras (list or None) – Limit extras checked to those specified

Returns: bool

True if all specified features were successfully loaded.

satpy.utils.convert_remote_files_to_fsspec(filenames, storage_options=None)[source]

Check filenames for transfer protocols, convert to FSFile objects if possible.


Context manager to temporarily set debugging on.


>>> with satpy.utils.debug():
...     code_here()

deprecation_warnings (Optional[bool]) – Switch on deprecation warnings. Defaults to True.


Turn debugging logging off.

This disables both debugging logging and the global visibility of deprecation warnings.


Turn debugging logging on.

Sets up a StreamHandler to to sys.stderr at debug level for all loggers, such that all debug messages (and log messages with higher severity) are logged to the standard error stream.

By default, since Satpy 0.26, this also enables the global visibility of deprecation warnings. This can be suppressed by passing a false value.


deprecation_warnings (Optional[bool]) – Switch on deprecation warnings. Defaults to True.




Switch off deprecation warnings.


Switch on deprecation warnings.

satpy.utils.find_in_ancillary(data, dataset)[source]

Find a dataset by name in the ancillary vars of another dataset.

  • data (xarray.DataArray) – Array for which to search the ancillary variables

  • dataset (str) – Name of ancillary variable to look for.

satpy.utils.get_chunk_size_limit(dtype=<class 'float'>)[source]

Compute the chunk size limit in bytes given dtype (float by default).

It is derived from PYTROLL_CHUNK_SIZE if defined (although deprecated) first, from dask config’s array.chunk-size then. It defaults to 128MiB.


The recommended chunk size in bytes.


Get the dask configured chunk size in bytes.


Get the legacy chunk size.

This function should only be used while waiting for code to be migrated to use satpy.utils.get_chunk_size_limit instead.


Return logger with null handler added if needed.

satpy.utils.get_satpos(data_arr: DataArray, preference: str | None = None, use_tle: bool = False) tuple[float, float, float][source]

Get satellite position from dataset attributes.

  • data_arr – DataArray object to access .attrs metadata from.

  • preference

    Optional preference for one of the available types of position information. If not provided or None then the default preference is:

    • Longitude & Latitude: nadir, actual, nominal, projection

    • Altitude: actual, nominal, projection

    The provided preference can be any one of these individual strings (nadir, actual, nominal, projection). If the preference is not available then the original preference list is used. A warning is issued when projection values have to be used because nothing else is available and it wasn’t provided as the preference.

  • use_tle – If true, try to obtain position via satellite name and TLE if it can’t be determined otherwise. This requires pyorbital, skyfield, and astropy to be installed and may need network access to obtain the TLE. Note that even if use_tle is true, the TLE will not be used if the dataset metadata contain the satellite position directly.


Geodetic longitude, latitude, altitude [km]


Read and clean storage options from reader_kwargs.


Ignore warnings generated for working with NaN/inf values.

Numpy and dask sometimes don’t like NaN or inf values in normal function calls. This context manager hides/ignores them inside its context.


Use around numpy operations that you expect to produce warnings:

with ignore_invalid_float_warnings():

Wrap operations that we know will produce a PROJ.4 precision warning.

Only to be used internally to Pyresample when we have no other choice but to use PROJ.4 strings/dicts. For example, serialization to YAML or other human-readable formats or testing the methods that produce the PROJ.4 versions of the CRS.


Give more info on an import error.


Check if we are in a jupyter notebook.


Turn logging off.


Turn logging on.

satpy.utils.lonlat2xyz(lon, lat)[source]

Convert lon lat to cartesian.

For a sphere with unit radius, convert the spherical coordinates longitude and latitude to cartesian coordinates.

  • lon (number or array of numbers) – Longitude in °.

  • lat (number or array of numbers) – Latitude in °.


(x, y, z) Cartesian coordinates [1]

satpy.utils.normalize_low_res_chunks(chunks: tuple[int | Literal['auto'], ...], input_shape: tuple[int, ...], previous_chunks: tuple[int, ...], low_res_multipliers: tuple[int, ...], input_dtype: dtype[Any] | None | type[Any] | _SupportsDType[dtype[Any]] | str | tuple[Any, int] | tuple[Any, SupportsIndex | Sequence[SupportsIndex]] | list[Any] | _DTypeDict | tuple[Any, Any]) tuple[int, ...][source]

Compute dask chunk sizes based on data resolution.

First, chunks are computed for the highest resolution version of the data. This is done by multiplying the input array shape by the low_res_multiplier and then using Dask’s utility functions and configuration to produce a chunk size to fit into a specific number of bytes. See Chunks for more information. Next, the same multiplier is used to reduce the high resolution chunk sizes to the lower resolution of the input data. The end result of reading multiple resolutions of data is that each dask chunk covers the same geographic region. This also means replicating or aggregating one resolution and then combining arrays should not require any rechunking.

  • chunks – Requested chunk size for each dimension. This is passed directly to dask. Use "auto" for dimensions that should have chunks determined for them, -1 for dimensions that should be whole (not chunked), and 1 or any other positive integer for dimensions that have a known chunk size beforehand.

  • input_shape – Shape of the array to compute dask chunk size for.

  • previous_chunks – Any previous chunking or structure of the data. This can also be thought of as the smallest number of high (fine) resolution elements that make up a single “unit” or chunk of data. This could be a multiple or factor of the scan size for some instruments and/or could be based on the on-disk chunk size. This value ensures that chunks are aligned to the underlying data structure for best performance. On-disk chunk sizes should be multiplied by the largest low resolution multiplier if it is the same between all files (ex. 500m file has 226 chunk size, 1km file has 226 chunk size, etc).. Otherwise, the resulting low resolution chunks may not be aligned to the on-disk chunks. For example, if dask decides on a chunk size of 226 * 3 for 500m data, that becomes 226 * 3 / 2 for 1km data which is not aligned to the on-disk chunk size of 226.

  • low_res_multipliers – Number of high (fine) resolution pixels that fit in a single low (coarse) resolution pixel.

  • input_dtype – Dtype for the final unscaled array. This is usually 32-bit float (np.float32) or 64-bit float (np.float64) for non-category data. If this doesn’t represent the final data type of the data then the final size of chunks in memory will not match the user’s request via dask’s array.chunk-size configuration. Sometimes it is useful to keep this as a single dtype for all reading functionality (ex. np.float32) in order to keep all read variable chunks the same size regardless of dtype.


A tuple where each element is the chunk size for that axis/dimension.


Convert projection units from kilometers to meters.

satpy.utils.recursive_dict_update(d, u)[source]

Recursive dictionary update.

Copied from:


Turn trace logging on.

satpy.utils.unify_chunks(*data_arrays: DataArray) tuple[DataArray, ...][source]

Run xarray.unify_chunks() if input dimensions are all the same size.

This is mostly used in satpy.composites.CompositeBase to safe guard against running dask.array.core.map_blocks() with arrays of different chunk sizes. Doing so can cause unexpected results or errors. However, xarray’s unify_chunks will raise an exception if dimensions of the provided DataArrays are different sizes. This is a common case for Satpy. For example, the “bands” dimension may be 1 (L), 2 (LA), 3 (RGB), or 4 (RGBA) for most compositor operations that combine other composites together.

satpy.utils.xyz2angle(x, y, z, acos=False)[source]

Convert cartesian to azimuth and zenith.

satpy.utils.xyz2lonlat(x, y, z, asin=False)[source]

Convert cartesian to lon lat.

For a sphere with unit radius, convert cartesian coordinates to spherical coordinates longitude and latitude.

  • x (number or array of numbers) – x-coordinate, unitless

  • y (number or array of numbers) – y-coordinate, unitless

  • z (number or array of numbers) – z-coordinate, unitless

  • asin (optional, bool) – If true, use arcsin for calculations. If false, use arctan2 for calculations.


Longitude and latitude in °.

Return type:

(lon, lat)