webknossos.dataset.layer
Layer
Layer(dataset: Dataset, properties: LayerProperties)
A Layer
consists of multiple MagView
s, which store the same data in different magnifications.
Do not use this constructor manually. Instead use Dataset.add_layer()
to create a Layer
.
category
property
category: LayerCategoryType
default_view_configuration
property
writable
default_view_configuration: Optional[LayerViewConfiguration]
dtype_per_channel
property
dtype_per_channel: dtype
dtype_per_layer
property
dtype_per_layer: str
name
property
writable
name: str
num_channels
property
num_channels: int
path
property
path: Path
read_only
property
read_only: bool
add_copy_mag
add_copy_mag(foreign_mag_view_or_path: Union[PathLike, str, MagView], extend_layer_bounding_box: bool = True, chunk_shape: Optional[Union[Vec3IntLike, int]] = None, chunks_per_shard: Optional[Union[Vec3IntLike, int]] = None, compress: Optional[bool] = None, executor: Optional[Executor] = None) -> MagView
Copies the data at foreign_mag_view_or_path
which can belong to another dataset
to the current dataset. Additionally, the relevant information from the
datasource-properties.json
of the other dataset are copied, too.
add_fs_copy_mag
add_fs_copy_mag(foreign_mag_view_or_path: Union[PathLike, str, MagView], extend_layer_bounding_box: bool = True) -> MagView
Copies the data at foreign_mag_view_or_path
which belongs to another dataset to the current dataset via the filesystem.
Additionally, the relevant information from the datasource-properties.json
of the other dataset are copied, too.
add_mag
add_mag(mag: Union[int, str, list, tuple, ndarray, Mag], chunk_shape: Optional[Union[Vec3IntLike, int]] = None, chunks_per_shard: Optional[Union[int, Vec3IntLike]] = None, compress: bool = False, *, chunk_size: Optional[Union[Vec3IntLike, int]] = None, block_len: Optional[int] = None, file_len: Optional[int] = None) -> MagView
Creates a new mag called and adds it to the layer.
The parameter chunk_shape
, chunks_per_shard
and compress
can be
specified to adjust how the data is stored on disk.
Note that writing compressed data which is not aligned with the blocks on disk may result in
diminished performance, as full blocks will automatically be read to pad the write actions. Alternatively,
you can call mag.compress() after all the data was written
The return type is webknossos.dataset.mag_view.MagView
.
Raises an IndexError if the specified mag
already exists.
add_mag_for_existing_files
Creates a new mag based on already existing files.
Raises an IndexError if the specified mag
does not exists.
add_symlink_mag
add_symlink_mag(foreign_mag_view_or_path: Union[PathLike, str, MagView], make_relative: bool = False, extend_layer_bounding_box: bool = True) -> MagView
Creates a symlink to the data at foreign_mag_view_or_path
which belongs to another dataset.
The relevant information from the datasource-properties.json
of the other dataset is copied to this dataset.
Note: If the other dataset modifies its bounding box afterwards, the change does not affect this properties
(or vice versa).
If make_relative is True, the symlink is made relative to the current dataset path.
Symlinked mags can only be added to layers on local file systems.
delete_mag
delete_mag(mag: Union[int, str, list, tuple, ndarray, Mag]) -> None
Deletes the MagView from the datasource-properties.json
and the data from disk.
This function raises an IndexError
if the specified mag
does not exist.
downsample
downsample(from_mag: Optional[Mag] = None, coarsest_mag: Optional[Mag] = None, interpolation_mode: str = 'default', compress: bool = True, sampling_mode: Union[str, SamplingModes] = SamplingModes.ANISOTROPIC, align_with_other_layers: Union[bool, Dataset] = True, buffer_shape: Optional[Vec3Int] = None, force_sampling_scheme: bool = False, args: Optional[Namespace] = None, allow_overwrite: bool = False, only_setup_mags: bool = False, executor: Optional[Executor] = None) -> None
Downsamples the data starting from from_mag
until a magnification is >= max(coarsest_mag)
.
There are three different sampling_modes
:
- 'anisotropic' - The next magnification is chosen so that the width, height and depth of a downsampled voxel assimilate. For example, if the z resolution is worse than the x/y resolution, z won't be downsampled in the first downsampling step(s). As a basis for this method, the voxel_size from the datasource-properties.json is used.
- 'isotropic' - Each dimension is downsampled equally.
- 'constant_z' - The x and y dimensions are downsampled equally, but the z dimension remains the same.
See downsample_mag
for more information.
Example:
from webknossos import SamplingModes
# ...
# let 'layer' be a `Layer` with only `Mag(1)`
assert "1" in self.mags.keys()
layer.downsample(
coarsest_mag=Mag(4),
sampling_mode=SamplingModes.ISOTROPIC
)
assert "2" in self.mags.keys()
assert "4" in self.mags.keys()
downsample_mag
downsample_mag(from_mag: Mag, target_mag: Mag, interpolation_mode: str = 'default', compress: bool = True, buffer_shape: Optional[Vec3Int] = None, args: Optional[Namespace] = None, allow_overwrite: bool = False, only_setup_mag: bool = False, executor: Optional[Executor] = None) -> None
Performs a single downsampling step from from_mag
to target_mag
.
The supported interpolation_modes
are:
- "median"
- "mode"
- "nearest"
- "bilinear"
- "bicubic"
If allow_overwrite is True, an existing Mag may be overwritten.
If only_setup_mag is True, the magnification is created, but left empty. This parameter can be used to prepare for parallel downsampling of multiple layers while avoiding parallel writes with outdated updates to the datasource-properties.json file.
executor
can be passed to allow distributed computation, parallelizing
across chunks. args
is deprecated.
downsample_mag_list
downsample_mag_list(from_mag: Mag, target_mags: List[Mag], interpolation_mode: str = 'default', compress: bool = True, buffer_shape: Optional[Vec3Int] = None, args: Optional[Namespace] = None, allow_overwrite: bool = False, only_setup_mags: bool = False, executor: Optional[Executor] = None) -> None
Downsamples the data starting at from_mag
to each magnification in target_mags
iteratively.
See downsample_mag
for more information.
get_mag
Returns the MagView
called mag
of this layer. The return type is webknossos.dataset.mag_view.MagView
.
This function raises an IndexError
if the specified mag
does not exist.
get_or_add_mag
get_or_add_mag(mag: Union[int, str, list, tuple, ndarray, Mag], chunk_shape: Optional[Union[Vec3IntLike, int]] = None, chunks_per_shard: Optional[Union[Vec3IntLike, int]] = None, compress: Optional[bool] = None, *, chunk_size: Optional[Union[Vec3IntLike, int]] = None, block_len: Optional[int] = None, file_len: Optional[int] = None) -> MagView
Creates a new mag and adds it to the dataset, in case it did not exist before. Then, returns the mag.
See add_mag
for more information.
redownsample
redownsample(interpolation_mode: str = 'default', compress: bool = True, buffer_shape: Optional[Vec3Int] = None, args: Optional[Namespace] = None, executor: Optional[Executor] = None) -> None
Use this method to recompute downsampled magnifications after mutating data in the base magnification.
upsample
upsample(from_mag: Mag, finest_mag: Mag = Mag(1), compress: bool = False, sampling_mode: Union[str, SamplingModes] = SamplingModes.ANISOTROPIC, align_with_other_layers: Union[bool, Dataset] = True, buffer_shape: Optional[Vec3Int] = None, buffer_edge_len: Optional[int] = None, args: Optional[Namespace] = None, executor: Optional[Executor] = None, *, min_mag: Optional[Mag] = None) -> None
Upsamples the data starting from from_mag
as long as the magnification is >= finest_mag
.
There are three different sampling_modes
:
- 'anisotropic' - The next magnification is chosen so that the width, height and depth of a downsampled voxel assimilate. For example, if the z resolution is worse than the x/y resolution, z won't be downsampled in the first downsampling step(s). As a basis for this method, the voxel_size from the datasource-properties.json is used.
- 'isotropic' - Each dimension is downsampled equally.
- 'constant_z' - The x and y dimensions are downsampled equally, but the z dimension remains the same.
min_mag
is deprecated, please use finest_mag
instead.
SegmentationLayer
SegmentationLayer(dataset: Dataset, properties: LayerProperties)
Bases: Layer
Do not use this constructor manually. Instead use Dataset.add_layer()
to create a Layer
.
category
property
category: LayerCategoryType
default_view_configuration
property
writable
default_view_configuration: Optional[LayerViewConfiguration]
dtype_per_channel
property
dtype_per_channel: dtype
dtype_per_layer
property
dtype_per_layer: str
largest_segment_id
property
writable
largest_segment_id: Optional[int]
name
property
writable
name: str
num_channels
property
num_channels: int
path
property
path: Path
read_only
property
read_only: bool
add_copy_mag
add_copy_mag(foreign_mag_view_or_path: Union[PathLike, str, MagView], extend_layer_bounding_box: bool = True, chunk_shape: Optional[Union[Vec3IntLike, int]] = None, chunks_per_shard: Optional[Union[Vec3IntLike, int]] = None, compress: Optional[bool] = None, executor: Optional[Executor] = None) -> MagView
Copies the data at foreign_mag_view_or_path
which can belong to another dataset
to the current dataset. Additionally, the relevant information from the
datasource-properties.json
of the other dataset are copied, too.
add_fs_copy_mag
add_fs_copy_mag(foreign_mag_view_or_path: Union[PathLike, str, MagView], extend_layer_bounding_box: bool = True) -> MagView
Copies the data at foreign_mag_view_or_path
which belongs to another dataset to the current dataset via the filesystem.
Additionally, the relevant information from the datasource-properties.json
of the other dataset are copied, too.
add_mag
add_mag(mag: Union[int, str, list, tuple, ndarray, Mag], chunk_shape: Optional[Union[Vec3IntLike, int]] = None, chunks_per_shard: Optional[Union[int, Vec3IntLike]] = None, compress: bool = False, *, chunk_size: Optional[Union[Vec3IntLike, int]] = None, block_len: Optional[int] = None, file_len: Optional[int] = None) -> MagView
Creates a new mag called and adds it to the layer.
The parameter chunk_shape
, chunks_per_shard
and compress
can be
specified to adjust how the data is stored on disk.
Note that writing compressed data which is not aligned with the blocks on disk may result in
diminished performance, as full blocks will automatically be read to pad the write actions. Alternatively,
you can call mag.compress() after all the data was written
The return type is webknossos.dataset.mag_view.MagView
.
Raises an IndexError if the specified mag
already exists.
add_mag_for_existing_files
Creates a new mag based on already existing files.
Raises an IndexError if the specified mag
does not exists.
add_symlink_mag
add_symlink_mag(foreign_mag_view_or_path: Union[PathLike, str, MagView], make_relative: bool = False, extend_layer_bounding_box: bool = True) -> MagView
Creates a symlink to the data at foreign_mag_view_or_path
which belongs to another dataset.
The relevant information from the datasource-properties.json
of the other dataset is copied to this dataset.
Note: If the other dataset modifies its bounding box afterwards, the change does not affect this properties
(or vice versa).
If make_relative is True, the symlink is made relative to the current dataset path.
Symlinked mags can only be added to layers on local file systems.
delete_mag
delete_mag(mag: Union[int, str, list, tuple, ndarray, Mag]) -> None
Deletes the MagView from the datasource-properties.json
and the data from disk.
This function raises an IndexError
if the specified mag
does not exist.
downsample
downsample(from_mag: Optional[Mag] = None, coarsest_mag: Optional[Mag] = None, interpolation_mode: str = 'default', compress: bool = True, sampling_mode: Union[str, SamplingModes] = SamplingModes.ANISOTROPIC, align_with_other_layers: Union[bool, Dataset] = True, buffer_shape: Optional[Vec3Int] = None, force_sampling_scheme: bool = False, args: Optional[Namespace] = None, allow_overwrite: bool = False, only_setup_mags: bool = False, executor: Optional[Executor] = None) -> None
Downsamples the data starting from from_mag
until a magnification is >= max(coarsest_mag)
.
There are three different sampling_modes
:
- 'anisotropic' - The next magnification is chosen so that the width, height and depth of a downsampled voxel assimilate. For example, if the z resolution is worse than the x/y resolution, z won't be downsampled in the first downsampling step(s). As a basis for this method, the voxel_size from the datasource-properties.json is used.
- 'isotropic' - Each dimension is downsampled equally.
- 'constant_z' - The x and y dimensions are downsampled equally, but the z dimension remains the same.
See downsample_mag
for more information.
Example:
from webknossos import SamplingModes
# ...
# let 'layer' be a `Layer` with only `Mag(1)`
assert "1" in self.mags.keys()
layer.downsample(
coarsest_mag=Mag(4),
sampling_mode=SamplingModes.ISOTROPIC
)
assert "2" in self.mags.keys()
assert "4" in self.mags.keys()
downsample_mag
downsample_mag(from_mag: Mag, target_mag: Mag, interpolation_mode: str = 'default', compress: bool = True, buffer_shape: Optional[Vec3Int] = None, args: Optional[Namespace] = None, allow_overwrite: bool = False, only_setup_mag: bool = False, executor: Optional[Executor] = None) -> None
Performs a single downsampling step from from_mag
to target_mag
.
The supported interpolation_modes
are:
- "median"
- "mode"
- "nearest"
- "bilinear"
- "bicubic"
If allow_overwrite is True, an existing Mag may be overwritten.
If only_setup_mag is True, the magnification is created, but left empty. This parameter can be used to prepare for parallel downsampling of multiple layers while avoiding parallel writes with outdated updates to the datasource-properties.json file.
executor
can be passed to allow distributed computation, parallelizing
across chunks. args
is deprecated.
downsample_mag_list
downsample_mag_list(from_mag: Mag, target_mags: List[Mag], interpolation_mode: str = 'default', compress: bool = True, buffer_shape: Optional[Vec3Int] = None, args: Optional[Namespace] = None, allow_overwrite: bool = False, only_setup_mags: bool = False, executor: Optional[Executor] = None) -> None
Downsamples the data starting at from_mag
to each magnification in target_mags
iteratively.
See downsample_mag
for more information.
get_mag
Returns the MagView
called mag
of this layer. The return type is webknossos.dataset.mag_view.MagView
.
This function raises an IndexError
if the specified mag
does not exist.
get_or_add_mag
get_or_add_mag(mag: Union[int, str, list, tuple, ndarray, Mag], chunk_shape: Optional[Union[Vec3IntLike, int]] = None, chunks_per_shard: Optional[Union[Vec3IntLike, int]] = None, compress: Optional[bool] = None, *, chunk_size: Optional[Union[Vec3IntLike, int]] = None, block_len: Optional[int] = None, file_len: Optional[int] = None) -> MagView
Creates a new mag and adds it to the dataset, in case it did not exist before. Then, returns the mag.
See add_mag
for more information.
redownsample
redownsample(interpolation_mode: str = 'default', compress: bool = True, buffer_shape: Optional[Vec3Int] = None, args: Optional[Namespace] = None, executor: Optional[Executor] = None) -> None
Use this method to recompute downsampled magnifications after mutating data in the base magnification.
refresh_largest_segment_id
refresh_largest_segment_id(chunk_shape: Optional[Vec3Int] = None, executor: Optional[Executor] = None) -> None
Sets the largest segment id to the highest value in the data.
largest_segment_id is set to None
if the data is empty.
upsample
upsample(from_mag: Mag, finest_mag: Mag = Mag(1), compress: bool = False, sampling_mode: Union[str, SamplingModes] = SamplingModes.ANISOTROPIC, align_with_other_layers: Union[bool, Dataset] = True, buffer_shape: Optional[Vec3Int] = None, buffer_edge_len: Optional[int] = None, args: Optional[Namespace] = None, executor: Optional[Executor] = None, *, min_mag: Optional[Mag] = None) -> None
Upsamples the data starting from from_mag
as long as the magnification is >= finest_mag
.
There are three different sampling_modes
:
- 'anisotropic' - The next magnification is chosen so that the width, height and depth of a downsampled voxel assimilate. For example, if the z resolution is worse than the x/y resolution, z won't be downsampled in the first downsampling step(s). As a basis for this method, the voxel_size from the datasource-properties.json is used.
- 'isotropic' - Each dimension is downsampled equally.
- 'constant_z' - The x and y dimensions are downsampled equally, but the z dimension remains the same.
min_mag
is deprecated, please use finest_mag
instead.