Working With Containers
Recursive Conversions
- fannypack.utils.to_device(x: torch.Tensor, device: torch.device, detach: bool = False) torch.Tensor
- fannypack.utils.to_device(x: fannypack.utils._conversions.BoundContainer, device: torch.device, detach: bool = False) fannypack.utils._conversions.BoundContainer
Move a torch tensor, list, tuple (standard or named), or dict of tensors to a different device. Recursively applied for nested containers.
- Parameters
x (torch.Tensor, list, tuple (standard or named), or dict) – Tensor or container of tensors to move.
device (torch.device) – Target device.
detach (bool, optional) – If set, detaches tensors after moving. Defaults to False.
- Returns
torch.Tensor, list, tuple (standard or named), or dict – Output, type will mirror input.
- fannypack.utils.to_numpy(x: torch.Tensor) numpy.ndarray
- fannypack.utils.to_numpy(x: List[torch.Tensor]) List[numpy.ndarray]
- fannypack.utils.to_numpy(x: List) List
- fannypack.utils.to_numpy(x: Tuple[torch.Tensor, ...]) Tuple[numpy.ndarray, ...]
- fannypack.utils.to_numpy(x: Tuple) Tuple
- fannypack.utils.to_numpy(x: Dict[fannypack.utils._conversions.Key, torch.Tensor]) Dict[fannypack.utils._conversions.Key, numpy.ndarray]
- fannypack.utils.to_numpy(x: Dict[fannypack.utils._conversions.Key, Any]) Dict[fannypack.utils._conversions.Key, Any]
Converts a tensor, list, tuple (standard or named), or dict of tensors for use in Numpy. Recursively applied for nested containers.
- Parameters
x (torch.Tensor, list, tuple (standard or named), or dict) – Tensor or container of tensors to convert to NumPy.
- Returns
np.ndarray, list, tuple (standard or named), or dict – Output, type will mirror input.
- fannypack.utils.to_torch(x: numpy.ndarray, device: str = 'cpu', convert_doubles_to_floats: bool = True) torch.Tensor
- fannypack.utils.to_torch(x: List[numpy.ndarray], device: str = 'cpu', convert_doubles_to_floats: bool = True) List[torch.Tensor]
- fannypack.utils.to_torch(x: List, device: str = 'cpu', convert_doubles_to_floats: bool = True) List
- fannypack.utils.to_torch(x: Tuple[numpy.ndarray, ...], device: str = 'cpu', convert_doubles_to_floats: bool = True) Tuple[torch.Tensor, ...]
- fannypack.utils.to_torch(x: Tuple, device: str = 'cpu', convert_doubles_to_floats: bool = True) Tuple
- fannypack.utils.to_torch(x: Dict[fannypack.utils._conversions.Key, numpy.ndarray], device: str = 'cpu', convert_doubles_to_floats: bool = True) Dict[fannypack.utils._conversions.Key, torch.Tensor]
- fannypack.utils.to_torch(x: Dict[fannypack.utils._conversions.Key, Any], device: str = 'cpu', convert_doubles_to_floats: bool = True) Dict[fannypack.utils._conversions.Key, Any]
Converts a NumPy array, list, tuple (standard or named), or dict of NumPy arrays for use in PyTorch. Recursively applied for nested containers.
- Parameters
x (np.ndarray, list, tuple (standard or named), or dict) – Array or container of arrays to convert to torch tensors.
device (torch.device, optional) – Torch device to create tensors on. Defaults to
"cpu".convert_doubles_to_floats (bool, optional) – If set, converts 64-bit floats to 32-bit. Defaults to True.
- Returns
torch.Tensor, list, tuple (standard or named), or dict – Output, type will mirror input.
Grouped Slicing
- class fannypack.utils.SliceWrapper(data: fannypack.utils._slice_wrapper.WrappedType)
Bases:
Iterable,Generic[fannypack.utils._slice_wrapper.WrappedType]A wrapper class for creating a unified interface for slicing and manipulating:
Lists
Tuples
Torch tensors
Numpy arrays
Dictionaries containing a same-length group of any of the above
This makes it easy to read, slice, and index into blocks of data organized into dictionaries.
Nominally:
dataset = SliceWrapper({ "features": features, "labels": labels, }) train_count = 100 train_dataset = dataset[:train_count] val_dataset = dataset[train_count:]
would be equivalent to:
train_count = 100 train_dataset = { "features": features[:train_count], "labels": b[:train_count], } val_dataset = { "features": features[train_count:], "labels": b[train_count:], }
For convenience, a transparent interface is provided for iterables that are directly wrapped. Thus:
SliceWrapper([1, 2, 3])[::-1]
would return:
[1, 2, 3][::-1]
- __getitem__(index: Any) Any
Unified interface for indexing into our wrapped object; shorthand for
SliceWrapper.map(lambda v: v[index]).For wrapped dictionaries, this returns a new (un-wrapped) dictionary with the index applied value-wise. Thus:
SliceWrapper({ "a": a, "b": b, })[index]
would return:
{ "a": a[index], "b": b[index], }
For iterables that are directly wrapped, this is equivalent to evaluating
data[index]. Thus..SliceWrapper([1, 2, 3])[::-1]
would return:
[1, 2, 3][::-1]
- Parameters
index (Any) – Index. Can be a slice, tuple, boolean array, etc.
- Returns
Any – Indexed value. See overall function docstring.
- __iter__()
Iterable iter() interface.
- __len__() int
Unified interface for evaluating the length of a wrapped object.
Equivalent to
SliceWrapper.shape[0].- Returns
int – Length of wrapped object.
- __next__()
Iterable next() interface.
- append(other: Any) None
Append to the end of our data object.
Only supported for wrapped lists and dictionaries containing lists.
For wrapped lists, this is equivalent to
data.append(other).For dictionaries,
othershould be a dictionary. Behavior example:# Data before append {"a": [1, 2, 3, 4], "b": [5, 6, 7, 8]} # Value of other {"a": 5, "b": 3} # Data after append {"a": [1, 2, 3, 4, 5], "b": [5, 6, 7, 8, 3]}
- Parameters
other (Any) – Object to append.
- data: fannypack.utils._slice_wrapper.WrappedType
Wrapped data.
- Type
list, tuple, torch.Tensor, np.ndarray, or dict
- extend(other: fannypack.utils._slice_wrapper.WrappedType) None
Extend to the end of our data object.
Only supported for wrapped lists and dictionaries containing lists.
For wrapped lists, this is equivalent to
data.extend(other).For dictionaries,
othershould be a dictionary. Behavior example:# Data before extend {"a": [1, 2, 3, 4], "b": [5, 6, 7, 8]} # Value of other {"a": [5], "b": [3]} # Data after extend {"a": [1, 2, 3, 4, 5], "b": [5, 6, 7, 8, 3]}
- Parameters
other (dict or list) – Object to append.
- map(function: Callable[[Any], fannypack.utils._slice_wrapper.MapOutputType]) Dict[Any, fannypack.utils._slice_wrapper.MapOutputType]
- map(function: Callable[[Any], fannypack.utils._slice_wrapper.MapOutputType]) fannypack.utils._slice_wrapper.MapOutputType
- map(function: Callable[[Any], fannypack.utils._slice_wrapper.MapOutputType]) Union[fannypack.utils._slice_wrapper.MapOutputType, Dict[Any, fannypack.utils._slice_wrapper.MapOutputType]]
Apply a function to all iterables within our wrapped data object.
For iterables that are directly wrapped (eg lists), this is equivalent to evaluating:
slice_wrapper: SliceWrapper[List] function(slice_wrapper.data)
For dictionaries,
functionis applied value-wise. Thus, an input of:SliceWrapper({ "a": [1, 2, 3], "b": [2, 4, 5], })
would return:
{ "a": function([1, 2, 3]), "b": function([2, 4, 5]), }
- Parameters
function (Callable) – Function to map.
- property shape: Tuple[int, ...]
Unified interface for polling the shape of our wrapped object.
For lists and tuples, this evaluates to
(len(data),).For Numpy arrays and torch tensors, we get
data.shape.For dictionaries, we return a tuple containing all shared dimensions between our wrapped values, starting from the leftmost dimension.
Args:
- Returns
Tuple[int, …]
Generic Squeeze
- fannypack.utils.squeeze(x: Any, axis: Optional[Union[int, Tuple[int, ...]]] = None) Any
Generic squeeze function, for all sliceable objects with a
shapefield.Designed for
fannypack.utils.SliceWrapper, but should also work with NumPy arrays, torch Tensors, etc.- Parameters
x (Any) – Object to squeeze. Must have a
shapeattribute and be indexable with slices.axis (Union[int, Tuple[int, ...]], optional) – Axis or axes to squeeze along. If None (default), squeezes all dimensions with value
1.
- Returns
Any – Squeeze object.