qimpy.io.Checkpoint

class Checkpoint(filename, *, writable=False, rotate=True)

Bases: File

Helper for checkpoint load/save from HDF5 files.

Parameters:
  • filename (str) –

  • writable (bool) –

  • rotate (bool) –

__init__(filename, *, writable=False, rotate=True)

Open a HDF5 checkpoint file filename for read or write based on writable.

In write mode, if rotate (the default), the file is first written to filename + ‘.part’ and then rotated into filename upon closing (with a previously existing filename moved into filename + ‘.bak’). This prevents corruption of the checkpoint if the job is terminated due to time limit while the checkpoint is being written.

Parameters:
  • filename (str) –

  • writable (bool) –

  • rotate (bool) –

Return type:

None

Methods

__init__

Open a HDF5 checkpoint file filename for read or write based on writable.

build_virtual_dataset

Assemble a virtual dataset in this group.

clear

close

Close the file, and perform rotations if applicable.

copy

Copy an object or group.

create_dataset

Create a new HDF5 dataset

create_dataset_complex

Create a dataset at path suitable for a complex array of size shape.

create_dataset_like

Create a dataset similar to other.

create_dataset_real

Create a dataset at path suitable for a real array of size shape.

create_group

Create and return a new subgroup.

create_virtual_dataset

Create a new virtual dataset in this group.

flush

Tell the HDF5 library to flush its buffers.

get

Retrieve an item or other information.

items

Get a view object on member items

keys

Get a view object on member names

move

Move a link to a new location in the file.

pop

If key is not found, d is returned if given, otherwise KeyError is raised.

popitem

as a 2-tuple; but raise KeyError if D is empty.

read_slice

Read a slice of data from data set dset in file, starting at offset and of length size in each dimension.

read_slice_complex

Same as read_slice(), but for complex data.

require_dataset

Open a dataset, creating it if it doesn't exist.

require_group

Return a group, creating it if it doesn't exist.

setdefault

update

If E present and has a .keys() method, does: for k in E: D[k] = E[k] If E present and lacks .keys() method, does: for (k, v) in E: D[k] = v In either case, this is followed by: for k, v in F.items(): D[k] = v

values

Get a view object on member objects

visit

Recursively visit all names in this group and subgroups.

visititems

Recursively visit names and objects in this group.

write_slice

Write a slice of data to dataset dset at offset offset from data (taking care of transfer to CPU if needed).

write_slice_complex

Same as write_slice(), but for complex data.

Attributes

attrs

Attributes attached to this object

driver

Low-level HDF5 file driver used to open file

file

Return a File instance associated with this object

filename

File name on disk

id

Low-level identifier appropriate for this object

libver

low, high)

meta_block_size

Meta block size (in bytes)

mode

Python mode used to open file

name

Return the full name of this object.

parent

Return the parent group of this object.

ref

An (opaque) HDF5 reference to this object

regionref

Create a region reference (Datasets only).

swmr_mode

Controls single-writer multiple-reader mode

userblock_size

User block size (in bytes)

writable

Whether file has been opened for writing

filename_move

If non-empty, move to this filename upon closing

close()

Close the file, and perform rotations if applicable.

create_dataset_complex(path, shape, dtype=torch.complex128)

Create a dataset at path suitable for a complex array of size shape. This creates a real array with a final dimension of length 2. This format is used by write_slice_complex() and read_slice_complex().

Parameters:
  • path (str) –

  • shape (tuple[int, ...]) –

  • dtype (dtype) –

Return type:

Any

create_dataset_real(path, shape, dtype=torch.float64)

Create a dataset at path suitable for a real array of size shape. Additionally, dtype is translated from torch to numpy for convenience.

Parameters:
  • path (str) –

  • shape (tuple[int, ...]) –

  • dtype (dtype) –

Return type:

Any

read_slice(dset, offset, size)

Read a slice of data from data set dset in file, starting at offset and of length size in each dimension. Returns data on CPU or GPU as specified by qimpy.rc.device.

Parameters:
  • dset (Any) –

  • offset (tuple[int, ...]) –

  • size (tuple[int, ...]) –

Return type:

Tensor

read_slice_complex(dset, offset, size)

Same as read_slice(), but for complex data. Converts data from real storage as created by create_dataset_complex() to a complex tensor on output.

Parameters:
  • dset (Any) –

  • offset (tuple[int, ...]) –

  • size (tuple[int, ...]) –

Return type:

Tensor

write_slice(dset, offset, data)

Write a slice of data to dataset dset at offset offset from data (taking care of transfer to CPU if needed). Note that all of data is written, so pass in the slice to be written from current process. This may be called from any subset of MPI processes independently, as no metadata modification such as dataset creation is done here.

Parameters:
  • dset (Any) –

  • offset (tuple[int, ...]) –

  • data (Tensor) –

Return type:

None

write_slice_complex(dset, offset, data)

Same as write_slice(), but for complex data. Converts data to real storage compatible with create_dataset_complex()

Parameters:
  • dset (Any) –

  • offset (tuple[int, ...]) –

  • data (Tensor) –

Return type:

None

filename_move: str

If non-empty, move to this filename upon closing

writable: bool

Whether file has been opened for writing