Skip to content

Commit

Permalink
More docstrings
Browse files Browse the repository at this point in the history
  • Loading branch information
coretl committed Feb 3, 2025
1 parent c7230ac commit ca0ff7f
Show file tree
Hide file tree
Showing 15 changed files with 227 additions and 117 deletions.
4 changes: 2 additions & 2 deletions docs/_templates/custom-module-template.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

{%- set filtered_members = [] %}
{%- for item in members %}
{%- if item in functions + classes + exceptions + attributes %}
{% set _ = filtered_members.append(item) %}
{%- if fullname + "." + item not in modules and not item.startswith("_") %}
{%- set _ = filtered_members.append(item) %}
{%- endif %}
{%- endfor %}

Expand Down
9 changes: 7 additions & 2 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,8 @@
# ('envvar', 'LD_LIBRARY_PATH').
nitpick_ignore = [
("py:class", "ophyd_async.core._utils.T"),
("py:class", "ophyd_async.core._utils.V"),
("py:class", "ophyd_async.core._device.DeviceT"),
# # builtins
# ("py:class", "NoneType"),
# ("py:class", "'str'"),
Expand Down Expand Up @@ -258,8 +260,11 @@
# Don't show config summary as it's not relevant
autodoc_pydantic_model_show_config_summary = False

# Show the fields in source order
autodoc_pydantic_model_summary_list_order = "bysource"
# Don't show JSON schema
autodoc_pydantic_model_show_json = False

# Don't show field summary, as links break in reimported models
autodoc_pydantic_model_show_field_summary = False

# Where to put Ipython savefigs
ipython_savefig_dir = "../build/savefig"
6 changes: 3 additions & 3 deletions docs/how-to/implement-ad-detector.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ The first stage is to make a module in the `ophyd-async` repository to put the c

## Add an IO class for the PV interface

Now you need an IO class that subclasses [](#ADBaseIO). This should add the PVs that are detector driver specific that are required to setup triggering.
Now you need an IO class that subclasses [](#adcore.ADBaseIO). This should add the PVs that are detector driver specific that are required to setup triggering.

For example for ADAravis this is in the file `_aravis_io.py`:
```{literalinclude} ../../src/ophyd_async/epics/adaravis/_aravis_io.py
Expand All @@ -22,7 +22,7 @@ For example for ADAravis this is in the file `_aravis_io.py`:

## Add a Controller that knows how to setup the driver

Now you need a class that subclasses [](#ADBaseController). This should implement at least:
Now you need a class that subclasses [](#adcore.ADBaseController). This should implement at least:
- `get_deadtime()` to give the amount of time required between triggers for a given exposure
- `prepare()` to set the camera up for a given trigger mode, number of frames and exposure

Expand All @@ -37,7 +37,7 @@ Now you need to make a [](#StandardDetector) subclass that uses your IO and Cont
- `prefix`: The PV prefix for the driver and plugins
- `path_provider`: A [](#PathProvider) that tells the detector where to write data
- `drv_suffix`: A PV suffix for the driver, defaulting to `"cam1:"`
- `writer_cls`: An [](#ADWriter) class to instantiate, defaulting to [](#adcore.ADHDFWriter)
- `writer_cls`: An [](#adcore.ADWriter) class to instantiate, defaulting to [](#adcore.ADHDFWriter)
- `fileio_suffix`: An optional PV suffix for the fileio, if not given it will default to the writer class default
- `name`: An optional name for the device
- `config_sigs`: Optionally the signals to report as configuration
Expand Down
3 changes: 2 additions & 1 deletion docs/tutorials/implementing-devices.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
(implementing-devices)=
# Implementing Devices

In [](./using-devices.md) we learned how to instantiate some existing ophyd-async Devices. These Devices were ophyd level simulations, so did not talk to any underlying control system. In this tutorial we will instantiate some demo Devices that talk to underlying control system implementations, then explore how the Devices themselves are implemented.
Expand Down Expand Up @@ -166,7 +167,7 @@ Let's start with the lowest level sort of Device, a single channel of our point
We specify to the Device baseclass that we would like a Signal of a given type (e.g. `SignalR[int]`) via a type hint, and it will create that signal for us in a control system specific way. The type of `value` is the python builtin `int`, and the type of `mode` is an [enum](#StrictEnum) we have declared ourselves, where the string values must exactly match what the control system produces.

```{seealso}
[](#SignalDatatype) defines the list of all possible datatypes you can use for Signals
[](#SignalDatatypeT) defines the list of all possible datatypes you can use for Signals
```

We also [annotate](#typing.Annotated) this type hint with some additional information, like [`Format`](#StandardReadableFormat). This will tell the [](#StandardReadable) baseclass which Signals are important in a plan like `bp.grid_scan`. In this case we specify that `mode` should be reported as a [configuration parameter](#StandardReadableFormat.CONFIG_SIGNAL) once at the start of the scan, and `value` should be [fetched without caching and plotted](#StandardReadableFormat.HINTED_UNCACHED_SIGNAL) at each point of the scan.
Expand Down
2 changes: 1 addition & 1 deletion outline.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
x design goals - differences to ophyd sync
- devices, signals and their backends
x declarative vs procedural devices
- where should device logic live
x where should device logic live
x device connection strategies


Expand Down
130 changes: 72 additions & 58 deletions src/ophyd_async/core/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@

from ._detector import (
DetectorController,
DetectorControllerT,
DetectorTrigger,
DetectorWriter,
StandardDetector,
Expand Down Expand Up @@ -100,85 +99,100 @@


__all__ = [
"DetectorController",
"DetectorControllerT",
"DetectorTrigger",
"DetectorWriter",
"StandardDetector",
"TriggerInfo",
# Device
"Device",
"DeviceConnector",
"init_devices",
"DeviceVector",
"DeviceFiller",
"StandardFlyer",
"FlyerController",
"HDFDataset",
"HDFFile",
"config_ophyd_async_logging",
"MockSignalBackend",
"AsyncConfigurable",
"DeviceVector",
"init_devices",
# Protocols
"AsyncReadable",
"AsyncConfigurable",
"AsyncStageable",
"AutoIncrementFilenameProvider",
"AutoIncrementingPathProvider",
"FilenameProvider",
"NameProvider",
"PathInfo",
"PathProvider",
"DatasetDescriber",
"StaticFilenameProvider",
"StaticPathProvider",
"UUIDFilenameProvider",
"YMDPathProvider",
"ConfigSignal",
"HintedSignal",
"StandardReadable",
"StandardReadableFormat",
"Settings",
"SettingsProvider",
"Watcher",
# Status
"AsyncStatus",
"WatchableAsyncStatus",
"WatcherUpdate",
"completed_status",
# Signal
"Signal",
"SignalConnector",
"SignalR",
"SignalRW",
"SignalW",
"SignalRW",
"SignalX",
"observe_value",
"observe_signals_value",
"set_and_wait_for_value",
"set_and_wait_for_other_value",
"soft_signal_r_and_setter",
"soft_signal_rw",
"wait_for_value",
"walk_rw_signals",
"Array1D",
"DTypeScalar_co",
"SignalBackend",
"make_datakey",
"StrictEnum",
"SubsetEnum",
"SignalConnector",
# Signal Types
"SignalDatatype",
"SignalDatatypeT",
"DTypeScalar_co",
"Array1D",
"StrictEnum",
"SubsetEnum",
"Table",
"SignalMetadata",
# Soft signal
"SoftSignalBackend",
"AsyncStatus",
"WatchableAsyncStatus",
"DEFAULT_TIMEOUT",
"CalculatableTimeout",
"Callback",
"soft_signal_r_and_setter",
"soft_signal_rw",
# Mock signal
"LazyMock",
"MockSignalBackend",
# Signal utilities
"observe_value",
"observe_signals_value",
"wait_for_value",
"set_and_wait_for_value",
"set_and_wait_for_other_value",
"walk_rw_signals",
# Readable
"StandardReadable",
"StandardReadableFormat",
# Detector
"StandardDetector",
"TriggerInfo",
"DetectorTrigger",
"DetectorController",
"DetectorWriter",
# Path
"PathInfo",
"PathProvider",
"StaticPathProvider",
"AutoIncrementingPathProvider",
"YMDPathProvider",
"FilenameProvider",
"StaticFilenameProvider",
"AutoIncrementFilenameProvider",
"UUIDFilenameProvider",
# Datatset
"NameProvider",
"DatasetDescriber",
"HDFDataset",
"HDFFile",
# Flyer
"StandardFlyer",
"FlyerController",
# Settings
"Settings",
"SettingsProvider",
"YamlSettingsProvider",
# Utils
"config_ophyd_async_logging",
"CALCULATE_TIMEOUT",
"CalculatableTimeout",
"DEFAULT_TIMEOUT",
"Callback",
"NotConnected",
"Reference",
"Table",
"WatcherUpdate",
"gather_dict",
"get_dtype",
"get_enum_cls",
"get_unique",
"in_micros",
"make_datakey",
"wait_for_connection",
"completed_status",
"YamlSettingsProvider",
"Watcher",
# Back compat - delete before 1.0
"ConfigSignal",
"HintedSignal",
]
18 changes: 10 additions & 8 deletions src/ophyd_async/core/_detector.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,14 +49,16 @@ class TriggerInfo(BaseModel):
"""Minimal set of information required to setup triggering on a detector"""

#: Number of triggers that will be sent, (0 means infinite) Can be:
# - A single integer or
# - A list of integers for multiple triggers
# Example for tomography: TriggerInfo(number=[2,3,100,3])
#: This would trigger:
#: - 2 times for dark field images
#: - 3 times for initial flat field images
#: - 100 times for projections
#: - 3 times for final flat field images
#:
#: - A single integer or
#: - A list of integers for multiple triggers
#: Example for tomography: ``TriggerInfo(number=[2,3,100,3])``.
#: This would trigger:
#:
#: - 2 times for dark field images
#: - 3 times for initial flat field images
#: - 100 times for projections
#: - 3 times for final flat field images
number_of_triggers: NonNegativeInt | list[NonNegativeInt]
#: Sort of triggers that will be sent
trigger: DetectorTrigger = Field(default=DetectorTrigger.INTERNAL)
Expand Down
40 changes: 25 additions & 15 deletions src/ophyd_async/core/_device.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,11 @@ def create_children_from_annotations(self, device: Device):
"""

async def connect_mock(self, device: Device, mock: LazyMock):
"""Used during `Device.connect` with ``mock=True``.
This is called when there is no cached connect done in ``mock=True``
mode. It connects the Device and all its children in mock mode.
"""
# Connect serially, no errors to gather up as in mock mode
exceptions: dict[str, Exception] = {}
for name, child_device in device.children():
Expand All @@ -46,11 +51,10 @@ async def connect_mock(self, device: Device, mock: LazyMock):
raise NotConnected.with_other_exceptions_logged(exceptions)

async def connect_real(self, device: Device, timeout: float, force_reconnect: bool):
"""Used during ``Device.connect``.
"""Used during `Device.connect` with ``mock=False``.
This is called when a previous connect has not been done, or has been
done in a different mock more. It should connect the Device and all its
children.
This is called when there is no cached connect done in ``mock=False``
mode. It connects the Device and all its children in real mode in parallel.
"""
# Connect in parallel, gathering up NotConnected errors
coros = {
Expand Down Expand Up @@ -149,14 +153,20 @@ async def connect(
) -> None:
"""Connect self and all child Devices.
Contains a timeout that gets propagated to child.connect methods.
Successful connects will be cached so subsequent calls will return
immediately. Contains a timeout that gets propagated to child.connect
methods.
Parameters
----------
mock:
If True then use ``MockSignalBackend`` for all Signals
If True then use `MockSignalBackend` for all Signals. If passed a
`LazyMock` then pass this down for use within the Signals, otherwise
create one.
timeout:
Time to wait before failing with a TimeoutError.
force_reconnect:
If True then force a reconnect, even if the last connect succeeded
"""
if not hasattr(self, "_connector"):
msg = (
Expand Down Expand Up @@ -205,11 +215,11 @@ async def connect(


class DeviceVector(MutableMapping[int, DeviceT], Device):
"""
Defines device components with indices.
"""Defines a dictionary of Device children with arbitrary integer keys.
In the below example, foos becomes a dictionary on the parent device
at runtime, so parent.foos[2] returns a FooDevice.
See Also
--------
:ref:`implementing-devices`
"""

def __init__(
Expand Down Expand Up @@ -340,8 +350,8 @@ def init_devices(
connect=True,
mock=False,
timeout: float = 10.0,
) -> DeviceProcessor:
"""Auto initialise top level Device instances to be used as a context manager
):
"""Auto initialise top level Device instances: to be used as a context manager
Parameters
----------
Expand All @@ -358,9 +368,9 @@ def init_devices(
timeout:
How long to wait for connect before logging an exception
Notes
-----
Example usage::
Example
-------
To connect and name 2 motors in parallel::
[async] with init_devices():
t1x = motor.Motor("BLxxI-MO-TABLE-01:X")
Expand Down
Loading

0 comments on commit ca0ff7f

Please sign in to comment.