Skip to content

Commit 62f91e8

Browse files
authored
Merge branch 'main' into remove_i22_tet_plugins
2 parents e5e72a9 + 67fa60a commit 62f91e8

File tree

150 files changed

+746
-515
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

150 files changed

+746
-515
lines changed

conftest.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
PathProvider,
1616
)
1717
from tests.devices.i10.test_data import LOOKUP_TABLE_PATH
18-
from tests.devices.unit_tests.test_daq_configuration import MOCK_DAQ_CONFIG_PATH
18+
from tests.devices.test_daq_configuration import MOCK_DAQ_CONFIG_PATH
1919
from tests.test_data import (
2020
TEST_DISPLAY_CONFIG,
2121
TEST_OAV_ZOOM_LEVELS_XML,
Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
# 6. Handle devices shared between multiple endstations
2+
3+
Date: 2025-08-27
4+
5+
## Status
6+
7+
Proposed
8+
9+
## Context
10+
11+
Some beamlines have multiple endstations with shared hardware in the optics or experiment hutch, and could potentially be trying to control it at the same time. Any device in the common hutch should only be fully controlled by one endstation at a time - the one that is taking data - but still be readable from the other endstations.
12+
13+
## Decision
14+
15+
The current solution is to have a separate blueapi instance for the shared hutch in order to be able to control the access to all the devices defined there.
16+
For all hardware in the shared optics hutch, the architecture should follow this structure:
17+
18+
- There is a base device in dodal that sends a REST call to the shared blueapi with plan and devices names, as well as the name of the endstation performing the call.
19+
- There are read-only versions of the shared devices in the endstation blueapi which inherit from the base device above and set up the request parameters.
20+
- The real settable devices are only defined in the shared blueapi and should never be called directly from a plan.
21+
- The shared blueapi instance also has an ``AccessControl`` device that reads the endstation in use for beamtime from a PV.
22+
- Every plan should then be wrapped in a decorator that reads the ``AccessControl`` device, check which endstation is making the request and only allows the plan to run if the two values match.
23+
24+
25+
:::{seealso}
26+
[Optics hutch implementation on I19](https://diamondlightsource.github.io/i19-bluesky/main/explanations/decisions/0004-optics-blueapi-architecture.html) for an example.
27+
:::

docs/how-to/create-beamline.rst renamed to docs/how-to/create-beamline.md

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,11 @@
1-
Creating a new beamline
2-
=======================
1+
# Creating a new beamline
32

43
A beamline is a collection of devices that can be used together to run experiments, they may be read-only or capable of being set.
54
They include motors in the experiment hutch, optical components in the optics hutch, the synchrotron "machine" and more.
65

7-
Beamline Modules
8-
----------------
6+
## Beamline Modules
97

10-
Each beamline should have its own file in the ``dodal.beamlines`` folder, in which the particular devices for the
8+
Each beamline should have its own file in the ``dodal.beamlines`` folder, in which the particular devices for the
119
beamline are instantiated. The file should be named after the colloquial name for the beamline. For example:
1210

1311
* ``i03.py``
@@ -16,14 +14,14 @@ beamline are instantiated. The file should be named after the colloquial name fo
1614

1715
Beamline modules (in ``dodal.beamlines``) are code-as-configuration. They define the set of devices and common device
1816
settings needed for a particular beamline or group of similar beamlines (e.g. a beamline and its digital twin). Some
19-
of our tooling depends on the convention of *only* beamline modules going in this package. Common utilities should
17+
of our tooling depends on the convention of *only* beamline modules going in this package. Common utilities should
2018
go somewhere else e.g. ``dodal.utils`` or ``dodal.beamlines.common``.
2119

2220
The following example creates a fictitious beamline ``w41``, with a simulated twin ``s41``.
2321
``w41`` needs to monitor the status of the Synchrotron and has an AdAravisDetector.
2422
``s41`` has a simulated clone of the AdAravisDetector, but not of the Synchrotron machine.
2523

26-
.. code-block:: python
24+
```python
2725

2826
from ophyd_async.epics.adaravis import AravisDetector
2927

@@ -85,3 +83,4 @@ The following example creates a fictitious beamline ``w41``, with a simulated tw
8583
drv_suffix=CAM_SUFFIX,
8684
fileio_suffix=HDF5_SUFFIX,
8785
)
86+
```

docs/how-to/move-code.md

Lines changed: 94 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,94 @@
1+
2+
# Moving code from another repo
3+
4+
In the process of writing code in other DLS repos you may come to realise that it makes more sense to be in ``dodal``. It is a good idea to keep the history for this code, you can do this by doing the following (we will be using moving devices from https://github.com/DiamondLightSource/hyperion as an example):
5+
6+
* Clone the codebase you are copying from:
7+
8+
```bash
9+
git clone [email protected]:DiamondLightSource/hyperion.git clone_for_history
10+
cd clone_for_history/
11+
```
12+
13+
* Remove the remote to avoid any mistaken pushes:
14+
15+
```bash
16+
git remote rm origin
17+
```
18+
19+
* Filter out only the directory/file you want to move:
20+
21+
```bash
22+
pip install git-filter-repo
23+
git-filter-repo --path file/to/move --path /other/file/to/move -f
24+
```
25+
26+
* Clean everything up:
27+
28+
```bash
29+
git reset --hard
30+
git gc --aggressive
31+
git prune
32+
git clean -fd
33+
```
34+
35+
* Add a note to every commit message to mention it's been moved::
36+
37+
```bash
38+
git filter-branch --msg-filter 'sed "$ a \
39+
NOTE: Commit originally came from https://github.com/DiamondLightSource/hyperion"' -f -- --all
40+
```
41+
42+
* If you have been using Github [issue references](https://docs.github.com/en/get-started/writing-on-github/working-with-advanced-formatting/autolinked-references-and-urls#issues-and-pull-requests) in the old repository modify these to point to be more explicit (Note that this assumes the old repo uses ``#123`` notation and only ever references issues from it's own repo)::
43+
44+
```bash
45+
git filter-branch -f --msg-filter 'sed "s|#[0-9]\+|DiamondLightSource/hyperion&|g"' -- --all
46+
```
47+
48+
* Prepare the code to be in the correct structure for dodal::
49+
50+
```bash
51+
mkdir -p src/dodal/devices
52+
mv path/to/device src/dodal/devices/
53+
```
54+
55+
* At this point it's a good idea to check the log ``git log`` and the general directory structure to ensure it looks mostly correct
56+
57+
* Add and commit this (locally)::
58+
59+
```bash
60+
git add .
61+
git commit -m "Prepare for import into dodal"
62+
```
63+
64+
* In a different folder clone ``dodal``, remove the origin (for now) to be safe and create a branch::
65+
66+
```bash
67+
git clone [email protected]:DiamondLightSource/dodal.git
68+
cd dodal
69+
git remote rm origin
70+
git checkout -b add_code_from_hyperion
71+
```
72+
73+
* Add the source repo as a remote for ``dodal``::
74+
75+
```bash
76+
git remote add source /path/to/source/old_repo/.git
77+
```
78+
79+
* Pull from the source repo::
80+
81+
```bash
82+
git pull --no-rebase source main --allow-unrelated-histories
83+
```
84+
85+
* This is another point where it's a good idea to check the log ``git log`` and the general directory structure to ensure it looks mostly correct
86+
87+
* Remove the source remote and re-add origin::
88+
89+
```bash
90+
git remote rm source
91+
git remote add origin [email protected]:DiamondLightSource/dodal.git
92+
```
93+
94+
* Tidy up the code so that it fits into the ``dodal`` repo e.g. in the Hyperion case we had to change the tests to import from ``hyperion`` to import from ``dodal`` and add some more dependencies.

docs/how-to/move-code.rst

Lines changed: 0 additions & 73 deletions
This file was deleted.

docs/how-to/write-tests.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,9 @@ Testing is essential to maintain the integrity and reliability of the codebase.
77
- **Unit Tests**: Place unit tests for individual components in the `tests` directory, but take care to mirror the file structure of the `src` folder with the corresponding code files. Use the `test_*.py` naming convention for test files.
88
- **System Tests**: Tests that interact with DLS infrastructure, network, and filesystem should be placed in the top-level `systems_test` folder. This separation ensures that these tests are easily identifiable and can be run independently from unit tests.
99

10+
Useful functions for testing that can be reused across multiple tests for common devices and for external plan repositories belong in the `dodal/testing` directory. For example, when mocking a `Motor` device, all of the signals will default to zero, which will cause errors when trying to move. The `patch_motor` and `patch_all_motors` functions, found in `dodal.testing`, will populate the mocked motor with useful default values for the signals so that it can still be used in tests.
11+
12+
1013
## Writing a test for a device
1114
We aim for high test coverage in dodal with small, modular test functions. To achieve this, we need to test the relevant methods by writing tests for the class/method we are creating or changing, checking for the expected behaviour. We shouldn't need to write tests for parent classes unless we alter their behaviour.
1215

Lines changed: 23 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -1,32 +1,27 @@
1-
Zocalo Interaction
2-
==================
1+
# Zocalo Interaction
32

4-
.. image:: ../assets/zocalo.png
5-
:alt: Diagram of zocalo
3+
![Diagram of zocalo](../assets/zocalo.png)
64

7-
Zocalo jobs are triggered based on their ISPyB DCID using the ``ZocaloTrigger`` class in a callback subscribed to the
8-
Bluesky plan or ``RunEngine``. These can trigger processing for any kind of job, as zocalo infers the necessary
5+
Zocalo jobs are triggered based on their ISPyB DCID using the ``ZocaloTrigger`` class in a callback subscribed to the
6+
Bluesky plan or ``RunEngine``. These can trigger processing for any kind of job, as zocalo infers the necessary
97
processing from data in ISPyB.
108

11-
Results are received using the ``ZocaloResults`` device, so that they can be read into a plan and used for
12-
decision-making. Currently the ``ZocaloResults`` device is only made to handle X-ray centring results. It subscribes to
9+
Results are received using the ``ZocaloResults`` device, so that they can be read into a plan and used for
10+
decision-making. Currently the ``ZocaloResults`` device is only made to handle X-ray centring results. It subscribes to
1311
a given zocalo RabbitMQ channel the first time that it is triggered.
1412

15-
Zocalo Service
16-
==============
13+
# Zocalo Service
1714

18-
The Zocalo service processes incoming messages using recipes which describe routing of messages between processing
19-
steps. You can see `source for the recipes here`_.
20-
21-
.. _source for the recipes here: https://gitlab.diamond.ac.uk/scisoft/zocalo/-/tree/master/recipes
15+
The Zocalo service processes incoming messages using recipes which describe routing of messages between processing
16+
steps. You can see [source for the recipes here](https://gitlab.diamond.ac.uk/scisoft/zocalo/-/tree/master/recipes).
2217

2318
You can find more information about Zocalo at https://confluence.diamond.ac.uk/display/SSCC/How+to+create+an+MX+processing+pipeline
2419

25-
Gridscans
26-
---------
20+
## Gridscans
2721

28-
The Zocalo Service receives messages of the following form for both the xy and xz gridscans::
22+
The Zocalo Service receives messages of the following form for both the xy and xz gridscans:
2923

24+
```
3025
{
3126
'recipes': ['mimas'],
3227
'parameters': {
@@ -39,9 +34,11 @@ The Zocalo Service receives messages of the following form for both the xy and x
3934
'guid': 'd6f117bb-c856-4df8-b9bc-2d3c625e9fd5'
4035
}
4136
}
37+
```
4238

43-
Zocalo is then sent stop messages::
39+
Zocalo is then sent stop messages:
4440

41+
```
4542
{
4643
'recipes': ['mimas'],
4744
'parameters': {
@@ -50,22 +47,22 @@ Zocalo is then sent stop messages::
5047
'guid': '9a96e59c-da30-494c-8380-c7a5c828c2c9'
5148
},
5249
}
50+
```
5351

5452
these tell zocalo that the data is now ready to be processed.
5553

5654
Zocalo then uses the ISPyB DataCollection ID to fetch the corresponding info from ISPyB
5755

58-
The messages that zocalo receives can be found in Graylog in the Zocalo stream, from there you can find the log of
56+
The messages that zocalo receives can be found in Graylog in the Zocalo stream, from there you can find the log of
5957
the recipe processing using the path to the logbook that comes from messages like these::
6058
Message saved in logbook at /dls/tmp/zocalo/dispatcher/2024-12/cb/62d35b-7cc9-4f1b-868b-712e82aa0271
6159

62-
From the zocalo graylog you can also see that once the gridscan nexus file is picked up (for the CPU gridscan) it
63-
starts a recipe::
60+
From the zocalo graylog you can also see that once the gridscan nexus file is picked up (for the CPU gridscan) it
61+
starts a recipe:
6462

63+
```
6564
{'recipes': ['per-image-analysis-gridscan-i03-no-really'], 'parameters': {'ispyb_dcid': 16085803, 'filename': '{filename}', 'start_frame_index': '{start_frame_index}', 'number_of_frames': '{number_of_frames}', 'message_index': '{message_index}', 'guid': 'd10f8b8c-57a5-4dc4-acaf-a22f8d2bbf60'}, 'recipe': <workflows.recipe.recipe.Recipe object at 0x7f0587f13110>}
65+
```
6666

67-
68-
If you look at the recipe json, Zocalo then runs Per-Image-Analysis on each frame and then assembles the results in
69-
the `DLS X-Ray Centring service`_.
70-
71-
.. _DLS X-Ray Centring service: https://github.com/DiamondLightSource/python-dlstbx/blob/a8fcbd30335bf13f5e35df78badfc60397500535/src/dlstbx/services/xray_centering.py
67+
If you look at the recipe json, Zocalo then runs Per-Image-Analysis on each frame and then assembles the results in
68+
the [DLS X-Ray Centring service](https://github.com/DiamondLightSource/python-dlstbx/blob/a8fcbd30335bf13f5e35df78badfc60397500535/src/dlstbx/services/xray_centering.py).

0 commit comments

Comments
 (0)