Daiquiri
Along with workflow/sidecar!56 (merged)
from blissoda.id13.daiquiri_xrpd_processor import DaiquiriXrpdProcessor as _XrpdProcessor
xrpd_processor = _XrpdProcessor(enable_plotter=False)
def loopscan_kmap(points, count_time, x_points, *detectors, datacollectionid=None):
scan_info = {
"daiquiri_datacollectionid": datacollectionid,
"instrument": {
"kmap_parameters": {
"x_nb_points": int(x_points),
"y_nb_points": int(points / x_points),
"@NX_class": "NXcollection",
}
}
}
return loopscan(points, count_time, *detectors, scan_info=scan_info)
Along with ui/daiquiri!830 and ui/daiquiri-ui!796 (merged) you now get a generic object which can be connected to a blissoda "processor"
In beacon you create a processor (say processor.yml), this is generic and can be used for any future type of processor:
- name: processor
plugin: bliss
package: blissoda.daiquiri.object
class: DaiquiriProcessor
processor_package: blissoda.id13.daiquiri_xrpd_processor
processor_class: DaiquiriXrpdProcessor
processor_class_options:
enable_plotter: false
This wraps the blissoda
processor and provides a consistent interface that daiquiri can use to provide a ui.
The blissoda
processes needs to mixin DaiquiriProcessorMixin
to provide the needed schema for the exposed parameters, each "parameter" should map to a property on the blissoda
processor.
from marshmallow import fields
from ..daiquiri.mixin import DaiquiriProcessorMixin, ExposedParameter
def exists(value):
if not os.path.exists(value):
raise ValidationError(f"File `{value}` does not exist")
def exists_valid_json(value):
exists(value)
if value.endswith(".json"):
try:
with open(value) as f:
json.load(f)
except json.decoder.JSONDecodeError as ex:
raise ValidationError(f"Could not decode JSON: {str(ex)}")
class DaiquiriXrpdProcessor(Id13XrpdProcessor, DaiquiriProcessorMixin):
EXPOSED_PARAMETERS = [
ExposedParameter(
parameter="workflow_with_saving_diffmap",
field_type=fields.Str,
title="Workflow",
),
ExposedParameter(
parameter="pyfai_config",
field_type=fields.Str,
title="PyFAI Config File",
field_options={"validate": exists_valid_json},
),
ExposedParameter(
parameter="integration_options",
field_type=fields.Dict,
field_options={"keys": fields.Str()},
title="PyFAI Options",
),
...
]
...
Parameters can be picked off the processor with EXPOSED_PARAMETERS
. This defines the parameter typing, UI title, and validation, which can do things like check a file exists, and check if the json is valid (useful for people manually modifying the poni json)
The high level bliss object then provides channels for each of the exposed parameters, so that daiquiri is aware of changes. Changes are bidirectioanlly reflected so modifications on the cli are sent to daiquiri and vice versa. The object also polls ewoksjob.client.get_workers() to tell you if the processor is ready to accept a job, it also looks at the active queue to tell you if the worker is processing.
Two outstanding questions:
- Right now PersistentParameters are stored relative to the session name. In bliss generally speaking on an object, parameters are stored relative to their object name (which has to be unique, enforced by beacon). This allows objects and their parameters to be shared over sessions. So right now if you load the daiquiri bliss object in a different session to where you previously used your processor your parameters are not available. We do this sometimes when testing for example import an object into an empty bliss session.
- Second because PersistentParameters are not a Cache there are no associated events, which essentially means that when the daiquiri bliss object wraps the processor it creates another copy of the parameters in redis. In use i dont think this is really a problem but it is duplicating data.