NeXus complient external writer process

BLISS offers the possiblitiy to a sepearete process (on the system level) for saving and achiving the aquired data. The code of this external NeXus writer is maintained by the ESRF Data Analysis Unit (DAU) to ensure seamless integration with data analysis tools provied by the DAU.

To start a session writer as a process inside an environment where BLISS is installed

   $ NexusSessionWriter test_session --log=info

To allow for a proper Nexus structure, add these lines to the session’s user script (strongly recommended but not absolutely necessary):

    from nexus_writer_service import metadata
    metadata.register_all_metadata_generators()

Warning

Currently the external NeXus writer is in an experimental state and the protocol to ensure the completeness for the saved data still needs to be put in place.

To test the external writer without using the file saving mechanism provided by BLISS itself the following line has to be entered in the BLISS shell:

    SCAN_SAVING.writer = 'null'

To start a session writer as a service inside an environment where BLISS is installed

   $ NexusWriterService

Example: using Bliss data api for hdf5 saving

Note

The example script discussed here is provided in Bliss repository at scripts/external_saving_example/external_saving_example.py. To have a minimal working Bliss environment have a look at the installation notes and the test_configuration setup.

Listening to a Bliss session

When running the script

python scripts/external_saving_example/external_saving_example.py

it is listening to new scans in the Bliss test_session

listen_to_session_wait_for_scans("test_session")

Connect to the node ‘test_session’ in redis:

session_node = get_session_node(session)

Using the walk_on_new_events() function with filter="scan"(limit walk to nodes of type node.type == "scan" ) in order to handle new events on scan nodes:

  • NEW_NODE when a new scan is launched
  • END_SCAN when a scan terminates.
    # wait for new events on scan
    for event_type, node in session_node.iterator.walk_on_new_events(
        filter="scan", from_next=True):

Receiving events from a scan

In the example script a new instance of the class HDF5_Writer is created per scan that is started. Following the initialisation a gevent greenlet is spawned to run the actual listener in a non blocking way. Inside def run(self) a second iterator is started walking through all events emitted by the scan (see data structure section):

   for event_type, node in self.scan_node.iterator.walk_events():

Hint

Data generated by scans in BLISS is emitted though a structure called channel. Further reading can be found at

Once an event is received it can be categorized by the event type:

  • NEW_NODE
  • NEW_DATA_IN_CHANNEL

and by node type:

  • channel
  • lima

Currently 0d and and 1d data is directly kept in redis and published through channels (node.type == "channel"). For each new channel a corresponding hdf5 dataset is created and filled with data emitted on a "NEW_DATA_IN_CHANNEL" event.

2d data (e.g. lima images) is not saved in redis itself, but can be retrieved through references (method get_image of class LimaDataView). However, for this example we do not want to deal with the 2d data itself but only resolve the final saving destination.

In this example data from channels (not lima) is written to the hdf5 file as soon as the event is received, but references of images are only saved in hdf5 once the scan has ended (nothing is preventing to do this also on the fly).

Finalizing the scan dataset on “END_SCAN”

Once the END_SCAN event is received the finalize method of the HDF5_Writer instance is called to

  • 1) stop the listening on events of the regarding scan,
  • 2) have a final synchronization for all datasets of the scan and
  • 3) to write instrument and meta-data entries to hdf5.

Meta-data and Instrument dataset

Each scan has an attached scan_info structure (nested dict) which e.g. contains meta-data entries which also have to be put into the hdf5. In Bliss there is a dicttoh5 function which is derived from its pendant in silx.io.dictdump, that puts in place correct h5dataset.attrs["NX_class"] attributes when converting the python dict structure into hdf5 datasets.

Examples of complex scans

In order to have some test cases for more demanding scans when working with the presented api a script file that can be executed inside the Bliss shell is provided:

    TEST_SESSION [1]:   exec(open("scripts/external_saving_example/some_scans.py").read())

The same scans can also be executed using Bliss in a library mode running (TANGO_HOST to be chosen according to the running server…)

    BEACON_HOST=localhost TANGO_HOST=localhost:20000 python scripts/external_saving_example/some_scans_bliss_as_library.py