prepare_weights_double: issue when processed for different entries.
The following code is not handling the following use case:
- we process a first dataset XXX stored at fileA / entry0000.
- -> this will create the file
- we process a second dataset YYY stored at fileA / entry0001
- -> this will overwrite the existing file when it could / should created another entry.
This comes from the following lines:
def create_heli_maps(profile, process_file_name, entry_name, transition_width):
profile = profile / profile.max()
profile = profile.astype("f")
profile = gaussian_filter(profile, 10)
if os.path.exists(process_file_name):
fd = h5py.File(process_file_name, "r+")
else:
fd = h5py.File(process_file_name, "w")