:py:mod:`orcanet.backend` ========================= .. py:module:: orcanet.backend .. autoapi-nested-parse:: Code for training and validating NN's, as well as evaluating them. .. !! processed by numpydoc !! Module Contents --------------- Functions ~~~~~~~~~ .. autoapisummary:: orcanet.backend.train_model orcanet.backend.validate_model orcanet.backend.weighted_average orcanet.backend.h5_inference orcanet.backend.make_model_prediction .. py:function:: train_model(orga, model, epoch, batch_logger=False) Train a model on one file and return the history. :Parameters: **orga** : orcanet.core.Organizer Contains all the configurable options in the OrcaNet scripts. **model** : keras.Model A compiled keras model. **epoch** : tuple Current epoch and the no of the file to train on. **batch_logger** : bool Use the orcanet batchlogger to log the training. :Returns: **history** : dict The history of the training on this file. A record of training loss values and metrics values. .. !! processed by numpydoc !! .. py:function:: validate_model(orga, model) Validates a model on all validation files and return the history. :Parameters: **orga** : orcanet.core.Organizer Contains all the configurable options in the OrcaNet scripts. **model** : keras.Model A compiled keras model. :Returns: **history** : dict The history of the validation on all files. A record of validation loss values and metrics values. .. !! processed by numpydoc !! .. py:function:: weighted_average(histories, f_sizes) Average multiple histories, weighted with the file size. Each history can have multiple metrics, which are averaged seperatly. :Parameters: **histories** : List List of histories, one for each file. Each history is also a list: each entry is a different loss or metric. **f_sizes** : List List of the file sizes, in the same order as the histories, i.e. the file of histories[0] has the length f_sizes[0]. :Returns: **wgtd_average** : List The weighted averaged history. Has the same length as each history in the histories List, i.e. one entry per loss or metric. .. !! processed by numpydoc !! .. py:function:: h5_inference(orga, model, files_dict, output_path, samples=None, use_def_label=True) Let a model predict on all samples in a h5 file, and save it as a h5 file. Per default, the h5 file will contain a datagroup y_values straight from the given files, as well as two datagroups per output layer of the network, which have the labels and the predicted values in them as numpy arrays, respectively. :Parameters: **orga** : orcanet.core.Organizer Contains all the configurable options in the OrcaNet scripts. **model** : keras.Model Trained Keras model of a neural network. **files_dict** : dict Dict mapping model input names to h5 file paths. **output_path** : str Name of the output h5 file containing the predictions. **samples** : int, optional Dont use all events in the file, but instead only the given number. **use_def_label** : bool If True and no label modifier is given by user, use the default label modifier instead of none. .. !! processed by numpydoc !! .. py:function:: make_model_prediction(orga, model, epoch, fileno, samples=None) Let a model predict on all validation samples, and save it as a h5 file. Per default, the h5 file will contain a datagroup y_values straight from the given files, as well as two datagroups per output layer of the network, which have the labels and the predicted values in them as numpy arrays, respectively. :Parameters: **orga** : orcanet.core.Organizer Contains all the configurable options in the OrcaNet scripts. **model** : keras.Model A compiled keras model. **epoch** : int Epoch of the last model training step in the epoch, file_no tuple. **fileno** : int File number of the last model training step in the epoch, file_no tuple. **samples** : int or None Number of events that should be predicted. If samples=None, the whole file will be used. .. !! processed by numpydoc !!