airpack.tf1.fileio

Module Contents

airpack.tf1.fileio.bytes2signal(inset, dtype=tf.int16, scalar=1.0)

Inputs a set of raw data (bytes), decodes it to dtype. It then normalizes if the data type is an integer.

Parameters
  • inset (bytes) – Inputs a set of raw data (bytes)

  • dtype (tensorflow.dtypes.DType) – dtype to convert raw data to

  • scalar (float) – Scalar gain value (linear units)

Returns

A Tensor object storing the decoded bytes.

Return type

tensorflow.Tensor

airpack.tf1.fileio.pars_folder(datapath, shuffle=True)

Recursively find all files with the “.bin” extension in the data path and shuffle the data set if requested.

Note

The numeric label is the name of the bottom-most folder in the tree of data: e.g., for file data/train/x/y/file.bin, the label is y.

Parameters
  • datapath (Union[str, os.PathLike]) – Directory that the data resides in

  • shuffle (bool) – Shuffle the data set if requrested

Returns

tuple: (filenames, labels)

Return type

Tuple[List[str], List[int]]

airpack.tf1.fileio.datareader(sess, data_folder, input_len, output_len, n_epoch, batch_size, dtype=tf.int16, nthread=4, buffer_size=16, interweaved=True, scalar=1.0)

Data pipeline optimized for reading signal data (I/Q) and feeding it to a deep learning model.

Example usage:

# Initialize the TensorFlow session
init = tf.compat.v1.global_variables_initializer()
sess.run(init)
batch_x, batch_y, nfiles = fileio.datareader(
    sess,
    data_path,
    input_len = 2048,
    output_len = 12,
    n_epoch = 10,
    batch_size = 128
)
# Read the data in a while loop.
try:
    while True:
        data, labels = sess.run([batch_x, batch_y])
except tf.errors.OutOfRangeError:  # exception returned when n_epoch exceeded
    pass
Parameters
  • sess (tensorflow.compat.v1.Session) – Tensorflow Session

  • data_folder (Union[str, os.PathLike]) – Directory that the data resides in

  • input_len (int) – Number of complex samples as the input to the neural network.

  • output_len (int) – Number of labels possible (defines output layer length)

  • n_epoch (int) – Number of epochs before tf.errors.OutOfRangeError is thrown

  • batch_size (int) – Number of batches to read for each iteration

  • dtype (tensorflow.dtypes.DType) – dtype of the data in datafiles

  • nthread (int) – Number of CPU threads to use when pipelining data reads

  • buffer_size (int) – Buffer size used for shuffling the data

  • interweaved (bool) – Is data interweaved I/Q or not

  • scalar (float) – scalar multiplied by signal for data normalization

Returns

(batch_x, batch_y, iterator) where batch_x = iterator of training files batch_y = iterator of labels for training files nfiles = total number of files

Return type

Tuple

class airpack.tf1.fileio.TfSaver(saver_dir)

Class that saves the tensorflow 1 session during training.

Once created, the save() field can be used to save a model periodically during training and once the model is fully trained.

Example usage:

# Initialize the TensorFlow session
init = tf.compat.v1.global_variables_initializer()
sess.run(init)
saver = fileio.TfSaver(model_save_folder)
while True:
    # <train model here>
    saver.save(sess)
Parameters

saver_dir – directory to save the TensorFlow model

save(self, tf_session)

Save state of tensorflow session

Parameters

tf_session (tensorflow.compat.v1.Session) – TensorFlow Session to save the state of.

Returns

airpack.tf1.fileio.sess2uff(sess, in_node_name='input/IteratorGetNext', out_node_name='output/networkout', filename='saved_model.uff', addsoftmax=False, quiet=False)

Convert a TensorFlow session to a UFF file.

This function is used to freeze a TensorFlow 1 model and export it into a format that TensorRT can read in to convert to an optimized .plan file for deployment on the AIR-T

Note

The values for in_node_name and out_node_name are defined in the TensorFlow 1 model using the tf.name_scope().

Parameters
  • sess (tensorflow.compat.v1.Session) – TensorFlow Session

  • in_node_name (str) – Input node name

  • out_node_name (str) – Output node name

  • filename (Union[str, os.PathLike]) – Name of file to export

  • addsoftmax (bool) – Add a tf.nn.softmax() to the output. Some training methods have this incorporated so it must be added after training

  • quiet (bool) – Don’t be verbose

Returns

airpack.tf1.fileio.sess2onnx(sess, in_node_name='input/IteratorGetNext', out_node_name='output/networkout', filename='saved_model.onnx', addsoftmax=False)

Convert a TensorFlow session to an ONNX file.

This function is used to freeze a TensorFlow 1 model and export it into a format that TensorRT can read in to convert to an optimized .plan file for deployment on the AIR-T.

Note

The values for in_node_name and out_node_name are defined in the TensorFlow 1 model using the tf.name_scope().

Parameters
  • sess (tensorflow.compat.v1.Session) – TensorFlow Session

  • in_node_name (str) – Input node name

  • out_node_name (str) – Output node name

  • filename (Union[str, os.PathLike]) – Name of file to export

  • addsoftmax (bool) – Add a tf.nn.softmax() to the output. Some training methods have this incorporated so it must be added after training

Returns