MLPNetwork

class MLPNetwork(n_layers: int = 3, n_units: int | list[int] = 500, activation: str | list[str] = 'relu', dropout_rate: float | list[float] | None = None, dropout_last: float | None = None, use_bias: bool = True)[source]

Establish the network structure for a MLP.

Adapted from the implementation used in [1]

Parameters:
n_layersint, optional (default=3)

The number of dense layers in the MLP.

n_unitsUnion[int, List[int]], optional (default=500)

Number of units in each dense layer.

activationUnion[str, List[str]], optional (default=’relu’)

Activation function(s) for each dense layer.

dropout_rateUnion[float, List[Union[int, float]]], optional (default=None)

Dropout rate(s) for each dense layer. If None, a default rate of 0.2 is used, except the first element, being 0.1. Dropout rate(s) are typically a number in the interval [0, 1].

dropout_lastfloat, default = 0.3

The dropout rate of the last layer.

use_biasbool, default = True

Condition on whether or not to use bias values for dense layers.

Notes

Adapted from the implementation from source code https://github.com/hfawaz/dl-4-tsc/blob/master/classifiers/mlp.py

References

[1]

Wang et al. Time series classification from scratch with deep neural

networks: A strong baseline, IJCNN, 2017.

Methods

build_network(input_shape, **kwargs)

Construct a network and return its input and output layers.

build_network(input_shape, **kwargs)[source]

Construct a network and return its input and output layers.

Parameters:
input_shapetuple of shape = (n_timepoints (m), n_channels (d))

The shape of the data fed into the input layer

Returns:
input_layera keras layer
output_layera keras layer