neuraxle.api

Neuraxle’s API classes

Neuraxle’s high-level API classes. Useful to make complex Deep Learning pipelines by calling just a few minimal things.

Classes

DeepLearningPipeline(pipeline, …[, …])

Adds an epoch loop, a validation split, and mini batching to a pipeline.

class neuraxle.api.DeepLearningPipeline(pipeline: Union[neuraxle.base.BaseStep, List[Union[Tuple[str, BaseStep], BaseStep]]], validation_size: float = None, batch_size: int = None, batch_metrics: Dict[str, Callable] = None, shuffle_in_each_epoch_at_train: bool = True, seed: int = None, n_epochs: int = 1, epochs_metrics: Dict[str, Callable] = None, scoring_function: Callable = None, cache_folder: str = None, print_epoch_metrics=False, print_batch_metrics=False)[source]

Adds an epoch loop, a validation split, and mini batching to a pipeline. It also tracks batch metrics, and epoch metrics.

Example usage :

p = DeepLearningPipeline(
    pipeline,
    validation_size=VALIDATION_SIZE,
    batch_size=BATCH_SIZE,
    batch_metrics={'mse': to_numpy_metric_wrapper(mean_squared_error)},
    shuffle_in_each_epoch_at_train=True,
    n_epochs=N_EPOCHS,
    epochs_metrics={'mse': to_numpy_metric_wrapper(mean_squared_error)},
    scoring_function=to_numpy_metric_wrapper(mean_squared_error),
)

p, outputs = p.fit_transform(data_inputs, expected_outputs)

batch_mse_train = p.get_batch_metric_train('mse')
epoch_mse_train = p.get_epoch_metric_train('mse')
batch_mse_validation = p.get_batch_metric_validation('mse')
epoch_mse_validation = p.get_epoch_metric_validation('mse')

It uses EpochRepeater, ValidationSplitWrapper, and MiniBatchSequentialPipeline

get_score()[source]

Get latest score. This function had to be defined for the hyperparameter optimization steps.

Returns

score

Return type

float

get_score_train() → float[source]

Get latest score train.

Returns

score

Return type

float

get_score_validation()[source]

Get latest score validation.

Returns

score

Return type

float