Creates a TabTransformer model for tabular omics data. TabTransformer uses self-attention to model feature interactions.
Usage
make_tabtransformer_learner(
n_heads = 4L,
n_layers = 2L,
d_model = 64L,
dropout = 0.3,
batch_size = 32L,
epochs = 100L,
learning_rate = 1e-04
)Arguments
- n_heads
Number of attention heads (default: 4)
- n_layers
Number of transformer layers (default: 2)
- d_model
Hidden dimension (default: 64)
- dropout
Dropout rate (default: 0.3)
- batch_size
Training batch size (default: 32)
- epochs
Number of epochs (default: 100)
- learning_rate
Learning rate (default: 0.0001)