Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Option to use SELU for the activation layer in mlp via keras #1127

Closed
obgeneralao opened this issue Jun 28, 2024 · 2 comments · Fixed by #1244
Closed

Option to use SELU for the activation layer in mlp via keras #1127

obgeneralao opened this issue Jun 28, 2024 · 2 comments · Fixed by #1244
Assignees

Comments

@obgeneralao
Copy link

Is it possible to add "selu" activation function for Multilayer perceptron via keras?
How to use "Adamax" optimizer instead of the default "Adam"?

I am using keras==2.15 and tensorflow==2.15 and parsnip==1.2.1.

Thanks a lot.

@EmilHvitfeldt
Copy link
Member

You can set the optimizer as a engine argument via set_engine().

library(tidymodels)

model <- 
  mlp() %>%
  set_engine("keras", optimizer = "Adamax") %>%
  set_mode("classification")

model_fit <- fit(model, class ~ ., sim_classification(1000))
model_fit$fit$optimizer
#> <keras.src.optimizers.legacy.adamax.Adamax object at 0x342a737d0>

Created on 2025-01-29 with reprex v2.1.1

Copy link

This issue has been automatically locked. If you believe you have found a related problem, please file a new issue (with a reprex: https://reprex.tidyverse.org) and link to this issue.

@github-actions github-actions bot locked and limited conversation to collaborators Feb 13, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants