-
Notifications
You must be signed in to change notification settings - Fork 19.6k
Fix for issue #21118: inconsistent behavior across callbacks #21275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix for issue #21118: inconsistent behavior across callbacks #21275
Conversation
…d logic used in EarlyStopping.
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #21275 +/- ##
==========================================
+ Coverage 82.57% 82.60% +0.02%
==========================================
Files 564 565 +1
Lines 54677 54772 +95
Branches 8500 8508 +8
==========================================
+ Hits 45152 45243 +91
- Misses 7435 7439 +4
Partials 2090 2090
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR!
@@ -197,6 +183,44 @@ def __init__( | |||
f"filepath={self.filepath}" | |||
) | |||
|
|||
def _set_monitor_op(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we refactor this logic into a standalone function that we could reuse across all callbacks that need this functionality?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added a MonitorCallback
base class for all the callbacks that use this functionality (currently EarlyStopping
, ReduceLROnPlateau
and ModelCheckpoint
). This ensures consistent behavior across callbacks and reduces code duplication.
…ss all callbacks that needs it
from keras.src.trainers import compile_utils | ||
|
||
|
||
@keras_export("keras.callbacks.MonitorCallback") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The refactoring into a shared base class makes sense, but please do not export it to the public API.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I removed it. However, I do think it could be useful as a public API — perhaps in a follow-up PR with some adjustments. It would be helpful for users who want to create custom callbacks that monitor a metric — for example, plotting something whenever the loss decreases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thank you for the updates!
#21118