Add a callback to show elapsed and remaining time#2082
Add a callback to show elapsed and remaining time#2082kyouma wants to merge 3 commits intolululxvi:masterfrom
Conversation
|
When you say "multiple outputs during training", are you referring to things like tensorflow C-level printing, or just normal printing from DeepXDE? |
|
I tried replacing all Also, I think that we can't guarantee that in future no |
|
In this case, your design choice for custom class is correct, in my opinion. |
echen5503
left a comment
There was a problem hiding this comment.
Can you also show some logs, after running on some of the deepXDE examples, to make sure this will display properly?
| self.starting_epoch = self._get_iteration() | ||
| self.last_display_epoch = self._get_iteration() | ||
|
|
||
| def _get_iteration(self): |
There was a problem hiding this comment.
if you can _get_iteration, why not _get_iterations (total iterations) as well, and reduce overhead in model.py code?
There was a problem hiding this comment.
It is an argument of the methods train() and _train_sgd() in Model, but not a property of Model, so callbacks do not have access to it.
A similar callback parameter setting method is already used in the _train_tensorflow_compat_v1_scipy() method, so I have followed the same way.
There was a problem hiding this comment.
Ah, I see. You are right.
|
Here it is. The training log ( |
|
Ok. looks good. We should think about improving clarity of multiple logging callbacks in a future PR, #2084 |
Hello.
There is no estimation of remaining training time, so I have made a callback to print in in tqdm style. Integration of tqdm itself seems very easy (change 1 line in
_train_sgd()), but is actually impossible due to multiple outputs during training that break the progress bar even withtqdm.tqdm.write().