-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to set time for running each notebook cell with nbval? #151
Comments
So my understanding of this error is that it is slightly different than the nbconvert case. In this case:
For steps 1-3, a different timeout is used (which is configurable, via I'm not entirely sure what is going on for the messages (after execution has finished) to take so long to send/process. We could certainly make this configurable, but I just want to make sure it is the right fix. For now, could you try increasing this value in your local copy to A: see that it actually fixes the problem, and secondarily B: give some indication of how long is needed? Line 336 in 3597e0b
|
I was checking the
|
According to the JupyterLab folks, See specifically jupyterlab/jupyterlab#12018 (comment) Is there any plan to fix it or support it? We're interested in continuing to use |
@jhlegarreta Does |
@mikemhenry thanks for the suggestion. Here is the attempt: I'll keep an eye on it to see if the flag is being taken into account. Edit: CI's are failing due to an unrelated issue. It will be some time before I get a chance to have a look at this. |
Hello nbval folks, first of all thanks a lot for your plugin! :)
We've decided to use it on ipycytoscape but we found a problem.
Currently we're having the following error on MacOS:
The way libs like
nbconvert
are dealing with this is allowing the user to pass a flag that tells jupyter to wait an X amount of time for the cell finish loading with flags like this one:--ExecutePreprocessor.timeout=600
, docs.Is it possible to do something similar on nbval?
Thanks a lot, please let me know if I can help somehow :)
The text was updated successfully, but these errors were encountered: