Opened 8 years ago
Closed 7 years ago
#27818 closed Cleanup/optimization (fixed)
Use contextlib.suppress to suppress exceptions.
Reported by: | Mads Jensen | Owned by: | |
---|---|---|---|
Component: | Core (Other) | Version: | dev |
Severity: | Normal | Keywords: | |
Cc: | Adam Johnson | Triage Stage: | Accepted |
Has patch: | yes | Needs documentation: | no |
Needs tests: | no | Patch needs improvement: | no |
Easy pickings: | no | UI/UX: | no |
Description
As per a comment in #23919, this pattern can be replaced with contextlib.suppress(...)
which is valid as of Python 3.4. The code looks a bit cleaner.
try: ... except ...: pass
Change History (11)
comment:2 by , 8 years ago
Component: | Uncategorized → Core (Other) |
---|---|
Triage Stage: | Unreviewed → Accepted |
comment:3 by , 8 years ago
Has patch: | set |
---|---|
Owner: | changed from | to
Patch needs improvement: | set |
comment:4 by , 8 years ago
Patch needs improvement: | unset |
---|
comment:5 by , 8 years ago
Easy pickings: | unset |
---|
comment:7 by , 7 years ago
I only found out about this in review from Tim the other week. I think it should be reverted because it's making things slower unnecessarily. I count contextlib.suppress as about 10 times slower than plain try/except, not to mention it can trigger garbage collection as it creates and destroys an object every time:
In [8]: def foo(): ...: try: ...: pass ...: except AttributeError: ...: pass ...: In [9]: def foo2(): ...: with contextlib.suppress(AttributeError): ...: pass ...: In [10]: %timeit foo() The slowest run took 10.24 times longer than the fastest. This could mean that an intermediate result is being cached. 10000000 loops, best of 3: 111 ns per loop In [11]: %timeit foo2() The slowest run took 6.15 times longer than the fastest. This could mean that an intermediate result is being cached. 1000000 loops, best of 3: 1.13 µs per loop
How would we proceed? Should I take this to the mailing list, or can I just submit a revert PR?
comment:8 by , 7 years ago
Resolution: | fixed |
---|---|
Status: | closed → new |
If a change slows down things, please revert the change.
comment:9 by , 7 years ago
I'm not against reverting, but before doing so, it could be useful to ask cpython developers if the slowness should be a documented caveat of contextlib.suppress()
.
comment:10 by , 7 years ago
I got this input from #python-dev:
Alex_Gaynor
: Without measuring, I'd bet almost anything that the majority of the overhead is in an additional function call, so it's not really avoidable. I'm not sure it makes sense to docuemnt, _any_ fucntion call would add the same overhead (assuming I'm right). (For that reason I'd expect it to have no additional overhead on PyPy)
__ap__
: Those kinds of wholesale replacements look like an anti-pattern to me. Most of the time, the un-abstracted version is as readable as the slower abstracted one.
PR to revert.
comment:11 by , 7 years ago
Cc: | added |
---|
Thanks Tim for checking with #python-dev and writing the PR! I forgot to add myself to CC so didn't get email updates (I'm used to systems where comment = auto subscribe).