Opened 8 years ago

Closed 7 years ago

#27818 closed Cleanup/optimization (fixed)

Use contextlib.suppress to suppress exceptions.

Reported by: Mads Jensen Owned by: Tim Graham <timograham@…>
Component: Core (Other) Version: dev
Severity: Normal Keywords:
Cc: Adam Johnson Triage Stage: Accepted
Has patch: yes Needs documentation: no
Needs tests: no Patch needs improvement: no
Easy pickings: no UI/UX: no

Description

As per a comment in #23919, this pattern can be replaced with contextlib.suppress(...) which is valid as of Python 3.4. The code looks a bit cleaner.

try:
    ...
except ...:
    pass

Change History (11)

comment:2 by Tim Graham, 8 years ago

Component: UncategorizedCore (Other)
Triage Stage: UnreviewedAccepted

comment:3 by Mads Jensen, 8 years ago

Has patch: set
Owner: changed from nobody to Mads Jensen
Patch needs improvement: set
Version 0, edited 8 years ago by Mads Jensen (next)

comment:4 by Mads Jensen, 8 years ago

Patch needs improvement: unset

comment:5 by Tim Graham, 8 years ago

Easy pickings: unset

comment:6 by Tim Graham <timograham@…>, 7 years ago

Owner: set to Tim Graham <timograham@…>
Resolution: fixed
Status: newclosed

In 550cb3a:

Fixed #27818 -- Replaced try/except/pass with contextlib.suppress().

comment:7 by Adam Johnson, 7 years ago

I only found out about this in review from Tim the other week. I think it should be reverted because it's making things slower unnecessarily. I count contextlib.suppress as about 10 times slower than plain try/except, not to mention it can trigger garbage collection as it creates and destroys an object every time:

In [8]: def foo():
   ...:     try:
   ...:         pass
   ...:     except AttributeError:
   ...:         pass
   ...:

In [9]: def foo2():
   ...:     with contextlib.suppress(AttributeError):
   ...:         pass
   ...:

In [10]: %timeit foo()
The slowest run took 10.24 times longer than the fastest. This could mean that an intermediate result is being cached.
10000000 loops, best of 3: 111 ns per loop

In [11]: %timeit foo2()
The slowest run took 6.15 times longer than the fastest. This could mean that an intermediate result is being cached.
1000000 loops, best of 3: 1.13 µs per loop

How would we proceed? Should I take this to the mailing list, or can I just submit a revert PR?

comment:8 by Mads Jensen, 7 years ago

Resolution: fixed
Status: closednew

If a change slows down things, please revert the change.

comment:9 by Tim Graham, 7 years ago

I'm not against reverting, but before doing so, it could be useful to ask cpython developers if the slowness should be a documented caveat of contextlib.suppress().

comment:10 by Tim Graham, 7 years ago

I got this input from #python-dev:

Alex_Gaynor: Without measuring, I'd bet almost anything that the majority of the overhead is in an additional function call, so it's not really avoidable. I'm not sure it makes sense to docuemnt, _any_ fucntion call would add the same overhead (assuming I'm right). (For that reason I'd expect it to have no additional overhead on PyPy)

__ap__: Those kinds of wholesale replacements look like an anti-pattern to me. Most of the time, the un-abstracted version is as readable as the slower abstracted one.

PR to revert.

comment:11 by Adam Johnson, 7 years ago

Cc: Adam Johnson added

Thanks Tim for checking with #python-dev and writing the PR! I forgot to add myself to CC so didn't get email updates (I'm used to systems where comment = auto subscribe).

comment:12 by GitHub <noreply@…>, 7 years ago

Resolution: fixed
Status: newclosed

In 6e4c6281:

Reverted "Fixed #27818 -- Replaced try/except/pass with contextlib.suppress()."

This reverts commit 550cb3a365dee4edfdd1563224d5304de2a57fda
because try/except performs better.

Note: See TracTickets for help on using tickets.
Back to Top