1 |
On 7/6/20 10:30 AM, Chun-Yu Shei wrote: |
2 |
> I finally got a chance to try Sid's lru_cache suggestion, and the |
3 |
> results were really good. Simply adding it on catpkgsplit and moving |
4 |
> the body of use_reduce into a separate function (that accepts tuples |
5 |
> instead of unhashable lists/sets) and decorating it with lru_cache |
6 |
> gets a similar 40% overall speedup for the upgrade case I tested. It |
7 |
> seems like even a relatively small cache size (1000 entries) gives |
8 |
> quite a speedup, even though in the use_reduce case, the cache size |
9 |
> eventually reaches almost 20,000 entries if no limit is set. With |
10 |
> these two changes, adding caching to match_from_list didn't seem to |
11 |
> make much/any difference. |
12 |
|
13 |
That's great! |
14 |
|
15 |
> The catch is that lru_cache is only available in Python 3.2, so would |
16 |
> it make sense to add a dummy lru_cache implementation for Python < 3.2 |
17 |
> that does nothing? There is also a backports-functools-lru-cache |
18 |
> package that's already available in the Portage tree, but that would |
19 |
> add an additional external dependency. |
20 |
> |
21 |
> I agree that refactoring could yield an even bigger gain, but |
22 |
> hopefully this can be implemented as an interim solution to speed up |
23 |
> the common emerge case of resolving upgrades. I'm happy to submit new |
24 |
> patches for this, if someone can suggest how to best handle the Python |
25 |
> < 3.2 case. :) |
26 |
> |
27 |
> Thanks, |
28 |
> Chun-Yu |
29 |
|
30 |
We can safely drop support for < Python 3.6 at this point. Alternatively |
31 |
we could add a compatibility shim for Python 2.7 that does not perform |
32 |
any caching, but I really don't think it's worth the trouble to support |
33 |
it any longer. |
34 |
-- |
35 |
Thanks, |
36 |
Zac |