1 |
On 7/6/20 11:03 AM, Zac Medico wrote: |
2 |
> On 7/6/20 10:30 AM, Chun-Yu Shei wrote: |
3 |
>> I finally got a chance to try Sid's lru_cache suggestion, and the |
4 |
>> results were really good. Simply adding it on catpkgsplit and moving |
5 |
>> the body of use_reduce into a separate function (that accepts tuples |
6 |
>> instead of unhashable lists/sets) and decorating it with lru_cache |
7 |
>> gets a similar 40% overall speedup for the upgrade case I tested. It |
8 |
>> seems like even a relatively small cache size (1000 entries) gives |
9 |
>> quite a speedup, even though in the use_reduce case, the cache size |
10 |
>> eventually reaches almost 20,000 entries if no limit is set. With |
11 |
>> these two changes, adding caching to match_from_list didn't seem to |
12 |
>> make much/any difference. |
13 |
> |
14 |
> That's great! |
15 |
> |
16 |
>> The catch is that lru_cache is only available in Python 3.2, so would |
17 |
>> it make sense to add a dummy lru_cache implementation for Python < 3.2 |
18 |
>> that does nothing? There is also a backports-functools-lru-cache |
19 |
>> package that's already available in the Portage tree, but that would |
20 |
>> add an additional external dependency. |
21 |
>> |
22 |
>> I agree that refactoring could yield an even bigger gain, but |
23 |
>> hopefully this can be implemented as an interim solution to speed up |
24 |
>> the common emerge case of resolving upgrades. I'm happy to submit new |
25 |
>> patches for this, if someone can suggest how to best handle the Python |
26 |
>> < 3.2 case. :) |
27 |
>> |
28 |
>> Thanks, |
29 |
>> Chun-Yu |
30 |
> |
31 |
> We can safely drop support for < Python 3.6 at this point. Alternatively |
32 |
> we could add a compatibility shim for Python 2.7 that does not perform |
33 |
> any caching, but I really don't think it's worth the trouble to support |
34 |
> it any longer. |
35 |
|
36 |
We've dropped Python 2.7, so now the minimum version is Python 3.6. |
37 |
-- |
38 |
Thanks, |
39 |
Zac |