Completing homomorphisms
08 Aug 2017I sent the PR with the other homomorphism cases a week ago, so about a day after my last post. The work required for the main part of the PR wasn’t really complicated but it took a while to get merged (earlier today) because some more problems showed up in the rewriting system part.
It started off with Kalevi noticing that in the case of a free abelian group, the list of rewriting rules after initiation seemed incomplete - it so happened that the test didn’t pick up on it because it didn’t need the missing rules. In itself, that wouldn’t be much of a problem because the missing rules could be added during the run of make_confluent
but is_confluent
was already True
- that was definitely wrong. So for one thing, _check_confluence
wasn’t working properly and also I thought that the type of rules that wasn’t added during rule initiation, could be added as another case - if it could be done in place, why wait till it’s discovered by the double loop in make_confluent
. I made a few little changes throughout the code to fix things but ultimately, it was the inadequacy of add_rule
that was causing problems.
When a pair of words is given to add_rule
, it first multiplies them by the inverse of the first element of the longer word until the length difference is 0, 1 or 2 (greater length differences are redundant when the smaller length differences are in the rules dictionary). Then it does the same on the other (right) side which leads to a different set of rules. We could obtain even more rules right here, without waiting for make_confluent
, if we allow switching sides, i.e. not just continuously multiplying on the right or on the left, but perform some left multiplications after several on the right, etc. This makes make_confluent
a little more efficient as more rules are discovered at one time but trying all possible combinations of sides would probably take too much time without actually being productive. At the moment, when the length difference becomes sufficiently small, instead of adding the rule directly, add_rule
calls itself recursively which allows for some side switching. Perhaps in the future, it would seem fit to try all combinations. A couple of days ago I added a rules cache to prevent repeating the work that has already been done by the function so maybe it won’t cause too much of a slow-down in practice.
After this, one rule was still missing. I reread the code several times and it took a while to work out that the problem was what seems quite obvious now. When a pair of words w1, w2
of the same length is given to add_rule
, the only rule that was added was w1: w2
for w1 > w2
. But another possibility right there could be w2**-1: w1**-1
provided w2**-1 > w1**-1
. Normally, this inverse rule doesn’t need to be added because if len(w1) > len(w2)
, then w1**-1 > w2**-1
and w**-1: w2**-1
is implied by how word reduction is set up. Adding this last case solved the issue.
There were some other little improvements. For example, make_confluent
has been made to returns a boolean at all times, not just when checking if the system is confluent. This could be used to see if it is successful. I also spotted an error in the kernel computation method that hadn’t come up before only by sheer luck.
Now that all the basic homomorphism functionality is available, I can have a go at extending the FpGroup
class with PermutationGroup
methods. I might be able to get it to work without the finite presentation of permutation groups PR (it hasn’t been reviewed yet) but I’m not entirely sure yet.
Another thing on my hands is sylow subgroups. I actually thought I got them to work several days ago but then one of the test groups (SymmetricGroup(10)
) revealed a bug in the _strong_gens_slp
attribute. It wasn’t caused by the sylow method and only comes up after computing a stabilizer or a normalizer - something I only realised yesterday; this bug really confused me for a while. I did fix it now but a different problem came up and what worked before no longer does. I don’t see why the bug fix would lead to it but evidently it did… So still trying to sort it out.
Update: Have just worked out that sylow thing. Turned out minimal blocks weren’t being computed properly (my fault: I wrote a separate function that should have outputed all minimal block systems but failed on minimality). So now all that remains is to do some more testing and tidy up the code, and I can send a PR with it in a day or so (if no other bugs turn up, that is).