Issue717

Title Lama-Synergy: add new implementation
Priority feature Status resolved
Superseder Nosy List cedric, jendrik, malte, salome, silvan
Assigned To cedric Keywords
Optional summary

Created on 2017-04-25.14:27:37 by cedric, last changed by silvan.

Messages
msg6802 (view) Author: silvan Date: 2018-02-13.15:42:29
Great! I made some additional small changes. The next step will be to remove the
old implementation of the synergy in issue759. Marking this one as resolved.
msg6796 (view) Author: cedric Date: 2018-02-09.12:26:40
The two subsubsections of the subsection "Setting the "cost type" for landmark
heuristics" at OptionsCaveats are merged.

I added a comment regarding the cost type "inheritance" to the synergy slave. I
didn't add it to the automatically generated Doc because we cannot set the
transform option for the slave on the command-line and for the master we already
have the link to the OptionCaveats where this is explained. I think mentioning
the cost type "inheritance" in the masters documentation would just make it more
complicated.
msg6782 (view) Author: silvan Date: 2018-01-25.12:00:29
Ok, then the two subsubsections of the subsection "Setting the "cost type" for
landmark heuristics" at OptionCaveats should be merged to contain the latest
information (we still need the Caveat page because the landmark heuristic is
still not trivial to use with that respect).

I think we should indeed add a comment regarding cost type "inheritance" to the
synergy slave, and likely also to the master (both in the automatically
generated Doc). Also adding a comment to the code is good, I thought we already
had this documented.
msg6781 (view) Author: cedric Date: 2018-01-23.14:55:22
I had a closer look into the cost type for the lama synergy. It turns out that
we do not need the OptionCaveats for the synergy because the LAMA-FF-Synergy
master behaves exactly like a landmark heuristic. The synergy's FF heuristic
always gets the cost type of the transform option of the landmark heuristic
declaration.

So I would remove the OptionCaveats for the lama synergy and add a note to the
landmark heuristic that if using the synergy, the synergy's FF heuristic gets
the cost type from the transform option of the landmark heuristic.

I can also add this as a comment in the code where we set the transform option
for the slave explicitly to nullptr.
msg6780 (view) Author: silvan Date: 2018-01-22.17:20:29
Points 1-4 are addressed now (took me nearly no attempts at all :-) ), and
Cedric will take care of point 5. Once this is done, all that remains to do is
to remove the summary and to add a comment at issue697 about the new state of
the option caveats.
msg6779 (view) Author: malte Date: 2018-01-17.17:30:34
Hi all, looking at the documentation on the website, I think it should still be
improved. Part of this is that the old documentation wasn't great either. ;-)

Some things I would suggest immediately:

In the slave documentation:

1. Make the reference to the master documentation a link.

In the master documentation:

2. Move the example at the start to the part after the main option descriptions,
where we normally have more elaborate examples. I think Silvan has done this
many times and knows how to do it.

3. Format the example the way we normally would, using one line per heuristic.

4. The example should be a concrete example, not use "...".

5. If there are still option caveats (which I think there are), this should be
mentioned here, and we should link to the appropriate page. For example, it is
not clear to me what to do if I want to use different cost transformations for
the FF and LM heuristic (or if this is possible), and where the landmark
heuristic takes its costs from (from the master heuristic, from the landmark
factory, or "it depends on the other options" as previously).

Regarding the option caveats etc., it would be nice to clean this up properly
soon, so that we no longer need the caveat page.
msg6778 (view) Author: silvan Date: 2018-01-17.10:57:22
I had a look at the commits and they are fine. Please merge. But before closing
this issue, please also update http://www.fast-downward.org/OptionCaveats to fit
the new way of using the synergy. Maybe there are no more caveats after all?
msg6777 (view) Author: malte Date: 2018-01-17.10:44:39
Please merge if you're both happy with it.
msg6776 (view) Author: cedric Date: 2018-01-17.10:44:05
I addressed Silvans comments in the pull request.
msg6775 (view) Author: malte Date: 2018-01-11.18:36:47
Feel free to wrap this up between yourselves, i.e., ready to merge once Silvan
is happy with it.
msg6774 (view) Author: silvan Date: 2018-01-11.16:01:02
> Sorry for the delay. All open questions should now be answered or suggestions
> are implemented.
I replied to the comments.

> Not all of the TODOs from the summary can be done. Since the old implementation
> is still around, we cannot remove "Heuristic::default_options()"
Then this TODO should be moved to the follow-up issue's summary, i.e. issue759.
msg6772 (view) Author: cedric Date: 2018-01-11.14:45:45
Sorry for the delay. All open questions should now be answered or suggestions
are implemented.

Not all of the TODOs from the summary can be done. Since the old implementation
is still around, we cannot remove "Heuristic::default_options()"
msg6771 (view) Author: silvan Date: 2018-01-09.12:42:32
I went over the activity timeline at bitbucket and found a few comments that
haven't been addressed. Cedric, could you please double check for more missing
things? (I went down till May 2017 but not further.) I think that a good
practice not to miss out any comments is to only move/delete the notification
emails of bitbucket once the code has been updated.

Open comments I found:

After discussing with Jendrik, I understood this comment. Can we add "(in this
method)" at the end of the last sentence? I.e., "... evaluation context together
(in this method)".

What happens if the user disables caching in one or both use heuristics by
setting cache_estimates=false? Is this a problem?

If they are never accessed, why do we need to add them? Also, I think that these
settings should only happen in non-dry-runs, i.e., I'd move them below the
dry-run test.

So it seems like we are nearly there, but not entirely yet :-)
msg6770 (view) Author: malte Date: 2018-01-08.18:58:15
Thanks, Jendrik! Waiting until issue739 is merged is fine, as long as we address
it eventually. As long as cgroups works somewhat reliably, we won't have an
issue anyway, as that it an additional safeguard.
msg6769 (view) Author: jendrik Date: 2018-01-08.18:26:28
The code in the branch for issue739 limits the memory for VAL. Should I apply the 
fix directly in the default branch now or do we just wait until the issue is 
merged?
msg6768 (view) Author: malte Date: 2018-01-08.18:06:40
Experiments look good apart from the Slurm errors.

Jendrik, it looks like we're not limiting memory as we should. Is fixing this
already on the agenda?

Cedric, regarding Silvan's comment, we usually have someone look at the code
again before merging if there were any changes that have not yet been reviewed.
In this case, I don't remember what the comments were, or if the revisions were
trivial or not. If they were nontrivial, it would be good if someone (Jendrik,
Salomé, Silvan) could sign off on the final pull request before merging. And
then you can merge.
msg6767 (view) Author: silvan Date: 2018-01-08.17:42:29
I haven't followed the latest discussion here, but the experiments look fine. So
if Maltes comments at bitbucket have been addressed, this should be good to go.

Also don't forget to deal with the TODOs in the Summary field after merging the
code.
msg6766 (view) Author: cedric Date: 2018-01-08.09:25:55
Here are the experiments. If it is okay, I will merge now.

http://ai.cs.unibas.ch/_tmp_files/geissmann/issue717-v2.html

http://ai.cs.unibas.ch/_tmp_files/geissmann/issue717-v2-comparativereport.html
msg6657 (view) Author: malte Date: 2017-11-30.15:38:51
I'm done with my comments on bitbucket.

I am adding Salomé to the issue because I'm not sure if the changes regarding
compute_heuristic/compute_results interact nastily with the changes that she is
working on regarding multi-path depedency/"lazy heuristics", so it would be good
for the two of you to talk to each other about this.

Other than that and apart from the questions I asked on bitbucket, I think this
looks good. I'd be happier if we could run a second round of experiments and if
we could also run an experiment for a new "new full LAMA" config, i.e., a
comparison lama-first vs. lama-first-new and a comparison lama vs. lama-new. It
may make sense to make these two different experiments because different
attributes make sense for the anytime configurations. I hope Jendrik can give
advice on this.

Apart from this, I think we can merge this. I would say that we don't need to
wait for the experiments before merging because we keep the old implementation
around for the time being anyway.

I've changed the title of this issue because we're not getting rid of the
synergy in this issue yet. I suggest to do this in a follow-up issue rather than
making this issue larger, i.e., I suggest we merge this one as is (but perhaps
after adding the new full LAMA config), and then open a new issue for the next
step (getting rid of synergies). The old issue title was "Lama-Synergy: get rid
of the synergy object", which could be a good title for the new issue.
msg6389 (view) Author: silvan Date: 2017-05-24.12:30:30
I added suggestions for documentaiton at bitbucket so that Jendrik stops blaming
me to hinder progress with this issue :-)
msg6388 (view) Author: silvan Date: 2017-05-23.12:32:32
I had a look at the new classes and they seem fine. They are easier to
understand than before, but it's still not really trivial. Maybe it would be
worth documenting, e.g. at the master heuristic, how the two classes interact:
how are the classes related, and how does the computation of a heuristic value
works (in particular, which methods trigger what other methods, and where are
results stored/cached/recomputed). I imagine that this would ease understanding
for future work on the classes.
msg6387 (view) Author: jendrik Date: 2017-05-22.13:58:24
We think the code is ready now.
msg6366 (view) Author: malte Date: 2017-05-08.17:08:00
Let's finish the new code first. Looking at the pull request, it seems that
there are comments from you that aren't addressed/answered yet.

Also, I see that the new code copies old documentation from the synergy object
in many places (search for "LAMA-FF synergy"), which of course also is not the
correct documentation.

Once you think the code is ready, I can have another look (either immediately or
as part of the FIFO, depending on how complex it turns out).
msg6365 (view) Author: jendrik Date: 2017-05-08.17:02:06
I think the results look good. What do you think Malte, can we start with removing 
the old Synergy object?
msg6364 (view) Author: cedric Date: 2017-05-08.16:57:31
You can find the results here
http://ai.cs.unibas.ch/_tmp_files/geissmann/issue717-lama-synergy.html#summary
msg6353 (view) Author: jendrik Date: 2017-05-08.11:38:21
What's the status here? What do the experiments tell us?
msg6274 (view) Author: jendrik Date: 2017-04-28.16:47:15
Cedric generated a pull request at 
https://bitbucket.org/cgeissmann/downward/pull-requests/4
and we already finished our first round of comments. The next step is to run 
experiments for lama-first with the new and old option syntax.
msg6223 (view) Author: silvan Date: 2017-04-26.13:24:10
To simplify the option parser, get rid of the special treatment of heuristic
creation for lama_ff_synergies. Split of the lama_ff_synergy heuristic in two
classes.
History
Date User Action Args
2018-02-13 15:42:29silvansetstatus: reviewing -> resolved
messages: + msg6802
2018-02-09 12:34:12cedricsetsummary: TODO: When we're done with this issue, update issue697 and http://www.fast-downward.org/OptionCaveats ->
2018-02-09 12:26:40cedricsetmessages: + msg6796
2018-01-25 12:00:29silvansetmessages: + msg6782
2018-01-23 14:55:22cedricsetmessages: + msg6781
2018-01-22 17:20:29silvansetmessages: + msg6780
2018-01-17 17:30:35maltesetmessages: + msg6779
2018-01-17 10:57:22silvansetmessages: + msg6778
2018-01-17 10:44:39maltesetmessages: + msg6777
2018-01-17 10:44:05cedricsetmessages: + msg6776
2018-01-12 16:01:12cedricsetsummary: TODO: When we're done with this issue, update issue697 and http://www.fast-downward.org/OptionCaveats TODO: After this issue is merged, we can remove "Heuristic::default_options()". -> TODO: When we're done with this issue, update issue697 and http://www.fast-downward.org/OptionCaveats
2018-01-11 18:36:47maltesetmessages: + msg6775
2018-01-11 16:01:02silvansetmessages: + msg6774
2018-01-11 14:45:45cedricsetmessages: + msg6772
2018-01-09 12:42:32silvansetmessages: + msg6771
2018-01-08 18:58:15maltesetmessages: + msg6770
2018-01-08 18:26:28jendriksetmessages: + msg6769
2018-01-08 18:06:40maltesetmessages: + msg6768
2018-01-08 17:42:30silvansetmessages: + msg6767
2018-01-08 09:25:55cedricsetmessages: + msg6766
2017-11-30 15:38:51maltesetnosy: + salome
messages: + msg6657
title: Lama-Synergy: get rid of the synergy object -> Lama-Synergy: add new implementation
2017-05-24 12:30:30silvansetmessages: + msg6389
2017-05-23 12:32:32silvansetmessages: + msg6388
2017-05-22 13:58:24jendriksetstatus: in-progress -> reviewing
messages: + msg6387
summary: TODO: When we're done with this issue, update issue697 and http://www.fast-downward.org/OptionCaveats -> TODO: When we're done with this issue, update issue697 and http://www.fast-downward.org/OptionCaveats TODO: After this issue is merged, we can remove "Heuristic::default_options()".
2017-05-08 17:08:00maltesetmessages: + msg6366
2017-05-08 17:02:06jendriksetmessages: + msg6365
2017-05-08 16:57:31cedricsetmessages: + msg6364
2017-05-08 11:38:21jendriksetmessages: + msg6353
2017-04-28 16:47:15jendriksetmessages: + msg6274
2017-04-28 10:56:56jendriksetsummary: TODO: When we're done with this issue, update issue697 and http://www.fast-downward.org/OptionCaveats
2017-04-26 13:24:10silvansetnosy: + malte, silvan
messages: + msg6223
summary: To simplify the option parser, get rid of the special treatment of heuristic creation for lama_ff_synergies. Split of the lama_ff_synergy heuristic in two classes. -> (no value)
2017-04-25 14:27:37cedriccreate