## Thursday, January 31, 2008

### Response to first thinking topic

If the state space is not a tree, there is a cycle, and so running around in the cycle as much as desired before achieving the goal generates plans of any desired length.

That being said, there tends to be very little interesting in generating plans with cycles, so one can always prune any plan with 2^k actions (which visits 2^k+1 states -- 1 more than the total number of states, ergo, the plan is cyclic, removing the cycle shows that it is non-minimal).

More practically, one keeps a closed list, i.e., a hash table of visited states. There is never a reason to pass through a real state or a regressed "state" in more than the shortest distance, so one tracks that number and prunes any search branches failing this condition. In some formalisms this is expensive to do, so instead one simply examines the states visited by the current branch, and prune the branch if there are any cycles.

Removing such restrictions does allow progression and regression planners to find any walk through state space, i.e., all plans minimal or otherwise. Such planners could enumerate all such walks, if desired, by printing plans whenever the goal check succeeds (but refusing to terminate, and generating the children of the state even though the goal is already true), but the search control does have to be breadth or best first [i.e., there must exist some measure of a plans with the guarantee that solutions are printed in sorted order with respect to the measure, and that only finitely many plans ever map to a value of the measure]. These considerations are important to partial satisfaction planning, actually.

Regression can, of course, generate cycles as easily as progression. One can almost invert any STRIPS domain; the inverted domain requires disjunctive and negative preconditions (and the original domain has to insist on conjunctive and positive preconditions). The idea here is that progression in the inverted domain is exactly regression in the original domain [so general properties of progression are also general properties of regression, and vice versa, unless the properties depend upon purely conjunctive or purely positive preconditions].

Minimizing a plan, by this definition, is going to be exponential -- every subset of actions would have to be checked in worst-case scenarios. In many domains, however, a simple search considering removing one action at a time could terminate early in every branch, leaving polynomial time. (if, for example, each action contributes the truth of a proposition that is needed by another action known to be necessary, and that no other action in the plan gives that condition, then this action is also necessary (counting the initial state and goal states as actions, with the goal state being the only action initially marked necessary). A simple backwards sweep is sufficient. Slight optimizations would be sufficient to minimize Blocksworld plans, and i think the approach as stated works for Rovers plans without cycles in move operators [which are easy to separately detect])

Different notions of planning define minimality in different ways; progression and regression normally imagine minimality to mean acyclic. Under that notion of minimality, with normal sorts of pruning based on cycle checking or duplicate detection, progression and regression generate minimal plans.

Under the suggested definition of minimality, progression and regression can generate non-minimal plans under any search control that doesn't guarantee optimality [even if employing duplicate detection or cycle elimination]. First note than an optimal plan must be minimal, since removing any set of actions reduces cost. Consider the following sketch of a state space:

sI -A-> s1 -B-> s2 -A'-> s3 -Z-> sG

-A'-> s3 -Z-> sG

An optimal planner will visit s3 by the shortest path first (optimal planners typically guarantee optimality in this recursive manner). A suboptimal planner, however, might choose to do ABA'Z, which is acyclic and non-minimal (A and B can be removed). Note that if B depends upon A, and A' depends upon B (or the initial state, and A deletes the condition that both the initial state and B assert), then removing either A or B alone leaves an unsound plan [thus showing the need to consider all subsets of a plan].

A planner that only prunes cyclic search branches (instead of keeping a closed list) will generate such acyclic, non-minimal, sub-optimal solutions with the greatest of ease.

For a much more thorough analysis of a very similar problem, one could look at Backstrom's analysis of reordering and deordering of plans; reordering being the hard problem of determining whether or not a plan has many alternative causal-link proofs [this is related to minimality as defined here].

While optimality implies minimality, and non-minimality implies sub-optimality, minimality does not imply optimality. (Under either definition of minimality: acyclic or every subset kills the plan).

-Will

Subscribe to:
Post Comments (Atom)

## No comments:

Post a Comment