1. Team ID = UW-MRS 2. Team affiliation = University of Washington 3. Contact information Woodley Packard sweaglesw@sweaglesw.org 4. ZIP file name = UW-MRS.zip System evaluation results are in "RESULTS.txt" inside this zip file. 5. System specs I'm not sure how elaborate to get here. The final writeup will have much more detail. 5.1 Core approach The commands are parsed using the English Resource Grammar (Flickinger, 2000) to produce MRS representations. The MRSes are crawled using hand-written rules to produce Robot Control Language. RCL hypotheses are optionally filtered using the spatial planner. A back-off approach using the Berkeley parser (Petrov et al., 2006) trained on a phrase-structure-tree conversion of the RCL treebank is employed as well. 5.2 Supervised The MRS crawling rules were hand-written by inspecting the training data. The Berkeley parser used for back-off is a supervised system trained on the provided training data as well. 5.3 Critical features used The approach relies on the English Resource Grammar and the task-provided spatial planner. 5.4 Critical tools used The ACE unification-based parser (http://sweaglesw.org/linguistics/ace/) and the Berkeley parser are both critical to the approach. 5.5 Significant data pre/post processing For the ERG-based route, none. For the Berkeley parser route, the RCL treebank had to be reformatted to look like phrase structure trees, and then components not spanning any input had to be reinserted into the trees after parsing. 5.6 Other data used None 6. References Flickinger, D. (2000). On building a more efficient grammar by exploiting types. Natural Language Engineering. Petrov, S., Barrett, L., Thibaux, R., and Klein, D. (2006). Learning Accurate, Compact, and Interpretable Tree Annotation. COLING-ACL.