OFAI

Technical Reports - Query Results

Your query term was 'number = 94-16'
1 report found
OFAI-TR-94-16 ( 50kB g-zipped PostScript file)

A Comparison of Pruning Methods for Relational Concept Learning

Johannes Fürnkranz

Pre-Pruning and Post-Pruning are two standard methods of dealing with noise in concept learning. Pre-Pruning methods are very efficient, while Post-Pruning methods typically are more accurate, but much slower, because they have to generate an overly specific concept description first. We have experimented with a variety of pruning methods, including two new methods that try to combine and integrate pre- and post-pruning in order to achieve both accuracy and efficiency. This is verified with test series in a chess position classification task.

Citation: Fürnkranz J.: A Comparison of Pruning Methods for Relational Concept Learning, Proc. AAAI-94 Workshop on Knowledge Discovery in Databases, Seattle, WA, 1994.