DIPARTIMENTO DI
INFORMATICA Università di Torino | |
The Train Check-Out ProblemThe following description of the dataset appeared in FONN: Combining First Order Logic with Connectionist Learning, presented at ICML-97.Starting from the well-known train-going-east problem defined by Michalsky [1] we extended the dataset by introducing continuous features to the description of each car in a train. In particular, a car is described by a vector of four numerical and five discrete attributes: width, length, height and weight, the presence of lights, the presence of brakes, the load type (no load, passengers, normal materials and special materials), the fact that the car is an engine or not (the first car of a train is always an engine; on long trains there is often another engine at the end, and perhaps one in the middle; an engine has a different minimum weight with respect to a standard car, and has always lights and brakes), and the number of axles (2, 3, 4, or 6). We designed a procedure to randomly generate instances of trains, each composed by 2 to MAX cars, being MAX a parameter of the procedure. Given such a dataset, it is possible to design different classification criteria and define corresponding learning problems to submit to a system. In the following experimentation, we defined the general problem of deciding whether a train cannot transit on a given line (check-out procedure followed by a railway inspector), depending on the characteristics of the line (for instance, if there are bridges on the railway the train must not weigh more than the bridge can bear); we selected three criteria of increasing complexity by varying the number of cars (from 1 to 3) and the kind of attributes (numerical and discrete) involved in the decision. In order to be able to rephrase the defined problems in a propositional setting, we chose a 0-arity concept train-cannot-go() to be learned. The first criterion trains1 classifies trains using a simple disjunctive rule based on the value of a numerical attribute of a car: a train cannot transit on a line if and only if it contains at least one car whose weight (height, length or width, respectively) exceeds some fixed threshold.This task is really simple as it can be easily translated and solved in a propositional environment. However, the ability to directly handle structured data in the network, simplifies the network itself by reducing the number of rules needed, on one hand, and allows to show its effectiveness in refining a manually tampered knowledge base. The second criterion (trains2) classifies trains using rules based on numerical attributes of two or three cars: a train cannot transit on a line if it contains two cars that are near each other and whose weights exceed a threshold we_1Two cars are considered near each other if they are at most three positions apart (i.e.: the second and the fifth car are considered near, the third and the seventh are not). This task is more complex than the previous one because hypotheses depend upon two or three cars; it allows us to show that the first order theory learned by the system is as compact as in the previous example, with respect to the problem complexity, whereas a propositional counterpart is much more complicated. The third criterion (trains3) classifies trains using rules based on both numerical and discrete attributes of two cars: a train cannot transit on a line if it contains two near cars, both without brakes and heavier than a threshold we_3All learning tasks are inherently first-order (all conditions are of the form ``there exists one car x such that ..." or ``there exist two/three cars x_1 and x_2 (and x_3) such that ..."). In order to test our claim that the system can generalize well over the number of objects (in this case, cars) that constitute a learning instance (in this case, a train), we generated a series of learning sets L_1, L_2 and L_3 composed of 100, 200, 500 trains, about 50% positives, whose length was randomly chosen between 2 and 8 cars; then, we built a first test set (test set A) containing 10000 trains whose number of cars was randomly chosen between 2 and 8, as for the learning sets, and a second test set of 10000 trains (test set B) composed by 2 to 15 cars. In order to relate our system with the existing ones, we converted the first-order version of sets L_i, A and B into a propositional one, where each train is represented by a vector of 135 features, nine per car; trains with less than 15 cars were represented by filling the vector with values out of range (-1). Obviously, this is not only a waste of space: some propositional learner can be disturbed by the huge number of missing features, and it can be unable to generalize to trains longer than those seen during learning, i.e. it cannot learn from a set of short trains such as L_i with only 72 significant features (8, the maximum length of a train, times 9, the number of features of each car) and then apply the acquired knowledge to the structurally more complex set B. We performed the following experiments: C4.5 [2] has been run on the propositional definition of each problem; handcrafted sets of rules, corresponding to the ones used for classifying the instances, have been fed to the FONN; thirdly, the multistrategy learning system Smart+ [3] has been run on the three problems and the knowledge learned has been refined by the FONN.
References
Available files(please notice that the actual files are a second revision of the original ones, more balanced and renumbered); the "obj" file format is used by Smart+, FONN and NTR
|
[Information]
[People]
[Research]
[Ph.D.]
[Education]
[Library]
[Search]
[Bandi/Careers] [HelpDesk] [Administration] [Services] [Hostings] [News and events] |
Administrator: wwwadm[at]di.unito.it | Last update: Mar 08, 1999 | |