Types of Cost in Inductive Concept Learning



increase.” If we are using this rule to make predictions,
then there is a misclassification error cost associated with
incorrect predictions. If we are using this rule to intervene
in a manufacturing process, then there is a similar cost
associated with “unwanted achievements” (van Someren
et al., 1997).

Suppose that the rule makes successful predictions for
90% of the cases for which the antecedent of the
condition (“sensor A has a value greater than B”) is
satisfied. If we can give the rule a causal interpretation,
then we may expect that, if we manipulate the process so
that sensor A is always greater than B, then the yield of
product type C will increase 90% of the time. The
remaining 10%, where our manipulation fails to increase
the yield of product type C, are a cost of using this rule.
These 10% are “unwanted achievements” of the rule (van
Someren
et al., 1997). (The terminology “unwanted
achievements” seems somewhat odd, but this is the
terminology used in van Someren
et al., 1997, and we are
reluctant to confuse the issue by introducing new
terminology.)

6.1 Constant Unwanted Achievement Cost

If the cost of unwanted achievements is constant, then we
can use a cost matrix, as with the cost of misclassification
errors (Section 2).

6.2 Conditional Unwanted Achievement Cost

The cost of unwanted achievements may vary with factors
such as the market demand for the unwanted
achievement, the cost of disposing of the unwanted
achievement, the cost of repairing or refining the
unwanted achievement, or the quantity of the unwanted
achievement.

7. Cost of Computation

Computers are a limited resource, so it is meaningful to
consider the cost of computation. The various types of
computational complexity are essentially different forms
of cost that we may wish to take into account.

We may distinguish the cost of computation by whether it
is static or dynamic, or by whether it is incurred during
training or during testing.

7.1 Static Complexity

A computer program, considered as a static object, has a
measurable complexity.

7.1.1 SIZE COMPLEXITY

The size complexity of a computer program may be
measured in several ways, such as the number of lines of
code, or the number of bytes. Since the code takes up
memory space in the computer, there is clearly a
corresponding cost.

Turney (1995b) shows how it is possible, under certain
circumstances, to treat size complexity as a kind of test
cost (Section 3). In this case, each feature that is to be
measured corresponds to a block of code that computes
the feature. The cost of measuring a feature is
proportional to the size of the corresponding block of
code. The goal is to minimize the total size of the code,
which is approximately the same as minimizing the total
cost of the features. (It is only approximate, because
blocks of code can combine in non-additive ways.)

7.1.2 STRUCTURAL COMPLEXITY

The structural complexity of a computer program might
be measured by the number of loops in the program, the
depth of nesting of the loops, or the number of recursive
function calls. Structural complexity has a cost; for
example, software with high structural complexity is more
difficult for software engineers to maintain.

7.2 Dynamic Complexity

The dynamic complexity of a program is the execution
time or memory space consumed by the program. Unlike
static complexity, dynamic complexity is a function of the
input to the program.

7.2.1 TIME COMPLEXITY

Time complexity may be measured in many different
ways. Even with a specific architecture, there are many
possible choices. For example, the time complexity of a
Turing machine might be measured by the number of
movements of the read/write head, by the number of
direction changes of the read/write head, or by the number
of state transitions of the finite state machine.

For example, a learning algorithm that discovers new
features may take into account the time complexity of
calculating the new features (Fawcett, 1993). In this
example, time complexity is a kind of test cost
(Section 3).

7.2.2 SPACE COMPLEXITY

The space complexity of a program is the amount of
memory it requires for execution with a given input.
Clearly memory has a cost. There are well-known (in the
theory of computational complexity) trade-offs between
time complexity and space complexity.

7.3 Training Complexity

The cost of computational complexity may be incurred
during training, when the algorithm is learning to classify.

7.4 Testing Complexity

The cost of computational complexity may be incurred
during testing, when the algorithm is making predictions.



More intriguing information

1. The changing face of Chicago: demographic trends in the 1990s
2. Design and investigation of scalable multicast recursive protocols for wired and wireless ad hoc networks
3. DURABLE CONSUMPTION AS A STATUS GOOD: A STUDY OF NEOCLASSICAL CASES
4. TINKERING WITH VALUATION ESTIMATES: IS THERE A FUTURE FOR WILLINGNESS TO ACCEPT MEASURES?
5. Fortschritte bei der Exportorientierung von Dienstleistungsunternehmen
6. Ronald Patterson, Violinist; Brooks Smith, Pianist
7. Who runs the IFIs?
8. The Provisions on Geographical Indications in the TRIPS Agreement
9. The name is absent
10. How do investors' expectations drive asset prices?
11. Financial Markets and International Risk Sharing
12. The name is absent
13. Sectoral specialisation in the EU a macroeconomic perspective
14. Evaluating the Impact of Health Programmes
15. Deprivation Analysis in Declining Inner City Residential Areas: A Case Study From Izmir, Turkey.
16. Integrating the Structural Auction Approach and Traditional Measures of Market Power
17. Investment and Interest Rate Policy in the Open Economy
18. The name is absent
19. Fiscal Policy Rules in Practice
20. Subduing High Inflation in Romania. How to Better Monetary and Exchange Rate Mechanisms?