A multistate demographic model for firms in the province of Gelderland



The APC model is not an explanatory model but a statistical accounting scheme. To
interpret the period and cohort effects, one must look for attributes of the historical
contexts that brought about the effects; the age effect must be related to attributes of the
firm development cycle over the lifespan. In moderns versions of the APC model, to be
employed in this analysis, covariates may be included. These are factors, related to each
combination of age-period-cohort, to test for level and/or slope differences among
segments in the population.

In this paper, the APC model is presented as a special case of a generalized linear model
(GLM). The number of deaths is a random variable associated with a stochastic process.
Model fitting consists of three interrelated steps, following McCullagh and Nelder
(1989); (i) model selections (model specification or identification); (ii) parameter
estimation; and (iii) prediction.

The model relates the outcome of the random process to the parameters of the process.
The outcome is the number of events (deaths) in a particular time interval. In this paper,
we study the trend in death rates, defined as the ratio of the numbers of deaths and
population at risk. The number and types of parameters are determined by the type of
data that are available. One parameter is associated with each age, cohort and period.
Models selected to represent the data belong to the family of generalized linear models
(GLMs). An important characteristic of GLMs is that they assume independent
observations. In case of non-independence, the variances will be larger than in the case
of independent observations. It is assumed that deaths are generated by a Poisson
process, hence the observed numbers of deaths follow a Poisson distribution. The
Poisson assumption is justified when the death rate is low. In that case the Poisson
assumption is an adequate approximation of the binomial distribution, which describes
binary response data (e.g. deaths/survivors). The assumption that the number of deaths
is an outcome of a Poisson process has become widely accepted in the literature and is
implicit in the loglinear analysis of mortality rates. The dependent variable is the death
rate, which is the ratio of the number of deaths and the total duration during which the
population is exposed to the risk of dying. Since the exposure varies with the death rate,
both the numerator and the denominator of the death rate are random variables and are
interdependent. The dependence complicates the analysis substantially. Therefore it is
generally assumed that the denominator is fixed, i.e. independent of the number of
deaths. If the death rate is small, the assumption is realistic.

A major problem in model selection is the choice of variables to be included in the
systematic part of the model. The strategy adopted in this paper is to associate one
parameter with each age, period and cohort category.

Let nxtc denote the observed numbers of death of age x, period t and cohort c. Let Nxtc
denote independent random variables having Poisson distribution with positive
parameters λ
xtc. λxtc is the product of the death rate and the duration of exposure to the
risk of dying in year t by individual of age x and cohort c, which is assumed to be fixed
(L
xtc). The true value consists of two components: a systematic component, predicted by
the model to be specified, and a random component. To be precise, the random
component must be separated into two parts. One is a part due to our ignorance, i.e. the
absence of a complete observation; the other part is due to the fact that the outcome of
any random process is inherently uncertain even if we have all the necessary data to
predict the outcome. No distinction between the two parts is made in this paper.



More intriguing information

1. Estimation of marginal abatement costs for undesirable outputs in India's power generation sector: An output distance function approach.
2. Three Strikes and You.re Out: Reply to Cooper and Willis
3. How do investors' expectations drive asset prices?
4. Handling the measurement error problem by means of panel data: Moment methods applied on firm data
5. The name is absent
6. Spatial Aggregation and Weather Risk Management
7. The name is absent
8. EFFICIENCY LOSS AND TRADABLE PERMITS
9. Second Order Filter Distribution Approximations for Financial Time Series with Extreme Outlier
10. The name is absent
11. Has Competition in the Japanese Banking Sector Improved?
12. RETAIL SALES: DO THEY MEAN REDUCED EXPENDITURES? GERMAN GROCERY EVIDENCE
13. Measuring Semantic Similarity by Latent Relational Analysis
14. The name is absent
15. Valuing Farm Financial Information
16. The name is absent
17. The Response of Ethiopian Grain Markets to Liberalization
18. The name is absent
19. The name is absent
20. Implementation of a 3GPP LTE Turbo Decoder Accelerator on GPU