• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Fisher Stenner2011b Full Formatted Paper3
 

Fisher Stenner2011b Full Formatted Paper3

on

  • 184 views

A technology roadmap for intangible assets metrology

A technology roadmap for intangible assets metrology

Statistics

Views

Total Views
184
Views on SlideShare
182
Embed Views
2

Actions

Likes
0
Downloads
1
Comments
0

1 Embed 2

http://www.linkedin.com 2

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Fisher Stenner2011b Full Formatted Paper3 Fisher Stenner2011b Full Formatted Paper3 Document Transcript

    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2 A TECHNOLOGY ROADMAP FOR INTANGIBLE ASSETS METROLOGY William P. Fisher, Jr.1, A. Jackson Stenner2 1 Graduate School of Education, University of California – Berkeley, Berkeley, California, USA 2 MetaMetrics, Inc., Durham, North Carolina, USA Abstract  Measurement plays a vital role in the crea- mental quality, present a wide array of new potentialtion of markets, one that hinges on efficiencies gained via applications of this principle.the universal availability of precise and accurate information Proven technical capacities for systematic and con-on product quantity and quality. Fulfilling the potential of tinuous improvements in the quality of objectivethese ideals requires close attention to measurement and the measures enable the alignment, coordination, androle of technology in science and the economy. The practicalvalue of a strong theory of instrument calibration and metro- integration of expectations, investments, and capitallogical traceability stems from the capacity to mediate rela- budgeting decisions over the long term. The relation-tionships in ways that align, coordinate, and integrate differ- ship between standards and innovation is complex andent firms expectations, investments, and capital budgeting dynamic, but a general framework conducive to inno-decisions over the long term. Improvements in the meas- vation requires close attention to standards. The trajec-urement of reading ability exhibit patterns analogous to tory of ongoing improvements in instrumentation inMoore‘s Law, which has guided expectations in the micro- the psychosocial and environmental sciences suggestsprocessor industry for almost 50 years. The state of the art in a basis for a technology road map capable of support-reading measurement serves as a model for generalizing the ing the creation of new efficiencies in human, social,mediating role of instruments in making markets for otherforms of intangible assets. These remarks provide only a and natural capital markets. New efficiencies are de-preliminary sketch of the kinds of information that are both manded by macroeconomic models that redefine la-available and needed for making more efficient markets for bour and land as human and natural capital, respec-human, social, and natural capital. Nevertheless, these initial tively, and that add a fourth form of capital—social—steps project new horizons in the arts and sciences of meas- to the usual three-capitals (land, labour, and manufac-uring and managing intangible assets. tured) framework. These models enhance sensitivity to the full com- Keywords measurement standards, technology road- plexity of intangible assets, enable the conservationmap, intangible assets and growth of their irreplaceable value, and frame economics in terms of genuine progress, real wealth, 1. INTRODUCTION sustainability, and social responsibility not captured in accounting and market indexes restricted to the value Standards ensure the performance, conformity, and of property and manufactured capital. Of special inter-safety of innovative new products and processes. est is the fact that the technical features of improve-Manufacturing and the provision of services require ments in rigorously defined and realized quantificationstandards to coordinate the matching of services (as in are likely to be able to support the coordination oftelecommunications), the fitting of parts, or the gaug- capital budgeting decisions in ways analogous to thoseing of expectations [1]. Measurement, then, plays an found in, for instance, the microprocessor industryessential economic role in the creation of markets relative to Moore‘s Law.centering on the efficiencies gained from the universal The state of reading measurement [12-13] is suffi-availability of precise, accurate, and uniformly inter- ciently advanced for it to serve as a model in extrapo-pretable information on product quantity and quality lating the principle to further developments in the[2-5]. Clear, fully enforced property rights and trans- creation of literacy capital markets, and for generaliz-parent representations of ownership are other forms of ing the mediating role of instruments in creating mar-standards that reduce the costs of transactions further kets to other constructs and forms of capital in theby removing sources of unpredictable variation in psychosocial, health, and environmental sciences.social factors [6-9]. When objective measurement is Instruments, metrological standards, and associ-available in the context of enforceable property rights ated conceptual images play vitally important mediat-and proof of ownership, economic transactions can be ing roles in economic success. For instance, the tech-contracted most efficiently in the marketplace [10-11]. nology roadmap for the microprocessor industry,The emergence of objective measures of individual based in Moores Law and its projection of doubledabilities, motivations, and health, along with service microprocessor speeds every two years, has success-outcomes, organizational performance and environ-
    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2fully guided semiconductor market expectations and vened a special meeting aimed at creating a detailedcoordinated research investment decisions for over 40 common vision, a roadmap, for the next 15 years‘years [2]. Moore‘s Law is more than a technical developments in semiconductor technology [2].guideline—it has served as a business model for an This roadmap made it possible for the industry toentire industry for almost 50 years. This paper pro- navigate a paradigm shift in its basic technology withposes the form similar laws and technology roadmaps no associated economic upheaval and with the con-will have to take to be capable of guiding innovation tinuation of the historically established pattern ofat both the technical level and at the broader level of increased functionality and lower costs. Education,human, social, and natural capital markets, compre- healthcare, government, and other industries requiringhensively integrated economic models, accounting intensive human and social capital investments lackframeworks, and investment platforms. analogous ongoing improvements in their primary The fulfilment of the potential presented by these products‘ reliability, precision, and cost control.intentions requires close attention to measurement and Where the microprocessor industry is able to reducethe role of technology in linking science and the econ- costs and improve quality while maintaining or im-omy [2-3]. Of particular concern is the capacity of proving profitability, education, healthcare, and socialcertain kinds of instruments to mediate relationships services seem only to always cost more, with little orin ways that align, coordinate, and integrate different no associated improvement in objective measures offirms expectations, investments, and capital budgeting quality.decisions over the long term. To what extent might this be due to the fact that Instruments capable of mediating relationships in these industries have not yet produced mediating in-these ways are an object of study in the social studies, struments like those available in other industries? Ifhistory, and philosophy of science and technology. In such instruments are necessary for articulating athis work, the usual sense of technology as a product shared history of past technical improvements andof science is reversed [14-20]. Instead of seeing sci- economies, and a shared vision of future ones, shouldence as rigidly tied to data and rule-following behav- not their development be a high priority? Within anyiours, the term technoscience refers to a multifaceted economy, individual actors are able to contribute todomain of activities in which theory, data, and instru- the collective estimation of value only insofar as thements each in turn serves to mediate the relation of the information they have at hand is sufficient to the task.other two [21-23]. Ideally, with that information, those demanding higher quality can identify and pursue it, rewarding producers 2. MEASUREMENT, MEDIATING INSTRU- of the higher quality. Without that information, pur- MENTS, AND MAKING MARKETS chasers are unable to distinguish varying levels of quality consistently, so investments in improved prod- In psychosocial research to date, there has been lit- ucts are not only unrewarded, they are discouraged.tle recognition of the potential scientific and economic Philanthropic capital markets have lately been de-value of universally accessible, uniformly defined, and scribed in these terms [24].constant units. This article draws from the history of Not yet having satisfactory mediating instrumentsthe microprocessor industry to project a model of how in industries relying heavily on intangible assets is notinstruments measuring in such units can link science proof of the impossibility of obtaining them. There areand the economy by coordinating capital budgeting strong motivations for considering what appropriatedecisions within and between firms. Links between mediating instruments would look like in human- andthe psychosocial sciences and industries such as edu- social-capital-intensive industries. Foremost amongcation and health care are underdeveloped in large part these motivations is a potential for correcting the sig-because of insufficient attention to the mediating role nificant capital misallocations caused when individualsome kinds of instruments are able to play in aligning organizations make isolated investment decisions thatinvestments across firms and agencies in an industry. cannot be coordinated across geographically distant Instruments capable of mediating relationships do groups‘ competing proprietary interests and tempo-so by telling the story of a shared history and by envi- rally separated inputs and outputs.sioning future developments reliably enough to reduce The question is one of how to align investment de-the financial risks associated with the large invest- cisions without compromising confidential budgetingments required. In the microprocessor industry, for processes or dictating choices. Simply sharing data oninstance, Moores Law describes a constant and pre- outcomes is a proven failure [25-26], and was neverdictable relation between increased functionality and attractive to for-profit enterprises for which such in-reduced costs. From 1965 on, Moore‘s Law projected formation is of proprietary value. But instead of focus-a detailed image of commercially viable applications ing on performance measured in locally idiosyncraticand products that attracted investments across a wide units incapable of supporting standard product defini-swath of the economy. When it became clear in the tions, might not a better alternative be found in defin-early 1990s that the physical limits of existing tech- ing a constant unit of increased learning, functionality,nologies might disrupt or even end this improvement or health, and evaluating quality and cost relative tocycle, the Semiconductor Industry Association con- it? The key to creating coherent industry-wide com-
    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2munities and markets is measurement. Fryback [27- engineering mechanics and in the economic sense of a28] succinctly put the point, observing that the U.S. new source of labour [34]. The medium is the message health care industry is a $900 + billion here, in the sense that mediating instruments like the [over $2.5 trillion in 2009 [29]] endeavor steam engine both represent the lawful regularity of that does not know how to measure its main the scientific phenomenon and provide a predictable product: health. Without a good measure of means of intervening in the production of it. output we cannot truly optimize efficiency The unique importance and value of Raschs mod- across the many different demands on re- els for measurement lie precisely here. Rasch- sources. calibrated instruments have long been in use on a wideQuantification in health care is almost universally scale in applications that combine the representationapproached using methods inadequate to the task, of measured amounts for accountability purposes withresulting in ordinal and scale-dependent scores that instructional or therapeutic interventions that takecannot capitalize on the many advantages of invariant, advantage of the meaningful mapping of abilitiesindividual-level measures [30]. Though data-based relative to curricular or therapeutic challenges [35-38].statistical studies informing policy have their place, These models are structured as analogies of scientificvirtually no effort or resources have been invested in laws‘ three-variable multiplicative form [13, 39] anddeveloping individual-level instruments traceable to so enable experimental tests of possible causal rela-universally uniform metrics that define the outcome tions [40-41]. When data fit such a model, demonstra-products of health care, education, and other industries bly linear units of measurement may be calibrated andheavily invested in human, social, and natural capital maintained across instrument configurations ormarkets. It is well recognized that these metrics are brands, and across measured samples. Clear thinkingkey to efficiently harmonizing quality improvement, about the measured construct is facilitated by the in-diagnostic, and purchasing decisions and behaviours variant constancy of the unit of measurement—one[31]. Marshalling the resources needed to develop, more unit always means one more unit of the sameimplement them, and maintain them, however, seems size. When instruments measuring the same thing areoddly difficult to do until it is recognized that such a tuned to the same scale, mediation is achieved in theproject must be conceived and brought to fruition on a comparability of processes and outcomes within andcollective level and against the grain of cultural pre- across subsamples of measured cases.suppositions as to the objective measurability of in- Linear performance measures are recommended astangible assets [32-33]. essential to outcome-based budgeting [11], and will Probabilistic models used in scaling and equating require structurally invariant units capable of mediat-different tests, surveys, and assessments to common ing comparisons in this way. Without instrumentsadditive metrics offer a body of unexamined resources mediating meaningfully comparable relationships, it isrelevant to the need for mediating instruments in the impossible to effectively link science and the econ-domains of human, social, and natural capital markets omy by coordinating capital budgeting decisions. The[33]. Miller and OLeary [2] complement the account- lessons so forcefully demonstrated over the course ofing literature‘s overly narrow perspective on capital the history of the microprocessor industry need to bebudgeting processes with the fruitful lines of inquiry learned and applied in many other industries.opened up in the history, philosophy, and social stud- The potential for a new class of mediating instru-ies of science. In this work, mathematical models and ments resides here, where the autonomy of the actorsinstruments are valued for their embodiment of the and agencies forming a techno-economic network islocal and specific material practices through which respected and uncompromised. Raschs parametermediation is realized. separation theorem is a scientific counterpart of Irving In these practices, instruments capable of serving Fishers economic separability theorem [42]. It isas reliable and meaningful media must simultaneously essential to realize that Raschs equations model therepresent a phenomenon faithfully and facilitate pre- stochastically invariant uniformity of behaviours,dictable control over it. Though the philosophy of performances, or decisions of individuals (people,science has long focused attention on the nature of communities, firms, etc.), and are not statistical mod-objective representation, the history and social studies els of group-level relations and associations betweenof science have, over the last 30 years or so, shifted variables [43]. Data fit a Rasch model and mediationattention to the role of technology in theory develop- is effected so far as the phenomenon measured (anment and in determining the outcome of experiments. ability, attitude, performance, etc.) retains its proper-By definition, instruments capable of mediating must ties across samples and instrument brands or configu-exhibit properties of structural invariance across the rations. Given this fit, the unit of measurement be-locally defined contexts of different organizations‘ comes a common currency for the exchange of valueparticular investments, policies, workforces, and ar- within a market defined by the model parameters [42].ticulations of the relevant issues. It is only through the How could this implicit and virtual market beconjoint processes of representation and intervention made explicit and actual? By devising mediating in-that, for instance, the steam engine became the me- struments linking separate actors and arenas in a waydium facilitating development of work in the sense of that conforms to the requirements of the techno-
    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2economic forecasts of a projection like Moores Law research and practice, so the confused consumer ac-or of a technology roadmap based in such a law. As cepts whatever is easily available or is most likely toMiller and OLeary [2] say, support a preconceived agenda. Markets are not spontaneously generated by the It may be possible, however, to separate the meas- exchange activity of buyers and sellers. Rather, urement wheat from the chaff. For instance, measure- skilled actors produce institutional arrange- ment consumers may value a means of distinguishing ments, the rules, roles and relationships that among methods that emphasizes their interests in, and make market exchange possible. The institu- reasons for, measuring. Such a continuum of methods tions define the market, rather than the reverse. could be one that ranges from the least meaningful andWhat are the rules, roles and relationships that skilled generalizable to the most meaningful and generaliz-actors need to arrange for their institutions to define able, which is equivalent to ranging from the most toefficient markets for human, social, and natural capi- the least dependent on the local particulars of thetal? What are the rules, the roles, and the relationships specific questions asked, sample responding, judgesthat make market exchange possible for these forms of rating, etc.intangible assets? How can standard product defini- The aesthetics, simplicity, meaningfulness, rigor,tions for the outcomes of education, healthcare, and and practical consequences of strong theoretical re-social services be agreed upon? Where are the lawful quirements for instrument calibration provide suchpatterns of regularities that can be depended on to criteria for choices as to models and methods [30. 44-remain constant enough over time, space, firms, and 49]. These criteria could be used to develop and guideindividuals to support industry-wide standardizations explicit considerations of data quality, construct the-of measures and products based on them? What trajec- ory, instrument calibration, quantitative comparisons,tories can be mapped that would enable projections measurement standard metrics, etc. along a continuumaccurate enough for firms and agencies to rely on in from the most passive and least objective to the mostplanning products years in advance? actively involved and most objective. Answers to questions such as these provide an ini- The passive approach to measurement typicallytial sketch of the kind of grounded, hands-on details of starts from and prioritizes content validity. The ques-the information that must be obtained if the endless tions asked on tests, surveys, and assessments areinflationary spirals of human- and social-capital- considered relevant primarily on the basis of theintensive industries are ever to be brought under con- words they use and the concepts they appear to ad-trol and transformed into profitable producers of au- dress. Evidence that the questions actually coherethentic value and wealth. together and measure the same thing is typically deemed of secondary importance, if it is recognized at 3. THE RASCH READING LAW all. If there is any awareness of the existence of axio- AND STENNER‘S LAW matically prescribed measurement requirements, these are not considered to be essential. That is, if failures of It is a basic fact of contemporary life that the tech- invariance are observed, they usually provoke a turn tonologies we employ every day are so complex that less stringent data treatments instead of a push tohardly anyone understands how they do what they do. remove or prevent them. Little or no measurement orTechnological miracles are commonplace events, from construct theory is implemented, meaning that alltransportation to entertainment, from health care to results remain dependent on local samples of itemsindustry. And we usually suffer little in the way of and people. Passively approaching measurement inadverse consequences from not knowing how auto- this way is then encumbered by the need for repeatedmatic transmissions, thermometers, or digital video data gathering and analysis, and by the local depend-reproduction works. It is enough to know how to use ency of the results. Researchers working in this modethe tool. are akin to the woodcutters who say they are too busy This passive acceptance of technical details be- cutting trees to sharpen their saws.yond our ken extends as well into areas in which stan- An alternative, active approach to measurementdards, methods, and products are much less well de- starts from and prioritizes construct validity and thefined. And so managers, executives, researchers, satisfaction of the axiomatic measurement require-teachers, clinicians, and others who need measurement ments. Failures of invariance provoke further ques-but who are unaware of its technicalities tend to be tioning, and there is significant practical use of meas-passive consumers accepting the lowest common urement and construct theory. Results are then inde-denominator of measurement quality. pendent of local samples, sometimes to the point that And just as the mass market of measurement con- researchers and practical applications are not encum-sumers is typically passive and uninformed, in com- bered with usual test- or survey-based data gatheringplementary fashion the supply side is fragmented and and analysis.contentious. There is little agreement among meas-urement experts as to which quantitative methods set 3.1. Six developmental stagesthe standard as the state of the art. Virtually any As is often the case, this black and white portrayalmethod can be justified in terms of some body of tells far from the whole story. There are multiple
    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2shades of grey in the contrast between passive and not accepted as viable alternatives, and significantactive approaches to measurement. The actual range attention will be paid to the item hierarchy and con-of implementations is much more diverse than the struct definition, but item calibrations remain empiri-simple binary contrast would suggest. Spelling out the cal. Though there is more significant use of measure-variation that exists could be helpful for making delib- ment theory, construct theory is underdeveloped, soerate, conscious choices and decisions in measurement no predictive power is available.practice. In Stage 4, the conceptualization of measurement It is inevitable that we would start from the materi- becomes more active than passive. Initial efforts toals we have at hand, and that we would then move (re-)design an instrument relative to construct specifi-through a hierarchy of increasing efficiency and pre- cations occur at this level. Additivity, invariance, etc.dictive control as understanding of any given variable are explicitly tested and are built into construct mani-grows. Previous considerations of the problem have festation expectations. The falsification of the additiveoffered different categorizations for the transforma- hypothesis provokes questions as to why and correc-tions characterizing development on this continuum. tive action, models with interaction effects are notStenner and Horabin [50] distinguish between 1) im- accepted as viable alternatives, significant attention ispressionistic and qualitative, nominal gradations found paid to the item hierarchy and construct definitionin the earliest conceptualizations of temperature, 2) relative to instrument design, but empirical calibra-local, data-based quantitative measures of tempera- tions remain the norm. Some construct theory givesture, and 3) generalized, universally uniform, theory- rise to limited predictive power. Commercial applica-based quantitative measures of temperature. tions that are not instrument-dependent (as in com- The latter is prized for the way that thermody- puter adaptive implementations) exist at this level.namic theory enables the calibration of individual In Stage 5, all of the Stage 4 features appear in thethermometers with no need for testing each one in context of a significantly active approach to measure-empirical studies of its performance. Theory makes it ment. The item hierarchy is translated into a constructpossible to know in advance what the results of such theory, and a construct specification equation predictstests would be with enough precision to greatly reduce item difficulties and person measures apart from em-the burden and expenses of instrument calibration. pirical data. These features are used routinely in com- Reflecting on the history of psychosocial meas- mercial applications.urement in this context, it then becomes apparent that In Stage 6, the most purely active approach tothese three stages can be further broken down. The measurement, all of the Stage 4 and 5 features aredistinguishing features for each of six stages in the brought to bear relative to construct specificationevolution of measurement systems are expanded from equations that predict the mean difficulties of ensem-a previously described five stage conception [12]. bles of items each embodying a particular combina- In Stage 1, conceptions of measurement are not tion of components. Commercial applications of thiscritically developed, but stem from passively acquired kind have been in development for several years.examples. At this level, what you see is what you get, Various degrees of theoretical investment at eachin the sense that item content defines measurement; stage can be further specified, along with speculationsadvanced notions of additivity, invariance, etc. are not as to the extent of application frequency in main-tested; the meanings of the scores and percentages that stream and commercial instrument development.are treated as measures are locally dependent on the Stage 1, with no effective measurement or constructparticular sample measured and items used; and there theory, remains the mainstream, most popular ap-is no theory of the construct measured. Data must be proach in terms of its application frequency, whichgathered and analyzed to have results of any kind. likely exceeds 90 percent of all efforts aimed at quan- In Stage 2, measurement concepts are slightly less tifying human, social, or natural capital. It is, how-passively adopted. Additivity, invariance, etc. may be ever, commercially the least popular in applicationtested, but falsification of these hypotheses effectively frequency (<10 %?) in high stakes educational andderails the measurement effort in favour of statistical psychological testing.models with interaction effects, which are accepted as Stage 2, implementing very limited use of meas-viable alternatives. Typically little or no attention is urement theory and no construct theory is the nextpaid at this stage to the item hierarchy or the construct most popular mainstream psychosocial applicationdefinition. An initial awareness of measurement the- frequency at perhaps eight percent, overall. It also hasory is not complemented by any construct specifica- a somewhat higher commercial application frequencytion theory. (10-20 %). In Stage 3, measurement concepts are more ac- Stage 3, with a strong use of measurement theorytively and critically developed, but instruments still and little or no construct theory, may be used as muchtend to be designed relative to content, not construct, as one or two percent of the time in mainstream appli-specifications. Additivity and invariance principles are cations, and may be dominant methodologically intested, and falsification of the additive hypothesis commercial applications (55-65 %?).provokes questions as to why, where, and how those Stage 4‘s strong use of measurement theory andfailures occurred. Models with interaction effects are use of some construct theory in informing instrument
    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2design have very limited psychosocial application proportionate ratios between reading comprehension,frequency in mainstream applications (<0,5 %?) but text complexity, and reader ability [12-13]. As texthave made some significant starts in commercial ap- complexity increases (the words used become lessplications (3-5 %?). Stage 5‘s strong theoretical un- commonly encountered, and sentence length in-derstanding of constructs is virtually unknown in creases), reading comprehension rates decrease rela-mainstream psychosocial application, but has also tive to a fixed reading ability measure. Conversely,begun to see some commercial developments. Stage given a fixed text complexity, reading comprehension6‘s mature theoretical understanding of constructs is rates increase as reading ability increases.only just emerging in some well-supported commer- The practical value of this law is realized insofarcial applications. as it then becomes possible to employ it productively in both (a) representing students‘ reading abilities in 3.2. The Rasch Reading Law summative accountability measures and (b) interven- ing in ways likely to change those measures in forma- Measurement theory sets the stage for thinking tive instructional applications [35-38]. Concerning theabout constructs by focusing attention on the mean- latter, it is well understood that learning is inherently aingfulness of the quantities produced, by facilitating matter of leveraging what is already known (the al-the construction of supporting evidence, by testing phabet, numbers, words, grammar, arithmetical opera-construct hunches, and by supporting theory develop- tions, etc.) to frame and understand what is not yetment. Construct theory then sets the stage for follow- known (new vocabulary, constructions, specific prob-ing through on measurement theory‘s fundamental lems, etc.). It is therefore vitally important to targetprinciples by making it possible to more fully tran- instruction at the sweet spot where enough is knownscend local particulars of respondent and item sam- to support comprehension, but where what is notples. It does so by recognizing that failures of invari- known is still substantial enough to make the lessonance are valuable as anomalous exceptions that challenging. This range along the measurement con-―prove‖ (L. probus, test goodness of) the rule embod- tinuum just above the student‘s measure is known asied in the measurement technology. the Zone of Proximal Development [52] and is valued That is, data-model misfit is not considered to re- for indicating the range of curriculum content thesult from model failure, but from uninterpretable in- student is developmentally ready to learn [53]. Whenconsistencies in the data stemming from under- measures are appropriately targeted, learning is maxi-developed theory and/or low quality data. Thus, fail- mized and measurement error is minimized. The sameure to fit a model of fundamental measurement is not a kind of strategy has proven useful in prescribing reha-sign of the end of the conversation or of the measure- bilitation therapies [36] and likely has other as yetment effort. Rather, negative results of this kind pro- unexplored applications.vide needed checks on the strength of the object to Targeting will be a key element in any future tech-withstand the rigors of propagation across media, nology roadmap for education. Though there is nowhich is the ultimate goal of having each different substitute for attention to other substantive aspects ofmanufacturer‘s tool capable of functioning as a me- the educational process, this indicator is of potentiallydium traceable to the same reference standard metric central importance as a summary indicator of how[18, 51]. accurately and precisely educational outcomes are The predictability of a trajectory for the evolution represented, and how efficiently instructional inter-of measurement allows the specification of a law ca- ventions are implemented.pable of shaping fundamental expectations as to in- Rasch measurement isolates and focuses attentioncreases in the power and complexity of psychosocial on empirical and theoretically tractable test item diffi-measurement technology, and the timing of those culty scale orders and positions. Then it estimatesincreases. This practical law is applicable to business student abilities relative to that scale and describesrelationships in a manner analogous to the way the them in terms of the probabilities of successful com-basic law describes scientific relationships. This is so prehension up and down the scale, whether or not alleven if the definition of work in engineering mechan- of the items potentially available have actually beenics is of little immediate interest in gauging the eco- administered. The goal of education, after all, is not tonomic value of labour. Despite the lack of immediate teach students only how to deal with the actual con-relevance, the practical utility of the widely used crete problems encountered in instruction and assess-horsepower measure of engine pulling capacity de- ment. The goal is rather to teach students how to man-pends on the scientific validity of the proportionate age any and all problems of a given type at a givenrelations between mass, force, and acceleration in level of difficulty.Newton‘s laws. Though a dialectic between part and whole is nec- The same simultaneous instantiation of scientific essary, we cheat students and society when educationand economic value must be possible for instruments becomes fixated on particular content and neglects theto mediate relationships in ways that can effectively larger context in which skills are to be applied. Theand efficiently coordinate capital budgeting decisions. overall principle is effectively one of mass customiza-Thus, the Rasch Reading Law describes invariantly tion. Instruction and assessment, or any bidirectional
    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2 Fig. 1. Hypothetical Projection of Mean Percent- Fig. 2. Hypothetical Projection of Mean Targetingages of Students Comprehending Text at a Rate of Accuracy (0-400L) By Average Relative US$ Cost of60% or Less By Average Relative Cost of Producing a Producing a Single Precision Reading Measure BySingle Precision Reading Measure By Year Yearmethod of simultaneous representation and interven- Fig. 1, modelled on the first of two figures intion, benefits from forms of quantification coordinat- Moore‘s original 1965 paper [58-59], shows a hypo-ing substantive content with metrics that remain stable thetical but not unrealistic projection of the relation ofand constant no matter which particular test, survey, average targeting accuracy with cost, by decade, fromor assessment items are involved. The same principles 1990 to 2030. Precision measurement is consideredapply in any other enterprise focused on intangible here to be realized when the targeted comprehensionoutcomes, such as health care, social services, or hu- rate is realized to within 5%. Few reading tests wereman resource management. We short change ourselves adaptively administered or well targeted before 1990;by failing to demand mediating instruments enabling a though a few were, computerization of test administra-kind of virtual coordination of improvement, purchas- tion was difficult and expensive, as was (and remains)ing, hiring, and other investment decisions across printed test production. Further, even fewer tests weredifferent individuals, firms, agencies, and arenas in the administered for diagnostic or formative purposeseconomy. The architecture of probabilistic models before 1990, which is just as well as few would haveopen to the integration of new items and samples been able to provide information useful for those ap-embodies the principles of invariance characteristic of plications.the mediating instruments needed for aligning legally It is plausible to suppose that, as the quality of test-and geographically separated firms‘ decisions within a ing has improved in the years after 1990, costs havecommon inferential framework. been reduced and the targeting accuracy of assessment items and instructional text has been enhanced, so the 3.3. Stenner’s law difference between the average item difficulty and the Of course, even though it has been almost 60 years average measure of the targeted student approaches 0,since Rasch [54] first did his foundational research to the left. The upper limit of targeting accuracy re-[55-57] on reading, integrating assessment and in- mains constant because the impact of new methods ofstruction on the basis of the Rasch Reading Law is not test construction and administration are unevenlyyet the norm in educational practice. Accordingly, distributed. Costs are driven down as theory is able tomost instruction is not integrated with assessment, and inform the automatic production and administration offew examination results are reported so as to illustrate targeted text and test items in computerized contextsthe alignment of a developmental continuum with the effectively integrating assessment and instruction.curriculum. Furthermore, and more specifically, most Costs may be dropping by an order of magnitudereading instruction is not appropriately targeted at every decade, with the rate in reductions in mistarget-individual students‘ Zones of Proximal Development. ting slowing as it nears 0. At some future date, accu-This is problematic, given that reading abilities within rate targeting may become universal, and the right,elementary school classrooms can easily range from off-target end of the range may also drop to near 0.two grade levels below to two grade levels above the Fig. 2, also patterned on the first figure in Moorereading difficulty of the textbook. [58], is a variation on the same information as that shown in Fig. 1. Mistargeted text and test items may
    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2 resolving the technical problem of continued reduc- tions in microprocessor size and cost had to be miti- gated so that no firms found themselves making large capital investments with no product or customers in sight (which, unfortunately, is the status quo in educa- tion, healthcare, and other industries making intensive investments in human, social, and natural capital). Table 1 presents a reading measurement variation on the 2001 version of the semiconductor industry‘s roadmap [2]. The basic structure of the table (the col- umns labelled ―Year of first production‖ and ―Tech- nology node‖, and the subheadings focusing on ―Ex- pected shifts in product functionality and cost‖, Intro- ductory volumes, and Innovations) is identical with Fig. 3. Rate of Increase in Number of Precision the one produced by the semiconductor industry. TheReading Measures Estimated remaining elements have been changed to focus on the kinds of functionality, cost, and innovations that havebore able readers encountering material that is much taken place historically in the domain of reading in-too easy, but poor readers unable to make any head- struction and assessment.way with readings far too complex for them to com- Some of these suggested elements may prove lessprehend are doomed to learn little or nothing. Fig. 2 is important in articulating a shared history and project-thus intended to convert Fig. 1‘s targeting information ing an accessible vision of the future, and others mayinto the implied percentage of students comprehend- be needed. The point here is less one of specifyinging text at a rate of 60% or less. what exactly should be tracked and is more focused on Fig. 3, patterned on the second figure in Moore conveying the general conceptual framework in which[58], describes what may be referred to as Stenner‘s new possibilities for coordinating investment deci-Law: the expectation that the number of precision sions in education might be explored. Plainly, it wouldreading measures estimated will double every two be essential for the major stake holding actors andyears, with no associated increase in cost. The figure agencies involved in education, from academia tohas historical validity in that the line begins not long business to government, to themselves determine theafter the 1960 introduction of Rasch‘s work in Chi- actual contents of a roadmap such as this.cago, is in the range of 350,000 in the 1970s, duringthe Anchor Test Study [60-61], and is about 20-30 5. CONCLUSIONmillion in the period of 2005-2008, which is approxi-mately how many measures were being produced The mediation of individual and organizationalannually at the time by users of the Lexile Framework levels of analysis, and of the organizational and inter-for Reading [12]. organizational levels, is facilitated by Rasch meas- urement. Miller and O‘Leary [2] document the use of 4. A TECHNOLOGY ROADMAP Moores Law in the microprocessor industry in the FOR INTANGIBLE ASSETS creation of technology roadmaps that lay out the struc- ture, processes, and outcomes that have to be aligned New and urgent demands challenged Moore‘s Law at all three levels to coordinate an entire industryswhen it was realized in the 1990s that the physical economic success. Such roadmaps need to be createdlimits of silicon could potentially disrupt the expecta- for each major form of human, social, and naturaltions that had allowed the microprocessor industry to capital, with the associated alignments and coordina-coordinate its investment decisions so consistently for tions put in play at all levels of every firm, industry,over 20 years [62]. The threat of a crisis led to the and government.convening of an industry-wide meeting in 1992 by It has been suggested that economic recovery inGordon Moore, then chairman of the Semiconductor the wake of the Great Recession could be driven by aIndustry Association‘s technology committee [2]. This new major technological breakthrough, one of the sizeand subsequent meetings of the group resulted in the and scope of the IT revolution of the 1990s. Thisannual publication of an International Technology would be a kind of Manhattan Project or internationalRoadmap for Semiconductors. public works program, providing the unifying sense of These charts provided a level of specificity and de- a mission aimed at restoring and fulfilling the prom-tail not present in the more bare-bones projections of ises of democracy, justice, freedom, and prosperity.Moore‘s Law. The established history of past suc- Industry-wide systems of metrological reference stan-cesses combined with new uncertainties compelled dards for human, social, and natural capital fit the bill.leaders in the field to seek out a basis on which new Such systems would be a new technological break-mediating instruments might be founded. Risks asso- through on the scale of the initial IT revolution. Theyciated with evaluating several different methods of would also be a natural outgrowth of existing IT sys-
    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2 Table 1. Sketch of a Possible Technology Roadmap for Literacy Education (Elements to be determined) Year of first productionTechnology node 1980 1990 2000 2010 2020 2030Expected shifts in product functionality and cost—highvolume targeted itemsNumber of steps in item production and administrationMultiple per 3-year technology cycleAffordable production cost per item at introductionRate of cost reduction per cycle (%)Annualized cost reduction/item (%)Improvements in instructional targeting accuracyReduction in students reading at <60% comprehension (%)Increase in number of precision reading measures (%)Available item volume at introductionUncalibratedCalibrated bankFrom specification equationInnovations in item administrationPaper and pencil adaptive formatsComputer adaptive (bank)Theory adaptive (specified)Innovations integrating assessment & instructionExaminations aligned with curricula (%)Differentiated instruction aligned with adaptive tests (%)Writing assessment included (%)tems, an extension of existing global trade standards, This might change with the introduction of Sten-and would require large investments from major cor- ners Law, and in conjunction with technology road-porations and governments. In addition, stepping be- maps for literacy capital and for other forms of hu-yond those suggestions that have appeared in the man, social, and natural capital that project rates ofpopular press, systematic and objective methods of increase in psychosocial measurement functionalitymeasuring intangible assets would help meet the and frame an investment appraisal process ensuringwidely recognized need for socially responsible and the ongoing creation of markets for advanced calibra-sustainable business practices. tion services for the next 10 to 20 years or more. Better measurement will play a vital role in reduc- A progression of increasing complexity, meaning,ing transaction costs, making human, social, and natu- efficiency, and utility can be used as a basis for aral capital markets more efficient by facilitating the technology roadmap that will enable the coordinationcoordination of autonomous budgeting decisions. It and alignment of various services and products in thewill also be essential to fostering new forms of inno- domain of intangible assets. A map to the theory andvation, as the shared standards and common product practice of calibrating instruments for the measure-definitions made possible by advanced measurement ment of intangible forms of capital is needed to pro-systems enable people to think and act together collec- vide guidance in quantifying constructs such as liter-tively in common languages. acy, health, and environmental quality. We manage Striking advances have been made in measurement what we measure, so when we begin measuring wellpractice in recent years. Many still assume that assign- what we want to manage well, we‘ll all be better off.ing numbers to observations suffices as measurement,and that there have been no developments worthy of ACKNOWLEDGEMENTnote in measurement theory or practice for decades.Nothing could be further from the truth. Thanks to Michael Everett for his support of this Theory makes it possible to know in advance what work. Initial drafts of this paper were composed whilethe results of empirical calibration tests would be with William Fisher was an employee of MetaMetrics, Inc.enough precision to greatly reduce the burden andexpenses associated with maintaining a unit of meas- REFERENCESurement. There likely would be no electrical industryat all if the properties of every centimetre of cable and [1] R.H. Allen, R.D. Sriram, ―The role of standards inevery appliance had to be experimentally tested. This innovation‖, Technological Forecasting and Socialprinciple has been employed in measuring human, Change, vol. 64, pp. 171-181, 2000.social, and natural capital for some time, but has not [2] P. Miller, T. O‘Leary, ―Mediating instruments and making markets: Capital budgeting, science and theyet been adopted on a wide scale.
    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2 economy‖, Accounting, Organizations, and Society, [18] B. Latour, Reassembling the social: An introduction to vol. 32, no. 7-8, pp. 701-34, 2007. actor-network-theory, Oxford University Press,[3] M. Callon, ―From science as an economic activity to Oxford, 2005. socioeconomics of scientific research: The dynamics of [19] T.L. Hankins, R.J. Silverman, Instruments and the emergent and consolidated techno-economic imagination. Princeton, New Jersey, Princeton networks‖, In P. Mirowski & E.-M. Sent (Eds.), University Press, 1999. Science bought and sold: Essays in the economics of [20] D. Ihde, The historical and ontological priority of science. Chicago: University of Chicago Press, 2002. technology over science. In D. Ihde, Existential[4] Y. Barzel, ―Measurement costs and the organization of technics (pp. 25-46). Albany, New York, State markets‖, Journal of Law and Economics, vol. 25, pp. University of New York Press, 1983. 27-48, 1982. [21] J.R. Ackermann, Data, instruments, and theory: a[5] A. Benham, L. Benham, ―Measuring the costs of dialectical approach to understanding science, exchange‖, In C. Ménard (Ed.), Institutions, contracts Princeton, New Jersey, Princeton University Press, and organizations: Perspectives from new institutional 1985. economics (pp. 367-375). Cheltenham, UK, Edward [22] D. Ihde, Instrumental realism: the interface between Elgar, 2000. philosophy of science and philosophy of technology,[6] C.A. Lengnick-Hall, M.L. Lengnick-Hall, S. Bloomington, Indiana, Indiana University Press, 1991. Abdinnour-Helm, ―The role of social and intellectual [23] D. Ihde, Expanding hermeneutics: visualism in science, capital in achieving competitive advantage through Evanston, Illinois, Northwestern University Press, enterprise resource planning (ERP) systems‖, Journal 1998. of Engineering Technology Management, vol. 21, pp. [24] S.H. Goldberg, Billions of drops in millions of buckets: 307-330, 2004. Why philanthropy doesnt advance social progress.[7] W. J. Ashworth, ―Metrology and the state: Science, New York, Wiley, 2009. revenue, and commerce‖, Science, vol. 306, no. 5700, [25] C. Murray, ―By the numbers: Acid tests: No child left pp. 1314-1317, 2004. behind is beyond uninformative. It is deceptive‖, Wall[8] G.J. Beges, Drnovsek, L.R. Pendrill, ―Optimising Street Journal, Tuesday, July 25, 2006. calibration and measurement capabilities in terms of [26] A.D. Ho, ―The problem with ‗proficiency‘: Limitations economics in conformity assessment‖, Accreditation of statistics and policy under No Child Left Behind‖, and Quality Assurance: Journal for Quality, Educational Researcher, vol. 37, no. 6, pp. 351-60, Comparability and Reliability in Chemical 2008. Measurement, vol. 15, no.3, pp. 147-154, 2011. [27] D. Fryback, ―QALYs, HYEs, and the loss of[9] J.A. Birch, Benefit of legal metrology for the economy innocence‖, Medical Decision Making, vol. 13, no. 4, and society (International Committee of Legal pp. 271-2, 1993. Metrology). Hunters Hill, Australia: International [28] D.A. Kindig, ―Purchasing population health: Aligning Organization of Legal Metrology (accessed 28 Oct financial incentives to improve health outcomes‖, 2008 No. Nursing Outlook, vol. 47, no. 1, pp. 15-22, 1999. http://www.oiml.org/publications/E/birch/E002- [29] "National Health Expenditure Data: NHE Fact Sheet." e03.pdf), 2003. 2011,[10] G.P. Baker, R. Gibbons, K.J. Murphy, ―Bringing the https://www.cms.gov/NationalHealthExpendData/25_N market inside the firm?‖, American Economic Review HE_Fact_Sheet.asp, Baltimore, MD, US, (Accessed 30 vol. 91, no. 2, pp.: 212-18, 2001. June 2011).[11] M.C. Jensen, ―Paying people to lie: the truth about the [30] D. Andrich, ―Controversy and the Rasch model: A budgeting process‖, European Financial Management, characteristic of incompatible paradigms?‖, Medical vol. 9, no. 3, pp. 379-406, 2003. Care, vol. 42, no. 1, pp. I-7--I-16, 2004.[12] A.J. Stenner, H. Burdick, E.E. Sanford, D.S. Burdick, [31] D.M. Berwick, B. James, M.J. Coye, ―Connections ―How accurate are Lexile text measures?‖, Journal of between quality measurement and improvement‖, Applied Measurement, vol. 7, no. 3, pp 307-322, 2006. Medical Care, vol. 41, no. 1 (Suppl), pp. I30-38, 2003.[13] D.S. Burdick, M.H. Stone, A.J. Stenner, ―The [32] R.D. Cooter, ―Law from order: Economic development Combined Gas Law and a Rasch Reading Law‖, Rasch and the jurisprudence of social norms‖, In A Not-So- Measurement Transactions‖, vol. 20, no. 2, pp. 1059- Dismal Science: A Broader View of Economies and 60, 2006. Societies, Eds. M. Olson and S. Kahkonen, pp. 228-44.[14] R. Bud, S. E. Cozzens (Eds.), Invisible connections: New York, Oxford University Press, 2000. Instruments, institutions, and science. Bellingham, [33] W.P. Fisher, Jr., ―Invariance and traceability for Washington, USA, SPIE Optical Engineering Press, measures of human, social, and natural capital: Theory 1992. and application‖, Measurement, vol. 42, no. 9, pp.[15] Price, D. J. d. S. (1986). Of sealing wax and string. In 1278-1287, 2009. Little Science, Big Science--and Beyond (pp. 237-253). [34] M.N. Wise, ―Mediating machines‖, Science in Context, New York: Columbia University Press. vol. 2, no. 1, pp. 77-113, 1988.[16] Y. M. Rabkin, Rediscovering the instrument: Research, [35] A.C. Alonzo, J.T. Steedle, ―Developing and assessing a industry, and education. In R. Bud & S. E. Cozzens force and motion learning progression‖, Science (Eds.), Invisible connections: Instruments, institutions, Education, vol. 93, pp. 389-421, 2009. and science (pp. 57-82). Bellingham, Washington, [36] W.C. Chang, C. Chan, ―Rasch analysis for outcomes SPIE Optical Engineering Press, 1992. measures: some methodological considerations‖,[17] D. Ihde, E. Selinger (Eds.), Chasing technoscience: Archives of Physical Medicine and Rehabilitation, vol. Matrix for materiality. (Indiana Series in Philosophy of 76, no. 10, pp. 934-39, 1995. Technology). Bloomington, Indiana: Indiana University [37] C.A. Kennedy, M. Wilson, ―Using Progress Variables Press, 2003. to Interpret Student Achievement and Progress.
    • urn:nbn:de:gbv:ilm1-2011imeko-018:1 Joint International IMEKO TC1+ TC7+ TC13 Symposium August 31st September 2nd, 2011, Jena, Germany urn:nbn:de:gbv:ilm1-2011imeko:2 Berkeley―, Evaluation & Assessment Research Center. [50] A.J. Stenner, I. Horabin, ―Three stages of construct Paper Series, No. 2006-12-01. Berkeley: University of definition‖, Rasch Measurement Transactions, vol. 6, California, Berkeley BEAR Center. 51 pp., 2007. no. 3, p. 229, 1992.[38] D. Leclercq, ―Computerised tailored testing: Structured [51] B. Latour, Science in action: How to follow scientists and calibrated item banks for summative and formative and engineers through society. New York, Cambridge evaluation‖, European Journal of Education, vol. 15, University Press, 1987. no. 3, pp. 251-60, 1980. [52] L.S. Vygotsky, Mind and Society: The Development of[39] W.P. Fisher, Jr., ―The standard model in the history of Higher Mental Processes. Cambridge, Massachusetts, the natural sciences, econometrics, and the social Harvard University Press, 1978. sciences‖, Journal of Physics, Conference Series, vol. [53] P. Griffin, ―The comfort of competence and the 238, no. 1, http://iopscience.iop.org/1742- uncertainty of assessment‖, Studies in Educational 6596/238/1/012016/pdf/1742-6596_238_1_012016.pdf, Evaluation, vol. 33, pp. 87-99, 2007. 2010. [54] G. Rasch, Probabilistic models for some intelligence[40] C.V. Bunderson, V.A. Newby, ―The relationships and attainment tests (Reprint, with Foreword and among design experiments, invariant measurement Afterword by B. D. Wright, Chicago: University of scales, and domain theories‖, Journal of Applied Chicago Press, 1980). Copenhagen, Denmark: Measurement, vol. 10, no. 2, pp. 117-37, 2009. Danmarks Paedogogiske Institut, 1960.[41] A.J. Stenner, M. Smith III, ―Testing construct [55] D. Andrich, Rasch models for measurement. (Vols. theories‖, Perceptual and Motor Skills, vol. 55, pp. series no. 07-068). Sage University Paper Series on 415-26, 1982. Quantitative Applications in the Social Sciences.[42] W.P. Fisher, Jr., ―Bringing human, social, and natural Beverly Hills, California: Sage Publications, 1988. capital to life: Practical consequences and [56] T. Bond, C. Fox, Applying the Rasch model: opportunities‖, Journal of Applied Measurement, vol. Fundamental measurement in the human sciences, 2d 12 no. 1, pp. 49-66, 2011. edition. Mahwah, New Jersey: Lawrence Erlbaum[43] W.P. Fisher, Jr., ―Statistics and measurement: Associates, 2007. Clarifying the differences‖, Rasch Measurement [57] B.D. Wright, ―Additivity in psychological Transactions, vol. 23, no. 4, pp. 1229-1230, 2010. measurement‖, In E. Roskam (Ed.), Measurement and[44] D. Andrich, ―Understanding resistance to the data- personality assessment (pp. 101-112). North Holland: model relationship in Raschs paradigm: A reflection Elsevier Science Ltd, 1985. for the next generation‖, Journal of Applied [58] G. Moore, ―Cramming more components onto Measurement, vol. 3, no. 3, pp. 325-59, 2002. integrated circuits‖, Electronics, vol. 38, no. 8, pp. 114-[45] I.J. Myung, ―Importance of complexity in model 117, 1965. selection‖, Journal of Mathematical Psychology, vol. [59] G. Moore, ―Progress in digital integrated electronics‖, 44, no. 1, pp. 190-204, 2000. International Electronics Devices Meeting (IEEE)[46] I.J. Myung, M. Pitt, ―Model comparison methods‖, In Technical Digest, vol. 21, pp. 11-13, 1975. L. Brand & M. L. Johnson (Eds.), Methods in [60] R.M. Jaeger, ―The national test equating study in Enzymology, Evaluation, Testing and Selection. Vol. reading (The Anchor Test Study)‖, Measurement in 383: Numerical computer methods, part D (pp. 351- Education, vol. 4, pp. 1-8, 1973. 65). New York, Academic Press, 2004. [61] R.R. Rentz, E.L. Bashaw, ―The National Reference[47] J.R. Busemeyer, Y.M. Wang, ―Model comparisons and Scale for Reading: An application of the Rasch model‖, model selections based on generalization criterion Journal of Educational Measurement, vol. 14, no. 2, methodology‖, Journal of Mathematical Psychology, pp. 161-179, 1977. vol. 44, no. 1, pp. 171-189, 2000. [62] M. Schulz, ―The end of the road for silicon?‖, Nature,[48] B.D. Wright, ―A history of social science vol. 399, pp. 729-730, 1999. measurement‖, Educational Measurement: Issues and Practice, vol. 16, no. 4, pp. 33-45, 52, 1997. Authors: William P. Fisher, Jr., Berkeley Evaluation and[49] B.D. Wright, ―Fundamental measurement for Assessment Research Center, Graduate School of Education, psychology‖, In S. E. Embretson & S. L. Hershberger University of California – Berkeley, (94720, Berkeley, (Eds.), The new rules of measurement: What every California, USA) +1-919-599-7245, wfisher@berkeley.edu. educator and psychologist should know (pp. 65-104), Hillsdale, New Jersey: Lawrence Erlbaum Associates, A. Jackson Stenner, MetaMetrics, Inc. (27713, Durham, 1999. North Carolina, USA) +1-919-547-3402, jsten- ner@lexile.com.