All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher, except for reading and browsing via the World-Wide Web.
[¶2.] The previous chapters pointed out the degree to which desktop electronic computers and widespread interconnection by local as well as national and global networks have become accepted as constitutive elements of modern life. Many of the short-term consequences and problems are well known and widely discussed. But this process has been under way for only a short while, and there is little basis for systematic understanding of the nature and implications of the long-term changes that will ensue as computers and networks begin to transform the structure as well as the practice of social, economic, and political life. Such long-term consequences as have attracted widespread attention and public debate have centered on the Internet and other systems of global telecommunications and information, and have focused on arguments as to how, or whether, they will transform the personal and social worlds of individuals.1
[¶3.] The substantive transformations already taking place in economic, business, government, and other social and political organizations are attracting far less notice, even though their long-term consequences are equally uncertain, and probably more significant. Existing organizations, from small businesses to government agencies, are changing both their structure and their practice in response to the opportunities and challenges of computerization, often with little or no attempt to understand the possible long-term effects.
[¶4.] In the 1960s, new technologies of computerized electronic control provided a new basis for the organization of production. The replacement of craft with production work at the turn of the century had transformed the representation of work, gradually moving the worker from being the direct creator of a product to one who operated a machine. The new introduction of process-control machinery, numerically controlled machine tools, and, eventually, production line or process computerization, transformed the representation once again, moving the worker one step farther back, from being the operator of the machine that did the work to supervising the controls that operated the machine that did the work.2
[¶5.] As computerization developed further, specification of output was increasingly incorporated into the machine by computerizing it. The first automated machinery incorporated only process into the machine; what was actually to be done was trained into the worker. The second generation removed even that discretion, preprogramming action as well as process into the machinery, or, more properly, into the computer that now operated the controls that guided the machinery.
[¶6.] As computers grow more powerful and networks more extensive, these processes are no longer confined to factories and production, but are extending into the world of knowledge-based work, including that of designers and managers themselves, creating entirely new forms of organizations in which managers do not manage and supervisors do not supervise; rather, they, too have become the operators of computers that supervise, and manage, through pre-programmed algorithms and models.
[¶7.] The widespread acceptance and use of the earlier revolutionary techniques for information and communication, the printing press and the telephone, not only transformed the social as well as the economic sphere, but also resulted in the invention of entirely new businesses and organizations entirely based on the new capabilities. Whether one believes that computerization amounts to a "third-wave" social and political revolution of equal importance or not, the new power of computers and computer networking is already making possible the creation of entirely new forms of technical systems whose possibilities and risks are not yet fully understood.
[¶8.] In order to understand these effects, and to relate them to the existing literature on the direct effects of workplace computerization and delocalization, much of which has traditionally focused on the skilling and deskilling of workers, it is necessary first to review briefly some of the relevant literature on the effects of automation and computers on business and management. Much of it tends to be normative or exhortative. But there is also a growing body of analytic work that seeks to address the problems and consequences of the widespread introduction of computers into knowledge-based as well as production-oriented segments of the economy, and seeks to explore the importance of increasing functional differentiation, spatial decentralization, and automation of program as well as task on the role and structure of management.
[¶10.] The management and business literature has long pointed out that the tremendous gains in efficiency and productivity that resulted could not have been captured without the developments in information technology that took place in the latter half of the twentieth century.3 The state of coordination and integration of early industrialization has been described as a piecemeal aggregation of individual jobs performed by skilled workers trained largely through apprenticeship.4 Standardization was lacking, diversity reigned, and comparing the output of two workers was at best trying. It was interconnection and communication via the telegraph, and more important, the telephone, that made possible the expansion of the span and scope of managerial control.
[¶11.] There are striking parallels between early forms of industrial organization and the early state of desktop computing in business prior to the development of standardized methods of networking. In early attempts to convert collective workplaces into factories, work was more often divided by product or subassembly than by task, and increases in efficiency as the industrial revolution took hold were sought through the partitioning of labor, narrowing the scope and definition of the individual task, de-emphasizing the importance of "craft" skills, and turning workers into "productive units."5 The genius of Henry Ford was to integrate these into a single productive machine via the assembly line, a method that could be applied to nonmanufacturing firms only with the creation of networks of communication equally hierarchical and equally susceptible to central design and control.
[¶12.] The increase in control and communication in turn both fostered and facilitated the formation of the large, vertically integrated companies and corporations that have dominated industry and business, socialist as well as capitalist, for most of the century.6 But without a theory of management to implement them, it is likely that none of the new techniques would ever have been put to productive use. And the first of these theories, developed around the turn of the century, was the school of scientific management, the first, but not the last, to try to apply rational principles of science and engineering to social organization.
[¶14.] As the creation of large factories or industries that employed large numbers of people to perform tasks by rote on a piecework basis shifted the emphasis from the skilled worker to the simple laborer, the more ambitious, motivated, and intelligent were quick to note that income and social mobility were increasingly associated with soft jobs such as management and sales rather than physical work, however skilled.7 Intelligence and experience were therefore leaving the plant and shop floor even before the first wave of task specialization and standardization narrowed the scope of individual tasks. Those who remained, or who replaced the ones who left, had few incentives to work any harder than was necessary. The new managers, lacking either work experience or a set of reliable measures against which to measure performance, were aware of the decline in morale, and, frequently, the decline in productivity that followed, but found it difficult to devise or enforce measures to increase efficiency in the face of worker recalcitrance.
[¶15.] It was in this context that Frederick W. Taylor was moved to introduce the principles of what he called "scientific" management, derived at first largely from the application of production engineering. As an engineer, Taylor believed that if rational rules and tight control replaced previous disorderly and informal modes of plant organization, management would be better able to combat labor problems such as soldiering and low motivation among workers.8 Many of Taylor's methods have become famous (or infamous); some, such as time-and-motion studies, were widely adopted. But others were too strict, or too mechanical, and Taylor had only moderate success in getting firms to adopt his means and methods.9 What did flourish was his agenda, with its underlying precepts of rational modeling and analysis and its emphasis on finding the correct organizational form for any business or activity.
[¶16.] As later critics have pointed out, the growing movement toward scientific management that resulted was more of an ideology than a management system in the modern sense.10 What was sought was the "one best way" to organize, to coordinate and rationalize not only to reduce conflict within the plant, but also to eliminate all forms of possible discord or disorderliness throughout the entire firm or industry. To the Taylorists, an ideal employee not only did what she was told, but stayed strictly within designated boundaries and task specifications. These in turn were set by managers using superior knowledge and rational methods of analysis.
[¶17.] The principles of Taylorism were never more clearly in evidence than when Henry Ford extended them to develop the first assembly line.11 In the classic "Fordist" model, individual tasks are sequenced together tightly by mechanical means (the line) whose underlying structure and purpose is the increase of productive efficiency. Costs are lowered by reducing slack (available but normally unused resources), buffering capacity (resource storage space between or among different tasks to allow for uneven output rates), and skill duplication; this was made possible by providing central sources of data and coordination, removing the need for interworker negotiation between and among various steps.12
[¶18.] In the classic mass-production plant, increased production was achieved not only by dividing and specializing tasks, but by preprocessing away into the mechanisms of control much of the information contained in the final products.13 The plant was indeed "automated" even before the era of the digital computer.14 Given that not all products are equally adapted to these strategies, items to be manufactured were therefore increasingly selected, and designed, according to the ease with which they could be subjected to the new techniques of mass production.15 Moreover, because sunk costs in a plant were high, every effort was made to reduce the variance between one product, or one generation of a product, and another. In effect, the range of consumer choice was being shifted from the end user to the designer.
[¶19.] The Taylorist ideology and approach were not confined to industries that mass-produced "hardware" such as automobiles or washing machines, with their simple structures and standardized outputs, or to offices that performed a narrow range of simple tasks. The growing sophistication of process at the end of the nineteenth century had resulted in a major internal reorganization of industries whose output was now based largely on scientific and engineering knowledge. In the electrical and chemical industries, for example, the entire means of production was radically revised to coordinate with the shift in emphasis from practical to formal knowledge.16 These firms sought not only to improve and monitor the individual performance of more highly skilled workers, but also a means to eliminate the wasteful duplication of skills, particularly obsolete ones.
[¶20.] The principles of Taylorism coincided neatly with this agenda, as one of the aims was to remove the control the craft guilds had heretofore retained through their monopoly on experiential knowledge and expertise.17 As craft procedures became increasingly reduced to formally codified individual tasks, craftsmen were to be eased out and replaced by less skilled workers performing a limited range of standardized and simplified procedures.18 As the workplace was socially reorganized to separate the organization of work from its execution, management was also to be reorganized along formal lines, using bureaucracy and technical infrastructure as a means of coordination and control.19
[¶22.] Although traditional mastery had disappeared with the craft guilds, the importance of tacit knowledge had not vanished. The more complex the plant and process, industry or enterprise became, the more important was the role of those who could maintain an integrative view of operations and react quickly to situations without pausing to break them down and analyze then one step at a time.20 Almost every plant or business has had a story equivalent to "old Mike" on the plant floor, who could tell a boiler was going to break down by the way it sounded, "old Harry" in management, who seemed to be able to smell how a certain change would affect sales, or "old Eloise" in accounting, who was the only one who could predict accurately whether operations were to be profitable that quarter.21
[¶23.] The period of the "old hand" is often remembered with some nostalgia, and considerable fondness, but such people were rarely well rewarded either for their value to the firm or for making their knowledge available to those who sought to codify and formalize it. What they lacked was the status and power to defend themselves as independent sources of authority and variation in an increasingly standardized work environment. The scientific management movement's claim to be able to embed their knowledge and judgment into plant and process design carried with it the implication that "old so-and-so," however quaint, was no longer central to efficient and reliable operation.22
[¶24.] To compensate for the loss of broadly skilled workers with integrative knowledge of plant and process, more elaborate and detailed models of work and task were created; control was asserted by increasing the direct, line authority of foremen and other intermediate managerial classes, under the assumption that better data and information would continue to ensure smooth integration.23 As the transition from skilled to standardized labor proceeded, more and more oversight was demanded for process reliability, which in turn increased organizational complexity and the need for managerial coordination. Despite its absence from traditional measures of productivity, the extensive bureaucratization, with the attendant increase in office and managerial staff it required, was now seen as necessary and productive, rather than as an expensive and wasteful consumer of resources.24
[¶25.] Gradually, those who had risen into the white-collar technician class from the plant floor began to be displaced by professional engineers, whose understanding of management practices and objectives, coupled with knowledge of the principles and general theory of their field, was considered more important than detailed knowledge of any particular plant or process.25 As engineers became more professionalized, their familiarity with operational details decreased; as processes became more sophisticated, the range of competence of individual workers was a decreasing fraction of the whole, and as management became a rationalized profession, preference was increasingly being given to those with formal education and training.26 And just as the shift in importance to the plant of machinery rather than workers elevated the status, and importance, of those on the technical staff who designed and specified the production machinery, so did it elevate the status of those who were responsible for formal design and organized plant flows.
[¶27.] The Taylorist agenda, strongly opposed by organized labor, was widely resisted in the 1920s.27 What was emerging instead was a greater emphasis on cooperation and social rewards, aided and augmented by changes in the training of the growing and increasingly professionalized cadre of middle managers. During the 1930s, the work of Mayo and of Barnard emphasizing social roles and group norms completed the agenda of social transformation. Instead of viewing workers as isolated productive units, the organization was viewed as a primarily cooperative, socially integrated enterprise.28
[¶28.] In contrast to the mechanistic and class-ridden approach that had characterized the Taylorist movement, and even its more humane Ford-ist
[¶29.] As tasks, goals, structures, and social interactions grew more complex, organizations came increasingly to be visualized not as self-contained entities, but as "natural" or "open" systems interacting strongly with their environments.31 But the acknowledgment of the role of uncertainty and the imperfection of models did not much impede the organizational search for as much certainty and predictability as possible. The literature on management continued to focus on rational, closed-system models, emphasizing the importance of formal planning, goal-setting, and the search for control rather than adaptation and negotiation.32 Increasing the control and efficiency of use of information therefore remained as much of a concern to business managers as it was at the turn of the century. Until the Second World War, however, there were few means for systematically addressing either.
[¶31.] Not all of the important developments made during the war were technical. There were a number of complex analytic problems to be solved, such as optimal strategies for hunting submarines, or the allocation of interdependent resources, that resulted in the development of two quantitative socio-technical tools of considerable significance. Although confined during the war primarily to specific military tasks of highest salience, the importance for business and other more general applications of both operations research, with its focus on mathematical strategies for optimization, and systems analysis, as a means for formal analysis of complexity, was perceived from the outset.33 Those who created and developed these tools were eager to find broader, more general applications. They sought to provide a rational and quantitative means for analyzing a wide variety of social phenomena, including business management, often in close conjunction with theoretical developments in information theory and computing.34
[¶32.] In the 1950s, operations researchers and their descendants began to move into industry and business as advisors and consultants, leading to two currents that have shaped contemporary management science. On the theoretical side, the dominant human relations model of management was to be overtaken by the new systems school in the mid-1960s, complementing the rational-actor and decision models being developed by Simon, March, Cyert, and others.35 On the practical side, as the systems perspective moved into the business schools, managers skilled in quantitative analysis and systems thinking moved from the periphery to the center of the discipline, aided by the expansion and availability of computers to perform the calculations.
[¶33.] The transfer of these formal analytic techniques to the business world reinvigorated the movement for quantitative scientific management, fostered and augmented by the rapid development of automated equipment and data processing techniques. The new agenda differed from the old in a major expansion of the scope of analysis; instead of treating the firm as a series of isolated, interacting operations to be integrated from the top, it was now visualized as a single, complex, interrelated pattern of activities, to be analyzed, coordinated, and optimized as a whole.36 At first, progress was slow. What was clearly needed were more and better data, and more comprehensive, less costly means for gathering, storing, and processing them.
[¶34.] The mini- and microcomputer revolutions provided the material and the means. Increased computer power and availability enabled the creation of far more elaborate models and techniques than had ever been imagined, aided and augmented by the availability of relatively simple applications for model-building and parametric adjustment. At the same time, the growing "informating" of the workplace by the introduction of computers for operational and process control was not only generating much more data about performance than had been possible before, but was offering the option of making it available online, in real time, as well as storing it in huge amounts for later analysis.37
[¶35.] In a volume prepared as the report of the Management in the 1990s program organized by MIT's Sloan School of Management, Michael S. Scott Morton, the director of the project, laid out the dramatic changes that were expected:
[¶37.] Others were even more enthusiastic. In the view of Davidow and Malone, for example:
[¶39.] What would be required was not only a recognition of the differences that would be made in the way work was done but in the way it was coordinated, an overhaul of management structure and strategy to accommodate the new possibilities and incorporate them into systematic new means for integration and control.
[¶41.] Although much of the historical focus of the debate over workplace automation and the presumed deskilling of workers was on industrial production, similar changes were taking place in other sectors over roughly the same time period.40 The general decrease in the fraction of product costs attributable to labor and the growing importance of nonmanufacturing industries, businesses, and other forms of regulatory or administrative organizations in the economy drew increasing attention instead to the softer tasks of desk workers. The introduction of magnetic check coding and automatic tallying equipment into banks, for example, had effects on the structure of employment that were not that different from what had been observed in industries such as manufacturing.41 Some occupations were more affected than others, but the pattern was consistent.
[¶43.] By the 1980s, much of the work of performing transactions also became computerized; now the tellers themselves were threatened unless they could convert to computer operators. Long considered a traditional venue for employment, banking replicated in two decades a century of industrial history, moving a large part of its work force two steps back from contact with the actual work.
[¶44.] There were similar stories throughout the economy. Automation of inventory and stock control, price totaling, and invoicing were among the first processes to become extensively computerized. Supermarket checkers went from ringing up prices by hand to electronic cash registers to bar-coded prices, to complete bar-code checkout automation--now frequently including a direct computer link to credit card or bank account. Both inventory and access control in many high-technology firms use bar-coded access cards and computers to keep track of who is where, as my university library does to charge out and track books. Operators answering order phones follow preprogrammed sets of responses on their screens--if they have not been completely replaced by programmed voice mail.
[¶45.] Because of the traditional focus on labor, and in particular industrial labor, a great deal of the analysis, and the debate, concerning the introduction of advanced computers continued to focus on the traditional problems of worker power and the value of labor, extending the argument about deskilling and reskilling in the workplace. The most powerful and direct statement of the deskilling hypothesis in the new context of workplace automation was that of Braverman, whose influential work set the terms of the ensuing debate.43 Braverman also argued that the introduction of new "technology" deskilled workers, but he concluded that the main source lay not in the changing relation between workers and their machinery, but in the increased control by management that was being made in the name of technical efficiency.44
[¶46.] This was clearly an argument that could easily be extended to office and other white-collar work--including management itself.45 Some analysts and business gurus are even predicting that the advent of machine intelligence and other quasi-intelligent techniques will do for knowledge workers (such as managers, administrators, and other professionals, including, presumably, designers) what earlier and less sophisticated office computerization did for transactions processors (such as clerks and keypunch operators).46 Although the argument for replacement may be a bit exaggerated (and technically too optimistic), it is much more plausible when rephrased in terms of increased control of the workplace.
[¶47.] By the 1970s, many of the new class of "scientific managers," trained in and completely familiar with increasingly powerful computerized tools and methods, were using their quantitative skills to move up the management ladder.47 The attitudes of many of those responsible for the design and implementation of the new systems had decidedly Taylorist resonances, among them the notion carried over from science and engineering that there is one best way to perform any task, including a managerial one. As deeply ingrained, and more operationally important, was the associated belief that excess capacity, the "slack" that provides a buffer against uncertainty or unexpected events, is wasteful, and that new forms of electronic integration can replace organizational buffers as well as inventories by implementing the equivalent of the tightly managed and controlled flows of "just-in-time" strategies.48 These were to be accomplished by the "leveraging of expertise," which translates in practice into preprogramming the workplace through rigorous standardization and adherence to formal models.
[¶48.] The advent of the personal computer in the 1980s and the accompanying dramatic increase in the capabilities and power of all forms of computing equipment allowed these designers and other experts to extend the use of their tools and techniques into all levels of the firm, from the shop floor (when there is one) to the office of the CEO,49 resulting indirectly in reconfiguration of the historical flow of information in the firm. Whether what results is or is not characterizable as "automation" in the traditional sense, new computerized techniques for managing communication, information, and operation are "informating" the workplace, generating vast bodies of new data and information.
[¶49.] In historical, centralized, hierarchical forms of organization, the span and scope of managerial control were strictly limited at any level. Much of the position and power of middle managers lay in their role as collectors, processors, and forwarders of information up the chain.50 The interconnected computerized workplace provides a means for an unprecedented degree of penetration across and through the lines of authority. Managers at all levels, including the most senior, can now collect and process data directly, not only overseeing the work of subordinates but actually monitoring it. They can bypass their subordinates, or work around their peers, by communicating directly with anyone in the firm and by checking the work of anyone over whom they have formal authority. In short, and depending on your perspective, managers can now oversee work directly, intervene, micro-manage, or interfere, even in tasks and processes for which they have
[¶51.] Many elements of the changing workplace, both as reported by empirical research and as analyzed by the growing business literature on computerization, seem uncomfortably close to the goals and aspirations of the classical Taylorists.51 Just as Taylorism eliminated a generation of craft workers, or converted them, plant automation threatens a generation of machine operators unless they are willing and able to acquire the skills (and attitude) needed to work with automated equipment.52 It was the principles of Taylorism that created the typing pool and the single-purpose clerical worker. The same principles seem to have created the full-time word processing specialist, or data processor, and may work in the future to create their managerial equivalent.
[¶52.] In her study of workers ranging from blue-collar manual labor to white-collar international banking, Zuboff bypassed the traditional, machine-oriented debate over deskilling in favor of a more intellectually oriented framework centered on the definition of work as a representation of the body in labor.53 In her work, she found evidence that the "informated" workplace was not so simply analyzed as the automated factory.54 Workers did express concern over the loss of experi-ential knowledge, and of hands-on control of procedures and
[¶53.]
[¶54.] As noted in chapter 3, these are effective means for exerting control through compliance rather than hierarchical authority, allowing for formal decentralization while ensuring that the range of significant deviance from central plan and intent is narrow. Such tactics have a long history of success in social and political affairs.60 It is therefore not a complete surprise that in other studies on the introduction of computers, particularly into businesses and other service-oriented organizations and bureaucracies, users do not see computerization as resulting in either an increase of direct central control or in the displacement of status and authority that many analysts had predicted.61 The question remains as to whether the accompanying increase in the information flows and connectivity that textualizes the work environment actually empowers workers or only enculturates them, whether there are operational gains in status and autonomy or only a new means for continuing the drive toward managerial oversight and control that first nurtured the Taylorist movement.
[¶55.] What is clear is that similar tendencies and trends are at work up and down the scale of employment, from the lowest production or clerical worker to at least the middle levels of management and engineering. According to Zuboff, the historical categories of blue collar ("acting on") and white collar ("acting-with") are both disappearing as computers are introduced. The blue-collar worker is increasingly an operator or monitor of an automated process; the white-collar worker is personally isolated but technically integrated into an automated office or enterprise.
[¶56.] If this tendency continues over the long term, operators and managers will be deprived of both the context and the meaning of action. What will emerge in the future is a world of work in which operators have evolved into "operatons," placed in a job that seems to require human judgment and human action, but so bounds and circumscribes what they can and cannot do that there is no room for discretion, let alone perfection of skill. Managers will have co-evolved into "manageroids," looking human, acting human, but having no real autonomy or control, and no discretion to evaluate or deviate from the plans and programs of their organizations. This is indeed a bleak vision, not just for workers but for the societies that depend on the products and services they provide and the organizations they staff.62
[¶57.] Those who reject the preceding argument as a simplistic critique of a neo-Taylorist strawman argue instead that computerization can enhance and expand the universe of work. The skills required to make effective use of computers are often of a higher order, engaging workers (and managers) more fully, making better use not only of their education but also of their innate capacity. More important is the "textualization" of the new workplace into an environment where workers read and respond to abstractions of the process and manipulate abstracted symbols and controls, rather than observing and acting on or with the processes themselves. In this view, the increased knowledge of operations required to master the new work environment does in fact give workers more than a sense of having more status and increased control over their own work. In the long term, they have the means to realize both.63
[¶58.] Despite several heroic efforts to clarify arguments, premises, and evidence, the terms and findings of the debate over deskilling remain confused, and somewhat contradictory.64 But is it in fact the central point at all? Some authors think not. Hirschhorn has suggested that the focus on deskilling and reskilling in the workplace confuses integration with discretion, blurring the distinction between reconfiguring the breadth and scale of task and redefining its nature and scope. The distinction he makes between skill and function is similar to that made in other contexts by Zuboff:
[¶60.] Although the specific notion of "defunctionalization" has also been the subject of some argument, Hirschhorn's description points directly toward the heart of the issue of the new transition.
[¶62.] One of the more remarkable accomplishments of human beings faced with situations of great complexity and difficulty is the ability of some to gain a mastery that is beyond even highly accomplished proficiency. Dreyfus and Dreyfus have characterized as "true expertise" that level of performance that is so rapid, so skilled, and seemingly so intuitive, that it cannot be attributed to rational, incremental accumulation of experience either through practice or through process.66 Whether chess player or pilot, artist or race driver, a true "expert" can be defined as a person who has an acquired instinct for going straight to the solution, the knack for finding a correct path or action without having to pause to logically deconstruct and rationally analyze the problem.67
[¶63.] The usual terms used in rational and reductionist analysis of good performance of complex tasks--such as competence or skill--are usually characterized, and measured, in terms of the ability to construct or reconstruct a logically exhaustive deductive chain, and then proceed along it. What many call expertise is in this sense only proficiency, the ability of a well-trained individual to find the shortest path through the logical chain to a correct (or effective) solution. Workers can achieve these levels of performance through practice, through training, and even through simulation, since proficiency can be built through experience, repetition, and the accumulation of knowledge.
[¶64.] But rote learning without the discretion to try, and to err, cannot produce an expert. What is required is to test and experiment, to practice until the cognitive map of action becomes an integrated representation instead of a series of causally linked discrete steps. Consider, for example, photographer Peter Gullers, writing on the subject of expert judgment of light in photography:
[¶66.] It is not only the memories of the actual practice of photography that play a part. The hours spent in the darkroom developing the film, my curiosity about the results, the arduous work of re-creating reality and the graphic world of the picture are also among my memories. A faulty assessment of the strength of the light and the contrast of the subject, the vibrations and tremors become important experience to be called upon next time I face a similar situation. All of these memories and experiences that are stored away over the years only partly penetrate my consciousness when I make a judgment on the light conditions. The thumb and index finger off my right hand turn the camera's exposure knob to a setting that "feels right," while my left hand adjusts the filter ring. This process is almost automatic.68
[¶67.] What Gullers describes as "right feeling" typifies the master craftsman in every field of human endeavor, from artists to politicians, from generals to metal workers, from athletes to theoretical scientists.69 It has its mirror in those workers and operators who over time gained such familiarity with their work that much of what they knew had passed over into the realm of tacit knowledge.70
[¶68.] As will be discussed in greater detail in chapter 7, the pattern of response found in empirical studies of the introduction of computers into nuclear or chemical plant operations, or into other similarly knowledge-based and integrative tasks such as air traffic control or airplane cockpits, has been shown to be far more complex than that found in studies confined to the automation of assembly lines and other technically simple workplaces. What is being replaced, or displaced, is not the direct and tactile involvement of the worker, but tacit knowledge, in the form of expert judgment of the state and condition of operation of plant, process, activity, or firm, and the range of experiential knowledge that allows flexibility in interpretation of rules and procedures when required.
[¶69.] The problem of maintaining formal control is dealt with simply enough--in principle. If designers are clever enough, and their knowledge of the technical system detailed and thorough enough, computer controls can be preprogrammed to detect all possible variance and correct for it automatically. The job of the operator (or manager) in that case would be simply to monitor the automated process and make a report when and if any major deviation or other unexpected event occurred. That would probably suffice for a "jellybean factory,"71 but would not be sufficient for a plant that depended on continuous pro-cessing, or for nuclear power plants or dangerous chemical facilities that needed to be under control at all times.72 What would be, and is, expected is that the operators also be able to step in and take over if the plant enters a configuration not programmed into the computer; or if it goes down; or if they see any other deviation that they think might threaten process, production, or plant or public safety.
[¶70.] The question the operators have asked, of themselves and of management, is whether they will be able to maintain the necessary skills and expertise if most of their time is spent nurturing the computerized control system instead of actually running the plant. More to the point, how will their successors and replacements be trained, or gain expertise, without learning through hands-on operation? It is the same sort of question asked by machine workers displaced from direct control of their machines, but at a broader and more publicly consequential level. The irreversible loss of expertise is one of the more serious potential long-term and indirect costs of the penetration and acceptance of computers and computerization into the operation and control even of complex and critical tasks. Concern over the possible consequences is another theme that is pursued through the following chapters.
[¶72.] I suggested earlier that the new computer-based information techniques were not only affecting existing technical systems but creating entirely new ones. Many of these may be no more consequential than the jellybean factory, and no more important. But in the past few years, new forms of technical systems have begun to appear that are created as heterogeneous assemblies of selected functions of pre-existing ones, and that depend completely on the capacity and capabilities of computers and networks for their function.
[¶73.] The most carefully analyzed case to date has been that of Eurotransplant, designed to provide an integrative mechanism for facilitating and coordinating the availability of human organs, surgical teams, and potential recipients.73 The perishability and unpredictability of the supply mandates requires this to be done on very tight schedules and in real time. The technical core of the system is a central computer with an international database connected by modem and phone lines to a network of other desktop and minicomputers that contain local information about hospital admissions and records, potential transplant recipients, doctors and medical teams, etc., without requiring potential recipients to tie themselves to a hospital for fear that they may otherwise miss the opportunity, or allowing viable organs to deteriorate before recipients can be found and teams assembled.
[¶74.] Eurotransplant is a system with no infrastructure of its own other than the coordinating and data-managing computers. It draws on other existing multipurpose systems for rapid air and ground transport, database management, telecommunications, and physical facility readiness, all of which must be smoothly coordinated and integrated, often within a matter of minutes. In traditional large technical systems, failure of a coordination or control computer may shut down subsystems, but the main system continues to exist and, if well designed, to operate, even if at reduced capacity. But Eurotransplant is different. If coordination and integration fail, the subunits continue to function (indeed, they may hardly notice), but the overall system ceases to exist.
[¶75.] Because these new, heterogeneous networks have no primary infrastructure other than computer hardware and software, both of which are known in any case to have short productive lifetimes, sunk costs are very small. Even a minor innovation in computer hardware or software, or a new idea about putting existing capabilities to work, can lead to a new way of framing the networking potential, which in turn can be implemented with great rapidity by rearranging and reassembling the component elements. Thomas Hughes has characterized large technical systems as somewhat ponderous, possessed of a "momentum" that carries them forward even in the absence of external stimuli.74 But many of these new systems will be nearly free of encumbering infrastructural and managerial "mass," enabling them to respond rapidly, radically, and perhaps unpredictably, to relatively small changes in integrative capabilities, problem definition, or representation.
[¶76.] As central elements of complex systems based entirely on networking, be they homogeneous or heterogeneous, computers are no longer simply auxiliaries, aids, or replacements for traditional ways of doing things. Because they are capable of storing enormous amounts of data and recalling them automatically, and carrying out remarkably complex and widespread actions in very short times, they create entirely new ways for other systems or individuals to take action. But to whom, or to what, are the coordinating systems and their designers, programmers, and operators accountable?
[¶77.] Who is responsible for supplying safe, reliable, just, or equitable performance of Eurotransplant? Surgeons? Pilots? Ambulance drivers? Can a computer programmer be sued for malpractice for a programming error? Is the hospital responsible if a perishable organ has been sent to the wrong place, or matched to the wrong potential recipient? Can a dispatch system be sued for incorrectly prioritizing ambulance calls, or a financial data trading system penalized if a transaction vanishes between sender and recipient? As the functions of the traditional vertical organizations that have dominated our thinking for the great part of this century are dispersed along and across the new networks of interconnection, we are hard pressed to locate, let alone improvise, means for ensuring accountability. This too is one of the threads that weaves through the following chapters.
[¶79.] The original information technologies, the telegraph and the telephone, made possible the emergence of the mass production-oriented, vertically integrated hierarchical organizations that dominated most of the industrial development of the twentieth century.75 They enabled increasingly gigantic and differentiated firms to retain centralization and hierarchical control while expanding in scope and scale to a degree never before possible.76 Although they may not be rendered as completely obsolete as the enthusiasts of the virtual corporation would have us believe,77 the new techniques are transforming them again in ways that are as pervasive as they are subtle, dispersing function away from the central organization while also supplying a means to retain coordination and control. And just as the telephone and the telegraph not only made possible the emergence first of entirely new organizational forms, and then of wholly new services for which information technology was central, the new, computerized techniques are not just fostering change on present organizations and services, but creating entirely new ones.
[¶80.] The power of desktop computers, harnessed to the new techniques for global communication and networking, are making possible the creation of new types of large technical systems that are inherently transboundary where traditional ones were nationally or regionally oriented, differentiated horizontally across strong networks rather than vertically between weak ones, and organized around communications rather than hierarchies, with management and control diffused horizontally through the system instead of flowing up toward the top.78 This is a major and important change.
[¶81.] Unfortunately, the present state of theory and analysis of either large, technically oriented firms or large technical systems in general does not seem adequate to the task of understanding the coming changes. Although the lure of computerization is clear, and the short-term costs and benefits are becoming increasingly well understood, the longer-term consequences for system performance, for the changing social and economic role of existing large-scale systems as they transform, and for the social, economic, and political structures and practices of the societies that depend on them, are still at best only dimly perceived.
[¶82.] Moreover, the tools and methods we have inherited and developed remain grounded in observation and analysis of the historical, hierarchical, vertically oriented institutions and organizations of the precomputer, prenetwork industrial revolution. It is not at all clear how appropriate they are for analyzing (let alone predicting) even the direct effects on social and industrial organization of the newly emergent, horizontally organized and network-coordinated forms that are coming into being, let alone for the more complex and difficult task of trying to understand the longer-term, larger-scale, more diffuse, and more general impacts.
[¶83.] The chapters that follow amount to a set of preliminary case studies bounded and defined by the preceding agenda, and set in contexts and situations where some of these indirect effects have already been consequential--along with some speculative inference on where they might be heading. The list of possibilities is long, and the ultimate consequences in some cases may be as hypothetical as the future capabilities that are promised. There is, however, one instance that already serves to illustrate and illuminate the range of emerging critical problems--the growing international network of trading in currencies, bonds, stocks, and other assets. As The Economist has put it: "How, in an age of computerized trading and borderless financial markets, can regulators ensure the efficient and fair working of a central market equally accessible to all investors big and small?"79 That question will be taken up in chapter 5.
NOTES:
1 See, for example, Negroponte, Being Digital; Stoll, Silicon Snake Oil; Kelly, Out of Control.
2 Thomas, What Machines Can't Do; Zuboff, Age of the Smart Machine; Noble, Forces of Production.
3 Of the many works in this field, those particularly relevant to the arguments developed in this book are Toffler, Third Wave; Hughes, Networks of Power; Beniger, Control Revolution; Chandler, Visible Hand; Scott Morton, Corporation of the 1990s; Chandler, Jr., Scale and Scope; Fogel, Railroads; Merkle, Management and Ideology; Thompson, Organizations in Action.
4 Hirschhorn, Beyond Mechanization.
5 Doray, A Rational Madness, 34ff.
6 Classic examples of early integration include not only the well-known case of the railroad, but the creation of the giants of the meatpacking industry. See, for example, Chandler, Visible Hand; Cronon, Nature's Metropolis.
7 Doray, A Rational Madness, 61ff.
8 Taylor, "Principles of Scientific Management."
9 Davidow and Malone, Virtual Corporation, 164ff.
10 Merkle, Management and Ideology; Waring, Taylorism Transformed.
11 Doray, A Rational Madness. Also Harvey, Condition of Postmodernity, 228ff.
13 Beniger, Control Revolution, 298.
14 As so cleverly, and beautifully, expressed by Charlie Chaplin. Chaplin and Goddard, Modern Times.
15 Beniger, Control Revolution, 298-299.
16 Noble, Forces of Production, 16; Rabinbach, Human Motor, 238ff.
17 Beniger, Control Revolution, 295.
18 Doray, A Rational Madness, 34ff.
19 Beniger, Control Revolution.
20 Dreyfus and Dreyfus, Mind Over Machine.
21 This level of expertise and tacit knowledge not only persists, but is nurtured in some modern organizations, particularly those managing the safety-critical operation of highly complex and fast reacting systems. See, for examples, Rochlin and von Meier, "Nuclear Power Operations"; La Porte, "United States Air Traffic System"; Rochlin, "Essential Friction."
22 Zuboff, Age of the Smart Machine.
23 Waring, Taylorism Transformed.
24 Merkle, Management and Ideology, 76. The ability to quantify the value to a firm of such "nonproductive" activities was one of the great motives for the development of transaction cost economics.
25 Rabinbach, Human Motor; Noble, America by Design.
26 In a final twist, the management techniques of General Motors as put forth by Alfred P. Sloan resulted in the gradual disappearance of even those individual entrepreneur-innovator-owners such as Henry Ford in favor of those whose major skill was running large, formal organizations.
27 Harvey points out that Taylorism never made major inroads in Europe between the wars because of the relatively greater strength of labor movements there and the lack of major immigration or movements of workers to provide leverage against them (Postmodernity, 128).
28 Barnard, Functions of the Executive; Scott, Organizations.
29 Zuboff, Age of the Smart Machine; Merkle, Management and Ideology; Waring, Taylorism Transformed.
30 The most splendid, and durable, presentation is Barnard, Functions of the Executive.
31 Scott, Organizations; Perrow, Complex Organizations.
32 Thompson, Organizations.
33 Baxter, Scientists against Time.
34 The flavor of those early days is wonderfully captured by Emery and Trist, "Socio-Technical Systems." Both the book and the article remain among the most important basic statements of the principles of socio-technical systems analysis.
35 Perrow, Complex Organizations.
36 Waring, Taylorism Transformed, 25. In a sense, this was a completion of the agenda of integration that had preoccupied managers since the nineteenth century, and therefore, by extension, of the search for control as described by Beniger, Control Revolution.
37 The term comes from Zuboff, Age of the Smart Machine, by analogy with and extension of workplace automation by more dedicated and single-purpose computers.
38 Scott Morton, Corporation of the 1990s, 8.
39 Davidow and Malone, Virtual Corporation, 10.
40 See, for examples, Marenco, "Effects of Rationalization"; McColloch, White Collar Workers.
41 McColloch, White Collar Workers.
42 Ibid., 116.
43 Braverman, Labor and Monopoly Capital.
44 Osterman, "Impact of IT on Jobs and Skills," 231ff.
45 Management and control in white-collar and office settings has been studied extensively by the research group at the University of California at Irvine. See, for example, Dunlop and Kling, Computerization and Controversy; Danziger and Kraemer, People and Computers; Danziger and others, Computers and Politics.
46 Monger, Mastering Technology, 84-85.
47 The term "scientific management" is rarely used by those who declare their field to be management science. They sharply contrast their use of models of limited rationality in the open- or natural systems approach with the formal, closed-system perspectives of the historical scientific management school. Nevertheless, the modern focus on "optimal" solutions seems quite reminiscent of the Taylorist approach. See, for example, Davidow and Malone, Virtual Corporation, 186.
48 Scott Morton, Corporation of the 1990s, 14; Rifkin, End of Work, 104. The best account of the historical development of the just-in-time strategy is that of Womack, Jones, and Roos, Machine That Changed the World.
49 Thomas, What Machines Can't Do; Zuboff, Age of the Smart Machine.
50 Davidow and Malone, Virtual Corporation. The span of control can nevertheless be impressive. If General Motors indeed had the fifteen levels of management attributed to it, and eight people reporting to each manager at each level, in principle, more than thirty billion employees could be controlled (p. 171). A more reasonable organization with eight levels and six people reporting directly at each level could still control more than 1.5 million. The point is that a even a relatively low-level manager, sitting at level four (from the top), with four levels below, would be passing up to the next level the summed and integrated information from more than a thousand subordinates.
51 Although this is not surprising in the context of the traditional neo-Marxist literature on deskilling, it is true even of the more sophisticated and sympathetic studies such as that of Thomas, What Machines Can't Do; Zuboff, Age of the Smart Machine; Nadis and MacKenzie, Car Trouble.
52 Beniger, Control Revolution; Noble, Forces of Production; Hirschhorn, Beyond Mechanization.
53 Zuboff, Age of the Smart Machine.
54 To make the distinction clear in a complex setting, an "automatic pilot" is a simple servo device that keeps an airplane on a specified heading at a specified speed and altitude, compensating for winds and other factors. An automatic navigation system, on the other hand, is fed a desired course from one point to another, and also controls the aircraft to keep it on the predetermined flight plan. The former device automates flight, the latter "informates" it.
55 Similar results were found in the detailed empirical studies of Thomas, What Machines Can't Do.
56 See, for example, the review by Osterman, "Jobs and Skills."
57 Zuboff, Age of the Smart Machine, 246ff.
58 Ibid., 238ff.
59 Howard, Brave New Workplace.
60 See, for example, Donald Chisholm,
61 See, for example, Danziger and Kraemer, People and Computers.
62 As is beautifully represented in the movie Brazil (see chapter 2, note 13).
63 Zuboff, Age of the Smart Machine; Thomas, "Politics of Growth."
64 Lerman and Salzman, "Deskilling and Declassing"; Wood, ed., Degradation of Work?; Wood, ed., Transformation of Work? Also see the recent effort by Rifkin (End of Work) to extend this debate about the displacement and deskilling of labor to the "white-collar" class in what he calls a "postmarket" era.
65 Hirschhorn, Beyond Mechanization, 71.
66 Dreyfus and Dreyfus, Mind Over Machine.
67 It is quite remarkable that although the author has heard this same description given at various times to mathematicians, performing artists, physicists, and automobile mechanics, it remains a distinction much more familiar in practice than in the analytic literature.
68 Göranzon, "Practice of Use of Computers." The original text was published in Swedish, in Göranzon, Datautveckslingsens Filosofi. The text here is Göranzon's translation of 1988.
69 Most of us possess at least some realm of expertise even in the practice of daily life, perhaps as a teacher or cook, singer or carpenter, or even driving or riding a bicycle. That such skills are in general not highly valued is a tren-chant critique of the values of modern technical societies.
70 Janik, "Tacit Knowledge."
71 The notional "jellybean factory" is an artifice developed by my colleague at Berkeley, Karlene Roberts of the Haas School of Business, as the diametrical opposite of reliability-demanding, safety-critical, and socially, environmentally, or politically important operations.
72 That is, in fact, one of the things that went wrong at Bhopal. See, for instance Shrivastava, Bhopal: Anatomy of a Crisis.
73 Braun, Feuerstein, and von Grote-Janz, "Organ-Technick"; Braun and Joerges, "How to Recombine Large Technical Systems." A similar system now exists in the United States, but has not been so extensively studied.
74 Hughes, Networks of Power.
75 Scott Morton, Corporation of the 1990s.
76 Beniger, Control Revolution; Chandler, Visible Hand; Chandler, Scale and Scope; Gras, Grandeur et Dépendance.
77 Davidow and Malone, Virtual Corporation.
78 Rochlin, "Trapped by the Web." Also see Rifkin, End of Work.
79 "Shrinking of the Big Board."
Socio-Technical Tools
Information technology has important general-purpose power to manipulate symbols used in all classes of work, and therefore, as an "information engine," it can do for business what the steam engine did in the days of the Industrial Revolution. It goes beyond this, however, as a technology that permits one to manipulate models of reality, to step back one pace from the physical reality. Such an ability lies at the heart of IT's capacity to alter work fundamentally.38
Computers can gather most information more accurately and cost-effectively than people, they can produce summaries with electronic speeds, and they can transmit the information to decision-makers with the speed of light. Most interesting for our purposes is that, frequently, this information is so good and the analysis so precise that an executive decision is no longer required. . . . Anyone restructuring a company that does not take this new employee empowerment into account is not dealing with the future but is merely streamlining the past.39
The Deskilling Controversy
About 80 percent of those jobs initially taken over by computers were defined as clerical/bookkeeping. One of the most severely affected operations was that of the highly skilled bookkeepers. This job, traditionally the ladder upwards in banks, had already been partially undermined by the use of bookkeeping machines and tabulators. The impact of computers proved devastating. There was a 50 percent reduction in the number of bookkeepers from 1960 to 1965, and the remaining jobs became more routine and less skilled. By [1970] the job had virtually been eliminated.42
The Informated Workplace
As workers are deskilled, their actions are narrowed and they become more integrated into the machine systems. In defunctionalization, on the other hand, the workers give up all execution functions and manual action and are fundamentally displaced from production. Displacement is hidden in semiautomatic systems, as workers lose execution functions but gain in control activity. . . . Mechanization narrows skills and coerces workers to commit their bodies, if not their minds, to the machine process. Post-industrial technology threatens to throw them out of production, making them into dial watchers without function or purpose. The problem of skills is dwarfed by the larger problem of fundamental identity and function.65
Expertise Lost
When faced with a concrete situation that I have to assess, I observe a number of different factors that affect the quality of the light and thus the results of my photography. Is it summer or winter, is it morning or evening? Is the sun breaking through a screen of cloud or am I in semi-shadow under a leafy tree? Are parts of the subject in deep shadow and the rest in bright sunlight? . . . In the same way I gather impressions from other situations and environments. In a new situation, I recall similar situations and environments that I have encountered earlier. They act as comparisons and as association material and my previous perceptions, mistakes and experiences provide the basis for my judgment.
Heterogeneous Systems
Conclusion