Game programming gems 6
Dickheiser M., Charles River Media, Inc., Rockland, MA, 2006. 736 pp. Type: Book
Date Reviewed: May 31 2007
This is the sixth volume of the popular and practical “Game Programming Gems” series. From the first volume, the series has addressed issues as they have emerged; currently, teams are growing larger and increasingly developers are specialists. The series addresses this need by providing state-of-the-art, readily available material for the specialist and handy resources that may be outside your bailiwick. Current machines and player expectations require higher fidelity models and animations, fancier physics and graphics effects, and more intelligent artificial intelligence (AI). The rising expectations of the work of programmers and the greater level of sophistication required demand flexible teams and longer production schedules, especially in light of scripting and data-driven systems. Of course, the biggest issue is cost. The 50-plus articles in this volume address these demands and expectations.
An important fact the book addresses is the collaborative reach of game technology experts who come from various backgrounds and over 20 countries. The experts include gaming experts, as well as experts from outside the industry. Moreover, this collaboration involves nearly every region of the world, including Eastern Europe, Latin America, North America, Singapore, and Japan.
This volume is not recommended for faint-of-heart, newer game programmers, since it does not function as a primer. It is likely that the specialist will pick and choose from the topics covered and the dedicated programmer will learn a great deal by reading more thoroughly. The series is aptly named “Gems,” and there are nuggets galore.
A pragmatic way to find the gems relevant to you is to peruse the seven parts: “General Programming,” “Mathematics and Physics,” “Artificial Intelligence,” “Scripting and Data-Driven Systems,” “Graphics,” “Audio,” and “Network and Multiplayer.” Most programmers will find their particular areas of interest and then look for handy tools in other sections.
“General Programming” is not for the novice, as the name may imply; rather, it involves multiprocessor techniques, unit testing, and security fingerprinting. “Mathematics and Physics” involves all things related to the floating point unit (FPU), central processing unit (CPU), and graphics processing unit (GPU). “Artificial Intelligence” demonstrates current work in cognitive science and machine intelligence, with a strong representation from academia; the AI techniques shown here can be applied in “other systems in the engine.” “Scripting and Data-Driven Systems” is a worthwhile addition to the series. The most popular and emerging languages provide a starting point for your engine with a flexible backbone. “Graphics” combines old and new technologies with numerous sharp techniques. “Audio” includes insightful ideas for advanced uses of the audio system. Finally, “Network and Multiplayer” is another emerging area as global players plug in to play. As the gaming content has increased, so too has the multiplicity of players across networks.
The editor notes that gaming is not just for game developers anymore. Game-based learning, edutainment, commercial and military training simulations, academics, and other “serious games” have all made their mark. The upshot of this newfound attention is that the “noobs” (p. xvi, a slang insult for newbies) are starting to put their feedback into gaming. At this point, the implications of this feedback are not clear, but what is obvious is that gaming will be transforming into new and potentially complex areas.
This volume takes into account the complexity of gaming and focuses on providing cutting-edge developments that are of interest to those outside the industry. Another sign of the maturity of gaming is the rise of growth and complexity issues related to the size and intricacy of games. The section on “Scripting and Data-Driven Systems,” along with “Network and Multiplayer,” converge in the two areas of most interest to those outside gaming. Some of the most exciting topics involve these two, especially when converged. A related area of convergence is how AI is of interest to those inside and outside gaming. If coded well, AI can provide the behavior of characters that are seemingly more intelligent and human-like, yielding a more involving game.
This volume, although replete with complex topics, is readable, relevant, and just about the best in its field. The enclosed CD has source code illustrating points in the articles. The index is useful as well, and includes information on all six volumes in the series. The illustrations are well done and add desirable visual examples.
Reviewer: G. Mick Smith Review #: CR134340
Evolutionary scheduling: a review
Hart E., Ross P., Corne D. Genetic Programming and Evolvable Machines 6(2): 191-220, 2005. Type: Article
Date Reviewed: Sep 27 2006
For those who need an update on research material that applies evolutionary computing methods to scheduling problems, this review paper is of substantial value. The last major survey was performed in 1999, when a major statement emerged from the European Network of Excellence on Evolutionary Computing (EVONET). The three coauthors here have provided an admirable overview and report on current trends and achievements, and “[suggest] the way forward.” In particular, this paper will interest a wide audience, since its ideas can be applied to many common scheduling issues, such as job-shop scheduling problems, an area much discussed in academic literature. The authors point out that algorithms today are capable of tackling enormous and difficult real-world problems, a major advance over earlier surveys, such as the EVONET report.
Reviewer: G. Mick Smith Review #: CR133355 (0707-0716)
IT professionals as organizational citizens
Moore J., Love M. Communications of the ACM 48(6): 88-93, 2005. Type: Article
Date Reviewed: Jun 12 2006
Information technology (IT) workers exhibit significantly less occupational citizenship behavior (OCB) than non-IT workers. This is the major finding of this work. Five types of OCB (altruism, courtesy, sportsmanship, civic virtue, and conscientiousness) are measured in this study. “Investigations of a situational factor--fairness perception--as a predictor of OCB have been ... fruitful.”
People seem to feel that if the exchange between them and their organization is positive, then their OCB will be enhanced. Procedural justice within an organization, or the perceived fairness of policies and procedures, how company policies are undertaken, and the dignity and respect with which they are communicated, are critical factors. Alarmingly, IT workers are plagued by significantly lower management trust and faith in procedural justice than their non-IT colleagues.
Some of the results in this study are not surprising. A fair amount of working life is similar to what you find in other fields. People don’t work particularly hard if they are only working for the money, especially if they do not feel like they are being treated fairly. The hazard for the IT field, though, is that the potentially devastating consequences of not helping others are acute in IT work. IT workers need to be proactive in stymieing malware, viruses, security issues, and a whole host of threats. If they are not helpful, these threats proliferate. The authors do not report this fact, though it does seem to be a central implication of their important research.
Reviewer: G. Mick Smith Review #: CR132910 (0704-0418)
Ethical engagement with data collection efforts related to fighting terrorists and terrorism in the context of recent events
Pohlhaus W. Innovation and technology in computer science education (Proceedings of the 10th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education, Capacrica, Portugal, Jun 27-29, 2005) 401-401. 2005. Type: Proceedings
Date Reviewed: Jan 3 2006
Pohlhaus discusses data collection problems with regard to civil liberties, and notes three problems deriving from data collection: misinformation, a lack of accountability, and the anonymity of those collecting the data.
Pohlhaus regards data collectors as unethically collecting information, and argues that the “community of computer scientists” should respond. He reviews historical and current narratives to outline problems concerning data collection. The examples that he draws on in particular are the Orion software system for classifying people and groups, and the scandal at ChoicePoint, Inc., where con artists used social engineering techniques to obtain personal information about people. These cases illustrate the ease with which programmers trust in those responsible for the collection and security of information. Of particular concern is the manner in which the Federal Bureau of Investigation (FBI) gathered and used information in COINTELPRO (an FBI counterintelligence program) during the 1960s and 1970s.
The author does not identify how the worldwide community of scientists could agree to take concerted action. Currently, there is not a vehicle to carry out his program. The issues are sound, but more work needs to be done in this area.
Reviewer: G. Mick Smith Review #: CR132225 (0610-1082)
Prefiguring cyberculture : an intellectual history
Tofts D., Jonson A., Cavallaro A., The MIT Press, Cambridge, MA, 2004. 328 pp. Type: Book
Date Reviewed: Nov 17 2005
The premise of this work is that we are who we are as humans because of the limitless social apparatus of technology, primarily the computer network. The interwoven nature of the human-computer interface has made it impossible to distinguish technology from the social and cultural production of being human. Cyberculture is the broader name given to this process of becoming through technological means. “Prefiguring” is a word that describes and predicts posthuman construction and adaptation to cyberculture. Prefiguring cyberculture, then, is a volume dedicated to tracing the intellectual history of cyberculture.
The surprise in this book, though, is that cyberculture has been arriving for quite some time, centuries in fact. In the book, cyber history is tracked by media critics and theorists, philosophers, and historians of science who explore the antecedents of contemporary technological culture sprinkled throughout key works and writers that anticipate cybercultural practice and theory, including Plato’s “The simile of the cave”; Descartes’ Meditations; Sir Thomas More’s Utopia; Francis Bacon’s New Atlantis; Mary Shelley’s Frankenstein; Butler’s Erewhon; de Chardin; Alan Turing; Philip K. Dick; William Gibson’s Neuromancer; Ray Bradbury; Arthur C. Clarke; Alvin Toffler; and the film, The Matrix. Also, artists explore how cybercultural themes have been envisioned in the visual arts with accompanying text.
The key works frame, but do not inhibit, the stimulating themes addressed by this volume. The 28 essays and artist statements, edited by three others, do not slavishly follow irrelevant historical texts or simply rehash old arguments, but they tease out meanings, reflections, and meditations on what it means to be cybernetic. Tofts suggests that the culture created by the human-computer interface has created an innovative creature: a posthuman. This new being is an intermingling of humanity and technology, which some have labeled cyborg, and others, one giant step for evolution.
These Western writers are hindered by the mostly rather weak middle portion of the book dedicated as it is to the oppressive mood of postmodernism, which bogs down the text. Gregory Ulmer’s “Reality Tables: Virtual Furniture” is unintelligible, spinning around textual tables, diagrams, and language, and throws in Elvis’ pelvis to boot. The highlight in the second section, though, is Scott McQwire’s definitive consideration of William Gibson’s cyberpunk novel Neuromancer. Although the volume as a whole is well done, too much of this middle portion of the book is stifled with jargon and parochial interests.
Yet the text is more than redeemed by an impressive opening section, and it ends with an impressive flourish in the last section of the book (punctuated by an artistic section in an interlude). Some of the contributions, noted below, are standouts. Erik Davis contributes a masterful opening salvo that rescues Descartes’ Meditations from the “punching bag” of criticism. In his capable hands, virtual reality, cyberspace, and The Matrix demonstrate that humans develop “authentic consciousness” by reawakening to ourselves. Samuel J. Umland and Karl Wessel consider Philip K. Dick’s work, as his work permeates the atmosphere for other writers in the text as well. In their prescient if somewhat disturbing essay, the authors maintain that technoscience is a sort of autistic endeavor, yet it generates an unintelligible message.
The fourth section is provocative as well. Margaret Wertheim illuminatingly finds an ideological discontinuity between More’s Utopia and Bacon’s New Atlantis. In contemporary positions, these two are represented by the utopian and idealistic virtual community (a la Howard Rheingold and Esther Dyson), or the era of the “dot.com barons,” paralleled by Atlantian figures. Bruce Mazlish contributes an impeccable study of Samuel Butler’s Erewhon as seen particularly in relief and in dialogue with Charles Darwin. The coda belongs to Mark Dery, who views Eero Saarinen’s TWA Terminal at New York’s JFK airport as a failed statement of contemporary society--air travel as a nervous endeavor, filled with freewheeling machines, monitored by defective humans.
Russell Blackford offers what is arguably the most important essay in this volume, “Stranger Than You Think: Arthur C. Clarke’s ‘Profile of the Future.’” This essay is timely and deserves a wider audience. Blackford notes that Clarke is not correct in a detailed exposition of the future, but he has been aptly prescient and thus deserves a reading. Clarke concludes that humanity may envision a “postbiological enhancement of the brain and body (page 1,261),” which is well worth considering.
This text deserves a broad readership and is replete with fascinating--even vital--ideas. As I progressed through the text, I found myself wondering whether the quality and suggestive ideas could really be that first-rate and be sustained throughout nearly the entire text. They were. This is quite an achievement for a work that is as ambitious and daring as cyberculture itself.
Reviewer: G. Mick Smith Review #: CR132044 (0610-1036)
Reviewer Selected
Establishing and maintaining long-term human-computer relationships
Bickmore T., Picard R. ACM Transactions on Computer-Human Interaction 12(2): 293-327, 2005. Type: Article
Date Reviewed: Nov 8 2005
Bickmore and Picard investigate the meaning of “human-computer relationships,” and present techniques for “constructing, maintaining, and evaluating such relationships.” Their primary conclusion is that they “have motivated the development of relational agents as a new field of research.”
Two particular relational benefits motivate the authors’ research: trust and task outcomes (like improved learning) known to be associated with relationship quality. The authors are concerned with evaluating whether agents “establish and maintain long-term social-emotional relationships with their users.” In their experiment, 101 users interacted daily with an exercise adoption system, for one month. Compared to an equivalent task-oriented agent, the computer-based relational agent was trusted more.
Placing agents on mobile devices could provide a potent combination of relationship building (an ever-present “buddy”) and behavior change (providing timely and appropriate interventions). Work should be done regarding the nature of the buddy. Examples of conversational systems, such as R2D2 in Star Wars, and the Microsoft Office Assistant, engendered mixed results: the former was cute and helpful, and the latter was intrusive and grating. There are also political and ethical considerations in designing a buddy. Should the buddy be a thing or a neuter object, as in the two examples above, or should it perhaps be a male, or, as in the authors’ study, a female? And, finally, as the authors note, these proactive buddy scenarios, which are monitoring us, raise issues of privacy and security: with whom do you let the buddy share which pieces of relational or personal information, and how does it earn your trust to do so?
Reviewer: G. Mick Smith Review #: CR132005 (0606-0637)
Privacy policies of the largest privately held companies: a review and analysis of the Forbes Private 50
Peslak A. Computer personnel research (Proceedings of the 2005 ACM SIGMIS CPR Conference on Computer Personnel Research, Atlanta, Georgia, USA, Apr 14-16, 2005) 104-111. 2005. Type: Proceedings
Date Reviewed: Sep 29 2005
This study reviews the Internet privacy policies of the 50 largest privately held companies in the US, as identified by Forbes magazine. The Web sites of these companies were examined to see if they complied with practices that the Federal Trade Commission (FTC) issued as guidelines, including posting information about the fair information practices promulgated by the FTC. In addition, the author compared the policies of these private companies to the largest publicly traded companies in the US.
One of the important findings of the study is the discovery that privately held companies are inconsistent in following fair information practices or consumer-centered Internet policies. More troubling, the 50 largest privately held companies are generally more lax in publicly revealing their fair information practices and consumer-centered policies than the 50 largest publicly held companies. To a large extent, this is due to the lack of any privacy policies for many privately held companies.
What this study demonstrates, more than anything else, is that there is cause for concern regarding the lack of a forthcoming nature among private companies. Further study on this topic is needed.
Reviewer: G. Mick Smith Review #: CR131837 (0608-0874)
Oracle insights, tales of the oak table
Kyte T., Ensor D., Gorman T., Lewis J., McDonald C., Millsap C., Morle J., APress, LP, Berkeley, CA, 2004. 419 pp. Type: Book
Date Reviewed: Feb 14 2005
This volume contains works by 11 leading Oracle experts, who share their first-hand, in-the-trenches expertise. This format allows a database administrator to read about expert experiences, using features that are “not exactly in the manual” and approved by Oracle, but arise from the practical and possibly unapproved uses of Oracle’s inner workings. In fact, the Oracle kernel has evolved over the years, and some of its most important innovations are in direct response to the projects described in this important work. The lessons honed here were first reported at conferences, in coffee shops, restaurants, bars, and, most especially, with pleasant camaraderie around a particular oak table.
The “oak table” reference in the subtitle refers to an international and informal network of Oracle experts who coalesced at Mogens Norgard’s house near Copenhagen. This informal network and companionship informs the expert knowledge that characterizes the approach taken in this volume, to relate war stories and seasoned problem solving capabilities. The decision to release a volume of this type means that we learn such things as who took only one philosophy/ethics class in college, whose ex is a physician, and who relieves themselves where while at Mogens’ house.
All of this is not to suggest that the contributors are less than serious in their Oracle capabilities. It does, however, typify how to read this work. It suggests that you can relish Oracle war stories, related with both expertise and accessibility, while you relaxingly think through knotty Oracle issues.
So, with its more light-hearted approach than most technical books, and a sense of humor, there is quite a bit to glean from this work. The 11 experts consider the patterns in building their respective systems, and relate them to Oracle issues. There is a rather broad range of problems discussed; some are old, some are new, but all can be related to contemporary issues likely to be encountered in Oracle applications.
The most general, but most accessible, chapter is “Testing and Risk Management.” As long as major software concerns release products that are as complex, unwieldy, and deficient as they are (and even the otherwise excellent Oracle product is), an inordinate amount of time will be spent managing software development with database administrators, or responding to business imperatives. This chapter, by Ruthven, is a sound, useful, and accessible introduction to overcoming these inevitable restrictions, as applied to Oracle applications. Another chapter, “Bad CaRMa,” makes a point similar to Ruthven’s, and is also along the same lines, lamenting a most spectacular failure: a customer relations management (CRM) debacle (hence the playful chapter title).
In fact, I found the inability to communicate across business areas to be the connecting theme in this volume, since it also appears in the chapter titled “Why I Invented YAPP” (page 152), as well as in “Extended SQL Trace Data” (page 163). This overarching theme, that information technology (IT) failures are often due to the silo mentality of separate business units, is not explicitly made, or summarized in an introductory piece, but perhaps it should have been.
The authors’ claim to “represent a wide spectrum of experience and knowledge...[that] would...allow all the OakTable network to share a lot of our stories” (page xxix). I feel they have achieved their goals. Though this work is often conversational, it is not for the faint of heart, given the detailed and technical nature of the topic. In order to profit from the text, one needs to be intimately associated with Oracle’s inner workings. Entire pages and sections are often nothing but technical code.
This is not a snazzy tips and nifty features type of Oracle book. You can find that in some other volumes on the market. In fact, some of the projects described herein are ten or more years old, and involve earlier versions of Oracle. However, there are certain consistencies in software projects, and these authors rightly derive their lessons from sound scientific principles, observation, prediction, experiments, and proofs. In any case, if you are a database practitioner in need of expert guidance, you can probably benefit from an international cast of Oracle characters to assist you through your biggest challenges, in a conversational and easily accessible volume filled with helpful tips.
This playful bunch never quits. A number of them report on the mythical, but based-on-real-life experiences of Brushco, a firm supposedly renting toilet brushes. I just know I’m missing out on some inside jokes there. Read this work if you want to know more about the inner workings of Oracle, with a healthy dash of puns and humor thrown into the mix.
Reviewer: G. Mick Smith Review #: CR130815 (0511-1217)
Shaping the network society : the new role of civic society in cyberspace
Schuler D., Day P., MIT Press, Cambridge, MA, 2004. Type: Book
Date Reviewed: Nov 29 2004
Is there a technological or a social imperative with regard to the Internet? Does the ’Net have a definite direction, in light of its development? These are some of the most important “big picture” concerns of the editors, Douglas Schuler and Peter Day, former chair of Computer Professionals for Social Responsibility (CPSR), and lecturer at the University of Brighton, UK, respectively.
Technology is predominant in the business world, but this volume examines the activities of progressive community activists, in nongovernmental organizations (NGOs), to meet the challenges of society. The primary concern of the editors, then, is to publish the contributors who describe the emergence of civil society in cyberspace.
Their range is far reaching: they describe human rights in the “global billboard society”; public computing in Toledo, Ohio; public digital culture in Amsterdam; “civil networking” in the former Yugoslavia; information technology and the international public sphere; “historical archaeologies” of community networks; “technographical” reflections on the future; libraries as information commons; and globalization and media democracy, as portrayed by Indymedia, a global collective of independent media organizations.
One major limitation of this volume is its belated appearance, and the meager results of civility that are described. The Internet has been around for some time, long enough if truly progressive activities were to be forthcoming, but the activities described herein appear paltry in a historical perspective.
Interesting in itself, one typical example is the self-titled “slow food” movement, originally from Italy, with all of 65,000 members. The contributors make no reference to existing cultural mores, in pre-network society, which may be part and parcel of such movements. A European cultural ideal may favor dining, and slowly, without the mechanism of cyber society. This point should be grounded in possible historical precedents. Perhaps more importantly, these admirable socially conscious movements make no reference to preexisting impulses for authentic human existence, something both adherents advocate. These preexisting movements, namely, religious communities in monasteries, have been around for centuries, and additionally offer trenchant critiques of issues the contributors claim is their central concern. However, the religionists are totally ignored.
On the other hand, a focus to be more inclusive could have included some aspects of the business world, although the editors seek to be unsullied by the world of profit. An entire volume without a discussion of instant messaging and speeded up communication seems shortsighted in a work such as this. At the very least, some discussion of what community means via the ’Net seems appropriate, and a consideration of advocacy in these new forms of communication would be well placed.
My point here is that the editors end up being more parochial than I’m sure they intend to be, and unnecessarily. The shaping of network civil society by cultural critics may be broader and more historically grounded than they suspect.
Also disturbing, as noted above, is the limited scope of the cited examples of progressivism. Holland’s experiment with free ’Net access proved unworkable; in South America, the civil society has had limited success; libraries which might be beacons of free inquiry have filters even for adults; and the lessons of Blacksburg and Seattle seem to demonstrate that small- to medium-sized cities can only muster relative success on a local basis.
These comments are not to disparage the real and fascinatingly positive accounts provided in the text, however. Veran Matic artfully describes civil networking in the hostile former Yugoslavia. In this antagonistic environment, he outlines the struggles to maintain a reliable flow of news and information, and the heroic, resourceful means to do so. If only this example were replicated on a global basis, and in repressive areas, I would find the cyberspace civil society more convincing. Likewise, Scott Robinson’s “Rethinking Telecenters” examines the largely positive example of Mexican migrant organizations as a catalyst for change in the microbank rollout. Another one of the volume’s strong analyses of a mid-sized city, Toledo, is instructive, at least, for Americans who base their progressivism on a mostly typical American community.
The volume is best viewed as an updating of analyses from the renowned Frankfurt School in Germany. Juergen Habermas wrote before the Internet era, but the contributors here are of the same vein. Their contribution is that they have extended his analysis to Internet civil society. The question is whether the School is well served by a volume that treats the Internet as if it is yet to come, when it has in fact arrived, matured, and spun off in distinctly unprogressive directions. This volume will not allay fears that the unprogressive aspects of ’Net culture will be mitigated by progressive movements. They are too small to have much of an impact, and remain on the fringes of networking, if they are a blip on the radar at all.
Reviewer: G. Mick Smith Review #: CR130468 (0507-0783)
From airline reservations to Sonic the Hedgehog : a history of the software industry
Campbell-Kelly M., MIT Press, Cambridge, MA, 2004. Type: Book
Date Reviewed: Sep 15 2004
In this book, Campbell-Kelly concentrates on the history of software companies, but not the software or the technology itself. Software history is defined specifically as the history of software businesses. This book is clearly in the business history camp; if you are interested in software history, or a history of the technology, you will not find it here, except in passing. The subtitle is telling: “a history of the software industry.”
Given the author’s focus, certain topics are highlighted as illustrative of the software industry, which has grown to become the fourth largest industrial sector of the US economy. There are “three main vectors of explanation” (page 3). The first is chronological, tracing software developments from the mid-1950s to the book’s ending point (about 1995). Second is the industry’s developed sectors: software contractors and corporate software products. Last is mass-market software products.
Roughly, though coincidently, these three sectors of software businesses emerged in decade-long intervals. The software contracting sector developed alongside the corporate mainframe computer, in the mid-1950s. Typically, this solo contractor period is characterized by programmers who wrote one-of-a-kind, expensive programs for a corporate client. By the mid-1960s, and with the release of IBM’s System/360 computer family, corporate software products emerged in larger numbers, creating a broader market for lower cost software, as contrasted with the first period of software history. With the arrival of the personal computer in the mid-1970s, a market opened for mass-market software. Relatively cheap (typically $100 to $500 in cost), these commonly available shrink-wrapped packages sold in large volume.
Campbell-Kelly’s taxonomy of the software industry correctly observes that no one firm is at the center of the software world, specifically, not the one that many people seem to believe is there, namely Microsoft. He describes other lucrative software products, such as IBM’s Customer Information Control System (CICS), a terrific example of the “invisible software infrastructure that runs the modern corporation” (page 149). For more than 30 years, CICS “has been the world’s best-selling computer program” (page 149). Where there is an automated teller machine (ATM) accessed, a travel reservation made, or a retail credit card purchase made, chances are CICS is employed. Along these lines, SAP’s R/3 product is dominating as well. SAP’s market share of the enterprise resource planning (ERP) software market stood at 33 percent in 1995.
These critical products run unobtrusively in the background of the modern corporation, and are little recognized by the general public; the author hopes to “provide a corrective to the common misconception that Microsoft is the center of the software universe” (page 9). Microsoft’s market share is roughly ten percent, approximating the amount of space that Campbell-Kelly devotes to the firm in this work.
I am, however, uncomfortable with the author’s discussion of Microsoft. He takes pains to diminish Microsoft’s importance in the history of software, though he recognizes its impact, and articulates the company’s accomplishments clearly enough. This is the major limitation of the work. Since, as Campbell-Kelly acknowledges, Microsoft is one of the few companies to survive the tumultuous rise of the personal computer era, more needs to be explained. A corrective is sorely needed, as he points out, since much previous literature on Microsoft is lacking in explanatory power. A business that placed so many of its products on desktops in America requires elucidation.
The author is also weak on programming developments. His discussion of the FORTRAN and COBOL programming languages could be considerably stronger. His account of how COBOL became, with FORTRAN, one of the twin peaks, accounting “for two-thirds of the applications programming activity of the 1960s and the 1970s” is lacking (page 36); this development occupies less than two pages of the text. The author does not provide enough reasons for how COBOL became so crucial; he only mentions that the US Department of Defense encouraged private companies to adopt this as a standard.
One troublesome aspect of the author’s analysis is his tendency to expect that a statement of raw dollars and financial figures is explicative. The finances are not adjusted for inflation, or placed in clear enough historical contexts for the numbers to be meaningful. The author even states, at one point, that the statistics speak for themselves. No, they don’t. The historian must interpret and interpose a meaningful framework on the explicated numbers for the figures to make sense.
By and large, this is a useful book, but with the above-noted limitations. I feel that a person interested solely in company history can read the volume beneficially. Expanding the areas noted above, on the historical context, and the technology used, would not deviate from, but I think would have enhanced, the author’s intended purposes.
Last, I love the title, with one important caveat. It does imply that Sonic the Hedgehog somehow equals the truly revolutionary and durable Sabre airline reservation system. But this is, of course, not the case.
Reviewer: G. Mick Smith Review #: CR130128 (0504-0438)
An information systems perspective on ethical trade and self-regulation
Duncombe R., Heeks R. Information Technology for Development 10(2): 123-138, 2003. Type: Article
Date Reviewed: Jul 28 2004
Duncombe and Heeks describe how ethical trade initiatives are increasing due to the concern that globalizing trade does not benefit “producers in developing countries” (page 123). “Ethical trade” is represented by organizations such as the UK’s Ethical Trading Initiative, which is a voluntary code of conduct among large producers, intended to benefit workers’ rights, human rights, and other social and environmental development goals.
This self-regulation is an alternative to more traditional forms of regulation, originating from state control, or from binding national or international agreements. According to the authors, ethical trade allows stakeholders to harmonize their efforts to set voluntary standards governing developing country workplaces enveloped by the global supply chain.
The liberal concerns of the authors are all well and good; however, I am uncomfortable with their working assumptions, which are that ethical trade “moves beyond” (page 123) other forms of regulation, and that it is also “more appropriate to a globalized, liberalized economy” (page 123).
In regards to its moral superiority, ethical trade wins hands down over traditional regulation, which I might agree with; however, in what way ethical trade is moving beyond state sanctions is not clear, nor am I convinced that ethical trade is more appropriate, or, most importantly, advantageous over other forms of the liberal, global economy.
I’m not sure developing world programmers, for example, have been harmed by American companies outsourcing their work. They indeed may have benefited by the cruel and inhumane Hobbesian world of traditional economics.
Reviewer: G. Mick Smith Review #: CR129935 (0501-0115)
Computer ethics and professional responsibility : introductory text and readings
Bynum T., Rogerson S., Blackwell Publishers, Inc., Cambridge, MA, 2003. 400 pp. Type: Book
Date Reviewed: Jun 17 2004
This book is a direct response to the need for “social and professional” undergraduate content, called for in a key educational guideline, Computing curricula 1991 (page xvii). The premise of the editors is that the information revolution is not merely technological, but fundamentally social and ethical (page 2). In fact, over the years, the professional associations of computer practitioners have recognized and required “standards of professional responsibility for their members” (page 2).
The editors note the crucial, but still exploratory connections between computer ethics and human values; in addition, they summarize the historical milestones in computer ethics. Their brief, but important introductory essay provides useful background for what follows. Thereafter, the book’s sound organization easily allows an instructor to use the text, in part or in total. The editors state: “the book is divided into four parts, each of which includes (1) an editors’ introduction to provide background and context, (2) relevant essays by computer ethics thinkers, (3) a specific case to consider and analyze, (4) a set of helpful study questions, and (5) a short list of additional readings and Web resources to deepen one’s knowledge of the topic (page 10).”
The text’s organization lends itself to ease of adoption. Supplemental Web materials are available at http://www.computerethics.org and http://www.ccsr.cse.dmu.ac.uk. The four topic areas of the text are “What Is Computer Ethics?” “Professional Responsibility,” “Codes of Ethics,” and “Sample Topics in Computer Ethics.” As with any multi-authored edited volume, the individual chapters are of varying worth and utility, depending on interests or pedagogical needs, but I do want to emphasize the high quality of these selections.
I enthusiastically welcome this much-needed volume. In it, important terms by seminal thinkers elucidate the key issues in computer ethics. James Moor describes computing as a “universal tool,” which is “logically malleable” because the technology is “shaped and molded to perform nearly any task” (page 2).
My enthusiasm for this volume is tempered, however, by a caveat or two. Many people are so dazzled by technology that they optimistically view things like computing as liberating. Moor states: “the Gulf War was about information and the lack of it” (page 24) and “better that data die, than people” (page 25). Although General Schwarzkopf remarked that the enemy capitulated because of a lack of information, and Moor speculates that computing may allow fewer physical combatants, physical death is still just as brutally real as ever, nor is there any evidence to suggest that technology humanizes atrocities. The recent beheading of Nicholas Berg in Iraq springs immediately to mind.
Nonetheless, Moor views computers as special and unique, and thus the ethical issues associated with them are largely unprecedented historically. “Computer ethics is a special field” is something he states repeatedly (page 26). However, with this optimistic premise, Moor overstates the case for the special civilizing qualities of computing. Heidegger provides a useful caution: modern technology also is a means to an end.
Bynum also too sharply associates ethical knowledge with formal training, in describing his program for a method in case analysis. In contrast, Jack Rogers and Forrest Baird wrote a fine introductory philosophy textbook using case analysis, and assuming very little background for readers, in their book [1].
Bynum intellectualizes ethics unnecessarily. Bynum’s method includes an admonition to “call upon your own ethical knowledge and skills” (page 68), but then he also states that readers should “take advantage of one or more systematic analysis techniques” (page 69). He maintains that people usually do not have recourse to professional philosophy “or attempt to use broad philosophical principles” derived from, typically let’s say, Kant or Bentham (page 62).
This is surely wrong though. We can point out an analogy to the field of health care. Although most of us are not educated as physicians, almost all of us practice medical knowledge culled from schools, training, reading, hear-say, family stories, and so on. Likewise, commonly held ethical views are derived from these sources as well, and during somewhat more formal instruction in churches, synagogues, and temples. Not surprisingly, after teaching ethics to undergraduates for years, I am no longer surprised to scratch the surface and find pseudo-Kantians and Benthamites abounding.
When all is said and done, however, this volume deserves a wide reading. Although it addresses the specific need for undergraduate ethical content, many more computer practitioners should read this work. These handily collected essays are not only worthwhile reading, but should also be required reading for most computer professionals. If computing professionals have not read, or are not familiar with, this volume’s contents, they would be well served by a sound consideration of the issues contained therein.
Reviewer: G. Mick Smith Review #: CR129776 (0412-1466)
1) Rogers, J.; Baird, F. Introduction to philosophy: a case study approach. Harper and Row, New York, NY, 1981.
The government machine : a revolutionary history of the computer
Agar J., MIT Press, Cambridge, MA, 2003. 576 pp. Type: Book
Date Reviewed: May 7 2004
In this book, the author describes how the British government exemplifies the metaphor of organization as machine, and consequently adopts systematic procedures, statistical methods, and ultimately, electronic computers.
Agar examines philosophical metaphors associated with two centuries of Western thought. Central to this examination are philosophical warhorses ranging from Machiavelli, through Bodin, Hobbes, Rousseau, and Mill, to Marx and Bagehot. The author is on illuminating ground here, and he interestingly demonstrates the interplay between these various mechanistic metaphors, and the practical politics of early nineteenth century politicians such as Charles Babbage, with the scholarly understanding of the period as demonstrated in the work of Otto Mayr. This is all well and good.
Agar's real concern, however, is the "relationship of humans and machines,"; as it relates "to a peculiarly important machine: the general-purpose computer" (page 3). Furthermore, he argues "that the apotheosis of the civil servant can be found," albeit, surprisingly, in the computer (page 3). Agar's argument is that the government machine par excellence is the permanent civil service. Following Habermas' argument, regarding the "scientization of politics," nineteenth century British governance relies less on gentlemanly codes of conduct, and more on rational and professional routines of specialist expertise (page 7).
The standard historiography [1,2] maintains that experts degenerated to mere specialists, demoted to generalist products as servants to the civil service. Agar takes issue with this perspective, and demonstrates that overlapping and succeeding experts continue to arise.
He is convincing here, and I do not doubt that the civil service continued to be characterized by specialized technocrats, the expert statisticians of the nineteenth and early twentieth centuries. Bureaucratization of Britain most significantly takes place in the Treasury. Most importantly, though, for the history of computing, is that warfare alters informational techniques.
Expertise informs "the culture of the wartime command economy" (page 12). It is at this moment that the first stored-program electronic computer enters the story. Wartime organizations combine their expertise, concerned as they are with military prowess, and at Manchester University, a computer is built.
The main course of Agar's argument is that government administration and office mechanization are inextricably connected, and thus, from 1945 through the 1970s, the Treasury's Organization and Methods movement held sway. Thereafter, though, computerization is more controversial, since it failed to deliver (circa the 1980s through 1990s). More recently, computers have been embroiled in debates concerning big government (page 12).
I do take issue with certain aspects of this book. Agar is on good ground with the philosophical metaphor of mechanization. I find more troubling, though, his analysis of the relationship between humans and machines, and a point he never addresses regarding his own use of evidence.
A series of illustrations is used to supplement his points regarding the civil servant as a computing machine. Although an arranged Victorian bureaucrat's desk set is displayed, as typical for museum viewing, in an idealized fashion, the same conclusion should not be drawn for human subjects. Thus, the circa 1920 Egyptians pictured in Figure 5.3, the woman operating a 1930s machine in Figure 5.9, a "typical government office of the early twentieth century"; (Figure 5.13), "more mechanized government" (Figure 5.13), and the six Royal Air Force pictures (Figure 6.9) do not support Agar's contention. The machines, the people, and thus the portrayals are completely staged and sanitized for formal pictures. These offices look like no real offices that people inhabit. Not a hair on their heads is mussed, and not a snippet of paper is out of place. Servants are machines? Not in genuine offices inhabited by human beings.
More troubling, perhaps, is the title of this volume. A review is necessary to elucidate the actual contents of this text. People interested in computing history will find less that interests them here than they might expect in a volume subtitled "A revolutionary history of the computer." Along these lines, this is, indeed, not a general history of the computer at all, but, more strictly speaking, an examination of how the British governmental bureaucracy mechanized its operations, and unwittingly paved the way for the ultimate efficiency, the mechanical tool of the computer, and the drones who run them. People not interested in this more peculiarly British development should pass on this volume.
Finally, the volume needs minor editing. Randomly, I noted a necessary article needed on page 2, and a second parenthesis required on page 501. I am a bit taken aback that the publisher did not catch these edits before publishing. Also, the text would be clearer with a list of illustrations (there is none), and, most importantly, a bibliography would have been very helpful.
Reviewer: G. Mick Smith Review #: CR129566 (0411-1332)
1) MacLeod, R. Government and expertise. Cambridge University Press, New York, NY, 1988.
2) MacDonagh, O. The nineteenth century revolution in government: a reappraisal. Historical Journal 1, (1958), 52 & 67.
Technology in the social studies classroom
Agostino V. In Challenges of teaching with technology across the curriculum. Hershey, PA: Idea Group Publishing, 2003. Type: Book Chapter
Date Reviewed: Jan 6 2004
Agostino’s essay surveys technology in the K-12 social studies classroom. Claude Shannon, digital visionary at MIT, is his starting point, as is the seminal thinker, Marshall McCluhan (1911-1980), of “the medium is the message” fame. From the historical basis of how significant the media is, rather than content, the substance of Shannon’s and McCluhan’s research justifies Agostino’s leap to examine social studies-based technologies.
Hot or cold media, according to McCluhan’s key insight, discriminates between effective and useless mediums, and thus is key for using technology in the social sciences. How instruction is packaged electronically--whether TV, radio, or Internet--affects the perceiver’s understanding of the information.
A hot medium enflames the mind and imagination of the user; the user participates and reacts emotionally and intellectually. Cold media, by contrast, negates involvement, sending information regardless of receivership. Who has not seen a TV blaring away, privately or publicly, without human connection?
Agostino promisingly notes how the technologies associated with social studies matured from cold to hot media. However, other than a summary of the National Council for the Social Studies (NCSS) standards, an evaluation of criteria for social studies software and Web sites, and references to key sources, there is little here that is original, easily found elsewhere, or is all that helpful. Agostino’s suggestion that social studies teachers might integrate technology in their classes--central to his alleged interest in the classroom--is evocative of a sound chapter topic that would have been a genuine contribution, but that is not the chapter we have here.
Reviewer: G. Mick Smith Review #: CR128850 (0405-0615)
Who invented the computer? : The Legal Battle That Changed Computing History
Burks A., Hofstadter D., Prometheus Books, 2002. 415 pp. Type: Book
Date Reviewed: Oct 10 2003
Burks’ history attempts to uncover the critical relationship, and divergent accounts of invention between, two computing pioneers. In 1941, University of Pennsylvania physicist John Mauchly visited physics professor John Atanasoff at Iowa State University to discuss Atanasoff’s current project, thereafter known as the Atanasoff-Berry Computer (ABC). Interestingly, not five years later, Mauchly was acclaimed as the inventor of the Electronic Numerical Integrator and Computer (ENIAC). What ideas germinated between Mauchly and Atanasoff became grist for an exhaustive patent trial, and a source of historical controversy. In 1973, Judge Earl L. Larson named Atanasoff as the computer’s inventor. Despite Larson’s decision, credit for inventing the computer often favors Mauchly.
In light of the lengthy Larson trial, Burks asserts two critical points. First, Atanasoff rightly deserves due recognition for inventing the computer among scholars, and, second, there is a popular lack in acknowledging Larson’s verdict (p. 17). It should be noted, however, that Burks is a principal in published debates over computer origins, and her husband, Arthur, was an associate of Mauchly’s on the ENIAC project. Despite a suspicion of favoritism, I admire her even tone; she remains dispassionate throughout the text, despite contentious debate over her work in collaboration with, and as an associate of, her husband. On the other hand, I found Burk’s potential bias troubling, but not for the obvious reason; the Burks possibly favor Atanasoff, precisely because Arthur did know Mauchly well. This point is not addressed.
In any case, Burks exhibits a trial-like presentation, first placing the computer competitors at odds, then testifying, then subsequently offering a closing argument. The trial arrangement is clever, but distracting; academic voices are not incorporated well throughout the work.
Unorthodox distractions mar the work. I am sidetracked by Burks, busy with her trial presentation, arranging countervailing voices (Herman Goldstine, for example, and nemesis Nancy Stern et al., introduced late in her text). Burks is aware, according to a note in her bibliography, that Goldstine published a computer history with Princeton University Press. Goldstine’s history, however, is not noted in her index.
Moreover, Stern’s various works, in particular her From ENIAC to UNIVAC, are still regarded well by historians, which Burks acknowledges. Nonetheless, the historical profession can and does make mistakes, thus a revisionist history would be welcome. Burks, however, does not provide enough evidence to convince historians to disagree with Stern’s conclusions regarding innovation versus invention. Mauchly adapted ideas and rendered them practical, thus, for historians, Mauchly is best viewed as the computer’s key innovator.
Two other academic fields are represented in Burks’ work. Gerald U. Brock and David J. Kuck, an economist and a computer scientist, respectively, both recognize Atanasoff’s priority in computer history. Chapter 7 (of 13 chapters), “Other Voices,” addresses other works on this question. This chapter’s topic should be discussed early, and incorporated throughout the book.
Burks also has a troubling, non-academic way of attributing credit, quoting sources, and following academic convention. She refers at one point (p. 31), in a statement with no footnote or attributed source, to a discussion between Arthur and Mauchly. This is hearsay evidence quoted as fact. At another point (p. 405), she states that a friend told “us” (presumably herself and Arthur) about a Philadelphia presentation favoring Mauchly, a statement with which Burks obviously does not agree. Nonetheless, this third-hand information is assumed to be reliable, and is included. In short, the work suffers from the lack of a meticulous academic editor, who would have pruned these shortcomings to strengthen the book.
If this is not a scholarly history, is it valueless? No. The text correctly identifies Atanasoff as the genuine inventor of the computer, and Burks’ engaging writing style is accessible to a popular audience: those most likely to view Mauchly as the inventor of the computer. The text includes a clever, well done “As It Happened” section, which reads more like a novel, and is a fresh, clarifying approach to a complex history. I also appreciated the extensive quotes from Larson’s decision (p. 146 - 148), which is generally not given serious weight in popular presentations of computer history, a fact the author documents well throughout her work.
Thanks to Burks, the public will read of a scientific community far more divided, disorganized, and contentious than they suspect. The intended audience for this book is a well-versed reader; such readers will benefit greatly from a favorably designed, illustrated, and accurate account of computer history.
Reviewer: G. Mick Smith Review #: CR128354 (0401-0013)