Tineke M. Egyedi
Standards for infrastructure
new directions for matching compatibility with flexibility
Introduction
Today’s Internet gives us a glimpse of the future ict infrastructure. It has the look and feel of a seamless web. To create a seamless web will be a main future challenge in areas such as semantic exchange and software integration.
Current Internet already displays a high diversity of innovative ict uses and a heterogeneity of players involved in its development, which hints at the future. Users, ranging from individual citizens to local government authorities, are increasingly acting as co-producers, too. Not only are they co-producers of content and innovative in their technology use. They will also compose their own services supported by ready-made service components and tools. The number of architectural service innovations, i.e. re-use and re-combination of existing components, will grow exponentially. Supported by an active open data policy of government authorities and the high quality of public data, new services will mushroom, for example, in areas in which mashing government and commercial sector data offers new service opportunities. These services will develop partly as inverse infrastructures, that is, driven and self-organized by citizen-volunteers and user groups, both complementing and competing with commercially developed service offerings.
Because innovation cycles will continue to shorten, the pressure will rise to update information systems. Concerns about the longevity of digital data and future accessibility of public archives will become acute, possibly following a series of lawsuits against government authorities about inaccessible public data. Moreover, partial infrastructure resources and systems will be of a temporary or ad hoc nature forging instant and dynamic connectivity, e.g. ad hoc communication between ‘intelligent’ devices in the Internet of Things.
A new market reality will arise with unanticipated forms of infrastructure ownership, e.g. lack of ownership, public-citizen or private-citizen partnerships. Large providers will interwork with small providers, commercial providers co-exist and collaborate with non-commercial providers, stable systems interwork with ad hoc systems, et cetera. These temporary, ad hoc structures require a radical break with past ideas about infrastructures (responsibilities and duration) and their provision.
The challenge of creating a seamless public ict infrastructure from a heterogeneous and changeable patchwork of interworking systems is a daunting and never-ending one. Interoperability and compatibility play a key role. Infrastructure components need to be compatible and integrated while the overall infrastructure needs to be highly flexible, that is, sufficiently adaptive to meet the uncertainty of new needs and integrate new offerings.
To understand and address the tension between compatibility and infrastructure flexibility, this chapter treats a public infrastructure as a particular instance of a large technical system (LTS).1 This system theoretical angle helps to clarify the apparent paradoxical role of standards2 and to appreciate how standards, as one of the most noteworthy compatibility strategies, should, and actually can, contribute to flexibility.3 It discusses standards in the wider context of creating local compatibility, without the overall system losing the ability to evolve and innovate. That is, this chapter goes beyond standardization by emphasizing compatibility. It reframes the main challenge of ict infrastructure evolution as a compatibility problem rather than a standardization solution. By doing so, this chapter is about opening new directions for the future public ict infrastructure.
First, the reasons to seek system flexibility are discussed in general and for information and communication technology (ict) networks, in particular. Next, issues of compatibility are turned to. Different sources of compatibility and relevant compatibility dimensions are identified. A three-dimensional compatibility space is defined to structure the search for appropriate solutions in different circumstances. The need is argued to carefully distinguish between and among means and aims of system flexibility. The chapter concludes discussing the difficulty of designing flexible systems. Compatibility issues are readdressed in the light of ict infrastructure evolution and innovation.
Problems of entrenchment
The ideally flexible infrastructure would be one that was designed to evolve, itself, with emerging technologies.4
From environmental, economic and social viewpoints, innovations to large technical systems (LTSs), such as transport and energy systems, are often desirable. However, many LTSs seem impervious to change, which can partly be ascribed to the complexity of such systems. They consist of many interdependent socio-technical components and subsystems, and comprise technical artefacts as well as, for example, the institutional and regulatory contexts of artefact use and production. Organizations and companies develop and sustain the system. Technical add-ons and complementary products are developed.
As LTSs expand, the number of and interdependencies between actors and artefacts grows. Over time, these interdependencies crystallize, solidify, and make manifest a process of socio-technical entrenchment.5 Changes become possible only at the cost of readjusting the technologies and socio-technical arrangements that surround them, to paraphrase Collingridge. The larger the vested interests, the higher the costs of change. To the onlooker, such large technical systems appear to have their own technological momentum,6 which
pushes the system along a path-dependent process of technological change.7
Apparently, unless something radical happens, no noteworthy deviations from the set path may occur. For theoretical concepts such as “technological momentum” and “path-dependency” suggest that significant changes cannot be brought about. They reflect a deterministic view on LTS evolution,8 a view echoed by the “Collingridge dilemma,” i.e. entrenchment problems are difficult to foresee at an early stage of technology development and are difficult to address at a later stage, and provide few clues for policy intervention.
Other concepts are more promising in this respect. For example, under the headings of de-entrenchment strategies, Mulder and Knot9 propose means to recreate the critical space necessary for system change. Their strategies target the system’s actor network by negotiating about and redefining aspects of the critical problem. e.g. assigning a new problem owner, or giving in to demands with regard to one part of the LTS in order to safeguard another.
This chapter focuses on ways to enhance the flexibility of LTSs, and infrastructures10 in particular. Paraphrasing Feitelson and Salomon,11 flexibility refers to the ease with which an LTS can adjust to changing circumstances and demands. It entails openness to change. A flexible design makes a system less susceptible to unwelcome, premature entrenchment. As stated above, I look towards compatibility creating ‘devices’ to enhance system flexibility. While some authors note that these devices play a crucial role in the evolution of LTSs,12 few discuss how they actually relate. The following section starts by examining standards, which at face value might seem a most unlikely candidate for enhancing system flexibility. Other compatibility strategies will follow later on.
Paradox of standards
There is an intuitive tension between standardization and flexibility.13 Standards would foremost seem to work as catalysts of entrenchment. There are two related reasons to think so. First, standards typically codify existing knowledge and practices. As Reddy states:
[S]tandardization […] is an attempt to establish what is known, consolidate what is common, and formalize what is agreed upon.14
In short, codification is a main source of entrenchment. Second, the interrelatedness of multiple components, which characterizes an LTS, is a source of entrenchment as well. These components are complementarities.15 Often, the relationship between complementarities is defined by standards. An example is the A4 paper format. It specifies the interface between all kinds of paper processing machines (such as copying machines, telefaxes and printers) and office requisites (for example, folders, computer software). It also eases the entry of new players in the market. Standards facilitate specialization in work and products, and therefore increase interdependencies between actors and artefacts.16 They thus stabilize technology development. Entrenchment eventually befalls all useful standards.
However, standards can also be a means to postpone, or even mitigate, system entrenchment since standardizing one part of the system can create flexibility in another.17 Formulated differently,
interdependence among the development of complementary technologies may require the coordination provided by standardization in one domain so as to foster the generation of diversity in another.18
For example, the international standard for freight container dimensions (ISO/R 668) lies at the basis of intermodal transport between in particular sea, rail and road transport,19 and illustrates that standards can even support the more radical infrastructure transition20 from road to rail and inland shipping. They can fulfil a flexibility-enhancing role in LTS and infrastructure design.
Objectives of flexibility
Flexibility is a means and not an end in itself. Therefore, we need to know for what reason system flexibility is sought. What are its objectives?
Many areas of technology, diverse as they may be, share the same objectives.21 For example, in physical transportation, the automobile industry and information management, producers seek a kind of flexibility that allows system development while preserving earlier investments, reduce engineering efforts and facilitate system maintenance. In the automobile industry, flexibility also serves the purpose of creating a wider variety of personalized products. Some general, partly overlapping flexibility objectives are:
Case: ict entrenchment
The ict system of a large government agency illustrates the entrenchment problem.22 The case is from 2002 but still very recognizable. Its ict resources evolved in a piecemeal fashion. Bit by bit stand-alone, local provisions have been coupled and integrated with networked functionalities.
Of the 350 software systems, 150 are generic and being used throughout the organisation, e.g. word processors. Two hundred software systems serve a specific, special purpose and are only used by specific people or locally. Those involved identified a number of serious problems with respect to system maintenance and evolution.23 In particular,
The case illustrates that where
information systems are updated, […] frequently, the resulting system grows increasingly complex, as does the maintenance process itself.24
The complexity and further development of the ict resources become difficult to manage. The ict system lacks the necessary flexibility.
Objectives of flexibility, continued
To explore the above objectives in more detail we return to ict and its flexibility requirements. In this field, reusability of information system components eases system innovation, reengineering, and managing the rapid change of technological generations. Independent and reusable data and application components simplify “processes of development, maintenance or reengineering of direct-purpose systems,” 25 and reduce their costs. Reusability is an overarching goal. It comes in different shapes, and is an important element in many of the following, specified flexibility objectives in ict:
Reviewing the above list of flexibility objectives, intuitively, standards will be judged important, for example, for exchangeability and longevity. For certain other system objectives other kinds of compatibility may be a more obvious choice.
Compatibility
Insight into different characteristics and sources of compatibility help us understand the manner in which standards and other compatibility strategies may contribute towards achieving system objectives; see below the two sections Generic and dedicated gateway Solutions and Sources of de facto compatibility. This background facilitates the identification of well-known compatibility strategies such as standardization (section Compatibility strategies and dimensions) and as yet unexplored strategies (section Compatibility space).
Generic and dedicated gateway solutions
For the argument of this chapter, the term compatibility essentially refers to the “suitability of products, processes or services for use together.”33 The term interoperability is used here as synonymous.
Compatibility can be achieved by different means, e.g. by a gateway technology. The latter refers to
a means (a device or convention) for effectuating whatever technical connections between distinct production sub-systems are required in order for them to be utilised in conjunction, within a larger integrated . . . system.34
[Gateways] make it technically feasible to utilise two or more components/subsystems as compatible complements or compatible substitutes in an integrated system of production.35
Gateways differ in the scope of compatibility they achieve.36 Some gateways are dedicated. They link an exclusive and specified number of subsystems. For example, gateways that link specific proprietary computer networks belong to this category. Other gateways have generic properties.
Committee standards constitute a main group of generic gateways. The example of the A4 paper format was mentioned earlier. Its generic features apply to, for example, paper storage and processing devices. An even more generic category of standards is the reference models that guide interdependent, complementary standards activities. A textbook example in the field of ict is still the Open Systems Interconnection (OSI) Reference Model.37 Gateway technologies can thus be categorized as dedicated, or (meta-)generic, depending on the scope of compatibility concerned.
The degree of standardization determines the scope of the gateway solution. Where no standardization has occurred, the connection between subsystems is, as it were, improvised. This corresponds to a dedicated gateway. Standardized gateway solutions, which aim at connecting an unspecified number of subsystems, correspond to generic gateways. Gateways based on modelled (standardized) solutions, that is, standardization at the level of reference frameworks, embody meta-generic properties. Table 1 summarizes the relationships.
Table 1: Relationship between the level of standardization and the scope of the gateway solution.38
Sources of de facto compatibility
Committee standardization is a means to coordinate the activities of parties (otherwise) competing in the market.39 Ideally, the resulting committee standards become the shared basis for compatible implementations and lead to de facto compatibility. However, standards as such of course do not guarantee compatibility. Whether this is actually achieved depends on the scale and manner in which standards are implemented.40
To take the argument one step further, ultimately, the public interest is not served by the activity of standardization per se, but by compatibility or, to use the synonym, interoperability. Compatibility may be achieved by committee standards but by other means as well. For example, compatibility can result as a by-product of market dominance. A prominent example is the PDF format. Such de facto standards may sometimes more forcibly induce widespread compatibility than committee standards do, with costs often occurring elsewhere, for example regarding information independence (inaccessible data) and market competition (lock-in).
The origin of de facto standards may differ. 41 Some result from in-company R&D efforts, others from cooperation between several parties, e.g. open source communities or consortia. The type of specification process needs have no bearing on how ownership of the specification is handled. Whatever Intellectual Property Right (IPR) license strategy is used, whether it is a proprietary or an open source one, a sizeable market share may result. When a software specification acquires market dominance, usually standards literature will retrospectively speak of de facto standardization, meanwhile referring to the software specification process. It would be more to the point to speak of de facto compatibility in order to emphasize the relevance of the outcome. Then, de facto compatibility is the outcome of a market-wise successful product development trajectory.42
Compatibility strategies and dimensions
Although there are several ways to address problems of interoperability and create compatibility, to my knowledge, no comparative literature exists on this matter. In the following, the relevance of some such compatibility strategies is argued. For purpose of reference, the discussion starts with dedicated gateways, which is the default compatibility strategy in most situations.
Dedicated gateways. As defined earlier, this chapter reserves the term dedicated gateway for a device or convention that, in contrast with a generic gateway, allows for a limited number of subsystems to be used together. The AC/DC rotary converter is an example of a dedicated gateway device. In the early years of electricity, it coupled the subnetworks of direct and alternating current.43 An example of a dedicated gateway convention is the Nordunet Plug.44 This protocol provided access from different subnetworks, i.e. OSI/X.25, EARN, DECnet, and ARPANET/IP, to a shared backbone. Both these gateways were designed to link specified subsystems.
Different views exist on the degree of flexibility which dedicated gateways provide. Hanseth emphasizes the flexibility they create for experimentation at subsystem level and their importance in the phase of system building.45 Thus, the Nordunet Plug played an important role in the building-up of Norwegian data communication.
However, these gateways work as an ad hoc solution, often worsening the subsystem’s entrenchment. Although such gateways may initially provide the required flexibility, they may turn out to be another instance of a temporary solution to the consequences of inflexibility. […] If gateways are [not standardized or modular], they may add the sort of complexity to the infrastructure that obstructs flexibility.
Standardization. Committee standards have a generic character. First, they create complements and facilitate substitution between standardized artefacts. For example, widespread use of the ISO standard for freight container dimensions facilitates exchange between and the use of different complementary combinations of transport modes (intermodal shift). It created a technology-neutral system environment regarding the transport mode.
Second, committee standards such as the container standard also create a supplier-neutral system environment, and thus are generic in the economic sense. They specify how the standardized artefact must interface and thereby create a level playing field for different system vendors. In the information and communications technology (ict) sector, standards are an important weapon against supplier dependence. Indeed, since the early days of the computer, customers have been tied to the products of their initial platform provider and unable to switch systems without incurring heavy costs. Dedicated interconnections between proprietary systems only partly alleviated the interoperability problem. Although technically feasible, such interconnections were too costly, numerous and cumbersome to create and sustain. In the 1980’s this resulted in standards activities which focused on open systems. Open systems are
computer environments that are based on de facto or international standards, which are publicly available and supplier independent.46
Figure 1 projects the degree of standardization as one dimension. The degree of technical compatibility achieved by adopting an ad hoc, improvised solution is portrayed to the left, i.e. no standardization. Dedicated gateways and proprietary de facto standards are classified according to this dimension as improvised solutions. Highly standardized solutions such as reference models would be projected to the extreme right.47
Modularity. Modularity is understood as follows:
A system is modular if it consists of distinct (autonomous) components, which are loosely coupled with each other, with a clear relationship between each component and its function(s) and well-defined48 […] interfaces connecting the components, which require low levels of coordination.49
In ict systems modularity plays a role at different system levels. For example, software modules may be used in what Reitwiesner and Volkert50 call componentware (component-based software) or, at a higher level, in pick-and-mix configurations. Modularity constitutes the second compatibility dimension as displayed in figure 1. At one end of this dimension, the modular approach is not applied. Improvised solutions would be projected here (low degree of modularity). At the other end approaches are projected that have a highly modular structure, for example reference models that include components and indicate how they are interrelated.
Interactive compatibility. The term compatibility artefact refers to categories of technical devices and conventions that create compatibility between ict components and (sub)systems. For example, by definition an interface creates compatibility and is in this sense a compatibility artefact. Other artefacts that potentially belong to this category are middleware, gateways and software agents. The term middleware refers to generic building blocks that support different applications, e.g. webservices to interface applications. Gateways usually create compatibility between protocols, et cetera, in a fixed, static way.
However, they sometimes also negotiate compatibility in a more dynamic manner. For example, Krechmer and Baskin51 use the term adaptability standards to capture the phenomenon of negotiation among standardized telecommunication services.52
Adaptability standards specify a negotiation process between systems which include two or more compatibility standards or variations and are used to establish communications. These standards negotiate the channel coding and/or source coding. This is an emerging form of standard. Examples include: T.30 (used with G3 facsimile), V.8, V.8bis (used with telephone modems), G.994.1 (used with DSL transceivers), and discovery protocols.53
The possible relevance of negotiating compatibility also applies to non-standardized settings. Increasingly, agent technologies will play a role in creating compatibility. Specific attributes of software agents are that they are autonomous, goal-driven, and can negotiate and interact with their environment, i.e. can communicate, act towards and react to the environment.54 These features are all essential to intelligent gateways. Although the technology is still largely in the research phase, one could imagine that in the future such agents will be designed to self-organize compatibility and manage the complexity of conversion for the sake of interoperability.
In Figure 1, the dimension of interactive compatibility distinguishes artefacts as being more passive or more active in forging compatibility. At the high end of this dimension, artefacts are projected that have the capacity to negotiate and interact in an intelligent and autonomous way, e.g., agent technologies. At the low end, artefacts are projected that create compatibility in a passive that is, in a static and fixed manner (passive interfaces).
Transparency. Openness is a quintessential requirement for public ict infrastructures of the future. Nowadays, the term openness refers to publicly available, shared platforms; to collaborative efforts and collective software development; and to easy access to and use of source code. That is, it is also associated with notions of availability, accessibility, collective development, public ownership, and transparency.
Many of these are also features of the open source approach. The element of transparency in open source code is a distinctive feature. It has bearing on compatibility, albeit indirectly. Additional research is required on the subject. For in and by itself, transparency (open source code) need not increase compatibility, yet contributes to longevity of public data.55
Compatibility space
The three most relevant compatibility dimensions discussed in the previous are depicted as orthogonal, i.e. independent, dimensions in Figure 1.It illustrates that compatibility creating artefacts as projected onto the X-axis, whether more passive or more active, can be standardized to different degrees (as depicted on the Y-axis) and designed in a modular way (as shown on the Z-axis).
Figure 1: Three-dimensional space of compatibility solutions.
Figure 1 draws attention to as yet unexplored compatibility strategies. While at present most solutions are dedicated and improvised, i.e. non-standardized and non-modular, the three-dimensional compatibility space draws attention to the wide range of alternatives available. Different situations will need different solutions. Awareness of the compatibility space helps prioritise compatibility solutions according to circumstances. It helps situate more common compatibility solutions. For example, Hanseth argues for the modular approach and the use of dedicated gateways for innovative web development.
The Web technology is developing rapidly. How to integrate corporate networks and the global Internet is yet an unsettled issue. How to do this is a matter of experiment for a long time. Such experiments require modular and flexible solutions—one must be able to change the modules independently. This requires interfaces between the modules, and what is going on inside the interface is irrelevant for outsiders. Gateways are exactly the interfaces needed to make larger networks flexible.56
But figure 1 also points to the possibility of devising more unusual compatibility strategies such as standardized negotiating agents.
Matching compatibility and flexibility
Recapitulating, there are several ways to create system flexibility. See figure 2, staking out the most relevant elements to take into account when matching compatibility with flexibility. Although flexibility might be achieved by different means, the focus is here on what compatibility strategies may contribute. Experience informs us that specific flexibility objectives are better achieved in some ways than in others. For example,
Figure 2: Matching compatibility with flexibility.
To illustrate the possible presence of multiple solutions, I call to mind a classic problem of information technology, i.e. of customers tied to a specific proprietary computer platform (lack of portability). In the 1990s, the Java community developed a middleware solution to bring about system flexibility, the Java platform. Later on an additional compatibility strategy was tried. Standardization of the Java platform was attempted twice, but failed every time.57
In large technical systems, matching compatibility strategies and flexibility objectives may become a complex matter because the choice of strategy partly depends on factors such as
However, focused research is needed to examine these assumptions.
Conclusion
Compatibility plays a key role in the development of an open and innovative future public ict infrastructure. In an ever-changing environment, designing (optimal) flexibility into such large and composite socio-technical systems is difficult. This difficulty resides in the infrastructure’s heterogeneity, the vast number and diversity of co-producers, its high paced dynamics. Adding to the extreme difficulty is technical as well as socio-institutional entrenchment, e.g. regulation, embedded market interests in the status quo and aligned education curricula, which hinders making any significant changes.
In this setting, standards and other means to create compatibility are important to look into as they may simultaneously affect the flexibility needed for system innovation or any of the other flexibility objectives. There are innumerable compatibility strategies each of which can be plotted as a coordinate in a three-dimensional conceptual space (see figure 1). The dimensions of this compatibility space are defined by the degree of standardization, of modularity, and of interactive compatibility. Matching compatibility strategies and types of system flexibility cannot be done in a uniform way. Although some strategies generally seem better placed to increase the system’s responsiveness to future demands than others, e.g. highly modularised and modelled consensus standards, the ideal match depends on the circumstances, for example whether the pressure for change is expected to persist.
Some very fundamental research questions remain to be answered, however. Are the three compatibility dimensions proposed the most relevant ones? What new compatibility strategies do they point to, and could their development significantly contribute to the seamless web of a future public ict infrastructure? Does the flexibility required of private and public infrastructures differ? And if so, what implications does this have for the choice of compatibility strategy?
Tineke M. Egyedi, PhD, works as a Senior researcher on Standardization at the ICT department of the faculty of Technology, Policy and Management, Delft University of Technology, and leads a research project in the Next Generation Infrastructures program. She is vice-president of the European Academy for Standardization (EURAS) and the former Chair of the International Committee for Education about Standardization (ICES).
1 Thomas P. Hughes, The Evolution of Large Technological Systems, in: The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology, Wiebe E. Bijker, Thomas P. Hughes, and Trevor J. Pinch (eds.), MIT Press, 1987, pp. 51-82.
2 Unless explicitly mentioned otherwise, e.g. as de facto standards, in this chapter I focus on committee standards, i.e. on standards developed in committees of formal standards bodies such as ISO, in professional organizations and other multi-party fora (IEEE, IETF), or in standards consortia such as W3C, that is, in multi-party industry standards fora. Such standards are “provisions for common and repeated use, aimed at the achievement of the optimum degree of order in a given context.” See ISO/IEC Guide 2: General Terms and Their Definitions Concerning Standardization and Related Activities, International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), 1991.
3 This chapter extends the author’s views as presented in Standards and Sustainable Infrastructures: Matching Compatibility Strategies with System Flexibility Objectives, in: Unifier or Divider?, Sherrie Bolin (ed.), Canton: Bolin Communications, Standards Edge Series, 2010, pp. 223-234. For allowing its development over the years, I gratefully acknowledge funding the Dutch Ministry of Transport, Public Works and Water Management, Sun Microsystems, and the Next Generation Infrastructures Foundation.
4 Nancy Bogucki Duncan, Capturing Flexibility of Information Technology Infrastructure: A Study of Resource Characteristics and their Measure, in: Journal of Management Information Systems, Vol. 12, no. 2, Fall 1995, pp. 37–57.
5 David Collingridge, The Social Control of Technology, The Open University Press, 1981, p. 47.
6 Hughes, The Evolution of Large Technological Systems. See Bijker et al.
7 Andrew Davies, Innovation in Large Technical Systems: The Case of Telecommunications, in: Industrial and Corporate Change, Vol. 5, no. 4, 1996, p. 1148.
8 E. van der Vleuten, Twee decennia van onderzoek naar grote technische systemen: thema’s, afbakening, en kritiek, in: NEHA—Jaarboek Voor Economische, Bedrijfs—en Techniekgeschiedenis, deel 63, 2000, pp. 328-364.
9 Karel F. Mulder and J. Marjolijn Knot, PVC plastic: A history of systems development and entrenchment, in: Technology in Society, no. 23, 2001, pp. 265-286.
10 Flexible Infrastructures is one of the main themes in the Next Generation Infrastructures research program:
http://www.nextgenerationinfrastructures.eu/.
11 Eran Feitelson and Ilan Salomon, The Implications of Differential Network Flexibility for Spatial Structures, in: Transportation Research Part A: Policy and Practice, Vol. 34, 2000, p. 463.
12 Bernard Joerges, Large Technical Systems: Concepts and Issues, in: The Development of Large Technical System, Renate Mayntz and Thomas P. Hughes (eds.), Campus Verlag, 1988, p. 30.
13 Ole Hanseth, Eric Monteiro, and Morten Hatling, Developing Information Infrastructure: The Tension between Standardization and Flexibility, in: Science, Technologies and Human Values, Vol. 21, no. 4, 1996, pp. 407-426.
14 N.M. Reddy, Product of Self-Regulation: A Paradox of Technology Policy, in: Technological Forecasting and Social Change, Vol. 38, Issue 1, 1990, p. 59.
15 Paul A. David and Shane Greenstein, The Economics of Compatibility Standards: An Introduction to Recent Research, in: Economics of Innovation and New Technologies, Vol. 1, Issue 1-2, 1990, p. 7.
16 Carl Cargill, Information Technology Standardization: Theory, Process and Organizations, Digital Press, 1989); Reddy, Product of Self-Regulation, p. 56.
17 Geoff J. Mulgan, Communication and Control: Networks and the New Economies of Communication, Guilford Press, 1990, p. 202.
18 Paul A. David, Standardization Policies for Network Technologies: The flux between Freedom and Order Revisited, ENCIP Working Paper Series, EEIG/ENCIP, October 1994, p. 25.
19 Tineke M. Egyedi, The Standardized Container: Gateway Technologies in Cargo Transport, in: EURAS Yearbook of Standardization, Vol. 3 Homo Oeconomicus XVII(3), Manfred Holler and Esko Niskanen (eds.), Accedo, 2000, pp. 231-262.
20 Tineke M. Egyedi and Jaroslav Spirco, Standards in Transitions: Catalyzing Infrastructure Change, in: Futures, forthcoming.
21 Duncan, Capturing Flexibility of Information Technology Infrastructure; Takahiro Fujimoto and Daniel Raff, Conclusion, in: Coping with Variety: Flexible Productive Systems for Product Variety in the Auto Industry, Yannick Lung, J. J. Chanaron, Takahiro Fujimoto, and Daniel Raff (eds.), Ashgate, 1999, pp. 393-406; Feitelson and Salomon, The Implications of Differential Network Flexibility; Terry Anthony Byrd and Douglas E. Turner, An Exploratory Examination of the Relationship between Flexible IT Infrastructure and Competitive Advantage, in: Information and Management, Vol. 39, no. 1, November 2001, pp. 41-52.
22 Tineke M. Egyedi, Trendrapport Standaardisatie: Oplossingsrichtingen Voor Problemen van IT-Interoperabiliteit, Ministerie van Verkeer en Waterstaat, Rijkswaterstaat/ Meetkundige Dienst, September 25, 2002, pp. 1-52.
23 Ibid.
24 Duncan, Capturing Flexibility of Information Technology Infrastructure, p. 43.
25 Ibid.
26 J.A. Dinklo, Open Systemen, in: Informatie en Informatiebeleid, deel 7, no. 2, 1989, pp. 29-36.
27 Ibid.
28 Ibid.
29 Duncan, Capturing Flexibility of Information Technology Infrastructure.
30 Note that reuse of part of a system for the purpose of integration with another system (part) is a transient form of flexibility: once integrated into a higher level system, flexibility is lost at the lower level. An example of integration can be found in Philipp Genschel, Institutioneller Wandel in der Standardisierung van Informationstechnik (doctoral dissertation, University of Cologne, Germany, 1993).
31 Genschel, Institutioneller Wandel.
32 Kees van der Meer , The sustainability of digital data: Tension between the dynamics and longevity of standards, in: The dynamics of standards, Tineke M. Egyedi and Knut Blind (eds.), Edward Elgar, 2008, pp. 15-27.
33 ISO/IEC, ISO/IEC Guide, 2.
34 Paul A. David and Julie Ann Bunn, The Economics of Gateway Technologies and Network Evolution: Lessons from Electricity Supply History, in: Information Economics and Policy, Vol. 3, no. 2, 1988, p. 170.
35 Ibid, 172.
36 Egyedi, The Standardized Container: Gateway Technologies in Cargo Transport.
37 The OSI reference model (ISO 7498 and CCITT X.200) identifies logically separate generic functions in data communication. It depicts these as a set of hierarchically ordered layers, which address areas of standardization.
38 Egyedi, The Standardized Container: Gateway Technologies in Cargo Transport.
39 Susanne K. Schmidt and Raymund Werle, Coordinating Technology: Studies in the International Standardization of Telecommunication, MIT Press, 1998; Martin B. H. Weiss and Marvin Sirbu, Technological Choice in Voluntary Standards Committees: An Empirical Analysis, in: Economics of Innovation and New Technology, Vol. 1, 1990, pp. 111-133.
40 Tineke M. Egyedi, An implementation perspective on sources of incompatibility and standards’ dynamics, in: The dynamics of standards, pp. 181-189.
41 Tineke M. Egyedi, Strategies for de facto Compatibility: Standardization, Proprietary and Open Source Approaches to Java, in: Knowledge, Technology, and Policy, vol. 14, no. 2, 2001, pp. 113-128.
42 Peter Grindley, Standards, Strategy and Policy: Cases and Stories, Oxford University Press, 1995, p. 140ff.
43 Hughes, The Evolution of Large Technological Systems.
44 Ole Hanseth, Gateways—Just as Important as Standards: How the Internet Won the ‘Religious War’ about Standards in Scandinavia, in: Knowledge, Technology, and Policy, Vol. 14, no. 3, 2001, pp. 71-90.
45 Ibid.
46 Dinklo, Open Systemen, pp. 29-30.
47 The question whether certain standards are more effective at achieving system flexibility than others is addressed elsewhere. See Tineke M. Egyedi and Zofia Verwater-Lukszo, Which standards’ characteristics increase system flexibility? Comparing ICT and batch processing infrastructures, in: Technology in Society, Vol. 27, 2005, pp. 347-362.
48 Wolters includes “standardized interface” as a property of modularity. However, as I will argue, its two elements should be distinguished.
49 Matthijs J. Wolters, The Business of Modularity and the Modularity of Business, doctoral dissertation, Erasmus University of Rotterdam, ERIM Ph.D. Series in Management, no. 11, 2002.
50 Bernd Reitwiesner and Stefan Volkert, On the Impact of Standardization on the Provision of ERP-Systems, in: EURAS Yearbook of Standardization, Vol. 4, Manfred .Holler (ed.), Accedo Verlag, 2003, pp. 79-102.
51 Ken Krechmer and E. Baskin, Standards, Information and Communications: A Conceptual Basis for a Mathematical Understanding of Technical Standards, in: Proceedings of the 2nd IEEE Conference on Standardization and Innovation in Information Technology, SIIT 2001, pp. 106-114.
52 Ibid. Krechmer and Baskin’s notion of adaptability standards inspired the dimension of interactive compatibility. Adaptability standards are defined by the dimensions of interactive compatibility as well as standardization.
53 Ibid.
54 Marijn Janssen, Designing Electronic Intermediaries: An Agent-Based Approach for Designing Interorganizational Coordination Mechanisms, doctoral dissertation, Delft University of Technology, 2001, p. 11.
55 Ruben van Wendel de Joode and Tineke M. Egyedi, Standardization and Other Coordination Mechanisms in Open Source Software, in: Advanced Topics in Information Technology Standards and Standardization Research, Vol. 1, Kai Jakobs (ed.), Idea Group, 2006, pp. 67-85.
56 Hanseth, Gateways—Just as Important as Standards, p. 88.
57 Tineke M. Egyedi, Why JavaTM was – not – standardised twice, in: Computer Standards & Interfaces, Vol. 23, no. 4, 2001, pp. 253-265.