January 27, 2005

Airlines: To Survive, Must Some Die for Good?

"Knowledge at Wharton" includes a concise article on the "death spiral" of the U.S. airline industry, likely to be aggravated by recent price wars. Sources of the problem date back to the legacies of the days when a line could make money at 50% capacity (gone since deregulation) and the excess capacity in the industry. The suggested solution includes not letting bankrupt airlines rise again to compete, and making sure that in the future, the dead stay dead.

Sounds to me a lot like other industries with similar woes and capacity surpluses. Maybe some of the same solutions might work there.

The article identifies companies that are most unlikely to survive, in Wharton's opinion, and suggests three solutions: 1) reform bankruptcy laws, 2) revamp industry work rules and 3) limit new investment in airlines.

Few Survivors Predicted: Why Most Airlines Are Caught in a Tailspin - Knowledge@Wharton

DougSimpson.com/blog

Posted by dougsimpson at 08:59 AM | Comments (0) | TrackBack

January 21, 2005

ProfBlogs Have a Hub

Blogs by law professors now have a hub with a RSS feed. At "Welcome to Law Professor Blogs" you'll find links to about a dozen topic-specific blogs, each focused on a particular subject matter, from Antitrust to White Collar Crime. Their site avoids personal ruminations in favor of resources and links to assist law professors in research and teaching.

DougSimpson.com/blog

Posted by dougsimpson at 09:01 AM | Comments (0) | TrackBack

January 15, 2005

A.G. Spitzer eyes offshore captives, may need feds to help

On January 7, New York Attorney General Eliot Spitzer told his state's legislature that in his expanding investigation into the insurance industry, "[w]e have also begun to look at other troubling areas of the insurance industry beyond steering and bid rigging, such as conflicts of interest that arise between brokers and captive insurance and reinsurance companies that are operated or owned by brokers."

Attorney Spitzer attributed to a 2002 Swiss Re study a statement that a majority market share was concentrated in two or three insurance brokerages, saying that "the threat of collusion has become a reality. We found that a small group of brokers and insurance companies have created a network of interlocking connections and secret payments which ensure that the bulk of business goes to certain insurers and that profits remain high."

Continue reading ...

Although he did not provide a precise citation to the Swiss Re study, a web search at Swiss Re's website found a 2004 research paper with the indicated data. From the executive summary "Commercial insurance and reinsurance brokerage -- love thy middleman," Sigma No. 2/2004 (Swiss Re, March 2004):

"In 2002, global commercial brokerage revenues were estimated at about USD 27 billion. The broker industry is highly concentrated, with Marsh and Aon accounting for 54% of revenues. As a subset of that market, the 2002 global revenues from reinsurance brokerage are estimated at USD 3 billion. This market segment is highly concentrated, with the four companies accounting for 78% of the total market."

"The growth of offshore markets -- particularly the Bermuda market -- has increased the share of brokered business in commercial lines and reinsurance, and is responsible for the recent decline or [sic] brokered commercial business in US insurance market statistics. Brokers have played a very active role in setting up some of the new Bermuda players and serve as the sole distribution channel for offshore reinsurance carriers." Swiss Re, Sigma No. 2/2004 (March 2004).

Attorney Spitzer acknowledged the challenges of investigating operations such as captive insurers that are located outside the jurisdiction of the United States, and called for greater federal involvement in insurance industry accountability.

His remarks were during testimony to the State Assembly Standing Committee on Insurance on January 7, 2005. insurance_assembly_testimony.pdf (application/pdf Object)

The Attorney General's statements open up the issues of potential antitrust exposures of brokers and reinsurers operating outside of the United States but impacting trade and commerce within the United States. As the ultimate risk bearers for the global insurance network, reinsurers have a significant impact on the pricing and availability of insurance at all levels of distribution.

In the 1980's, decisions by international reinsurers had a substantial impact on the availability of commercial liability insurance in the U.S. and became the subject of major antitrust action by state attorney generals. In its decision, the Supreme Court reaffirmed that "the Sherman Act applies to foreign conduct tha was meant to produce, and did in fact produce, some substantial effect in the United States." Hartford Fire Ins. Co. v. California, 509 U.S. 764 (1993).

DougSimpson.com/blog

Posted by dougsimpson at 07:34 AM | Comments (0) | TrackBack

January 14, 2005

GLBA trumps Mass. laws on bank insurance sales: USDC

Massachusetts Bankers Assoc. Reports Favorable Federal Court Decision on Bank Insurance Sales

The U.S. District Court's memorandum of decision in Massachusetts Bankers Association, Inc. v. Bowler, Civ. Action No. 03-11522-RWZ. (January 10, 2005) found that certain provisions of the Mass. Consumer Protection Act (Mass.Gen.Laws ch. 167F, Sec.2A), dealing with sales of insurance by banks, were preempted by the Gramm-Leach-Bliley Act. The contested provisions included the Referral Prohibition, the Referral Fee Prohibition, the Waiting Period Prohibition and the Separation Provision.

Last year, the First Circuit Court of Appeals dismissed an attempt by the Massachusetts Commisioner of Insurance and Banks to contest a 2002 opinion to similar effect that had been issued by the Office of the Comptroller of the Currency of the United States (OCC). Bowler v. Hawke, 320 F.3d 59 (1st Cir. 2003).

DougSimpson.com/blog

Posted by dougsimpson at 05:14 AM | Comments (0) | TrackBack

January 11, 2005

Credit Scores Correlate with Claims Experience : Texas Department of Insurance

Texas Department of Insurance released results of a study correlating claim experience, ethnicity and credit scores of individual insureds under 2 million personal lines insurance policies.

While the report indicates a "strong relationship between credit scores and claims experience on an aggregate basis," that conclusion was based on a monovariant analysis. The Department has undertaken a multivariant analysis and expects results by January 31, 2005.

While the study found that a majority of insurers utilized credit scores in rating and underwriting, the impact on rates varies significantly among insurers.

The full report is available at: Report to 79th Legislature : Use of Credit Information by Insurers in Texas

DougSimpson.com/blog

Posted by dougsimpson at 04:54 PM | Comments (0) | TrackBack

January 10, 2005

"Web of Law": Not just a figure of speech

The "Web of Law" is more than a figure of speech, according to a draft paper by Prof. Thomas A.C. Smith of Univ. of San Diego School of Law now available on SSRN-The Web of Law by Thomas Smith

As Prof. Smith told me in a recent email, "it presents the preliminary results of a citation study of all US cases, which shows that the legal network is organized as a scale-free network, as the WWW is. Many interesting consequences follow from the scale free structure of the web of law. It implies that law will have self-organizing properties analogous to those of the Web, that advanced searching technologies being used on the WWW (google, cluster.com, teoma) will also work in the Web of Law, that quality or 'fitness' of legal opinions and scholarship can be measured much more rigorously, that the evolution of legal systems can be studied and that it will parallel in many respects the evolution of other scale free networks, and that legal opinions age in predictable ways, putting doctrines of precedent in a different light."

Tom's paper passes my first litmus test for a paper on network theory by citing Duncan J. Watts and Albert-Laszlo Barabasi on the first page. He includes a highly readable backgrounder on network science, and gracefully links it to the world of legal scholarship. His data comes from the assistance of LexisNexis, which performed a "full network scan" of the Web of Law. An important and valuable piece available at: SSRN-The Web of Law by Thomas Smith

DougSimpson.com/blog

Posted by dougsimpson at 11:08 PM | Comments (0)

January 06, 2005

PLoS Adds Journals To Creative Commons

Public Library of Science has announced three new open-access journals--PLoS Genetics, PLoS Computational Biology, and PLoS Pathogens.

PLoS Computational Biology is accepting submissions at www.ploscompbiol.org and is scheduled to launch in June 2005 with the involvement of the International Society for Computational Biology (ISCB),

PLoS Genetics is expected to launch in July 2005 and is accepting submissions at www.plosgenetics.org.

PLoS Pathogens, begins accepting submissions in March 2005 and is expected to start publishing in the fall. PLoS has not yet identified a website for submissions, but a check for www.plospathogens.org recovered a report that it was recently registered.

Public Library of Science applies the Creative Commons Attribution License to all works it publishes.

DougSimpson.com/blog

Posted by dougsimpson at 02:39 PM | Comments (0) | TrackBack

January 05, 2005

Reading: Weber, "Success of Open Source"

Traditional property rights protect the owner's right to exclude; property rights in "open source" material protect the right to distribute, according to Berkeley Professor Steven Weber's analysis in "The Success of Open Source," (Harvard University Press 2004). Traditional thinking holds that exclusion rights are necessary to motivate invention by reducing the effects of "free riding" and the "tragedy of the commons." Weber analyzes the progress of open source software projects such as Linux and proposes the elements that may be essential to success of open source models generally. He speculates about possible extensions of the open source development model beyond software, into fields such as medicine and genomics.

Success of a true open source model depends upon the presence of particular characteristics in the tasks to be accomplished and particular motivations and capabilities of the agents performing those tasks. Weber's analysis dissects those characteristics out of case studies of software development and presents them in terms transferable to other fields. He also suggests general organizational principles for distributed innovation and compares aspects of alternative legal structures including the General Public License ("GPL").

Some notes on Weber's book follow.

(Continue reading)

Property and the Problem of Software

Under the principles of open source as applied to software, the instruction set necessary to reproduce and modify the package ("source code") is set "free," (as in “free speech,” not “free beer”). Weber summarizes the "Open Source Definition" as providing that:

  • Source code must be distributed with the software or otherwise made available for no more than the cost of distribution.
  • Anyone may redistribute the software for free, without royalties or licensing fees to the author.
  • Anyone may modify the software or derive other software from it, and then distribute the modified software under the same terms.
    Weber p. 5. The full Open Source Definition is available at: http://www.opensource.org/docs/definition.php

    Under traditional economic analysis, this model lacks incentives for creators and capitalists, yet under certain conditions it motivates a swarm of self-directed volunteers to cooperate, create and assemble working computer code. Weber attempts to identify and analyze those conditions using the principles of political economy with frequent reference to writings of Eric Raymond. See, e.g. Eric S. Raymond, “The Cathedral and the Bazaar,” (O’Reilly, 2001) and the original essay of the same name at http://www.firstmonday.org/issues/issue3_3/raymond/ .

    Weber focuses on three component questions posed by open source models:

  • Why do talented individuals contribute to a non-rival, non-excludable public good?
  • How do they coordinate cooperation without hierarchy or market structures?
  • What enables open source models to deal with complexity and avoid “Brooks Law” (adding manpower to a late software project makes it more late)? See: Frederick P. Brooks, “The Mythical Man-Month: Essays on Software Engineering,” (Addison-Wesley, 1975).

    He also addresses four broader subjects illuminated by the open source models:

  • the process of organization of networks;
  • new principles of relations between communities, culture and commerce;
  • economic behavior around a distribution model of property rights;
  • potential limits to an extension of open source methods into other knowledge domains.

    Early History of Open Source

    A joint project at MIT, Bell Labs and GE to develop an improved time-sharing OS led to frustration and Bell’s withdrawal. A Bell researcher, Ken Thompson, working alone over a period of four weeks, built the central core of Unix and its philosophy:

  • Write programs that do one thing and do it well.
  • Write programs that work together.
  • Write programs that handle text streams (a universal interface).

    AT&T feared that it would breach the 1956 antitrust consent decree if it sold Unix, so licensed it free, “as is,” and included the source code. It first went to research centers and was used as a research and learning tool. Lacking AT&T support, a community of early users made and freely shared their own bug fixes, enhancements and extensions, with Berkeley taking a leading role. In 1983, at the urging of DARPA (which feared ‘lock-in’ from dependency on a proprietary system), Berkeley’s Bill Joy integrated TCP/IP into Berkeley Unix, fostering its role as a foundation of the Internet.

    In the 1970’s, Microsoft pressed for broader recognition of proprietary rights in software about the same time as AT&T was broken up on antitrust grounds. Free of the 1956 consent decree that limited its role in commercial software sales, AT&T jacked up license fees for the popular Unix system. Berkeley organized a network that developed an alternative that was free of code owned by AT&T and distributed it under liberal terms. Tensions grew between those supporting the proprietary versions of Unix and those behind the “free” version, eventually leading to years of disruptive litigation as AT&T fought to protect its interests.

    A philosophical backlash was led by Richard Stallman, founder of the Free Software Foundation and developer of GNU (“Gnu’s Not Unix”) as an alternative to Unix. Stallman saw software as an expression of human creativity, not just a tool to run computers, and saw traditional property rights as constraints on a cooperative community. The FSF supported the elements of four freedoms essential to Stallman’s vision, including the freedom to charge a fee for the distribution of the free software. Stallman’s views and the meaning of “copyleft” are explained and supported at the GNU/FSF website, www.gnu.org.

    He developed the General Public License (“GPL”) to protect what he called “copyleft,” an inversion of copyright concepts to ensure that free software and its derivatives remain free. A “viral” provision in the GPL bars use of GPL code in another program unless the combination is also GPL-licensed. These self-imposed philosophical restrictions limited the potential applications of Gnu and other GPL-licensed products as alternatives to the proprietary systems defended by AT&T.

    What Is Open Source and How Does It Work?

    Frederick P. Brooks, a scholar of the software development process, separated two kinds of problems in software engineering: essence (what is inherent in the structure) and accident (what happens in the process). Weber writes that in "The Mythical Man-Month" (Addison-Wesley Professional, 1995), Brooks takes the position that the complex nature of software is an essential property rather than an accidental property. Resolving this complexity calls for the creative work of an individual or a small, close-knit group of individuals. Adding programmers, according to “Brooks’ Law,” multiplies the vulnerability to bugs faster than it multiplies the work done, as the number of pathways between workers and operations (and the necessary coordination effort) increases geometrically. As software systems and their desired uses become more complex, Brooks Law presents escalating challenges for proprietary and open source software alike.

    The ideal form of open source process, according to Weber, relies upon voluntary participation and voluntary selection of tasks. The process relies upon suggestions, requests and contributions of solutions from ordinary users as well as developers, but remains ordered and methodical, with vigorous debate over alternatives. Despite sometimes heated differences, the resulting system is stable, even though an essential freedom of all users is to take the source code and a sub-set of followers in another direction (“fork the code base”).

    Weber writes of open source labor as being distributed, but not divided in the industrial sense of that term. As an example, statistical studies of Linux indicate hundreds of central members do most of the coding, with thousands of others making some contribution. All are volunteers, with no formal distinction between the role of user or consumer and that of producer or developer.

    Weber suggests eight principles to capture the essence of the open source process:

  • “Make it interesting and make sure it happens.”
    -- Combinations of attraction and persuasion point participants toward problems.
  • “Scratch an itch.”
    -- Address an immediate problem or opportunity.
  • “Minimize how many times you have to reinvent the wheel.”
    -- Freed from worry about “lock-in,” adopt others’ foundation solutions.
  • “Solve problems through parallel work processes whenever possible.”
    -- Traditional development process relies upon an engineering archetype, in which an authority decides the path to be followed in development. Open source process follows an evolutionary archetype, in which the community allows multiple parallel paths to generate multiple alternative solutions, from which successful solutions are later selected.
  • “Leverage the law of large numbers.”
    -- Open source projects engage many bug-hunters and bug-fixers.
  • “Document what you do.”
    -- The larger, dispersed community is more reliant on documentation.
  • “Release early and release often.”
    -- The evolutionary model rewards frequent iterations (generations), but puts a strain on those that must select among many submitted solutions.
  • “Talk a lot.”
    -- Direct communication within the community, with lots of open conflict, typifies open source processes.

    The Internet has facilitated collaboration among open source participants. The formal licensing schemes, such as the GPL, explicitly shift property rights away from protection of the author to protection of users, expanding the commons. These schemes manifest the social structure underlying the process. The social structure of the open source community influences the structure of the code it writes, resulting in modular code adaptable to voluntary parallel processing. Different coding projects resolve conflict differently, but none have a central authority to enforce rules and all are free to “fork,” or leave the system with the code.

    A Maturing Model of Production

    Disagreements over paths for development of proprietary Unix extensions led to competing forks with higher costs, incompatibilities and fragmented development efforts. Linus Torvalds created Linux as an alternative to Unix, which many developers believed to be “dying.”

    Torvalds elected to create a large, “monolithic” kernel, despite recent trends toward less complex “microkernels.” Weber attributes Torvald’s decision to a belief that the open source process would allow management of a monolithic kernel’s complexity. Asked to allow distributors to charge fees for distribution, Torvalds adopted the GPL as the standard license for Linux, with its “viral” clause that required derivative code to be similarly licensed. Both decisions were responses to open debate over the best path for the process. As the Linux code writers publicly battled over the best technical solutions, AT&T and BSD battled over ownership of the Unix code, mostly in private without involvement of the code writers.

    As an alternative to the basic command prompt, Linux users wanted a GUI. Orest Zborowski ported X, a public domain GUI application to Linux. Torvald encouraged the initiative by reworking the Linux kernel to better work with X. The “lieutenant” organization of Linux development evolved as Torvalds chose between the products of rival code projects and thereafter routed new code submissions to the “winning” leader.

    Linux evolved further as it was ported to architectures other than Intel’s x86 line, was distributed as part of packages with business goals and as its release numbering matured to distinguish between “stable” and “experimental” releases.

    As Linux 2.0 was released and Torvalds took a position with Transmeta in 1996, participants in the process began to articulate a self-consciousness, of which Eric Raymond’s “The Cathedral and the Bazaar” (1997, 2001) was an example. With that self-awareness came the realization that the label “free software” and the rigid moral stance of Richard Stallman’s Free Software Foundation were limiting and problematic. The label “open source” and its accompanying Open Source Definition clarified its agnostic goal of meeting the needs of users rather than serving any particular moral ideal.

    As Linux and its community grew, the load on Torvalds increased. By summer of 1998, the system was stretched to its limit and heated disputes broke out in the open discussion boards, threatening a fork. To the community, Eric Raymond styled the problem in technical terms and proposed a new code management system. The principals agreed to a pyramidal structure to better manage work flow and the crisis passed, the dispute resolved through open public debate in archived Linux email lists. See “Linus tries to make himself scale,” Linux World, February 11, 2002, http://www.linuxworld.com/story/32722.htm

    Linux demonstrated its value to commercial enterprise platforms in 2000 as developers ported high-end database systems to work on it. Netscape attempted to counter Microsoft’s free browser tactic by contributing its browser code as open source, but Mozilla never caught on with the community due to problems with the code and the licensing scheme. IBM was more successful in opening its hardware to Linux and announcing a corporate bet on open source as a core IBM strategy.

    Microsoft’s Vinod Valloppillil analyzed the threat of open source to commercial business models in an August 1998 “Halloween Memo” that was later leaked. It presented for Microsoft’s management the challenge of competing with the open source process. “The intrinsic parallelism and free idea exchange in OSS has benefits that are not replicable with our current licensing model and therefore present a long term developer mindshare threat,” wrote Vallopillil, and “the ability of the OSS process to collect and harness the collective IQ of thousands of individuals across the Internet is simply amazing. More importantly, OSS evangelization scales with the size of the Internet much faster than our own evangelization efforts appear to scale.” The memo addressed the serious challenges for a hierarchical organization that must competitively target a process rather than a company.

    Explaining Open Source: Microfoundations

    Weber opens an examination of why and how individuals contribute to open source projects by dispelling “myths” that open source is the product of like minded individuals acting out of altruism, and that it can be explained by merely labeling it “self-organized.” He finds a variety of individual motivations that do not rely upon the traditional monetary rewards. For the economic logic, he suggests that software is not just “nonrival,” but actually “anti-rival” and subject to positive network externalities, whereby the value of a system increases with the number of users, even benefiting from free riders, as long as some fraction of the users make a contribution. The highly diverse population of the Internet, combined with low connectivity costs, increases the impact of the small percentage of “outlier” users who actually contribute to code solutions.

    Weber recognizes the views of Mancur Olsen that the larger the group size, the lower the benefit to any marginal contributor, and the less the likelihood that a social good will be produced. See Mancur Olsen, “The Logic of Collective Action; Public Goods and the Theory of Groups,” (Harvard Univ. Press, 1971). As an alternative Weber proposes that: “Under conditions of antirivalness, as the size of the Internet-connected group increases, and there is a heterogenous distribution of motivations with people who have a high level of interest and some resources to invest, then the large group is more likely, all things being equal, to provide the good than is a small group.” Weber, p. 155.

    Explaining Open Source: Macro-Organization

    The coordination of an open source effort can be explained by several mechanisms, suggests Weber. Individual disincentives to fork are inherent in a system that has positive network externalities, because segmenting the user population reduces the value of the forked product. A variety of cultural norms common to open source participants govern control of distribution and successful decision-making. Leadership practices reflect the system’s dependency upon participants by emphasizing responsiveness to that group’s followers, the reliance on rational explanation and the focus on technical justification. Weber recognizes the potential for such an emergent system to get stuck in a local maximum (as do ecological systems), and that the systems are testing the balance between the power to fork (that supports creativity) and the disincentives to fork (that support stability).

    Open source efforts manage the complexity of software development in part by modular design of the code, which reduces organizational demands on the social and political structures. Sanctions upon violators of community norms take the form of public expressions of disapproval (“flaming”), and denial of cooperation and access to the community’s support (“shunning”). The variety of open source licenses seen in the environment also demonstrate the norms and standards of the particular process and reflect the common realization that participants expect to be on both sides (licensor and licensee) of the bargain at different times. Early examples of formal governance structures reflect the need to accommodate asynchronous, “bursty” communication and the lack of a hierarchy for division of labor (although a hierarchy for decision making may exist).

    Business Models and the Law

    An open source process not only releases control of source code but also assures that no one can control the source code, complicating the search for business models that accommodate the increased customer control. Weber suggests approaching the problem by focusing on protectible rights other than the source code (e.g. brands and trademarks), and on accumulation of valuable knowledge necessary to implement the source code. He describes several early experiments with such business models attempted in the initial years of open source.

    Open source licenses rely upon copyright law to enforce requirements for control of source code, but few “copyleft” controversies have been resolved by the courts. Weber poses the question whether or not the Digital Millennium Copyright Act’s bar on anti-circumvention tools could be used to prevent creation of systems that provide the functionality of proprietary systems using open source code. Open source uses non-negotiated, “click-wrap” licenses with untested provisions that purport to disclaim warranties and that bind downstream users much like a restrictive land covenant.

    Recent decisions allowing patenting of algorithms and business methods have intensified an ongoing controversy over patentability of software. See State Street Bank & Trust v. Signature Financial Group, 149 F.3d 1368 (Fed.Cir. 1998). Weber asks what might be the impact of a threat of patent litigation against users of open source software when there is no license warranty binding a “deep pocket” to defend against such challenges?

    The Code That Changed the World?

    Open source is a process with elements that can work in fields other than software development, suggests Weber. Traditional notions of intellectual property focus on rewarding creativity by limiting distribution, making the producer more powerful than consumers. Open source systems protect distribution, shifting power to the consumer. Yet under proper conditions, creativity is sufficiently rewarded in an open source environment that it continues. The open source process shifts innovation to the edges of the network, taking away the role of central authority that architects and make labor assignments and in its place leaves a system of distributed innovation.

    Weber suggests four organizational principles that enable distributed innovation:

  • Empower people to experiment.
  • Enable bits of information to find each other.
  • Structure information so it can recombine with other pieces of information.
  • Create a governance system that sustains this process.

    Weber also cautions that without a central authority, distributed innovation systems can get stuck in a sub-optimal equilibrium, falling into the trap explained by Clayton Christensen in "The Innovator’s Dilemma," (Harvard Business School Press 1977). Distributed innovation may also be unequally responsive to all segments of a diverse international community. It is unclear what effects open source processes will have on standards setting. Open source may also accelerate the rate of performance improvement for projects such as cluster and grid computing.

    Open source processes transcend national boundaries and may shift decision-making toward people in the developing world, and enables users in the developing world to drive technology development to serve their needs, suggests Weber. Open source may serve nationalist desires to avoid lock-in and dependency on a particular vendor or on products of a particular nation’s industry. The changes in power relationships and property rights may have more impact than the lowering of transaction costs, and may “destabilize the foundations of existing cooperative arrangements and institutions … .” Weber p. 257.

    Yet to be explored are the political and economic phenomena resulting when open source communities interact with traditional hierarchies. Weber suggests some examples in the interaction between Microsoft and the open source software movement and between the United States and terror networks as described by John Arquilla and David Ronfeldt.

    Weber closes by suggesting avenues for possible generalization of the open source process beyond software development, such as to develop engineering techniques common to players in an industry, sharing of knowledge in the medical field, and in the study of genomics. Weber suggests characteristics of tasks that may be suited to open source solutions:

  • “Disaggregated contributions can be derived from knowledge that is accessible under clear, nondiscriminatory conditions, not proprietary or locked up.
  • The product is perceived as important and valuable to a critical mass of users
  • The product benefits from widespread peer attention and review, and can improve through creative challenge and error correction.
  • There are strong positive network effects to use of the product.
  • An individual or a small group can take the lead and generate a substantive core that promises to evolve into something truly useful.
  • A voluntary community of iterated interaction can develop around the process of building the product."
    Weber, p. 271.

    He also suggests the most suitable motives and capabilities of the involved agents:

  • "Potential contributors can judge with relative ease the viability of the evolving product.
  • The agents have the information they need to make an informed bet that contributed efforts will actually generate a joint good, not simply be dissipated.
  • The agents are driven by motives beyond simple economic gain and have a “shadow of the future” for rewards (symbolic and otherwise) that is not extremely short.
  • The agents learn by doing and gain personal valuable knowledge in the process.
  • Agents hold a positive normative or ethical valence toward the process."
    Weber, p. 272.

    Weber’s “The Success of Open Source” (Harvard, 2004) includes numerous footnotes and a bibliography. It provides a thoughtful perspective on an emergent phenomenon of creative development that tests the frontiers of intellectual property law and policy.

    DougSimpson.com/blog

    Posted by dougsimpson at 10:29 AM | Comments (0) | TrackBack