In 2001, a suit alleging contributory and vicarious copyright infringement was filed by artists and publishers against Grokster and two other file-sharing software makers. In April 2003, the District Court entered summary judgment for the defendants, which became final and appealable in June, and MGM et al appealed. Electronic Frontier Foundation has taken an amicus position in the case, favoring the defendants. They have collected online links to complaints, briefs and decisions in the case, which is now on appeal before the U.S. Court of Appeals for the Ninth Circuit. See EFF: MGM v. Grokster
Thanks to beSpacific for the pointer to this resource.
"P2P United," an trade association of P2P software companies announced a code of conduct to encourage users to demonstrate responsibility in sharing files. See Peer-To-Peer Networks Unveil Code of Conduct. The group intends to demonstrate that a responsible industry can be built around peer-to-peer file sharing, without running afoul of copyright laws and recent accusations regarding pornography and "spyware," said Reuters. In July, the New York Post reported that Adam Eisgrau, of Washington, DC - based Flanagan Consulting, LLC, is representing P2P Online and has also represented the American Library Association and the Digital Future Coalition.
P2P United is also the group that offered to pay the $1,000 settlement of the 12-year old who was the first defendant to settle with RIAA. In a press release, P2P United identified its charter members as Free Peers, Inc.; Grokster, LTD.; Lime Wire, LLC; MetaMachine, Inc.; Piolet Networks, S.L.; and StreamCast Networks, Inc. P2P United's website lists its members as Bearshare, Blubster, Grokster, LTD.; Lime Wire, LLC; EDonkey, Grokster and Morpheus. Notable in its absence is Kazaa.(Read more ... )
Organizations such as the Secure Digital Music Initiative (SDMI) have attempted to develop open technology specifications that would allow sharing of digital music while preserving copyright. Its efforts have been on hiatus for over two years due to a lack of consensus.
Efforts to create a copyright-friendly file sharing environment have broken down in the past. One example is Bertlesmann's financial support of Napster during the period while it attempted to develop industry support for such a system. The effort eventually failed, and led to lawsuits filed against Bertlesmann by other industry leaders who alleged Bertlesmann was contributing to Napster's copyright violations. In June, the LA Times reported that Bertelsmann hoped to develop evidence about the reasons for the 2001 breakdown in negotiations for licenses to allow Napster to operate legally.
Will initiatives such as P2P United's, combined with the recent lawsuits, revive the move to open standards and codes of conduct that address all reasonable interests? It has already piqued the interest of some who have seen "unclean hands" on the file sharing software industry.
Comments and Trackback, please.
Jerry Lawson, in eLawyerBlog, points to Blogs As A "Disruptive Technology", based on Clayton Christensen's description in the classic "The Innovator's Dilemma." I agree completely, and the current application of blogs fits well. (Read more ... )
Christensen distinquishes "sustaining technologies," those that allow businesses to operate their existing business models better, from "disruptive technologies" which disrupt the established business models by offering a cheaper, less functional alternative that improves rapidly without substantial increase in cost. He points out that at first, disruptive technologies appeal only to those operating with low or zero profit margins ... hobbies and free services. Those with which the nascent technology will ultimately compete see it as no threat, because it does not offer the advanced features sought by their high-margin customers. As a result, it is disregarded, even scorned, by established operations.
We see exactly this happening today, as blogs are used primarily for zero-margin publishing of free opinions and denigrated by some. At the same time, we see for-profit organizations (like About.com) adopting blog tools to supplement or enhance its low margin operations.
Among the characteristics of a disruptive technology, as Christensen describes it, is that it improves much faster and is far more efficient than the established technology it replaces. So, even though it starts at the bottom of the economic food chain, it steadily improves and moves "up market" faster than do customers' requirements. As it does, it continually becomes competitive with higher and higher margin business, as companies using established technology (with higher fixed costs and margin expectations) abandon the lower margin business to the upstart's inroads.
Managers behave quite rationally in shrugging off losses of low-margin bits of business seen as unimportant compared to its high margin "best business." Adopting the new technology would mean radical change in their business model, marketing relationships and profit expectations, all usually unacceptable to existing managers and shareholders.
This process continues, the disruptive tech moving steadily up the margin curve as it rapidly improves, taking better and better business, the established competitors now fighting a defensive battle. Eventually, the established technology companies are left with insufficient high-margin business to sustain their business models, marketing channels and compensation structures. They are forced to restructure or fail, usually the latter, all while making decisions that are perfectly rational at the time ... under business analysis appropriate to other technologies.
Christensen, a Professor at Harvard Business School, lays all of this out with a hundred years of historical examples in "The Innovator's Dilemma." He has recently released "The Innovator's Solution," which offers guidance to business managers on how to avoid being obsoleted by this phenomenon.
Copyright holders that separately lack market power but act in concert to restrain competition may be subject to private suits by injured consumers. Successful consumers can recover treble damages and attorneys fees under a Sherman Act suit. These principles may come into play as RIAA members file suits against alleged individuals who share copyrighted music, some of whom may choose to file countersuits for Sherman Act violations. Similar liabilities were discussed in the case of Microsoft, which had sufficient market power to be subject to such liabilities even when acting alone. See Unintended Consequences: Copyright Limits in Microsoft 2001. (Read more ... )
Judge Patel, in her February 2002 order in the Napster case, addressed Napster's allegations of antitrust violations as evidence of copyright abuse by plaintiff copyright holders. Napster's charges were that the RIAA plaintiffs acted in concert to enter the digital distribution market and agreed on license terms that enabled them to control prices and availability of music through the digital distribution channel. Judge Patel noted that the U.S. Justice Department was investigating similar allegations and allowed further discovery of potential evidence relevant to the misuse charges. She also ordered Napster to take certain steps to limit the use of its service for exchange of copyrighted files.
The discovery was not concluded. Within a few months Napster was shut down for inability to fully comply with the district court's orders protecting the copyright holders. Ultimately, over the objections of the other plaintiffs, its assets were sold to one of the plaintiffs, Bertlesmann, from which Napster had received funding and support over the years, and Napster went into bankruptcy.
Findlaw maintains Patel's opinion and other primary legal sources in a special page on the Napster Suit.
Shortly after Judge Patel's decision, Robert G. Badal wrote "Be Careful What You Ask For." It reviews the antitrust exposures that emerge with the enforcement of intellectual property rights. He reviews cases about potential liabilities for misuse and antitrust violation, specifically discussing patent pools, the "essential services" cases such as Aspen Skiing, plus the Microsoft and Napster decisions noted above. Badal raised the question whether intellectual property rights can constitute an "essential service," leaving the question open after noting its treatment in the 1999 decision in Intergraph Corp. v. Intel Corp.
IP Watchdog maintains a comprehensive suite of links to primary and secondary sources of study regarding antitrust law in the IP context.
Sharman Networks continues bringing antitrust charges against RIAA, despite early defeats, and RIAA has brought suits against other P2P networks such as iMesh. Neither side appears ready to back down, suggesting a fertile field for development of these issues in the future.
A posting at ethicalEsq?: suggests that online legal service delivery is constrained by a "guild mentality" of the legal profession. Quoting: "my experience looking at learned professions from the competition-consumer perspective tells me that the real culprit is the historic "guild" mentality, which fears and opposes virtually every type of innovation in services or marketing. This is especially true if most guild members see themselves as threatened with the loss of business and income, the need to become more efficient, or the pressure to engage in price or quality competition. " The author also points to his personal experience with professional peer pressure against his offering "affordable" alternatives to traditional legal services.
(Read more ... )
In a comment at a note, "What Led to the Demise of So Many Online Legal Websites?" at eLawyerBlog.org, I noted that some progress in using online resources to deliver legal services and support is being made by nonprofits in the area of probono work. One with which I happen to be familiar is an international group that happens to be based here in Hartford, Lawyers Without Borders (LWOB). This nonprofit uses the Internet to connect lawyers in the developed world with non-governmental organizations ("NGO's") and over-worked lawyers in the developing world and in war-torn areas to provide support for human rights and rule of law efforts. Their example could be a model to other probono and public service organizations seeking to tap a variety of legal resources from wide-spread areas.
Removing the profit motive and fear of competition from more efficient delivery channels might make a big difference in the profession's attitude toward online delivery of legal services.
(Read more ... )
Attempts by governments in China and Myanmar to censor communications on the Internet have led to legislative initiatives in the U.S., including the proposed creation of a Federal Office of Internet Freedom.
The Attorney General of Pennsylvania recently backed off of a program to order blockage of potentially illegal websites authorized by a state statute, according to another CNET story. This followed a federal law suit by the ACLU and the Center for Democracy and Technology, alleging violation of the First Amendment.
Two years ago, the government of France ordered that Yahoo block access by French citizens to its auction site featuring Nazi memorabilia. Yahoo later obtained a U.S. court declaration that enforcement of the French order in the United States would be unconstitutional. For links to the French order, the U.S. Judge's decision and further discussion and reading on the French Yahoo case, see Kanoho, "A Victory For Yahoo!—The United States Cannot Enforce French Censorship Of Auctions" (Internet Law Journal, 2/2/02)
Organizations concerned about government or private industry interference with their communications have also turned to the use of private networks using encryption to conceal the content of their speech. Such are generically called "Darknets."
Thanks to OnlineJournalism.com's newsletter for the heads up on this story.
"Grid computing" is the shared use of multiple distributed computing resources through broadband connections to tackle computing tasks beyond the capacity of ordinary servers. By combining its various servers' power, an organization or alliance of companies can get the power of a supercomputer with minimal additional investment. Or at least, that is the goal of those participating in a non-profit organization to develop and promulgate global open standards for such applications, the Global Grid Forum. According to basic network theory, such grids will allow companies to respond faster to changing markets and demands.
(Read more ... )
Global Grid Forum maintains a series of persistent documents analogous to the Request for Comments (RFC) series associated with the Internet Standards Process and the Internet Engineering Task Force (IETF). Those documents and the process of developing them is available at GGF and include:
* Drafts Currently Available for Public Comment
* Published (final) documents
* Grid Working Drafts in process
* Drafts submitted for GGF meetings
For Datamation, Willy Chui wrote "Grid Computing: Fulfilling the Promise of the Internet" in July. He noted initial use of grid computing in scientific and technical applications has spread to more business uses as common standards.
Wired ran a piece in April about use of distributed computing to tackle computational needs for addressing SARS. Grid Computing Spreads to SARS
In addition to large corporations such as IBM, Oracle and Hewlett-Packard, companies providing grid computing software include:
Platform Computing, Inc.
United Devices, Inc.
Seattle Post-Intelligencer reports that Ex-Tenn. lawyer sentenced to 5 years in Frankel case by a Mississippi court. He will serve the sentence concurrently with the 5.5 years he received in a Connecticut court in September for a related charge.
In 2002, after some time as an international fugitive, Martin Frankel was caught and pleaded guilty to defrauding an insurance company under his control of over $200 million. His elaborate Ponzi scheme involved an international cast and triggered Congressional hearings, inter-governmental accusations and GAO studies. Not to mention the indictments. Frankel is awaiting sentencing as his alleged cohorts are prosecuted.
(Read more ...)
The GAO Report: Scandal Highlights Need for Strengthened Regulatory Oversight .
From the "Results in Brief":
"Throughout the 1990s, Martin Frankel, with assistance from others,
allegedly obtained secret control of entities in both the insurance and
securities industries. He is alleged to have anonymously acquired and
controlled insurance companies in several states and, despite being barred
from the securities industry, to have exercised secret control over a small
securities firm. Using the name of this securities firm, Mr. Frankel
allegedly took custody of insurance company assets and provided false
documents on investment activity to disguise his actual purpose. Instead
of managing these assets in a prudent manner, he allegedly diverted them
to other accounts he controlled and used them to support the ongoing
scam and his lifestyle. The scam was finally exposed after insurance
regulators in Mississippi took enforcement action against three of the
Frankel-connected insurers by placing them under regulatory supervision.
At the time this report was being written, a federal criminal probe against
Mr. Frankel was still ongoing."
"This report includes recommendations to help prevent or detect similar
investment scams in insurance companies by proposing the adoption of
appropriate asset custody arrangements, improved asset verification
procedures, and the sharing of confidential regulatory information across
industries and agencies. In addition to the above recommendations
emanating from the Frankel matter, this report contains a
recommendation designed to broaden and help sustain cooperation among
regulators of different financial services sectors."
A September 18, 2000 letter from U.S. Congressman Dingell to NAIC Commissioner Nichols
included: "I am greatly concerned by the U.S. General Accounting Office’s (GAO) report to me about the insurance investment scam of Martin Frankel ("INSURANCE REGULATION: Scandal Highlights Need for Strengthened Regulatory Oversight" GAO/GGD-00-198). The GAO’s report shows this travesty occurred because state insurance regulators were either too blind to see, or too unwilling to acknowledge, the scam Mr. Frankel perpetrated, openly and fearlessly, over a period of eight years. This fraud went on far too long, not because Mr. Frankel was clever and deceptive, but because he was operating in an environment where the regulators lacked the skill, authority, access to basic information, resources, and "healthy skepticism" needed to protect insurance consumers."
Court TV's Crime Library Article on the Frankel Fraud (10 segments)
Ellen Pollack, The Pretender: How Martin Frankel Fooled the Financial World and Led the Feds on One of the Most Publicized Manhunts in History. A reviewer at Amazon.com says: "Ellen Joan Pollock's The Pretender is a biography of Martin Frankel, an unsavory financial savant whose vast illicit empire reached into very high places on two continents before collapsing with thundering suddenness. By the time of his arrest in 1999, Frankel had bilked various insurance companies out of $200 million via an elaborate (and oddly haphazard) Ponzi scheme. Pollock chronicles not only Frankel's phantom stock trades, fictional portfolios, asset skimming, and money laundering, but his mind-boggling personal extravagances--both financial and sexual. (His Greenwich, Connecticut, headquarters served both as business office and home to a shifting harem devoted to Frankel's sadomasochistic interests.)"
Testimony of the National Association of Insurance Commissioners
Before the Subcommittee on Oversight and Investigations And the Subcommittee on Financial Institutions and Consumer Credit Committee on Financial Services United States House of Representatives Regarding: Information Sharing Among State and Federal Financial Regulators. Quoting: "In particular, we want to move very quickly on closing the information gaps that prevented state regulators from checking on securities violations committed by Martin Frankel before he got involved in the insurance industry."
The Chronicle of Higher Education reports: "Computer Program Helps Business-School Students Expand Their Networking Opportunities". Leading B-schools such as Dartmouth College's Tuck School of Business and others provide an online system by which students, faculty and alumni can enter their personal business contacts into a database searchable by other participants in the system.
Privacy International has made available the report on a twelve-month study involving over fifty experts and advocates from across the world, made possible by a grant from the Open Society Institute. The study, in PDF format is available at: Silenced: An International Report on Censorship and Control of the Internet. (Read More ... )
From the executive summary: "This study has found that censorship of the Internet is commonplace in most regions of the world. It is clear that in most countries over the past two years there has been an acceleration of efforts to either close down or inhibit the Internet. In some countries, for example in China and Burma, the level of control is such that the Internet has relatively little value as a medium for organised free speech, and its use could well create additional dangers at a personal level for activists. The September 11, 2001 attacks have given numerous governments the opportunity to promulgate restrictive policies that their citizens had previously opposed. There has been an acceleration of legal authority for additional snooping of all kinds, particularly involving the Internet, from increased email monitoring to the retention of Web logs and communications data. Simultaneously, governments have become more secretive about their own activities, reducing information that was previously available and refusing to adhere to policies on freedom of information."
Concerns over such monitoring and hazards are one motivation for the rising interest in "darknets," mentioned in a recent note "Darknets Offer Privacy" in Unintended Consequences.
Thanks to OnlineJournalism.com for the heads up on this study.
In 1974, when Fred Carr took the helm of First Executive Life, the life insurance industry was at the threshold of drastic change, and Carr was one of those who took it over that threshold. Reading Schulte, The Fall of First Executive: The House That Fred Carr Built provides an insider's view into how a financial services company can fall victim to the actions of a few individuals acting either from greed, hubris or both. Gary Schulte's inside account provides important lessons in regulation of financial services. (Read more ... )
Fred Carr took over the management of First Executive Life in 1974, at a time when it was near bankruptcy due to lack of capital. The life insurance industry was then a wealth creation machine that few outside of the industry understood, and which faced little competition, despite the hundreds of companies in the market. It was shaken up in 1978 with a FTC report that claimed that the savings component of life insurance produced a rate of return far below that of other safe investments available to the public. The press picked up on the report, shaking public confidence in life insurance as a savings vehicle.
Into this situation, ripe with opportunity for change, stepped Fred Carr, a stock broker who had earned a reputation as a "gunslinger" in the explosion of the mutual funds markets during the "Go-Go Sixties." His was the opening story in the 1972 book "The Young Millionaires" by Lawrence A. Armour.
Author Benjamin J. Stein included Carr among the ranks of what he called a network of junk bond financiers in his "License to Steal" (Simon & Schuster 1992). Besides Michael Milken and others the list also included Saul Steinberg of Reliance, which would eventually fail also, but not until after the disaster of 9/11/01.
First Executive pulled itself from the brink of bankruptcy through the sales of interest-sensitive life insurance products. Agents showed prospects projections of investment returns dramatically superior to competing products. Carr made the returns plausible by heavy investment in "high yield" or "junk bonds" bought from Michael Milken, and erroneous assumptions that high returns would continue indefinitely. First Executive provided Milken a ready market for his bonds, propelling both businesses into rapid growth, in a reciprocity strategy commented on by Forbes in a 1984 cover story.
Schulte criticizes Executive Life (and those that followed it into the hot money trance of the 1980's) for failing to consider the unknowns of the future, and for taking risks that were not its to take ... risking the long-term promise to the policyholder for short-term performance. Without old portfolios or existing policyholders to consider, Executive Life was able to offer products based upon an investment portfolio with yields as much as double those of the established companies with established portfolios invested in conservative bonds and mortgages.
He also saw Executive Life succumb to what he calls "psychomedia risk" -- "the risk that even though you do everything right, something unexpected will happen, which creates the perception of failure although the facts don't support it." Schulte, page 56.
One element of Carr's eventual downfall was his self-imposed isolation of First Executive from both the insurance establishment and the press. First Executive and its managers avoided participation in industry associations and the network of industry support. He operated the expanding company like a sole proprietorship, making all the decisions and sharing little information with his own executive staff.
His agents suffered similar isolation from the established industry network. By joining Executive Life and aggressively converting old-line policies, they rejected the rest of the industry and put other agents on the defensive about their products. Other agents looked upon Executive Life agents as pariahs, associated with junk bonds, junk policies and junk agency practices.
What made Executive Life grow so fast, was the competitive advantage from the exceptionally high yield of its portfolio of junk bonds. According to Schulte, by 1987, First Executive Life had the largest portfolio of junk bonds in the world. In 1990, he was called before Congressman Dingell's subcommittee investigating problems in the insurance industry and defended his portfolio strategy. Those hearings resulted in a report "Failed Promises: Insurance Company Insolvencies" which was critical of state insurance regulation. As junk bonds and the Milken activities began to break down and come under increasing criticism from the media and commentators such as Benjamin Stein and Joseph Belth's Insurance Forum, he declined to respond.
What undid the company was its eventual inability to raise sufficient capital to support the rapidly expanding volume of business. The nature of the life insurance business is that due to heavy up-front expenses (mostly for agent's commissions), policies do not become profitable until several years of renewals. In the early years, every dollar of new premium is an actual drain on capital. Fast growing companies need more and more capital, and Carr took greater and greater risks to raise that capital, while projecting the continuation of high yields into the indefinite future.
Of course, the future turned out to be quite different than the projections.
When the junk bond market collapsed and policyholders became frightened for their investments, redemptions of his policies began and the company was forced to announce a dramatic writedown in 1990, increasing the policyholder panic.
His competitors and the press had been predicting the downfall of Carr's practices for some time, and in light of the new revelations, fed the collapse of confidence in the company. Carr found himself without allies or friends in the industry or media. Agents converted the company's policies to other writers, and in a very short time, Executive Life went from a fast growing business to a company in liquidation, a victim of the "psychomedia risk" and its lack of support and allies within the industry and media.
As the crisis deepened, Carr's pattern of centralized, autocratic management resulted in paralysis as he was buried in defensive issues. As organizational scholars such as Columbia Professor Duncan Watts would later predict, without a network of informed executives within the company and sympathetic allies outside the company, he was unable to effectively respond to the rapid change and challenges.
Although the base company reserves were sufficient to cover the remaining liabilities, they were seized by the insurance departments and placed into liquidation, and the holding company was forced into bankruptcy on May 13, 1991. The policyholders were covered, eventually, but the shareholders who had not cashed out early (as did Peter Lynch) lost their investments.
Stories like this are invaluable studies, because cases like those of First Executive continue to play out today. Such as those told in the February 2002 issue of The Actuary, "Deja Vu All Over Again - a roundtable discussion of insurance solvency and insurance fraud."
In an earlier note, I posed a question whether RIAA may have crossed a line in antitrust law by its recent combination of lawsuits and "Clean Slate" program regarding users of P2P file sharing networks. In a later posting, I sketched the outlines of the concept of copyright misuse, including reference to a thought-provoking article by Professor Gifford in which he opined that the courts will eventually agree that "exercise of intellectual property rights cannot violate the antitrust laws."
One case in which that statement may be too broad is United States v. Microsoft, 252 F.3d 34 (D.C. Cir. 2001)(en banc). Earlier decisions against Microsoft dated back to United States v. Microsoft, 56 F.3d 1148 (D.C. Cir. 1995) ("Microsoft I") and United States v. Microsoft, 147 F.3d 935 (D.C. Cir. 1998 ("Microsoft II"). Microsoft 2001 makes clear that antitrust law still constrains the exercise of intellectual property rights when the rights holder has dominant market power.
In June 2001, the U.S. Court of Appeals for the D.C. Circuit upheld the decision that Microsoft possessed monopoly power in operating systems for Intel-compatible personal computers. Monopoly power is not by itself illegal under U.S. antitrust law as long as it was obtained through competition or product excellence. The United States charged that Microsoft had violated the Sherman Antitrust Act §2 by going farther, to the offense of monopolization. Through restrictive provisions in its licenses for Windows, Microsoft worked to maintain its monopoly position, using methods other than competition on the merits of its products and services.
A violation of Section 2 of the Sherman Act requires that one with monopoly power engage in "exclusionary conduct." Mere growth or development resulting from a better product, business skill or market accident doesn't qualify. To qualify as "exclusionary," acts must have "anticompetitive effect" -- harm to the process of competition and thereby to consumers. Harm to competitors is not sufficent.
The Court found that Microsoft negotiated provisions in its licenses of Windows that interferred with the process of competition in the market for Internet browsers. In particular, Microsoft used its rights to withhold licenses of its Windows operating system in order to prevent Netscape's Internet browser from gaining enough market share to be a competitive threat to Microsoft's Internet Explorer ("IE"). The Court found that those restrictions were "exclusionary," and when combined with Microsoft's market power, a violation of Section 2 of the Sherman Act.
Microsoft argued that as a holder of a valid copyright, it had the right to impose even those restrictions found to be "exclusionary." That argument "borders on the frivolous," said the Court of Appeals. "Intellectual property rights do not confer a privilege to violate the antitrust laws," said the Court, citing In re Indep. Serv. Orgs. Antitrust Litig., 203 F.3d 1322, 1325 (Fed. Cir. 2000) (the "Xerox Case").
The Court rejected other attempts by Microsoft to justify its actions, and found that Microsoft used its market power to protect its monopoly, without legitimate justification, violating the Sherman Act.
The Court also found to be exclusionary certain "exclusive dealing" clauses in Microsoft license agreements. Such clauses can be used by a dominant firm to harm the process of competition. See generally Dennis W. Carlton, "A General Analysis of Exclusionary Conduct and Refusal to Deal -- Why Aspen and Kodak are misguided," 68 Antitrust L.J. 659 (2001). Microsoft struck deals with major potential distributors that they would distribute only Microsoft's Internet Explorer or else make it the default browser. These restrictions had the effect of keeping the usage level of the competing Netscape browser below the critical threshold at which it would be a viable threat to Microsoft's extension of its operating system monopoly into the market for Internet browsers.
The Microsoft 2001 outcome was quite different than those in the cases (such as those cited by Gifford) in which a copyright holder was allowed to impose similar restrictions on the licenses of its copyrighted products. The principle difference lies in the existence of market power, the existence of a monopoly in the Sherman Act sense. Cases allowing copyright holders to exclude competitors from access to their product (such as in the Kodak case and others cited by Gifford) involve sellers that had valuable, desireable products, but which lacked a "monopoly" in the true, Sherman Act sense of the word.
What does Microsoft tell us about the current controversy over peer-to-peer (P2P) sharing of copyrighted recordings? It appears that no one artist or publishing house commands sufficient market share to constitute a monopoly under the meaning in the Sherman Act. Yet the Sherman Act has a Section 1, that is potentially applicable: the prohibition against "combinations and conspiracies" in restraint of competition. If two or more separate parties act in concert to interfere with the competitive process, they may violate Section 1, even if none of the parties separately possess enough market power to constitute a monopoly.
Cases like Microsoft, Kodak and Xerox each address a situation in which a single copyright holder imposed license restrictions that negatively affected competition. In Microsoft, "exclusionary" restrictions were violative of the Sherman Act because it had monopoly market power. In Xerox and Kodak, the copyright holders had market power over their particular product, but less than a monopoly in the relevant market in which their product was but one competitor, making the Sherman Act inapplicable. Their imposition of conditions on licensure of their products were lawful because they fell short of "copyright misuse."
A different outcome might result if facts reveal concerted action among competing copyright holders, in restraint of trade or commerce, which action interfered with the process of competition. Concerted action among competitors is not by itself illegal. For example, competitors can compare notes on best practices, can agree on industry standards, can form joint ventures to undertake projects not practical for individual firms and cooperate in other ways, all within the antitrust laws.
Some concerted actions are prohibited by the Sherman Antitrust Act §1 and can be the subject of civil or criminal suits and penalties. The most well-known is the operation of a cartel, the fixing of prices by express or tacit agreement among competitors. Mergers that result in the combined firm having more than a threshold level of market power can also lead to Section 1 liability. Yet another is a concerted refusal to deal, or boycott, about which the Supreme Court has written in recent years.
It is to such forms of concerted action that I'll turn in the next of this series of notes.
Comments and TrackBack, please ...
In 3rd Circuit Breaks New Ground on Copyright Misuse, 8/26/03., Tech Law Journal summarizes and discusses the August 2003 decision in Video Pipeline v. Buena Vista Home Entertainment . This extended note includes a review of the history of copyright misuse through the Lasercomb and subsequent decisions. (Read more ... )
The perceived offense in Video Pipeline was suppression of criticism, rather than unfair competition or violation of antitrust law. Such use by the rights holder undermines the purpose of copyrights derived from the United States Constitution, said the Court, which in the end found the defense inapplicable in the facts of Video Pipeline's case.
The commentator also briefs a decision by Judge Posner in Ty v. Publications International (7th Cir. 2002), involving the manufacturer of "Beanie Babies" and a publisher of books about the collection of such toys. Ty sued to prevent the publisher's unlicensed use of images of the toys, which are copyrighted by Ty. In his opinion, Judge Posner discussed the potential for using a copyright monopoly in one market (the toys) to take over and monopolize a second market (the publication of critical guides). He wrote that "ownership of a copyright does not confer a legal right to control public evaluation of the copyrighted work."
Judge Posner noted some of Ty's actions to suppress criticism by licensees. Ty reserved rights to veto any text in the publisher's guides. It forbade licensees to reveal that they were licensees of Ty. Despite Ty's control over content, it required licensees to expressly disclaim sponsorship or endorsement by or affiliation with Ty. But the facts of the case did not require determination of the issue, so Judge Posner left it with the sentence "We need not consider whether such a misleading statement might constitute copyright misuse, endangering Ty's copyrights."
The TechLawJournal commentator continues with a discussion of several other possible factual contexts in which the theory in the Video Pipeline and the Ty cases might be applied in 3rd Circuit Breaks New Ground on Copyright Misuse, 8/26/03.
At their website, iMesh (with a lower case "i") claims 50 million members and says: "As a member of the leading peer-to-peer network, you can download, search, share and even publish just about any digital media file."
At Slyck.com, an site featuring information and conversation about file sharing, iMesh is described as " the last of the original file-sharing networks. iMesh made its mark when CuteMX, Scour and Napster were the big names in file-sharing. Despite the downfall of these networks, the Israeli based company has managed to prosper during file-sharings darkest hours. iMesh's strength comes from its impressive userbase of approximately 100,000 users and simultaneous download capability (among the first to support this feature). Although this Napster-like network has been requested to block copyrighted material, search queries are usually very successful."
Popular magazine PCWorld has a short description and download link to iMesh on its site.
The file sharing network has no apparent connection to IMesh (with a capital "I"), which describes itself as an "International Collaboration on Internet Subject Gateways --
For discussion of international collaboration on subject-based resource discovery services."
A half-day training symposium organized by Lawyers Without Borders (LWOB), conveyed a lot of information about the organization and its current projects. LWOB is a US-based international non-profit that connects lawyers seeking to provide pro bono services with non-governmental organizations (NGOs) in need of legal research, counsel and representation and in support of the Rule of Law. Providers and clients are connected across political borders, providing US lawyers with opportunities to assist economically or politically challenged non-profit organizations in developing or war-torn regions. (Read More ... )
Christ'l Dullaert, a Dutch lawyer whose expertise is in mediation and negotiation, is in the U.S. for three years. She spoke about the various types of pro bono work supported by LWOB and how in-house lawyers as well as those in private practice can apply "generic skill sets" to provide support to individuals, NGO's and in "Rule of Law" roles.
Professor Peter W. Schroth, of Rensselear at Hartford, addressed the rewards and nuances of work in an international setting in which substantive law, procedure and the importance (or unimportance) of prior decisional law varies greatly from the U.S. model. He also addressed the fact that in many African countries, few if any lawyers exist in private practice, making access to legal representation problematical for individuals. Also, in certain war-torn countries, certain factions have systematically targeted educated persons, such as physicians, lawyers and educators, for elimination, further reducing the local availability of legal support.
Robert Lally, a CPA with the Hartford area firm Federman, Lally & Remis, explained the U.S. tax implications to individuals and firms of providing pro bono services. Many folks do not realize that under U.S. tax law, the value of services is never a charitable deduction, although many out of pocket expenses may be.
Honorable Robert E. Beach, Jr., an administrative judge in the Hartford Judicial District, provided a perspective that in some countries, the legal comparison is not between "civil law" and "common law" structures but with the contrast between "law" and the "no law." He outlined basic legal needs for a Rule of Law to exist, including some form of workable commercial law and human rights law. He discussed the needs of local courts for pro bono assistance, particularly in a time of state budget cutbacks.
Professor Cindy Slane of Quinnipiac University School of Law addressed ethical considerations pertinent to the provision of pro bono services. Conflicts and appearances of conflicts are the biggest source of ethics challenges in this area. From a professional responsibility standpoint, the rules make clear that lawyers "should" do pro bono, and that firms have obligations to provide their lawyers with opportunities to discharge that responsibility. She discussed the conceptual challenges of dealing with "voluntary obligations" in a world of economic challenges.
Lt. Governor Jodi Rell spoke of the fundamental value of people who take the initiative to organize good works because they recognize that they are the best person to do the things that they do. She presented LWOB with a certificate recognizing its good work.
A panel discussion on means of creating innovative delivery systems for pro bono work followed, in which participated representatives of Hartford-based non profits, four major law firms, three professional associations and local corporations. Topics of discussion included the non-profit's needs, the steps the associations and law firms are taking to enhance opportunities for their members and associates to serve, and the financial support needed for continued development of the sorts of services LWOB provides.
Additional information about Lawyers Without Borders is available from the Lawyers Without Borders website or from its director, Atty. Christina Storm.
Berkman Center for Internet & Society at Harvard Law School continues building the schedule for Day One of their first conference about weblogs in journalism, education, science, business and politics: The BloggerCon 2003 Weblog: Day 1 Schedule (In Progress)
Scheduled presenters include: Jenny Levine, A.K.M. Adam, Jon Udell, Patrick Delaney, Mathew Gross, Joshua Marshall, Jim Moore, Susan Mernit, Kaye Trammell, Elizabeth Spiers, Scott Heiferman, Chris Locke, Doc Searls, Adam Curry, Halley Suitt, Scott Rosenberg, Glenn Reynolds, Brian Weatherson, Joe Jones.
I plan to drive up from Hartford for Day One, October 4, 2003, and join the conclave. More information about the conference is at their BloggerCon in a Nutshell page.
File-sharers are turning to "darknets" to stay away from prying eyes, says Business Week Online in "The Underground Internet" (September 15, 2003). Sources of technology include Freenet, Waste, BadBlue and Groove.(Read more ... )
"Darknet" software is made to enable small groups of trusted individuals to quickly set up and take down secure networks on the infrastructure of the public Internet. The article says large corporations are using darknets to communicate and share information with partners in a channel more secure than their corporate intranets. Another potential use is for swapping of content, including unauthorized copies of copyrighted materials. There are a variety of "flavors" of darknet technology.
Freenet uses a ring of trusted persons to search for and exchange information. It has been used in various sectors politically threatened by systematic denial of free speech and privacy. For a technical and practical introduction to Freenet, see a 2002 IEEE paper "Protecting Freedom of Information Online with Freenet". It is also referenced in numerous scientific articles accessible through Citeseer. Freenet's Ian Clarke has declared that Freenet will not enforce copyrights.
Direct Connect (DC) is another, but BusinessWeek Online says it secures its net with passwords, making it easy to penetrate.
Waste is said to be more secure than DC, says BWOnline, because it requires participants to exchange public keys then encrypts data travelling between network participants in transit. It was quietly made available as open source software in May by Justin Frankel, at the time head of a unit of AOL, then quickly withdrawn. Not quickly enough. It was promptly picked up by SourceForge, which also develops Freenet. Frankel also developed WinAmp, Shoutcast and Gnutella, according to an article in MIT Enterprise Technology Review.
BadBlue offers two white papers about their technology: "A Standards-based, P2P Approach to Marketplaces and Exchanges" and "BadBlue Platform Approach: A Web Server in every device."
Groove is the company that was founded and built up by Ray Ozzie, using the ultimate proceeds his share of the 1995 sale of Lotus to IBM for $3.5 billion. Ozzie was the principal developer of Lotus Notes, which was the "jewel in the crown" that IBM was after. In 2001, Groove announced a strategic relationship with Microsoft . Ozzie's weblog.
The entertainment industry is not worried about darknets yet, according to Randy Saaf of MediaDefender, Inc., who told Business Week: "If they are using private networks, there is very little risk of being caught, but there is very little risk of them really doing much harm to the entertainment companies."
One need not turn to antitrust law to address the questions posed in RIAA v. P2P Net: Notes in the Key of Antitrust. One may also examine RIAA's assaults on P2P networks through the lens of "misuse of copyright." (Read more ...)
Professor Gifford's analysis of the interface between intellectual property and antitrust law concludes that courts will migrate to the Federal Circuit's view, that "exercise of intellectual property rights cannot violate the antitrust laws." Id. p. 414. He admonishes, however, that the intellectual property laws themselves contain strictures against rights holders' misuse of such rights.
Copyright misuse was found to be a valid affirmative defense in Lasercomb Am., Inc. v. Reynolds, 911 F.2d 970 (4th Cir. 1990). Lasercomb held copyrights on certain CAD/CAM software, which it licensed to others. Its usual license agreement purported to preclude licensees from creating their own competing CAD/CAM software. Lasercomb sued Reynolds for copyright infringement, and Reynolds claimed that Lasercomb was attempting to use its copyright to control competition in an area outside the copyright: the use of CAD/CAM in Reynolds' industry.
The Lasercomb Court reviewed the history of patent and copyright law and noted the recognition of the equitable defense of "misuse of patent" in Morton Salt v. G.S. Suppiger, 314 U.S. 488 (1942). While the Lasercomb Court found no comparable Supreme Court decision relating to misuse of copyright, "since copyright and patent law serve parallel public interests, a 'misuse' defense should apply to infringement actions brought to vindicate either right." 911 F.2d 970, 976.
The Court of Appeals rejected the application of the antitrust law's "rule of reason" to Lasercomb's behavior. "So while it is true that the attempted use of a copyright to violate antitrust law probably would give rise to a misuse of copyright defense, the converse is not necessarily true -- a misuse need not be a violation of antitrust law in order to comprise an equitable defense to an infringement action. The question is not whether the copyright is being used in a manner violative of antitrust law (such as whether the licensing agreement is 'reasonable'), but whether the copyright is being used in a manner violative of the public policy embodied in the grant of a copyright." 911 F.2d at 978.
Further, the Lasercomb Court found that the challenger need not have agreed to the restrictive license terms; "the defense of copyright misuse is available even if the defendants themselves have not been injured by the misuse." 911 F.2d at 978.
Since 1990, the doctrine of copyright misuse has been adopted in several circuits in addition to the Fourth. See Alcatel USA, Inc. v. DGI Techs., Inc. , 166 F.3d 772 (5th Cir. 1999); Practice Management Info. Co. v. AMA , 121 F.3d 516 (9th Cir. 1997). See also Data Gen. Corp. v. Grumman Sys. Support Corp., 36 F.3d 1147 (1st Cir. 1994). The Federal Circuit has also acknowledged the availability of the doctrine in some circumstances. In re Independent Service Organizations Antitrust Litigation. 203 F.3d 1322 (Fed. Cir. 2000), cert. denied 531 U.S. 1143 (2001).
When sued by A&M Records for copyright infringement, Napster, Inc. alleged that plaintiffs colluded to extend their copyright monopoly to include online distribution. The Ninth Circuit concluded that the bundle of copyrights include the right, within broad limits, to curb the development of such a derivative market. A&M Records, Inc. v. Napster, Inc., 239 F.3d 1004 (9th Cir. 2001).
The Court also declined Napster's invitation that it impose compulsory royalties: "Plaintiffs would lose the power to control their intellectual property: they could not make a business decision not to license their property to Napster, and, in the event they planned to do business with Napster, compulsory royalties would take away the copyright holders' ability to negotiate the terms of any contractual arrangement." 239 F.3d 1004, ____.
Comments and TrackBack, please ...
Overseers Missed Big Picture as Failures Led to Blackout The New York Times' analysis, based on review of phone transcripts, interviews and timelines from the August 14 blackout offers what the Times called "a far deeper appreciation not only of what crucial elements went wrong that day, but also of the fundamental weaknesses in the way the nation's electricity grid is overseen and policed, especially in the Midwest." The Times does not point to error by any one person or organization, but a combination of systemic failures that allowed small faults to cascade into a collapse, including:
* the existence of two agencies monitoring interfacing grids, neither of which had real-time "big picture" information about what the other was experiencing or doing;
* temporary failures of computer monitoring systems; and
* lack of authority for controllers to direct corrective action.
Economist Stan Liebowitz sees P2P file sharing as a significant problem for music publishers, and advocates use of Digital Rights Management (DRM) tools to allow both P2P file sharing and protection of copyrights. He takes issue with proposals for compulsory licensing, a method used in the past in similar situations. He maintains an online page of notes and links to his published and to-be-published papers and studies. (Read More ...)
Stan Liebowitz is an economist at the University of Texas at Dallas. He has studied the challenge of MP3 file sharing for several years and assisted with critiquing economists' amicus briefs filed in the Eldred case. His studies had led him to believe (and write) that file sharing technology would have a significant negative impact on the recording industry, and that DRM technologies would provide protection without sacrificing fair use. "Policing Pirates in the Networked Age" (Cato Institute 2002).
Further study and experience led him away from, then back again to that conclusion, expressed in the August 2003 note "The Day the Music Died". It predated RIAA's recent lawsuits against users, but anticipates them and the resulting controversy and repeats his support of experimentation with DRM as a solution. A few days ago, he released "Alternative Copyright Systems: The Problems with a Compulsory License" in which he concludes that cumpulsory licensing is not the solution in this instance, as it was with rights in broadcast music.
He also maintains an informal but more current page of links and notes on the subject of "Copyright Issues, Copying and MP3 Downloading". It contains useful links to his yet-to-be published studies and papers as well as the recent court decision involving RIAA v. Kazaa et al and the continuing controversy over the grant of subpoena powers and privacy.
Liebowitz' logic and data are not without critics, including that of Miriam Rainsford, a pro-file-sharing musician.
Comments and TrackBack, please.
Peer-to-peer (P2P) file sharing network hubs received simultaneous assaults by the Recording Industry Association of America (RIAA). Some industry figures have sharply criticized the RIAA's tactics as wrong-headed and counterproductive. Has the RIAA's latest assault on users of a potentially legal competing distribution channel carried it into the antitrust minefield? (Read more...)
RIAA presaged its main attack with reconnaissance strikes on selected targets that partially shaped the legal and tactical battlefield. In September, a force majeur push included several hundred lawsuits against network hubs, described as the more eggregious participants in the offering of large quantities of recordings copyrighted by RIAA members. Within days, some information about defendants emerged, and RIAA's first tactical victory was disclosed: a $2,000 settlement paid by the mother of a 12-year old defendant.
At the same time, RIAA's "Clean Slate Program", offered limited amnesty to those who used various P2P file sharing systems to download or share copyrighted works, if they submitted a potentially self-incriminating "Clean Slate Program Affidavit". No amnesty was offered to those not using P2P networks to copy or share copyrighted works.
On behalf of P2P network users, a lawsuit against RIAA has been filed, alleging unfair and deceptive practices in connection with the Clean Slate Program, and at least one United States Senator, Norman Coleman, (R. Minn.) has indicated that an investigation of the program is appropriate.
Some recording artists and labels have expressed support of P2P file sharing, due to increased exposure they do not get through the existing system. Further factual research may show the numerical proportion of the RIAA such group represents, and what percentage of the industry revenues accrue to such group.
Some have suggested that RIAA is missing the opportunity to negotiate for the conversion of the "outlaw" peer-to-peer network (lets call it P2P Net) into a licensed, low-cost channel for distribution of recordings and payment of artists. Some suggest that such a system would result in increased opportunities for new, "independent" artists and labels and would increase the supply and reduce the price of recordings in the market. For example, Tim O'Reilly suggests that the RIAA is more concerned about the dominant publishers losing their control of the market than in the interests of copyright holders.
One of those is John Synder, President of Artist House Records, who in February presented a formal proposal titled "Embrace file-sharing, or die" to the New York chapter of National Association of Recording Arts and Sciences (NARAS). His extended remarks review the state of the recorded music industry, suggest recent drops in CD sales are not due to piracy, and is highly critical of RIAA, about which he said: "They overstate their position, misinterpret their own data, and make dubious claims for artists' rights when the biggest abusers of artists' rights are their benefactors, the record companies themselves."
In a note at LawMeme, Ernest Miller speculates that the controversial litigation against file sharers "just might be part of an extremely clever plan of the RIAA's to get the law changed to outlaw Kazaa. "
Has RIAA strayed into the antitrust minefield?
Although "all is fair in love and war," this is just interstate commerce. Even conceding the lawful monopoly rights of copyright holders, the RIAA's tactics raise some interesting issues of antitrust law on which legal scholars can chew. Let the chewing begin with these debatable issues:
Query: Does RIAA's combined campaign go beyond legitimate enforcement of copyrights in the market for recordings and extend to an attempt to restrain competition in a separate but related market: the marketplace for distribution services?
Query: Does RIAA's Clean Slate Program constitute an attempt to intimidate consumers into a boycott of a potentially lawful competing distribution channel (P2P Net), in order to maintain its members' power in the market for distribution services?
Query: Does the RIAA's combined campaign result in economic harm to certain recording copyright holders and to recording consumers by restraining competition from P2P Net in the market for distribution services?
Query: In the event of any affirmative answers to the above queries, is RIAA's combined campaign included within qualified immunity doctrines such as (but not limited to) that described in Noerr Motor Freight?
The issue of copyright abuse was addressed briefly in the 2001 Napster decision. The court acknowledged it as a potential defense, but found a lack of evidence supporting it in that case. A scholarly paper by Daniel J. Gifford explores the developing law in this area. Will new evidence emerge in the discovery processes of the new lawsuits filed this week?
Comments and TrackBack, please.
In Declarations and Exclusions, California Atty. George Wallace is tracking two new lawsuits arising out of the 9/11 tragedy. One is by some of the major insurers and reinsurers who paid over $2.7 billion in losses, naming some 500 individual and corporate defendants, including some government entities. The other lawsuit is by survivors against the airlines, the Port Authority and Boeing for negligence or product liability contributing to their loss of loved ones. September 11 -- The Litigation Expands
He has noted articles in Business Insurance and the New Republic regarding these cases, which will expand the discovery of facts and may redistribute the losses.
The science of networks illuminated the analysis of business organizations as Walter W. Powell of Stanford took issue with the classic division of business organizations into "market" or "hierarchy." In a 1990 paper, he challenged views that described a continuum of transaction forms from "market" at one end to "hierarchy" at the other, with some hybrid forms like joint ventures in between. "Neither Market Nor Hierarchy: Network Forms of Organization." Research in Organizational Behavior, Vol. 12. Pp. 295-336 (1990).
Powell took issue, and set out "to identify a coherent set of factors that make it meaningful to talk about networks as a distinctive form of coordinating economic activity." Powell, pp. 300, 301. "When the items exchanged between buyers and sellers possess qualities that are not easily measured, and the relations are so long-term and recurrent that it is difficult to speak of the parties as separate entities ... When the entangling of obligation and reputation reaches a point that the actions of the parties are interdependent, but there is no common ownership or legal framework ... such an arrangement is neither a market transaction nor a hierarchical governance structure, but a separate, different mode of exchange, one with its own logic, a network." Id. p. 301.
Discussing years of research in this paper, he noted findings that participants move toward network forms to reduce uncertainty, get fast information access, and provide a reliable, responsive system. Network forms are largely self-regulating, without formal enforcement processes, because of mutual trust and mutual desire of each participant to continue with the network over the long term. In later papers (see below), he applied these findings to particular networks within the pharmaceutical industry. He found that high levels of activity in the industry networks, or "centrality," was a prime determinant of financial success for the firms, and that this centrality came from the experience and diversity of contacts resulting from participation in cooperative R&D efforts.
(More ... )
In the 1990 paper, reviewing past published research, he laid out what he saw as key differences among markets, hierarchies and networks. For example, unlike market forms, networks involve indefinite, sequential transactions within the context of a general pattern of interaction. In networks, participants avoid using legal sanctions against each other. They prefer to create indebtedness and reliance over a long-term basis, and often involve the participants at a personal level.
Networks are better than markets or hierarchies at facilitating learning and the transfer of technological know-how. Perhaps as a result, they are better than hierarchies at responding to sharp fluctuations in demand and unanticipated changes. In networks, the parties gain by pooling resources, and such sharing requires the parties to adapt and accomodate other network participants, sometimes to their own short-term disadvantage.
"Networks, then, are especially useful for the exchange of commodities whose value is not easily measured. * * * The open-ended, relational features of networks, with their relative absence of explicit quid pro quo behavior, greatly enhance the ability to transmit and learn new knowledge and skills." Id. p 304.
Powell notes findings that a key element of network operation is reciprocity, the benefits of which are more apparent to those with a long-term viewpoint. Expectations of future benefit, "the shadow of the future," broadens one's view of one's own self-interest and encourages reciprocal cooperation.
Construction, publishing and the film and recording industries have been shown to exhibit network forms, he notes. Such industries display strong craft industry characteristics, with a special kind of skilled labor pool that has production experience and is capable of generating new products in response to changing customer demands. Their participants tend to have strong work ties within teams and with members of other organizations.
In the film industry, for example, he cites observations that "distinct networks crystallize out of a persistent pattern of contracting when particular buyers of expertise and talent * * * with given schedules of resources and alternatives, settle into self-reproducing business transactions with distinct (and small) sets of sellers." Id. at p. 308.
Powell noted studies of similar behavior among craft style industries that cluster in geographical areas to take advantage of the pool of skilled labor and expertise, and to engage in mutual assistance. Swedish scholars Hagg and Johanson found such characteristics among Swedish firms that shared R&D efforts. "Instead of a competitive environment, there is a sharing of risks and resources and a pooling of information. * * * [T]hese arrangements eliminate costly safeguards and defensive measures and are better adapted to uncertainty. Competition in intermediate producer markets is not eliminated, rather coalitions of firms compete with other coalitions, not on the basis of price, but in terms of product development and knowledge accumulation." Id. p. 313.
In conclusion, Powell highlights three factors as critical components of networks:
* Know-how. Where skills aren't of the type that can be kept exclusively with the high bidder, and where shared information builds common values, networks flourish.
* Demand for speed. Networks favor fields requiring responsiveness to innovation and rapid translation into action.
* Trust. Networks exhibit the development of "generalized reciprocity" and an expectation of continued future association. Mutual desire for continued participation results in self-regulation without formal oversight processes. Participants' reputation for reliability is very important.
Powell also noted the importance of particular contexts of legal, political and economic factors to the development of networks, including relaxed antitrust standards and national policies promoting research and development and linkages between industry and academia. He closed with an agenda for future research, including the need for studies of the relevance of state policies, the durability of networks, the performance liabilities of networks and whether network participation affects the likelihood of future collaboration.
"Neither Market Nor Hierarchy" won the American Sociological Association's Max Weber Prize and has been translated into German and Italian.
Powell's later studies of the pharmaceutical industry.
Statistical studies were done with industry data available for several hundred firms over a period of ten years, testing a hypothesis drawn from the published literature. Powell and his colleagues hypothesized that centrality, or activity in the network of cooperation within the pharmaceutical industry, would be positively correlated with financial success. The data showed that it was; the more active firms got more patents and more nonoperating income, grew faster and had larger sales revenue.
Analyzing deeper for the criteria that led to centrality, Powell and his colleagues found that cooperative R&D developed experience and diversity (multiple linkages with a range of partners), which together were the best predictors of centrality. They concluded that "collaborative R&D drives the interorganizational network in biotechnology.""Network Position and Firm Performance: Organizational Returns to Collaboration in the Biotechnology Industry" Powell, Koput, Smith-Doerr and Owen-Smith. Networks In and Around Organizations. 1999. p. 24.
In "Brawn & Brains" (Forbes, 9/15/03) Peter Huber and Mark Mills look at the 2003 Power Blackout. Although investment in the grid is important, they point out that it was not particularly stressed on August 14. The failure developed over an hour-long period and the final crisis unfolded in nine seconds ... a long time in the power management business. They see this as good news, because such problems can be addressed by short-term investment in better "Scada" networks.
In a white paper referenced in the article, the authors also argue that the electrical network, designed to be robust in the face of random failures, is vulnerable to targeted sabotage on key elements. They have a list of recommended steps. (More ... )
The Scada ("supervisory control and data acquisition") networks are the brains that steer the waves of power and "reactive" power that wash through the electrical grid, especially in times of high trading volume and rougue failure states. Recent advances in Scada technology allow a much "smarter" grid than was possible even 5 or 10 years ago, say Huber and Mills. Replacing older electro-mechanical switches with higher-power silicon switches can provide faster, more reliable and more precise power control, as they now do at military installations. They leave us with the public policy question: will power suppliers and transmitters get the economic incentives to invest in this new technology?
In a white paper published by their consulting firm, the Digital Power Group They discuss detailed suggestions for improvement, outlined in eight major areas for coordinated action by policy makers, industry associations and end users:
1. Assess Vulnerabilities
2. Establish Critical-Power Standards for Facilities Used to Support Key Government Functions
3. Share Safety- and Performance-Related Information, Best Practices, and Standards
4. Interconnect Public and Private Supervisory Control and Data Acquisition Networks
5. Secure Automated Control Systems
6. Share Assets
7. Enhance Interfaces Between On-Site Generating Capacity and The Public Grid
8. Remove Obstacles
Peter Huber (a Manhattan Institute senior fellow), and Mark Mills are partners in the Digital Power Group and authors of the August 2003 white paper "Critical Power" (2003) There, they suggest that the grid was designed for power quality and short-term reliability. Its design makes it resistant to random failures scattered throughout the network. It was not designed for targeted attacks on critical nodes that may result in extended periods of downtime for significant network assets.
Since 9/11/01, such possibilities seem less remote. In the paper, Huber and Mills indicate that it needs to be improved to assure continuity during extended outages, including those that may result from deliberate sabotage, and that such requires different approaches to technology and investment.
Joint hearings will be scheduled to consider a bill yet to be introduced that would protect compilers of facts into databases. Mere factual compilations are presently not protected by copyright law. The bill is supported by the Software and Information Industry Association, says CNET News.com in Databases--the next copyright battle? Opponents include Public Knowledge, which describes its mission as advocating a "fair and balanced" approach to copyright and technology policy.
Thanks to the e-newsletter from Online Journalism for the heads up on this story.
George Wallace's Decs and Excs follows California's turmoil in workers comp insurance: Insurance Commissioner Seeks Reduction in Workers' Compensation Cost Factors. In a press release linked by Decs and Excs, Department of Insurance (DOI) Commissioner Garamendi anticipates a legislative attempt to force lower premiums. The Commissioner asserts that the better solution is to control claim costs, and that capping premiums below cost will only drive for-profit insurers out of the marketplace.
That sort of exodus would increase the stress on the California State Fund, the controversial writer of "last resort." State Fund has been increasing its market share and premium/surplus ratio dramatically in recent years. The Commissioner of the Department of Insurance (DOI) has questioned the adequacy of State Fund's rates and capital base, but State Fund has bristled at DOI's attempts at control. This past May, State Fund filed suit against the Commissioner, arguing that it was not subject to the Insurance Department's control, or the Risk Based Capital (RBS) Statutes, and was not near insolvency.
In a recent press release, State Fund disclosed and dismissed advice from its own accountants that it is underreserved by over $1 billion. As of last month, the lawsuit continued, with State Fund disclosing that it has requested approval of a reinsurance transaction to improve its financial position by moving $4 billion in premiums off of its books.
The situation sounds sadly like those in past years in other states, such as Texas in the 1980's, where a state-controlled "market of last resort" sold workers comp insurance below cost, running large deficits until it was shut down and put into run-off. Is that happening now in California? Time will tell. (More ... )
About 18 months ago, California Commission on Health and Safety and Workers’ Compensation released a comprehensive white paper "State of the Workers’ Compensation Insurance Industry in California" that is worthwhile reading on this subject.
The well-documented experience of Texas may be informative to those considering the best moves for California. The Texas Research and Oversight Council on Workers Compensation maintains a comprehensive library of studies of the Texas workers compensation market and effects of legislative changes over the past years there, including the impact of the residual market programs upon prices, competitiveness and solvency. The articles are offered by mail for free and many of the more recent reports are online for immediate download, including the December 2002 Biennial Report which reports on the current effects upon insurance prices resulting from past legislative change, medical costs and insurance industry insolvencies, including that of Reliance.
GrepLaw is a blog at Harvard Law School's Berkman Center for Internet & Society. Ernest Miller is at Yale Law, and has been an editor at LawMeme, a law and technology blog there.
Miller explains for GrepLaw readers the Information Society Project at Yale Law School, and opines that blogs "are great places for law students to begin to find their voice and practice writing in this new medium. They will also be the center of more and more legal debate and analysis"
About the key issues of cyberlaw for the coming year, Miller tells GrepLaw: "The intersection of copyright law and the First Amendment is perhaps the key modern issue in this field. Until the theories of copyright and First Amendment can be reconciled, the law will continue to be confusing and come up with strange results. I am optimistic, though not overly so, that some movement on this front has already begun."
He has a lot to say about DRM and fair use, privacy and many other issues. An extended interview well worth reading.
Columbia Professor Duncan J. Watts builds on the work of mathematicians, physicists, biologists, sociologist, economists and others to advance the new science of networks. "Six Degrees: The Science of a Connected Age " brings a sociologists' perspective to a field relevant to those dealing with complex systems and their robustness and fragility under stress. The science of networks has significance for those wrestling with current issues of law and public policy in a wide spectrum of applications including electric power grids, insurance markets and anti-terror measures.
Chapters one and two introduce the basics of the study of how individual behavior aggregates to collective behavior. In chapters three through five, Watts addresses the elements of "small world" networks, the role of scale-free networks and hubs, and the mechanics of search on the network. In chapters six through nine, he deals with epidemics, "the madness of crowds" and the dynamics of adaptation. He closes with views on learnings from recent crises, including the terror attacks of September 11, and with five "lessons for a connected age."
In addition to my notes below, see Watts' February piece for The Chronicle of Higher Education "Unraveling the Mysteries of the Connected Age. "
Watts, D.J. Six Degrees: The Science of a Connected Age (W.W. Norton & Co., 2002) (More ... )
Chapters One and Two
"The Connected Age."
Watts starts with a description of the 1996 power blackout in the western United States as an example of "cascading failure." In that case, the devices that protected individual elements of the system made the whole more likely to collapse. He discusses "emergence" of new phenomena when individual behavior aggregates at a level of groups, systems and populations.
The emergence of a new science.
He reviews the history of the science of random graphs developed by Erdos and Renyi, who found phase transitions at a critical point of increasing connectivity, resulting in the formation of giant components. But random graph theory lacked a way to deal with dynamics on the networks, particularly with how globally coherent activity emerges without central control.
When physicists entered the study of networks, they brought with them earlier studies of phase transitions that have the same behavior as those observed in evolving social networks. This revealed elements of universality in the emerging science. Universality is the quality that the behavior of physical or biological systems (e.g. chemical reactions or metabolic processes) can exhibit fundamental similarities with social systems (e.g. complex group behavior) and other types of apparently unrelated systems.
Watts says that "this tells us that at least some of the properties of extremely complicated systems can be understood without knowing anything about their detailed structure or governing rules. * * * This is a tremendously hopeful message for anyone interested in understanding the emergent behavior of complex social and economic systems like friendship networks, firms, financial markets, and even societies." Watts, Six Degrees, p. 65.
Chapters Three Through Five
Clusters and random short cuts make for small worlds.
"Small world" effects describe the ability of individuals to find connecting paths to strangers in far away locations in relatively few jumps. Studies found that the keys to finding short paths within networks are the existence of local clusters of related nodes or individuals (cliques) plus "short cuts" within the network. In mathematical models, only a few random short cuts caused the average path length to drop like a stone. Because the short cuts could be random to have this effect, it didn't matter how they were formed. This discovery enabled the transfer of discoveries and formulae from other sciences to the study of social networks.
Scale-free networks and hubs.
In 1999, Barabasi and Albert published a ground breaking paper "Emergence of Scaling in Random Networks." Science, 286, 509-512. (1999). This paper showed that certain connections in real world networks don't have a normal ("bell curve") distribution but rather follow a power law distribution. This means that there is an increased likelihood of extreme events in such "scale free" networks. As a result, in scale-free networks, as networks evolve, a few nodes will be "hubs" with an extraordinary number of connections. Barabasi and Albert also found that the evolution of these hubs depended on the combination of network growth and "preferential attachment" - the tendency for new nodes to connect to those already well connected (the "rich get richer" effect).
Watts applied the physicists' formulae (with some modifications) to social networks. He demonstrated that in social networks, hubs were not necessary in order for small world effects to appear. He found those effects in "networks of overlapping cliques, locked together via the co-membership of individuals in multiple groups. Because this feature is a property of the representation of the network, and not of any particular matching procedure, it is true regardless of how individuals and groups are matched. Even * * * random affiliation networks will always be small-world networks." Watts, p. 128.
Searching the network.
In times of crisis and rapid change, the ability to find short paths to the right information becomes particularly important. Network search can follow two modes, broadcast or directed. A broadcast search is impractical on large networks (unless you are a virus). Directed searches have their own issues. How to direct a search in a large network of which one knows only the local portion?
Mathematicians had demonstrated that to solve the problem of directed search, one need only forward the request to the individual node that seems closest to the desired information or destination, then let that node do the same until the search is successful. In a social network, people measure "closeness" to others on multiple dimensions (e.g. common geography, occupation, college attendance).
Watts's research found that if individuals were allowed to use 2 or 3 dimensions of social "closeness," they could easily find randomly chosen targets, even in networks characterized by close-knit clusters or cliques. All without using a hub. "Searchability is, therefore, a generic property of social networks,". Watts, p. 156.
Chapters Six Through Nine
Epidemics and Failures.
Epidemiology focuses on reducing the rate of growth of a disease. A disease can be thought of as doing a network search for an uninfected ("susceptible") host, and so epidemics can be analyzed using network science. If the disease can find a short cut, such as an infectious person flying from the isolated source to a highly connected hub city, or having intimate contact with many suspectibles, or both, an otherwise slow-to-spread disease can explode, as did AIDS.
Physicists contribute "percolation theory" to epidemiology. Various physical processes develop toward a change, then suddenly percolate from one state to another. The formation of crystals in a super-saturated solution is an example familiar to high school chemistry students. Percolation depends upon the development of a "percolating cluster" -- a single cluster of susceptibles that is connected with the entire population. When it is triggered, the entire system changes.
In 2000, physicists Barabasi and Albert looked at the "robustness" of scale-free networks during epidemics and similar challenges. They found that such networks are more resistant to random failures, because the small minority of hubs were unlikely to be hit. However, for the same reason, they were more vulnerable to attacks targeted at the highly connected hubs. "Attack and Error Tolerance of Complex Networks." Nature, 406, 378-382 (2000). Using percolation mathematics, Watts' team found similar effects in networks that were not scale free.
Decisions, Delusions and the Madness of Crowds.
In 1841, Mackay wrote his classic about panics and group manias, "Extraordinary Popular Delusions and the Madness of Crowds." Documented examples date to Roman times, and include the "Dot.com" bubble at the end of the 20th century. The real mystery of financial markets is that they are both rational and irrational at times. The evolution of cooperative behavior is similarly paradoxical. The "Diner's Dilemma" and the "Tragedy of the Commons" illustrate how outcomes unfortunate for the whole group can be the result of each participant acting rationally.
An "information cascade" is an event during which individuals spontaneously stop behaving like individuals and start to cooperate, acting like a coherent mass. Information cascades, such as financial panics, riots or revolutions, can be triggered by a small initial shock, then propagate throughout an entire networked system.
Watts looks at studies of "information externalities" (information issues outside of a transaction that affect the transaction decision, like other consumers' opinions), "market externalities" (value components derived from the presence of something else, like a fax machine), and "complementaries" (separate products that increase each others' value). The combination of market externalities and complementaries can generate the positive feedback effect of increasing returns. Watts also suggests the presence of "coordination externalities" (the likelihood that others will follow your action) affects the decision to go along with a budding information cascade.
Thresholds, Cascades and Predictability.
Transposing epidemiology to network science, Watts calls information cascades "social contagion." He contends that unlike biological contagion, social contagion is a process highly contingent upon coordination externalities. The greater the percentage of one's social "neighbors" making a choice, the greater the probability one will make the same choice. The probability jumps once the percentage hits a critical "threshold." "Highly connected" nodes with many neighbors tend to be "stable," because the tipping of any one or two neighbors will not reach their threshold percentage.
Individuals have different thresholds. "Early adopters" may tip to an innovation based on the influence of just one neighbor. As they tip, they influence their more stable neighbors who require the example of a larger percentage of neighbors. Early adopters with few neighbors are more likely to constitute the threshold percentage and trigger a cascade within their small clique than will those with many neighbors in a large clique. Under Watts' theory, highly-connected clusters (large cliques) tend to be stable rather than vulnerable, thus not the likely origin of a global cascade.
After a survey of the mathematics used to study global cascades, Watts proposes that global cascades can be started either by 1) lowering the average threshold of the population (e.g., by increasing the appeal of the innovation) or by 2) reducing the average density of the network (size of connected cliques). When both thresholds and density are high, the system tends to be stable, except in the small fraction of all nodes where a small clique includes an early adopter that could tip to the innovation.
Cascades, therefore, will tend to be rare, unless an innovation is directly targeted at one of those vulnerable clusters. If converted to the innovation, that vulnerable cluster exposes its neighboring stable nodes to the phenomenon of multiple neighbors converting, leading to a landslide of conversion to the innovation. In the context of marketing new technology, Geoffery Moore of Intel calls this phenomenon "crossing the chasm".
Watts takes from this the insight that the nature of the trigger matters less than the connectivity of the target. To prevent (or precipitate) a global cascade, the trick is "to focus not on the stimulus itself but on the structure of the network the stimulus hits." Watts, p. 249.
Innovation, Adaptation and Recovery.
Watts reviews the story of the Toyota-Aisin crisis and the "self-healing" response. On February 1, 1997, the only factory for the sole supplier of a brake component essential to Toyota's entire production system burned down with all of the specialized equipment needed for its production. Within 24 hours, all of Toyota's "just in time" manufacturing operation halted. The halt ended Toyota's demand for the other components supplied by some 200 companies whose businesses were dependent upon Toyota. Without central control or supervision, the 200 cooperated so as to find replacement equipment and re-establish production of the component within 3 days, averting a disaster for the entire Japanese industrial system.
Watts uses the Aisin crisis as a case study to sketch the history of industrial organizational theory from Adam Smith's The Wealth of Nations through Ronald Coase's The Nature of the Firm and The Second Industrial Divide by Michael Piore and Charles F. Sabel. The latter work identifies "flexible specialization" as the key to adaptation in an ambiguous environment of rapid change. Flexible specialization is a characteristic of the craft model of organization that was almost wiped out by the "economies of scale" methods introduced during the Industrial Revolution. Working together, Sabel and Watts proposed that when solving complex problems in ambiguous situations, individuals compensate for their individual limitations by using the network search methods Watts had been studying, to find those individuals with the knowledge or assets the searchers lack.
The hierarchical nature of the typical post-Industrial Revolution firm performs poorly at the task of flexible redistribution of information needed in a crisis. Yet the encouragement of local "multiscale" teams at multiple levels of the hierarchy dramatically improved the capacity for such search and redistribution. Because such "multiscale networks" not only minimized the likelihood of failures but also optimized recovery from failures, Watts calls them "ultra robust." He sees such crisis recovery skills developing naturally by network participants dealing with everyday ambiguity and change.
Chapter Ten : "The End of the Beginning."
The terror attacks of September 11, 2001 tested the robustness of the complex network that is Manhattan. Besides the physical destruction, public and private organizations suffered an unprecedented organizational crisis. The sudden loss of communications, transportation and information infrastructures was compounded by the loss of so many firefighters, police officers, business managers and staff. Entire firms were in peril, and some did not recover. Within 24 hours, a response self-organized and Manhattan's complex network rewired itself and marched on, though the global effects continue today.
As with the Toyota Aisin crisis, "the capability to recover from the catastrophe could not have been consciously designed. * * * So whatever it was about the system that enabled it to recover so rapidly had to have been there beforehand and had to have evolved principally for other purposes." Watts, p. 295.
Before providing a difficulty-rated guide for further reading, Watts closes with the admission that his book leaves unanswered questions, and the expectation that scientists will keep exploring the new science of networks until they get to the bottom of it ... and then keep going. Just like Manhattan.
Insurance Defense Blog editor Dave Stratton points us to online white papers relating to toxic tort defense and other material of interest to insurance people in his post: Independent Insurance Brokers of America Virtual University
For the Washington Post, Robert Samuelson writes Fixing Fannie and Freddie. Combined, the debt of government-sponsored enterprises Freddie Mac and Fannie Mae totalled $1.5 trillion at the close of 2002, writes Samuelson. Highly profitable, they continue a history of expansion, enabling the American Dream of home ownership. They buy money cheap (by selling bonds) and sell it dear (by buying bundles of mortgage notes). They make money on the "spread," and protect themselves from fluctuations with derivatives.
Samuelson asks "What if they failed?" This year, discoveries of misleading profit reports shook up Freddie, suggesting that errors, even big ones, can escape the notice of regulators. The failure of either is unlikely, Samuelson admits, but the consequences could be catastrophic. In his editorial, he suggests that a blue-ribbon commission be formed to take an unbiased look at the structure and practices of these financial service giants. (More ... )
Samuelson's editorial brings to mind the failure of Long Term Capital Management (LTCM), a highly leveraged "hedge fund," that owed huge amounts to major players in the U.S. financial system. In 1998, LTCM was caught in an unexpected shift of the financial markets triggered by the Russian government's default on its debt. In the resulting disarray in the global debt markets, LTCM found its spread "inside out," and suddenly went from growing profits to sudden and massive losses. In order to prevent the failure of LTCM (which might trigger further cascades of failure in the U.S. financial system), the Federal Reserve organized a private sector bailout that stopped the landslide, but is controversial still today. In 1999, the General Accounting Office (GAO) reported on its study of the LTCM failure. From the opening summary:
Between January and September 1998, LTCM, one of the largest U.S. hedge funds, lost almost 90 percent of its capital. In September 1998, the Federal Reserve determined that rapid liquidation of LTCM’s trading positions and related positions of other market participants might pose a significant threat to already unsettled global financial markets. Thus, the Federal Reserve facilitated a private sector recapitalization to prevent LTCM’s collapse. Although the crisis involved a hedge fund, the circumstances surrounding LTCM’s near-collapse and recapitalization raised questions that go beyond the activities of LTCM and hedge funds to how federal financial regulators fulfill their supervisory responsibilities and whether all regulators have the necessary tools to identify and address potential threats to the financial system."
* * *
"Federal financial regulators did not identify the extent of weaknesses in banks’ and securities and futures firms’ risk management practices until after LTCM’s near-collapse. * * * [E]xaminations done after LTCM’s near-collapse revealed weaknesses in credit risk management by banks and broker-dealers that allowed LTCM to become too large and leveraged."
* * *
"Regulators for each industry have generally continued to focus on individual firms and markets, the risks they face, and the soundness of their practices, but they have failed to address interrelationships across each industry. The risks posed by LTCM crossed traditional regulatory and industry boundaries, and the regulators would have needed to coordinate their activities to have had a chance of identifying these risks. Although regulators have recommended improvements to information reporting requirements, they have not recommended ways to better identify risks across markets and industries. We are recommending that federal financial regulators develop ways to better coordinate oversight activities that cross traditional regulatory and industry boundaries."
Other studies of the LTCM collapse include Roger Lowenstein's "When Genius Failed: The Rise and Fall of Long-Term Capital Management" and Nicholas Dunbar's "Inventing Money: The Story of Long Term Capital Management and the Legends Behind It."
See also a 1999 speech by FRB Governor Laurence H. Meyer "Lessons from Recent Global Financial Crises"
In 1918, Oliver Wendell Holmes wrote "The life of the law is not logic, it is experience" and in 1921, “a page of history is worth a volume of logic.”
For centuries, jurists have decided cases based on the human experience of behavior among individuals, families, firms, associations, and nations. Those actors are influenced by multiple, often conflicting environmental and social factors of great complexity. Drawing from this history, jurists weigh opinions and arguments of the likely effect of their decisions on the social network. Advocates present them with the best input of physicians, sociologists, economists and other scientists as well as the teachings of historical experience.
Until recently, there were few scientific structures for explaining why complex social networks behaved as they did during times of ambiguity and change. Without understanding why the history is as it is, how do jurists decide how to shape the future? How will the economy be affected by the recognition of new torts? What will happen if we prohibit or encourage certain forms of business contracts or combinations? What will be the global effects of forcible regime change in a complex, multi-ethnic nation that bristles with weapons? Jurists, lawmakers and executives at all levels consider the input of what science has to offer, then make their best judgment on what to do. Often, the judgment results in unintended consequences, some good and some bad.
The new science of networks now brings together the results of quantitative research into systems as diverse as subatomic particles, neural nets, ecologic food webs, electrical grids and business organizations. Research has documented fundamental similarities in how each of these very different complexities behave, especially in times of crisis and rapid change. These universalities among systems provide us with a way to transpose the learning in one field of science to another, to apply the history and experience they offer to the social science of jurisprudence.
As articulated by an expanding universe of many scholars as diverse as the physicist Barabasi, and the sociologist Watts, this new science brings us to the threshold of a dramatically expanded perspective on the issues of jurisprudence. Exploring that science and its meaning for jurisprudence is a basic goal of this journal.
First Monday has e-published "Giving E-mail back to the users". As an alternative to proposals for more strict legislation and "bounties" on spammers, the authors proposed a code solution to the spam problem.
From the abstract: "This paper argues that current legislative and private attempts to stop spam are either ineffective, or involve unacceptable tradeoffs. The key to solving the spam problem is recognizing the importance of e-mail authentication and the granting of permissions. Properly used, digital signatures can easily authenticate e-mail for effective spam control. The ability to manage public keys for verifying digital signatures provides each e-mail user the individual power to control who communicates with her and can therefore completely eliminate the practice of spamming. Finally, we recommend that software developers build the requisite capabilities for managing public keys into their e-mail programs. We argue for a technological solution as opposed to government legislation." (More ... )
The article, by grad students Trevor Tompkins and Dan Handley, is one of the peer-reviewed articles on First Monday, an online journal of academic papers dedicated to the Internet. First Monday is a Great Cities Initiative of the University of Illinois at Chicago Library, which (according to its website) has published 466 papers in 87 issues; these papers were written by 567 different authors. First Monday reports that it is indexed in INSPEC, LISA, PAIS and other services and that in the year 2002, users from 642,954 distinct hosts around the world downloaded 4,036,340 contributions published in First Monday.
First Monday is free, online, and digitally searchable. It invites paper submissions for possible publication, and provides an excellent style guide to writing for Internet publication.
Past contributors include Phil Agre, Virgilio Almeida, Aleksander Berentsen, John Seely Brown, Steve Cisler, Paul Duguid, Esther Dyson, Simson L. Garfinkel, Rishab Aiyer Ghosh, Michael H. Goldhaber, Andreas Harsono, Bernardo A. Huberman, David R. Johnson, Brian Kahin, Jessica Litman, Clifford Lynch, Miranda Mowbray, Bonnie Nardi, David F. Noble, Andrew M. Odlyzko, Ilya Prigogine, David Post, Eric S. Raymond, David Ronfeldt, Pamela Samuelson, Abigail Sellen, Linus Torvalds, Hal R. Varian, and Richard Wiggins.
Thanks to beSpacific for the pointer to this article.
The U.S. Chamber of Commerce published two papers regarding the law and science of toxic mold lawsuits, questioning the adequacy of the science behind much of the litigation in The Growing Hazard of Mold Litigation. (More ... )
Cliff Hutchinson and Robert Powell in “Mold Litigation: How Hysteria and Junk Science Built a Cottage Industry.”
From the executive summary:
"Hutchinson and Powell lay out the development of mold litigation, including some significant cases with large verdicts * * * The authors examine mold litigation through the Daubert microscope and argue that the serious health claims that pervade mold litigation * * * cannot withstand scrutiny under the “reliable science” standard of Daubert."
Bryan D. Hardin, Ph.D., Andrew Saxon, M.D., Coreen Robbins, Ph.D., CIH, and Bruce J. Kelman, Ph.D., DABT “A Scientific View of the Health Effects of Mold,”
From the executive summary:
"The paper examines in depth each type of health complaint associated with mold and offers an extensive survey of the scientific literature on the topic. It determines that mold can cause allergies for those who are “atopic” or prone to allergic reactions. * * * The paper concludes that infections caused by mold are rare, except for those individuals who are “immune-compromised.” Finally, it asserts that “there is no sound scientific evidence that mold causes ‘toxicity’ in doses found in home environments.”
Thanks to Insurance Defense Blog for the pointer to this resource.
BloggerCon 2003 will be held on the Harvard Law School campus, on October 4 and 5, 2003.
Presenters include: Glenn Reynolds, Joshua Marshall, Doc Searls, Scott Rosenberg, Adam Curry, Elizabeth Spiers, Jim Moore, Susan Mernit.
Moderators: Lance Knobel, Ed Cone, Christopher Lydon.
Host: Dave Winer. (More ...)
Questions to be addressed:
"Weblogs. The unedited voice of a person! Will easy and inexpensive publishing technology change the face of politics, business, journalism, the law, medicine, engineering and education? Is a revolution underway, or are weblogs just the latest Internet craze? We'll show how artists create new experiences and inspire with weblogs. New technology will be showcased at BloggerCon 2003. Educators are using blogs to help students express themselves and learn from each other.
Meanwhile questions linger. Are today's bloggers the modern-day Emersons and Thoreaus or Charlie Chaplin, PT Barnum or Erma Bombeck? Is blogspace a Second Superpower, a ride on the Cluetrain, the venue for the next election or is it even worse than it appears, just good enough to make a difference, or the revolution so many say it is? "
Columnist Paul Krugman of the New York Times styles as a "joke" the FERC-energy companies settlement in the California energy crisis in "Another Friday Outrage" on 9/2/03. According to Krugman, most experts agree that during 2000-2001, energy companies manipulated prices through "economic withholding" — keeping capacity offline to drive up prices above competitive levels. Yet the settlement is far lower than the impact on ratepayers for two reasons, he says. First, it is easier to prove an industry pattern than intentional withholding by a particular player; an "accidental" shutdown looks much like an "accidentally on purpose" shutdown. Second, because withholding drives up prices generally, all companies (not just those responsible) profited from others' manipulation.
Krugman's solutions? One, make the continental electrical transmission system (for which no one seems responsible) robust. Two, assign a watchdog with more power than FERC has today. Three, make sure the watchdog is independent of the energy companies. What qualifications has Krugman (Professor of Economics and International Affairs at Princeton University) to advise us? More than a few: Paul Krugman Biography