SIMs Policy Manual of
Table of Contents
1. Software and Business Method Patents for the Internet 2-6
2. Consumer Information Privacy on the Internet 7-16
3. Copyright and Data Protection in E-Business 17-20
4. Critical Infrastructure Security 21-28
5. Encryption/Cryptography 29-37
6. Trademarks, Domain Names and Cybersquatting on the 38-40
7. Taxation of Internet Commerce 41-46
8. Internet Content Restrictions 47-53
9. (DRAFT) E-business Strategies: Open Versus Closed 54-64
Customer and Competitor Environments
Software and Business Method Patents for the Internet
Issue: Whether software and business method patents relating to the Internet will create
undesirable monopolies in E-Commerce or, instead, are legitimate ways to protect
Background: Traditionally, mathematical algorithms, as might be contained in software,
and “business methods” were considered unpatentable. They were considered too
abstract and not novel enough to grant anyone a monopoly upon their use. The U.S.
Patent and Trademark Office, however, has recently been granting patents for software
and business methods -- in particular as they relate to the Internet. Examples of Internet-
related software patents include:
• Unisys: method of data compression called LZW used in a graphic format called GIF,
which many web sites use in order to be compatible with older web browsers
• Bruce Dickens: computer software windowing method that created a Y2K fix
• Geoworks: software on Wireless Application Protocol that allows server computers to
rearrange pages of information to fit on the screens of phones and mobile devices.
Examples of Internet business method patents include:
• Amazon.com (2 patents): 1-Click ordering (storing a customer's billing information
so that they do not have to enter it every time they make a purchase); and Web
Affiliate Program, including the process used to apply to become an affiliate, the
technology used to link Amazon's databases to the affiliate site, and the billing system
used to make sure the affiliate gets its share of the profits.
• Priceline.com: reverse auctioning or “name your own price” on the Internet
• Sightsound.com: selling of audio or video recordings in download fashion over the
• Home Gambling Network: remote, live wagering over the Internet
• CyberGold: rewards to customers who receive and view online advertisements
Note that the business method patents are on the business method idea, not the
technology to accomplish the business method. The Internet software patents, by
comparison, are patents on the specific software technology accomplishing the result.
Thus, the business method patents are much broader as there may be several technologies
(software and otherwise) that could be created to accomplish the business method but
would be blocked from usage because they would infringe on the business method patent.
However, most E-business method patents are implemented through software, which
itself may be patentable.
These patents create 20-year monopolies over the software technology or the business
methods identified in the patent claims. And these Internet-related software and business
method patents are proliferating. Between October 1998 and September 1999, 2,600
applications for computer-related business methods were filed. During that same time
period, 583 computer-related business method patents were issued. Business with these
patents can prevent other businesses from using the software technology or business
methods, or they can license them out for a fee.
While the U.S. Supreme Court endorsed software patents some time ago, recent court
cases have brought the business method and Internet related patents into sharper focus.
The first case, State Street, did not involve the Internet, but rather a "hub and spoke"
software program for managing an investment structure for mutual funds. The software
facilitated the administration of mutual funds (the "spokes") by pooling their investments
into a single portfolio organized as a partnership (the "hub"). The software determined
changes in hub investment assets and allocated the assets among the spokes. The Federal
Circuit Court of Appeals (the highest, and most specialized, court on patent matters
besides the Supreme Court) held that software algorithms that lead to business methods,
like the one at issue in State Street, were patentable. This case reversed a long history of
judicial opinions suggesting otherwise.
Another important case now under way involves Amazon.com’s effort to stop
barnesandnoble.com from using “Express Lane,” a one-click check out mechanism
similar to Amazon’s patented 1-Click checkout. The trial court issued an injunction
against barnesandnoble.com (although it was later stayed) and the case is pending a final
ruling. It is considered a critical case regarding the general validity of Internet business
method patents. But other cases are also working through the courts, including a suit by
Priceline.com against Microsoft’s Expedia for replicating Priceline.com’s “name your
own price” business model for selling hotel rooms, airline tickets and other consumer
goods and services.
Conflict: There is considerable debate over the granting and use of these patents. Many
argue that these patents will stifle the open nature of the Internet and discourage
innovation. They argue that the open nature of software development is why the Internet
has advanced as far as it has today, and to allow proprietary ownership over code will
seriously undermine continued innovation. Influential legal scholar Lawrence Lessig
states, for example, that "[t]he idea that [Amazon’s] 1-Click is so amazing that it deserves
a government-granted monopoly is ridiculous.... These patents are going to change what
the Internet is right now, which is a place for a broad number of people to play in the
innovation game."1 Critics complain that these patent applications are generally overly
broad and ignore “prior art” – that is, prior ideas that are known, which should defeat a
patent claim that the idea is novel or non-obvious and thus patentable. Some attacks on
Amazon’s business method patent have been direct in this regard arguing that Amazon’s
1-Click is a simple, logical and obvious use of the cookie system pioneered by Netscape
and others and, thus, not deserving a patent by the very terms of patent law. These
arguments have been generalized to the broader number of Internet software and business
Thomas E Weber , “Patents feuds may damp Web's spirit,” Wall Street Journal, B1, November 8, 1999.
With respect to stifling the open nature of the Internet, not everyone is in agreement.
Many argue that patents have a role to play in even an open system. Q. Todd Dickinson,
the Director of the U.S. Patent and Trademark Office, defends business method patents as
spurring innovation and preventing rip-offs of inventors’ ideas. Jeff Bezos of
Amazon.com has also defended his 1-Click patent, arguing that Amazon took risks and
committed substantial time to the effort to create the ordering system. Moreover, many
software patent holders say they have software patents "for defensive purposes", to press
for cross-licensing, or to argue they were first to invent in case they are threatened with
patent lawsuits by others.
There is also debate over whether the USPTO is properly reviewing these patents for
prior art. The critics claim that a major reason so many bad software and business
method patents issue is that patent examiners do not have enough time and library
resources to adequately consider the prior art. Critics have said the agency approves such
patents too readily because its examiners do not understand current technology and
Internet practices well enough. This hampers competition and innovation, they argue, by
allowing commonplace business practices to be rendered private property, and by
restricting innovation by entrepreneurs wary of infringement lawsuits.
There are enforcement concerns with respect to these E-Patents. [insert international
Some Key Players and Resources:
• Jeff Bezos, CEO of Amazon.com. Champions the 1-Click patent but proposes
reducing length of Internet patents to 3-5 years.
• Jay Walker, founder of Walker Digital. Walker Digital is in the business of patenting
new business method patents, including Priceline.com.
• Kevin Rivette, a patent attorney and author of "Rembrandts in the Attic," a book on
how to make the most aggressive use of patents.
• Greg Ahorian, outspoken critic of software and business method patents. Operates
the Internet Patent News Service.
• Richard M. Stallman, software developer and founder of the GNU Project, launched
in 1984 to develop the free operating system GNU. Outspoken critic of Amazon.com
and software patents (www.gnu.org/people/rms.html)
• Lawrence Lessig, Harvard Law Professor and leading scholar on Internet and
intellectual property rights (http://cyber.law.harvard.edu/lessig.html).
• Harvard Berkman Center for Internet and Society. Promotes open code approaches to
the Internet (http://cyber.law.harvard.edu/).
• Q. Todd Dickinson, Director of the U.S. Patent and Trademark Office
(www.uspto.gov/web/offices/com/admin/). Defends USPTO practice in granting
software and business method patents.
• U.S. Patent and Trademark Office (www.uspto.gov)
• Protest Site against Amazon.com: www.NoAmazon.com
• Protest Site against Unisys: www.burnallgifs.com
• Patent Guidelines: US Patent Office (1998) Artificial Intelligence, Business and
Mathematics Patent Examination Guidelines
(http://www.uspto.gov/web/offices/pac/compexam/comguide.htm); US Patent Office
(1996) Computer-Related Invention Guidelines
(http://www.uspto.gov/web/offices/pac/dapp/oppd/patoc.htm); US Patent Office
(1989) Patentability of Math Algorithms and Computer Programs
(http://www.bustpatents.com/og1989.htm); Japan Patent, Implementing Guidelines
for Computer Software Related Inventions at JPO Office (http://www.jpo-
miti.go.jp/infoe/txt/soft-e.txt); UK Patent Office, Claims to Programs for Computers
• State Street Bank and Trust v. Signature Financial Group, 149 F.3d 1368 (Fed. Cir.
1998), cert. denied, 119 S. Ct. 851 (1999) (held that business methods are
• ATT vs. Excel Communications, 172 F.3d 1352 (Fed. Cir. 1999), cert. denied, 120 S.
Ct. 368 (1999) (applied rule of State Street decision in case dealing with business
method patent on long distance telephone message handling).
• Amazon.com v. Barnesandnoble.com, 73 F. Supp.2d 1228 (W.D. Wash, Dec. 1,
1999) (granted preliminary injunction against barnesandnoble.com for likely
infringement against Amazon.com’s 1-Click ordering patent).
Amazon.com 1-click patent claim: Method and system for placing a purchase order via a
communications network Issued/Filed Dates: Sept 28, 1999 / Sept 12, 1997
1. A method of placing an order for an item comprising: under control of a
client system, displaying information identifying the item; and in response to only
a single action being performed, sending a request to order the item along with an
identifier of a purchaser of the item to a server system; under control of a single-
action ordering component of the server system, receiving the request; retrieving
additional information previously stored for the purchaser identified by the
identifier in the received request; and generating an order to purchase the
requested item for the purchaser identified by the identifier in the received request
using the retrieved additional information; and fulfilling the generated order to
complete purchase of the item whereby the item is ordered without using a
shopping cart ordering model.”
Jeff Bezos quotes:
"We spent thousands of hours to develop our 1-Click process, and the reasons we have a
patent system in this country is to encourage people to take these kinds of risks. (quoted
in Thomas E Weber , “Patents feuds may damp Web's spirit,” Wall Street Journal, B1,
November 8, 1999).
“I now believe it's possible that the current rules governing business method and software
patents could end up harming all of us -- including Amazon.com and its many
shareholders, the folks to whom I have a strong responsibility, not only ethical, but legal
and fiduciary as well.” – Jeff Bezos, in suggesting a 3-5 year length for business method
patents (AN OPEN LETTER FROM JEFF BEZOS ON THE SUBJECT OF PATENTS)
Consumer Information Privacy on the Internet
Issue: Whether self-regulation versus governmental regulation of privacy builds the
confidence of consumers in Internet business
Relevance to E-Business Managers: Consumer information privacy on the Internet deals
with the use of personal data, which is critical for the success of an Internet business. It
allows a merchant to know who its customers’ identity, interests, and needs, and thereby
tailor the relationship process and the offerings to increase customer satisfaction and
customer convenience. The availability and sale of personal information has been one of
the engines of growth in Internet business.
The growth in the number of Internet users has increased the concern over the ability of
an individual to control the terms under which personal information is acquired and used
on the Internet. The concern about privacy comes from customers, who are wary of
vendors using the data or supplied information in an exploitive manner. Several high
profile cases have occurred where information about customers has been gathered
without their knowledge or without full disclosure of the purpose of data collection,
resulting in an outcry of customer complaints (e.g., Real Networks). DoubleClick found
that the mere announcement of targeting and profiling led to customer hysteria.
The consumer confidence in the Internet is critical for the development of electronic
commerce. The majority of people not online say that they stay off because of privacy
concerns. Some reports suggest that 55% of U.S. web users mistrust the present handling
of privacy. Interest groups are playing a watchdog role. The Federal Trade Commission
(FTC) has released a report that suggests that only 20% of the websites manage privacy
adequately. Moreover, online worries are being extended to offline concerns.
Privacy is another legal activity in development. There are some 300 privacy proposals at
the federal level and a plethora of others on the state levels. E-business managers must
stay abreast with these developments to avoid a “Privacy Valdez.”
Background: Personal information is information identifiable to an individual. E-
businesses have access to a wealth of information about online customers. To access a
web site or services, customers may complete online registration forms, where they
reveal contact information, financial data, and personal interests. To purchase goods or
services online, customers may send credit card numbers and shipping addresses over the
Internet. As customers click on advertisements or link to Web pages, e-business may use
cookies to record and store their surfing habits. Much of the data collected contains
personally identifiable information.
E-businesses have incentives to collect personal information. First, they may use the
information for their own marketing purposes. For example, an e-business may
personalize its web site for each individual customer to ensure that the customer’s
attention is focused on goods or services that he is most likely to wantgiven his past
surfing habits. Second, e-businesses may sell customer information to other companies,
who use the information to market directly to those customers. Finally, e-businesses may
collect personal information because the nature of their business requires the information.
For example, medical web sites require customers’ personal medical history to deliver
With existing technology, Internet merchants can collect vast amount of data, most of it
invisibly, and put together a complete profile of a person. Detailed tracking of a user’s
movements coupled with personally identifiable information has led to concerns over the
rise of identity theft. Some predict that within the next 6 to 8 months, most web users
will witness the siege of their identity.
The online collection of personal information gained widespread attention in 1998 when
the Federal Trade Commission (FTC) published its first study of online privacy practices.
The study analyzed the presence of privacy statement on commercial web sites. The
study found that although customers ranked the lack of privacy protection as the top
reason for not using the Internet, a substantial number of e-businesses collected personal
FTC’s sample posted any type of privacy disclosure. A 1999 Georgetown University
study (sponsored partly by the FTC) revealed an improvement from the prior year: 67 %
of the sites posted a privacy statement. However, the content analysis of these statements
suggested inadequate protection. Some companies posted statements that give the
company the right to do anything with the personal information Only 13.6 percent
followed the FTC’s “fair information practices” that would likely become law if the U.S.
government regulated privacy. Other studies suggest that companies fail to comply with
their own policies. In 1999, the FTC handled more than 11,000 complaints against online
auction sites alone.
The FTC’s “fair information practices” are reflected in the Privacy Act of 1974 which
focused on government sue of personal information. Although the U.S. Government has
endorsed the standards, it has never passed legislation on them. The Organization for
Economic Cooperation and Development (OECD) passed guidelines governing privacy
in 1980 and those guidelines are based on fair information practices. Fair information
o Notice/Awareness: website would be required to provide consumers notice
of their information practices, such as what information they collect and
how to use it
o Choice/Consent: web sites would be required to offer consumers choices
as to how that information is used beyond the use for which the
information was provided (for example to consummate a transaction)
o Access/Participation: web sites would be required to offer consumers
reasonable access to that information and an opportunity to correct
o Security/Integrity: websites would be required to take reasonable steps to
protect the security and integrity of that information.
In summer 1999, the FTC informed Congress that the new Internet privacy laws are not
needed at this time and endorsed a policy of self-regulation. It warned that they did not
“foreclose [the] possibility of legislative or regulatory action” in the future. Privacy
advocates disagreed with the FTC’s decision calling for a comprehensive privacy law.
Partially because of the concerning results from 1998 FTC WebSurf, Congress passed the
Children’s Online Privacy Protection Act (COPPA). But other than COPPA, the Clinton
administration has avoided governmental regulation of online privacy practices except on
the sectorial level (health and financial services). Instead, the administration has
encouraged e-businesses to adopt self-regulatory approaches to privacy protection in
order to protect the free growth of the Internet. Although the administration has assumed
a hands-off approach for now, it has charged both the FTC and the National
Telecommunications and Information Administration (NTIA) with monitoring online
privacy protection to ensure the effectiveness of self-regulation. If self-regulation is
ineffective, the administration says it will turn to governmental regulation of online
privacy. A recent Business Week/Harris poll reported that 57% of Americans believe
that it has become time for the government to step and regulate privacy; only 15%
believe that self-regulation is the way to go.
In May 2000, the FTC released the results of the 2000 WebSurf. The study found that
only 20% of the sites provided adequate consumer protection. Whereas in 1999, FTC
gave a green light to over 60% sites, this had dropped to 20% later due to the FTC
changing the rules. While in the past, the FTC had largely checked for the existence of a
privacy statement, in 2000 the study analyzed the content of the statement and to the
extent to which it met the four requirements of the Fair Information Practices. The FTC
2000 WebSurf suggested that the Federal Trade Commission has taken a more active role
in enforcing fair information practices online.
Business Self-Regulation Approaches
E-businesses have taken self-regulation seriously because they want to avoid
governmental regulation and because they recognize that privacy protection is simply
good business. Since 1998, the percentage of web sites providing privacy notices has
grown from 14 %2 to 24 %3. Several organizations, including TRUSTe and BBBOnline,
have launched privacy seal programs that provide third party monitoring of an enrolled
web site’s privacy practices. Finally, e-businesses themselves have changed their privacy
practices in response to consumer pressure. For example, DoubleClick abandoned plans
to merge data relating to online surfing habits with offline personal data when consumer
Three approaches to self-regulation have emerged:
o First, e-businesses may police their privacy practices by holding themselves to
restricted privacy policies. American Express employs this police approach.
As found in a 1998 FTC study published at http://www.ftc.gov.
As found in a 2000 enonymous.com survey published at http://www.privacyratings.org.
o Second, e-businesses may seek to create a market in privacy by compensating
consumers for personal information and then using that information as they see
fit. Cybergold employs a market approach.
o Third, consumers, instead of e-businesses, may control their own information by
using software that allows them to block access or designate the types of
information that will be revealed when they visit web sites.
Many hold hope that future privacy enhancing technologies coupled with consumer
education will elevate privacy protection to new levels within the self-regulation
framework. Some privacy enhancing technologies include:
1. Intermute: a Java application to block undesired access to your computer when
you are online
2. PGP 5.0: a powerful encryption program to guarantee the confidentiality of your
messages to trusted recipients
3. PGP Cookie Cutter: a Windows 95 utility to delete selected cookies
4. Lucent Personalized Web Assistant: an application to be used for identifying
yourself at a web site that shields your true identity
5. Anonymous technology: Anonymizer.com is a web site to be visited before you
visit other web sites that provides you with an anonymous identity. File sharing
programs such as Gnutella mask the identity of those using the system.
6. Platform for Privacy Preferences (P3P): P3P is an automated system that gives
users more control over the information they disclose about themselves as they
surf the Web. Under the proposal, site designers would post their privacy
practices in a format the user's browser would understand. Web surfers could, in
turn, set browser preferences to control how much information they want to
release to web sites they visit.
The criticism against self-regulation has grown in the last year. The press has featured
prominently a number of online privacy gaffes. DoubleClick, Amazon.com, Microsoft,
and Real Networks are just a few. Real Networks had a TRUSTe privacy seal on their
site while they violated their own privacy statement by transmitting personal information
from twelve million people. And TRUSTe has still yet to discipline Real Networks.
TRUSTe’s response has included that privacy problems happen not because of malicious
intent by the corporation but because “the left hand of a company doesn’t know what the
right one is doing.”
Businesses themselves are split on the balance between governmental and self-regulation.
Some businesses fear that unless the federal government acts, states and local
jurisdictions will pass their own privacy laws, leading to a mishmash of laws. Others
insist that the Internet businesses can self-regulate.
Consumer Ownership of Personal Information
Customers give away their personal information in anticipation of some future value from
that exchange (e.g., convenience, tailored products). While it is easy to see how
merchants benefit from personal customer information, it is less clear what Internet
customers have received in return for their personal information. The promises of greater
convenience (one-click shopping), personalization, and tailoring, have often fell short.
Perhaps because of failed promises, customers have begun to claim ownership of their
personal information and place economic value on the information that they share with
merchants while transacting, communicating, and collaborating with them. Customers are
willing to release this information if they can profit by doing so (e.g., compensation,
gifts, coupons, rebates, special offers). Some merchants have begun to provide a flat sum
of money for customers’ completing online surveys ($5-10) , providing a discount on the
first purchase, or pay the customer a few cents when the information is sold to a third
party. Firms whose main business is to sell personal information business have begun to
pay surfers for the time they surf (e.g., 50 cents per hour), number of advertisements they
look at, and the amount of information they share. Others argue that it is not possible to
put a value on a piece of data on the customer (name, browsing pattern) as it depends on
the context of the data.
Japan has followed closely the U.S. lead and has advanced ethical practices similar to the
Fair Information Practices. The European Union has taken a governmental approach. The
member countries in 1998.
1. The most obvious conflict concerns self-regulation versus governmental regulation.
The Clinton administration and e-businesses favor self-regulation since they believe
that governmental regulation will stifle the growth of the Internet. Specifically,
governmental regulation will erode consumer confidence and trust in e-businesses
and will offer an inflexible approach to a rapidly changing online environment.
Public interest groups note that self-regulation does not work, however. If e-
businesses find an economically beneficial use for online data, they are unlikely to
police themselves at an economic loss. Third-party private sector auditors are
ineffective since those organizations survive on funding that audited businesses
provide. Proponents of governmental regulation argue that without effective privacy
protection, consumers will not purchase goods and services on the Internetand the
Internet will not reach its full growth potential.
Among self-regulatory efforts, a conflict exists over the most effective approaches.
For example, many web sites are giving consumers the option to opt-out of
information sharing. Many public interest groups believe that web sites should use an
opt-in policy instead, however. Besides the issues of choice, there are issues of being
informed. How does the privacy statement constrain the firm from changing their
business model and their information uses in the future? The privacy statement
covers what information the business collects, how it collects that information, and
how it uses that information. The privacy advocates argue that if the data is collected
under Version 1 of the privacy statement, then it can only be treated under Version 1
without approval from everyone who provided data. Other issues rally around who is
responsible for the integrity of the data.
2. A conflict also exists between the European and American approaches to privacy
protection. In 1998, the E.U. implemented a privacy protection law that allows
companies to collect personal data only when individuals consent to the collection,
know how the data will be used, and have access to databases to correct or erase their
information. The law does not allow the transfer of data from E.U. countries to
countries with less stringent privacy policies. Since the U.S. has adopted a self-
regulatory approach, its privacy policies are less stringent, and the E.U. law prohibits
data transfer to the U.S. In March 2000, the U.S. and E.U. reached a safe harbor
agreement that has not yet been ratified. Europe agreed it would not try to force the
U.S. to impose an intrusive E.U. data-privacy law on all U.S. companies. In return,
the U.S. agreed to set up a ""safe harbor'' – a list, to be maintained by the Department
of Commerce, of companies that voluntarily adopt E.U.-style safeguards of their
customers' private information. Companies that do not participate would risk a halt
of data flows from Europe. The Europe-U.S. agreement has particularly been slow in
resolving the issues of onward transfer of data and enforcement.
3. There is a conflict between privacy and anonymity. The privacy advocates argue that
users have a right to stay anonymous. However, anonymous file sharing programs
such as Napster and Gnutella are associated with rampant copyright violations.
Because the users are anonymous, then right holders have no one to use. Industry
leaders, whose businesses are dependent on copyright protection, have called for the
elimination of anonymity for people who wanted to use services such as Napster.
Some have even said that the issue of anonymity might become the most significant
policy issue in the coming years.
Legislation in the U.S.:
1. Fair Credit Reporting Act (1970): Governs the collection and disclosure of personal
information in the credit reporting industry.
2. Privacy Act of 1974: Regulates government conduct pertaining to the collection, use,
and disclosure of personal identifiable information (including electronic information).
3. Freedom of Information Act: Regulates government conduct pertaining to the
disclosure of personal identifiable information (including electronic information).
4. Cable Communications Policy Act (1984): Requires cable companies to provide their
customers with annual notice as to how their personal identifiable information is used
(perhaps applicable to cable providers who provide Internet access).
5. Electronic Communications Privacy Act (1986): Protects private electronic
communications from unauthorized access, interception, or disclosure by the
government, individuals, or third parties.
6. Video Privacy Protection Act of (1988): Regulates disclosure of videotape rental
information (application of the law to the Internet is unclear).
7. COPPA (Children’s Online Privacy Protection Act) (1998): Prohibits unfair or
deceptive acts or practices in connection with the collection, use, or disclosure of
personally identifiable information from and about children younger than 13 on the
8. Gramm-Leach-Bliley Financial Services Bill (1999): The bill itself codifies the rights
of financial consumers. The Clinton administration is currently drafting rules to
implement privacy protections required by the bill. The proposed rules include a
mandatory privacy notice and opt-out policy.
Ethical Standards for Privacy
The U.S. constitution does not contain any rights to privacy and no comprehensive
privacy legislation exists in the U.S. However, there are ethical standards that firms
o The National Telecommunications and Information Administration (NTIA)
articulated the following fair information practices and enforcement mechanisms in
1. Principles of Fair Information Practices
Fair information practices form the basis for the Privacy Act of 1974, the
legislation that protects personal information collected and maintained by the
United States government. In 1980, these principles were adopted by the
international community in the Organization for Economic Cooperation and
Development's Guidelines for the Protection of Personal Data and Transborder
a. Awareness. At a minimum, consumers need to know the identity of the
collector of their personal information, the intended uses of the information,
and the means by which they may limit its disclosure. Companies are
responsible for raising consumer awareness and can do so through the
1) Privacy policies. Privacy policies articulate the manner in which a
company collects, uses, and protects data, and the choices they offer
consumers to exercise rights in their personal information.
2) Notification. Notification should be written in language that is clear and
easily understood, should be displayed prominently, and should be made
available before consumers are asked to provide personal information to
3) Consumer education. Companies should teach individuals to ask for
relevant knowledge about why personal information is being collected,
what the information will be used for, how it will be protected, the
consequences of providing or withholding information, and any recourse
they may have.
b. Choice. Consumers should be given the opportunity to exercise choice with
respect to whether and how their personal information is used, either by
businesses with whom they have direct contact or by third parties.
c. Data Security. Companies creating, maintaining, using or disseminating
records of identifiable personal information must take reasonable measures to
assure its reliability for its intended use and must take reasonable precautions
to protect it from loss, misuse, alteration or destruction. Companies should
also strive to assure that the level of protection extended by third parties to
whom they transfer personal information is at a level comparable to its own.
d. Data Integrity. Companies should keep only personal data relevant for the
purposes for which it has been gathered, consistent with the principles of
awareness and choice. To the extent necessary for those purposes, the data
should be accurate, complete, and current.
e. Consumer Access. Consumers should have the opportunity for reasonable,
appropriate access to information about them that a company holds, and be
able to correct or amend that information when necessary. The extent of
access may vary from industry to industry.
f. Accountability. Companies should be held accountable for complying with
their privacy policies.
The discussion of enforcement tools below is in no way intended to be limiting.
The private sector may design the means to provide enforcement that best suit its
needs and the needs of consumers.
a. Consumer recourse. Companies that collect and use personally identifiable
information should offer consumers mechanisms by which their complaints
and disputes can be resolved. Such mechanisms should be readily available
b. Verification. Verification provides attestation that the assertions businesses
make about their privacy practices are true and that privacy practices have
been implemented as represented.
c. Consequences. For self-regulation to be effective, failure to comply with fair
information practices should have consequences. Examples of such
consequences include cancellation of the right to use a certifying seal or logo,
posting the name of the non-complier on a "bad-actor" list, or disqualification
from membership in an industry trade association. Non-compliers could be
required to pay the costs of determining their non-compliance. Ultimately,
sanctions should be stiff enough to be meaningful and swift enough to assure
consumers that their concerns are addressed in a timely fashion. When
companies make assertions that they are abiding by certain privacy practices
and then fail to do so, they may be liable for deceptive practices and subject to
action by the Federal Trade Commission or appropriate bank or financial
o In June 1998, the Electronic Privacy Information Center recommended the following
policy should be accessible from the home page by looking for the word
2. Privacy policies should state clearly how and when personal information is
3. web sites should make it possible for individuals to get access to their own data.
4. Cookies transactions should be more apparent.
European Union Data Privacy Directive
a. Collectors of personal information must provide the data subject with notice
of their collection practices
b. A gatherer of personal information can only collect such information for
“specified, explicit and legitimate purposes”
c. Information must be adequate and relevant for the stated purpose, accurate
and current, and maintained in personal identifiable form for only the amount
of time needed to accomplish the stated purpose for collection
d. Personal identifiable information can be processed only if the subject of the
information gives unambiguous consent
e. The data subject must be given a right of access and a right to object to the
processing of his information
f. The data collector must provide for confidentiality and security of the
g. Personal identifiable information transferred outside of the E.U. must only be
to countries with "adequate" privacy protection
2. Recent events surrounding the Directive
a. The European Commission said on January 11 that it would take five
European Union member states to court for failing to implement rules
designed to protect individuals' privacy on the Internet and other electronic
networks. The E.U. executive said it had decided to take France, Luxembourg,
the Netherlands, Germany and Ireland to the Luxembourg-based European
Court of Justice for failing to fully implement the E.U.'s Data Protection
Directive (Reuters, 11 January 2000)
b. Tentative agreement on the Safe Harbor was reached in March. Details are
still being finalized.
1. The FTC has launched investigations into a handful of web sites’ business practices.
Yahoo! is the target of a current FTC investigation to determine whether it disclosed
user data to third parties in violation of federal regulations. Earlier this year,
DoubleClick was the target of an FTC investigation when it announced that it would
combine online surfing habits cultivated by its ad network with personal information
collected by transaction records. The FTC has also settled cases with KidsCom and
2. Private parties have filed six lawsuits against Doubleclick alleging deceptive privacy
Public Interest Groups:
1. Center for Democracy and Technology (http://www.cdt.org)
2. Electronic Privacy Information Center (http://epic.org)
3. Internet Privacy Coalition (http://www.privacy.org/ipc)
1. Federal Trade Commission (http://www.ftc.gov): Monitors deceptive business
practices, which include privacy practices.
2. National Telecommunications and Information Administration
(http://www.ntia.doc.gov): This agency of the U.S. Department of Commerce is
charged with studying and monitoring the status of electronic privacy protection.
Private Sector Auditors:
1. The Personalization Consortium is a group of 26 companies that police members’
privacy policies while educating consumers about personalization issues. Members
must tell consumers what data is being collected in the personalization process and let
them opt out of collection.
2. TRUSTe provides a fee-based service that promises to audit a site and issue a seal
3. BBBOnline is developing a privacy seal program that includes verification and
consumer dispute resolution.
1. Online Privacy Alliance (http://www.privacyalliance.org): The Alliance and all its
members are strongly committed to meeting the Administration’s challenge to
develop a strong, effective program for self-regulation in the online marketplace. The
Alliance has adopted a set of guidelines for online privacy practices and a very strong
set of principles for children’s online activities.
1. Alan Westin, a professor at Columbia University, is involved in the development of a
self-regulatory privacy program for BBBOnline.
2. Mary Culnan, a professor at Georgetown University, is the author of the 1999
Advisory Committee on Access and Security.
Copyright and Data Protection in E-Business
Issues: To what extent should law protect both copyrighted information in cyberspace
and the technological means used to self-protect?
Managerial Questions: Does moving operations online create new content or software
for which copyright protection should be sought? Would copyright protection be
sufficient? If not, how should e-businesses protect their data or software from
unauthorized use and distribution? How should e-businesses avoid being sued for
Background: U.S. copyright law has been used to protect the content of web sites, data,
and Internet software programs from unauthorized copying and distribution. Businesses
have found, however, that laws have limits in their effectiveness in cyberspace where
content can be copied quickly by anonymous users. Many companies have decided to
rely not only on traditional copyright law, but also on technical protections (anti-piracy
measures) built into web sites or software to protect against unauthorized copying and
distribution. The music industry, for example, is betting that secure copy-protection
technologies, developed under the auspices of the Secure Digital Music Initiative
(SDMI), will stop the unauthorized spread of new music through file sharing systems like
Napster or Gnutella. Cyber Patrol, a screening software to protect children from
pornographic sites, is another product with an anti-piracy feature. And CSS is the
encryption program designed to prevent unauthorized copying of DVDs.
Many of these anti-piracy measures have been “cracked” by hackers. This has created a
growing consensus that copyright law should protect not only the content, but also the
technical measures designed by firms to secure the content. In 1998, Congress passed the
Digital Millenium Copyright Act (DMCA) which makes it illegal to break through
passwords, encryption and other technological defenses that companies erect around their
Internet content. The Act was designed to implement international treaties that the U.S.
had signed at the World Intellectual Property Organization (WIPO) in 1996. The bill was
originally supported by the software and entertainment industries, and opposed by
scientists, librarians, and academics. At the last minute, certain controversial provisions
were deleted, including a provision that would have provided copyright protection for
databases even when the material in the databases was in the public domain. Some “fair
use” protections were inserted for non-profit archives, libraries and educational
institutions, and the bill was passed.
Among other things, the DMCA:
• Makes it a crime to circumvent anti-piracy measures built into most
• Outlaws the manufacture, sale, or distribution of code-cracking devices used
to illegally copy software.
• Allows the U.S. Copyright Office to make exemptions to the antihacking
• Limits Internet Service Providers (ISPs) from copyright infringement liability
for simply transmitting information over the Internet.
• Requires “webcasters” to pay licensing fees to record companies.
Several lawsuits have been brought under the anti-circumvention provisions of the
• Real Networks obtained an injunction against a portion of software created by
Streambox that allowed users to capture or record “streamed” media sent via
Real Networks’ copy-encoded format.
• The Motion Picture Association of America (MPAA) filed lawsuits against
web sites that posted software, or links to it, created by a 16-year-old
Norwegian student that allowed DVDs to be played on Linux-based
• The Recording Industry Association of America (RIAA) filed a lawsuit
against start-up company Napster, which allows music fans to trade music
files directly from one another’s machine without posting them on a web site.
• Universal City Studios brought suit against sites that posted a de-encryption
program know as DeCSS. DeCSS de-encrypts CSS, a proprietary program
that precluded copying of movies stored on DVD.
A European legislative proposal bears broad resemblance to the Digital Millennium
File Sharing: A major conflict has arisen over the recent software programs that allow
Internet users to share files over the Internet without paying for their use or distribution.
Currently, most of the attention is focused on file sharing programs such as Napster that
allow free exchange of music. But there are other file sharing programs that allow the
sharing of any software file on a user’s computer. Wrapster, for example, allows any
kind of file to be listed and traded over the Napster network. iMesh allows people to
swap music, video and other multimedia files. That provides a broader range of options
than Napster itself, which only supports MP3 files, but falls short of the capabilities of
the new Wrapster technique. These programs and others like them are likely to pit
software copyright owners against unauthorized users and those who assist in the file
Some technological solutions have emerged: NetPD and Media Enforcer, which allow
artists to monitor who is swapping their songs online and gather the Web addresses and
usernames of traders. But new services such as Freenet and ZeroKnowledge are being
developed that will make this job much more difficult, masking individual traces online
and distributing content more widely around the Net.
Information Aggregators and Data Base Protection: Whether databases -- collections of
facts like telephone directories, weather reports, stock tables and real estate listings,
airline schedules, medical advice, city maps, basketball scores and other information --
can be copied, repackaged and distributed by competitors and other information
aggregators. In a recent case involving Internet auction site eBay and information
aggregator Bidder’s Edge, Judge Ronald Whyte proclaimed that the “bots” launched by
Bidder's Edge were a "violation of eBay's fundamental property right to exclude others
from its computer system." The judge issued a preliminary injunction barring Bidder’s
Edge, which indexes online auctions so users can find the best deal, from automatically
harvesting information from eBay. The court said Bidder’s Edge was “trespassing” by
using the resources of eBay's computer systems without permission. According to the
judge, the law recognizes no such right to use another’s property. The implications of this
ruling could effectively outlaw "deep linking." Deep links take Internet users directly to a
relevant item on another web site. They are the bread and butter of search engines,
content aggregators and comparison-shopping sites. In an earlier precedent-setting case
(Ticketmaster versus tickets.com), a U.S. court found that deep linking did not violate
copyright protection, apparently resolving controversy surrounding the practice. The
eBay case rests on the notion of “trespass” rather than “copyright.”
The debate about Internet links is also being played out in the US Congress, where eBay
is alone among leading US web publishers in supporting a bill that would prohibit the
systematic extraction of information from databases compiled by companies. Yahoo!,
Amazon.com, America Online and other Internet heavyweights oppose it. Last year the
House Judiciary Committee approved a bill sponsored by Rep. Howard Coble, R-N.C.,
that would establish criminal penalties for the unauthorized use of material in databases.
Opponents say the bill would allow companies with databases to control access to facts.
The issue is just as vital to older businesses as they adapt to the Internet. For example,
real-estate agents complain that online home-sale listings have been pilfered and reused.
Publishers worry about pirating of their databases. Newspapers are nervous about
classified advertising being copied.
Some Key Players and Resources
• Rep. Howard Coble, R-N.C. Introduced bill that would establish criminal penalties
for unauthorized use of material in databases.
• Senate Judiciary Committee Chmn. Orrin Hatch (R-Utah). Has promised "a series of
hearings" on copyright problems created by new technologies.
• Lawrence Lessig, Harvard Law Professor and leading scholar on Internet and
intellectual property rights (http://cyber.law.harvard.edu/lessig.html).
• James Billington, Librarian of Congress. Will make final decisions regarding
exceptions to the DMCA’s anti-circumvention provisions.
• Robin Gross, attorney who filed comments for the Electronic Frontier Foundation
regarding Cyber Patrol litigation. Supported CPHack’s position.
• Rapper Chuck D. Wrote in a recent New York Times op-ed article that, 'Music on
the Internet is just a promotional device that helps to sell records.' Favors Napster.
• Metallica, heavy metal band. Suing Napster.com for contributory copyright
• Michael Eisner, Disney CEO. Outspoken advocate on the need for stronger copyright
• eBay v. Bidder’s Edge, Inc., No. C-99-21200 (Northern District of California, May
24, 2000) (used the principle of “trespass to a computer system” to prevent content
aggregation by “bots”).
• Universal City Studios v. Reimerdes (January 20, 2000) (injunction against DeCSS
de-encryption software for DVDs).
Critical Infrastructure Security
Issue: New security weaknesses caused by vulnerabilities in the Internet, as well as in
web browsers and servers, have created a variety of new security risks. The types of risks
include system-modifying attacks (viruses or “hostile” applets) and Denial-of-Service
(DoS) attacks that consume a machine’s resources or make them unavailable. Attack
technology is being developed in an open source environment where a community of
interest develops this technology at a rapid pace. Several significant new forms of attack
have appeared in just the past year such as the Melissa virus and DoS attacks. As attack
technology evolves, it can be acquired by users with significant resources to hone and
advance the technology, making it a much more serious threat to national security and the
effective operation of government and business.
Industry is acutely interested in protecting the critical infrastructure since almost 90% of
the world’s information infrastructure, including the Internet, is run by industry.
Government is also interested in protecting critical infrastructure security, as such
protection runs into national security concerns. Business and government have disagreed
on how, and by whom, critical infrastructure security should be maintained.
Background: The threats to critical infrastructure come in a variety of forms:
• Viruses. A virus is a program designed to perform some malicious action
unknowingly triggered by an innocuous event (such as a user action, a certain date
being reached, etc.). The defining characteristic of viruses is that they are self-
replicating. With the ease of passing information between users greatly enhanced by
the Internet, so too is the ease of a user unknowingly transmitting a virus. Also, the
number of new viruses appearing is escalating at an alarming rate. According to PC
Magazine, new viruses appear at the rate of more than 200 per month.
• Hostile Applets and ActiveX Controls. Hostile applets are designed to take advantage
of an applet’s capabilities. Because they are designed to execute on a user’s
computer, if they contain malicious features, they can perform hostile acts such as
damaging files or exposing them for unauthorized users to read without the user
• Denial-of-Service Attacks (DoS). DoS attacks are among the biggest threats to reliable
computing environments. The development of the Internet with distributed systems
based on the client/server model has made many computer systems much more
vulnerable to these types of attacks. DoS attacks include several different methods of
making system resources unavailable and shutting down service.
E-mail “bombs” – Consist of hundreds of duplicate messages and large files, thus
potentially filling file systems or overloading mail servers and making them
unavailable for valid use.
“SYN flooding” – Inundates a server with requests to open new connections that
carry invalid IP addresses, tying up the server as it tries to acknowledge
unknown or nonexistent addresses.
“Ping of Death” attacks – Crash network servers or firmware by overloading them
with illegally large ping packets. (“Ping,” short for Packet Internet Groper, is
an Internet utility used to determine whether a particular IP address is online.
It is used to test and debug a network by sending out a packet and waiting for
IP fragment attacks – The so-called “Teardrop” attack targets a weakness in the
reassembly of IP packet fragments on the destination host. When an IP packet
is sent across the Internet, it often is broken up into smaller packets. These
smaller packets indicate which data bytes of the original packet they hold (for
example, bytes 128 through 255 of packet XYZ). The Teardrop virus will
change these numbers, making them incorrect. When some destination hosts
are unable to reconstruct the original packet because of these invalid numbers,
they hang or crash.
“False alarm” attacks – Trigger automatic firewall alarms designed to close down
connections when attacked or cause other system shutdowns. In other words,
this method uses the network’s or server’s own security tools to deny service.
The Center for Education and Research in Information Assurance and Security at Purdue
University (CERIAS) has identified the following key trends and factors facilitating
cyber attacks on critical infrastructures:
1. Attack technology is developing in an open-source environment and is evolving
rapidly. Technology producers, system administrators, and users are improving
their ability to react to emerging problems, but they are behind and significant
damage to systems and infrastructure can occur before effective defenses can be
implemented. As long as defensive strategies are reactionary, this situation will
2. Currently, there are tens of thousandsperhaps even millionsof systems with
weak security connected to the Internet. Attackers are (and will) compromising
these machines and building attack networks. Attack technology takes advantage
of the power of the Internet to exploit its own weaknesses and overcome defenses.
3. Increasingly complex software is being written by programmers who have no
training in writing secure code and are working in organizations that sacrifice the
safety of their clients for speed to market. This complex software is then being
deployed in security-critical environments and applicationsto the detriment of
4. User demand for new software features over security ones, coupled with industry
response to that demand, has resulted in software that is increasingly supportive
of subversion, computer viruses, data theft, and other malicious acts.
5. Because of the scope and variety of the Internet, changing any particular piece of
technology usually cannot eliminate newly emerging problems; broad community
action is required. While point solutions can help dampen the effects of attacks,
robust solutions will come only with concentrated effort over several years.
6. The explosion in use of the Internet is straining our scarce technical talent. The
average level of system administrator technical competence has decreased
dramatically in the last 5 years as non-technical people are pressed into service as
system administrators. Additionally, there has been little organized support of
higher education programs that can train and produce new scientists and educators
with meaningful experience and expertise in this emerging discipline.
7. The evolution of attack technology and the deployment of attack tools transcend
geography and national boundaries. Solutions must be international in scope.
8. The difficulty of criminal investigation of cybercrime coupled with the
complexity of international law mean that successful apprehension and
prosecution of computer crime is unlikely, and thus little deterrent value is
9. The number of directly connected homes, schools, libraries and other venues
without trained system administration and security staff is rapidly increasing.
These "always-on, rarely-protected" systems allow attackers to continue to add
new systems to their arsenal of captured weapons.
Network firewalls are commonly used to enforce a site’s security policy by controlling
the flow of traffic between two or more networks. Firewalls often are placed between the
corporate network and an external network such as the Internet or a partnering company’s
network. However, firewalls are also used to segment parts of corporate networks. A
firewall system provides both a perimeter defense and a control point for monitoring
access to and from specific networks.
Conflict: To improve critical infrastructure security, the U.S. government has suggested
interoperability of products and systems through standard-setting efforts. Many
businesses, however, endorse adopting best practices for tackling critical infrastructure
issues rather than setting standards. They believe that the marketplace and not the federal
government should dictate preferred technologies (which would become de facto
standards). Many in the IT industry views standards as a snapshot of technology at a
given moment, creating the risks that technology becomes frozen in place, or that
participants coalesce around the "wrong" standards. Many IT professionals favor an open
source model for developing best practices, a model that is not constrained by technical
rules or regulations.
There also exists a debate regarding whether to consolidate activities regarding collection
and analysis of cyber attacks. FBI director Louis Freeh and the Critical Infrastructure
Assurance Office favor a single location for the collection, analysis, and dissemination of
information regarding security threats. Industry prefers a more diffuse approach that
currently exists whereby multiple organizations are working to evaluate vulnerabilities
and threats as well as developing technical solutions. The challenge from this perspective
is not to pull all data together, but to push it out to meet the varying needs of the various
Richard Pethia of CERT stresses information sharing as the fundamental component to
preventing cyber attacks. He maintains that IT professionals understand they can never
hope to eliminate every vulnerability in their system. Therefore, they need data to help
them determine which vulnerabilities are most critical and therefore likely to be
exploited. Pethia states: “Our law enforcement and intelligence organizations must find
ways to release threat data to the operational managers of information infrastructures to
motivate these managers to take action and to help them understand how to set their
Information sharing about cyberattacks, however, is problematic. Companies are
currently reluctant to share sensitive information about security practices and network
breaches with either government agencies or their competitors. Companies worry that
trade secrets or other proprietary information could be compromised in the exchange.
Additionally, they worry that the information on intrusions could be used against them in
shareholder lawsuits, jeopardize their customer base, or even prove beneficial to the
hacker community. Companies also fear sharing this information with government
because of the possibility it may lead to increased regulation of the industry or e-
commerce generally. Moreover, companies are concerned with protected individual
customer’s privacy and fear that privacy breaches may occur inadvertently during
information infrastructure investigations.
Currently, corporations often have more to lose from damaged reputations than from the
network attacks themselves. These organizations will not share security incident or loss
information unless they have a high degree of confidence that this information will be
protected from public disclosure. Industry professionals are urging the federal
government to take steps to protect sensitive information, including creating exemptions
from Freedom of Information Act (FOIA) requests. Many in industry believe that
freedom from FOIA concerns is the most formidable obstacle, and that an exemption for
this type of information sharing is the only option. Opponents of proposals to relax FOIA
provisions believe industry might use the relaxed standards to protect itself from
disclosing damaging information that should be released to the public.
FBI Director Louis Freeh believes safeguards are currently in place to protect sensitive
information. In his testimony before the Senate Judiciary Subcommittee on Technology,
Terrorism and Government, he stated that under the Economic Espionage Act, passed in
1996, there are specific provisions for maintaining the confidentiality of information
obtained during the process of a criminal prosecution. Therefore, any proprietary
information is under specific and court-ordered protection to ensure it is not
compromised in the course of the prosecution.
Additional Proposals to Improve Critical Infrastructure Security: In addition to the
debated solutions above, the Information Technology Association of America (ITAA)
and CERT have suggested additional approaches to improving the current mechanisms
for combating threats and responding to attacks on the nation’s critical infrastructure.
1. Building Awareness. The ITAA and its member companies are raising awareness of
the issue within the IT industry and through partnership relationships with other vertical
industries, including finance, telecommunications, energy, transportation, and health
services. An awareness-raising campaign targeting the IT industry and vertical industries
dependent on informationsuch the financial sector, insurance, electricity, transportation
and telecommunicationsis being overlaid with a community effort directed at CEOs,
end users and independent auditors. The goal of the awareness campaign is to educate the
audiences on the importance of protecting a company's infrastructure, and to instruct the
steps they can take to accomplish this. The message is that information security must
become a top tier priority for businesses and individuals.
2. Educating Computer Users. In an effort to take a longer-range approach to the
development of appropriate conduct on the Internet, the Department of Justice and the
ITAA have formed the Cybercitizen Partnership. The Partnership is a public/private
sector venture formed to create awareness, in children, of appropriate on-line conduct.
The effort focuses on developing an understanding of the ethical behavior and
responsibilities that accompany use of the Internet. The Partnership will develop focused
messages, curriculum guides and parental-information materials aimed at instilling a
knowledge and understanding of appropriate behavior online. The ITAA believes that a
long-range, ongoing effort to insure proper behavior is the best defense against the
growing number of reported incidents of computer crime.
3. Expanding Research and Development. ITAA believes that between industry's
market-driven R&D and government's defense-oriented R&D projects, gaps may be
emerging that no market forces or government mandates will address. ITAA and its
member companies actively support the President Clinton's call for an Institute for
Information Infrastructure Protection. This institute, under consideration by the
President's Committee of Advisors on Science and Technology, will focus limited
government funding on targeted R&D projects conducted through consortia of industry,
academia and government.
Key Groups and Organizations
• The Information Technology Association of America (ITAA) provides global public
policy, business networking, and national leadership to promote the continued rapid
growth of the IT industry. ITAA consists of 400 direct and 26,000 affiliate corporate
members throughout the U.S., and a global network of 41 countries' IT associations.
ITAA members range from the smallest IT start-ups to industry leaders in the
Internet, software, IT services, ASP, digital content, systems integration,
telecommunications, and enterprise solution fields. (www.itaa.org).
• The National Infrastructure Protection Center (NIPC) is a multi-agency organization
whose mission is to detect, warn of, respond to, and investigate computer intrusions
and other unlawful acts that threaten or target our Nation's critical infrastructures.
Located in the FBI's headquarters building in Washington, D.C., the NIPC brings
together representatives from the FBI, other U.S. government agencies, state and
local governments, and the private sector in a partnership to protect our Nation's
critical infrastructures. (www.nipc.gov).
• The President’s Commission on Critical Infrastructure Protection (PCCIP) was
formed to advise and assist the President of the United States by recommending a
national strategy for protecting and assuring critical infrastructures from physical and
cyber threats. (www.pccip.ncr.gov)
• The Critical Infrastructure Assurance Office (CIAO) is a government agency charged
with plotting a federal plan for protecting the nation's critical infrastructures from
disruption or attack. (www.ciao.gov)
• The Institute of Internal Auditors will be holding a series of briefings and meetings
around the country, in conjunction with the CIAO and ITAA, to discuss critical
infrastructure issues as they relate to internal company audits by accounting
• Americans for Computer Privacy (ACP) is a broad-based coalition representing
financial services, manufacturing, telecommunications, high-tech and transportation,
as well as law enforcement, civil liberty, pro-family and taxpayer groups. ACP
supports policies that promote industry-led, market-driven solutions to critical
information infrastructure protection and that oppose government efforts to impose
mandates or design standards, or increase widespread monitoring or surveillance.
• The Center for Education and Research in Information Assurance and Security at
Purdue University (CERIAS) is a center for multidisciplinary research and education
in areas of information security. (www.cerias.purdue.edu).
• The CERT Analysis Center was recently established to address the threat posed by
rapidly evolving, technologically advanced forms of cyberattacks. Working with
sponsors and associates, the CERT Analysis Center collects and analyzes information
assurance data to develop detection and mitigation strategies that provide high-
leverage solutions to information assurance problems, including countermeasures for
new vulnerabilities and emerging threats. The CERT Analysis Center builds upon the
work of the CERT Coordination Center. The CERT Analysis Center extends current
incident response capabilities by developing and transitioning protective measures
and mitigation strategies to defend against advanced forms of attack before they are
launched. Additionally, it provides the public and private sectors with opportunities
for much-needed collaboration and information sharing to improve cyber attack
• International Centre for Security Analysis (ICSA). Based at King's College London,
ICSA is an international center of excellence that conducts research on the policy and
technological implications of information assurance. ICSA addresses both the
economic and defense aspects of the threats posed by electronic attack. ICSA is
hosting the IAAC in order to enhance its research base and to strengthen links
between academia and private and public sector end-users. (www.icsa.ac.uk)
• World Information Technology and Services Alliance (WITSA). WITSA consists of
the national information industry representative bodies from around the world. Its role
is to develop public policy positions on issues of concern to the information industry
and present these positions to governments and international organizations.
• Richard D. Pethia, Director of the CERT Centers, Software Engineering Institute
(SEI), Carnegie Mellon University
• Harris Miller, President of the Information Technology Association of America
(ITAA) and President of the World Information Technology and Services Alliance
• John S. Tritak, Director of the Critical Infrastructure Assurance Office (CIAO). As
Director, Mr. Tritak is responsible for supporting the National Coordinator for
Security, Infrastructure Protection, and Counter-Terrorism in the development of an
integrated National Infrastructure Assurance Plan to address threats to the nation's
critical infrastructures, including communications and electronic systems,
transportation, energy, banking and finance, health and medical services, water
supply, and key government services. As Director, he will also coordinate a national
education and awareness program, as well as legislative and public affairs initiatives.
• Louis J. Freeh, Director of the Federal Bureau of Investigation, U.S. Department of
• Sen. John Kyl (R-AZ), Chairman of the Terrorism, Technology and Government
Information Subcommittee of the Senate Judiciary Committee
Major Legislation. Emerging federal computer crime legislation can be divided into
three broad categories:
1. enhanced law enforcement of cybercrime suspects
2. technical solutions to breaches of network security
3. improved information sharing
Enhanced Law Enforcement. Senate Bill 2092, the Schumer-Kyl High-Tech Crime Bill,
seeks to modify Title 18 of the United States Code relating to the use of pen registers and
trap-and-trace devices. The bill provides law enforcement with nationwide trap-and-trace
authority. Under current law, investigators who are trying to track a hacker must obtain a
trap-and-trace order in each jurisdiction through which an electronic communication is
made. S. 9092 amends current law to authorize the issuance of a single order to
completely trace online communications to its source, regardless of how many
intermediary sites through which it passes. Industry has expressed some concern that the
bill would create undue administrative and financial burdens on the part of ISPs and other
telecommunications companies to comply with the trap-and-trace provisionsnot to
mention the possibility of breaching privacy policies they have established with their
customers. Another industry representative doubts the bill will be enacted in the
immediate future, if at all, due to a controversial provision that would treat some juvenile
offenders as adults in a criminal proceeding.
Technical Solutions. HR 2413, the Computer Security Enhancement Act of 1999,
outlines a fellowship program to increase the number of skilled IT workers. There is
currently a critical shortage of IT professionals and more specifically, an acute shortage
of information security specialists. Expanding workforce development is a key
prerequisite for protecting the nation’s critical infrastructure.
Encouraging Information Sharing. Bi-partisan information sharing legislation is expected
to be introduced in the House of Representatives by Congressmen Tom Davis and James
Moran, both of Virginia, within the next few weeks. The bill will seek to promote the
formation of Information Sharing and Analysis Centers (ISACs) to facilitate the
collection, analysis and dissemination of security data to government and industry. The
bill will also create exemptions from Freedom of Information Act (FOIA) requests for
information on network attacks on certain firms. The hope is that industry will feel more
inclined to share information knowing that it will not be subject to a FOIA request. The
bill also contains provisions that encourage information sharing without creating liability
Issue: One of the principle aims of information security is data integrity, that is, ensuring
that data in a file remains unchanged or that any received data matches what was sent.
Encryption (the conversion of data into an unreadable form via an encryption algorithm)
enables information to be sent across communication networks, which are assumed to be
insecure, without losing confidentiality or integrity. Encryption can also be used for user
authentication. For example, Lotus Notes uses encryption both for message
confidentiality and to verify the sender’s identity to the recipient. Encryption provides
assurances when the computer system or network cannot be trusted.
Encryption is gaining popularity as more companies begin to rely on shared public
networks such as the Internet rather than private leased lines for e-mail and electronic
commerce. Encryption helps protect transmission of payment data, such as credit card
information, and addresses problems of authentication and message integrity.
Authentication refers to the ability of each party to know that the other parties are who
they claim to be. Message integrity is the ability to be certain that the message that is
sent is not altered or copied before reaching the recipient.
Background: An encryption algorithm transforms plain text into a coded equivalent
(known as cipher text) for transmission or storage. The cipher text is decrypted at the
receiving end and restored to plain text. The algorithm uses a key, a binary number
typically from 40 to128 bits in length for single-key systems or 512 to 2,048 bits or more
for public-key systems. The data is “locked” for sending by using bits in the key to
transform the data bits mathematically. At the receiving end, the key is used to
unscramble the data, restoring it to its original binary form.
The effort required to decode the unusable scrambled bits into meaningful data without
knowledge of the key – known as breaking or cracking the encryption – typically is a
function of the complexity of the algorithm and the length of the keys. In most effective
encryption schemes, the longer the key, the harder it is to decode the encrypted message.
Two types of algorithms are in use today: (1) shared single key (known as secret key or
symmetric key) and (2) public key (or asymmetric key).
1. Single Key Encryption. In single-key algorithms, the same binary number is required
to encrypt and decrypt the data. This single key must be kept secret for the information
to remain secure. Therefore, a different shared key is required for each pair of users. The
system is symmetric in that the same key and the same algorithm are used for both
encryption and decryption.
The Data Encryption Standard (DES), which officially became a U.S. government
standard in 1977, is the leading single-key algorithm, with the standard specifying a 56-
bit key. Many experts consider longer key lengths of at least 90 bits necessary for the
future. U.S military strength encryption requires key lengths of 1,024 bits or more.
In 1998, RSA Data Security conducted a contest to see how quickly a 56-bit DES key
could be broken. In July 1998, a team from the Electronic Frontier Foundation cracked a
56-bit key in 56 hours.
Business are beginning to explore encryption methods other than those based solely on
56-bit DES keys including:
1) Triple-DES – Encrypts information three times using two different 56-bit keys,
thus increasing the effective key size of DES so they are computationally more
secure and, therefore, more difficult to break. Triple DES has an effective key
length of 112 bits.
The benefits of triple-DES include the fact that no known attacks have succeeded
in breaking two 56-bit keys, it is incorporated easily into existing systems, and it
is a standards-based algorithm. Drawbacks include the computing power required
(three times that of normal DES) and the difficulty of managing and distributing
keys associated with any secret-key algorithm.
2) International Data Encryption Algorithm (IDEA) – Encrypts information using
128-bit key and 8 rounds. IDEA is recognized as a fast, Triple-DES equivalent
cipher. IDEA is considered secure, with no algebraic weaknesses that might
make it susceptible to being broken. IDEA can be implemented in software or
hardware and has similar performance characteristics to DES.
2. Public-Key Encryption. The other major type of algorithm in popular use is public-key
encryption, which is based on two keys: one to encrypt the message and another to
decrypt the message. The algorithm is not symmetric, so knowing the public encryption
key is no help in being able to decrypt a message. Users wanting to receive encrypted
information can announce their public key, which then is used by the sender to encrypt
data to be sent to them. Public keys are typically stored in a public directory. Only the
holder of the private key can decrypt the data.
Public keys are attached to a digital certificate, which ties the user’s identity to the public
key. The problem of managing a large number of public keys and making them widely
available (yet easily revoked by their owners) is the primary challenge that should be
addressed. Public-key encryption is gaining in popularity with the growth of e-commerce
over the Internet, in particular because it does not require the exchange of private keys
before sending encrypted messages, unlike single-key encryption.
The most commonly used public key algorithm is RSA, created by RSA Data Security.
RSA Data Security’s recommended key sizes are now 768 bits for personal use, 1,024
bits for corporate use, and 2,048 bits for valuable keys such as the key for a certificate of
authority. RSA Data Security expects a 768-bit key to be secure until 2004.
Recommended key length schedules are published on RSA Data Security’s web site at
Digital Signatures. One application of public key encryption is evident in the
development of digital signatures. A digital signature is an encrypted alphanumeric code
attached to an electronic message that is both unique to the message and unique to the
person sending it. The digital signature is assigned to the document by a digital signature
software program. The sender then encrypts the alphanumeric code using his private key.
The recipient verifies authenticity of the digital signature by using the sender’s public key
to decrypt the message.
If the verification process confirms the digital signature, the recipient has reasonable
assurance that the message is authentic and has not been altered. While in theory only
the sender can access his private key, there is a potential for the private key to be
compromised if it is not protected. Certification Authorities (CAs) or other trusted third-
parties can provide some assurance that available public keys correspond to the signer’s
private key. CA’s can also revoke or suspend public keys, rendering the associated
private key useless.
Standardizing digital signatures using a public key infrastructure (PKI) is preferable
because it ensures a high degree of data integrity and authentication while enabling users
to conduct business transactions with multiple business partners, suppliers and customers
without mandating a technology choice. In sum, digital signatures accomplish four goals:
1. Ensure data integrity – The recipient can determine if the data has been altered.
2. Ensure confidentiality – The sender can encrypt data such that only certain
recipients can decrypt that data.
3. Ensure non-repudiation – The recipient cannot deny receiving a message because
the public key used to decrypt the message returns a proof of receipt.
4. Provide authentication – The digital signature allows the recipient to identify who
signed the message.
• PKI as the preferred encryption technology. PKIs, including the PKI-based digital
certificates and signatures, are becoming the authentication system of choice for
conducting e-business on the Internet. Reasons include a price decline in PKI
products stemming from fierce battle among suppliers to gain market share as well as
fundamental improvements in the system making it more flexible and easier to
The primary application for PKI is b-to-b e-commerce with enterprise customers,
business partners and suppliers. Applications driving the adoption of PKI included
Internet-based financial transactions and customer service. IS managers are also
deploying PKI for use with Internet-based b-to-c e-commerce, electronic funds
transfer and sales applications.
The principal reason IS managers are selecting PKI-based systems is to manage
enterprise risk arising from the use of Internet channels to conduct business.
Compared with alternate authentication systems, only PKI-based digital certificates
and signatures can be relied on to mitigate the financial risks associated with e-
IS managers have identified the following criteria for authentication systems:
1. It must provide validity and integrity for invoicing and revenue-recognition
2. It must provide widespread and ubiquitous interoperability.
3. It must meet financial, auditing, legal and uniform commercial standards.
4. It must be economically practical to deploy and maintain.
5. It must be difficult or economically impractical to steal or duplicate.
Traditional access security systems – passwords, hardware tokens, and biometric
systems - fail to meet these requirements. Although passwords are the most common
form of authentication, they do not provide sufficient proof of who an Internet user
claims to be. Hardware tokens are impractical due to implementation. To be
deployed effectively, IS managers would have to force customers, suppliers, and
business partners to adopt the enterprise’s specific technology choice. Rather than
making it easier to conduct business with the enterprise, token-based systems may
merely route customers to competitors. Biometric signatures such as retinal scans,
fingerprinting and voice signatures provide the best proof of identity. However,
biometric systems are prohibitive to implement due to high costs. Moreover, they are
difficult to implement because it requires customers to submit biometric signatures.
However, The PKI-digital certificate system is not without security weaknesses.
Security analysts maintain an astute hacker can access the private key over the
Internet. Therefore, sealing entry to the private key with only a user name and
password is not acceptable. Security developers have developed a variety of
solutions to this problem, collectively known as “extended user authentication.”
Essentially, these technologies, which can be hardware or software-based, require the
user to enter some form of secured identification to access the password or the private
There has been some debate as well regarding authorization-linked digital signatures
versus identity-linked digital signatures. While governments have invested
significant effort in developing the latter, IT professionals believe authorization
linked signatures will be more important in protecting digital transactions and
• Government bans on strong encryption exports. The government’s concern with
cryptography centers on its ability to ensure the continuing viability of intelligence
operations. With the advent of strong encryption techniques, intelligence gathering
organizations throughout the world are justifiably concerned that intelligence
gathering measures will be rendered obsolete. Therefore, extensive deployment of
strong cryptography poses a serious security threat. However, by providing
government access to keys, confidence in cryptography is undermined, thereby
slowing its deployment.
The U.S. government announced in September 1999 its revised approach to
encryption. In short, the Clinton Administrations policy hopes to balance a
competing range of national interests including promoting e-commerce, supporting
law enforcement and national security, and protecting privacy. In short, under the
new policy, any encryption commodity or software of any key length may be
exported under license exception, after a technical review, to individuals, commercial
firms, and other non-government end users in any country except for the seven state
supporters of terrorism (Iran, Iraq, Libya, Syria, Sudan, North Korea and Cuba). Any
retail encryption commodities and software of any key length may be exported under
license exception, after a technical review, to any end user in any country, except for
the seven state supporters of terrorism. Streamlined post-export reporting will
provide government with an understanding of where strong encryption is being
exported, while also reflecting industry business models and distribution channels.
On April 3, the Electronic Privacy Information Center (EPIC) released a study on
encryption policies in 135 countries. Cryptography and Liberty 2000 finds that that
the trend toward relaxation of export controls is continuing, but also that law
enforcement agencies are seeking new authority and new funding to gain access to
private keys and personal communications.
• How to manage the key network? Key recovery can be thought of as an encryption
system (with a backup decryption capability) that allows authorized individuals, such
as company officers or government officials, to decrypt encrypted text with the help
of information supplied by one or more trusted parties who hold special data recovery
keys. These data recovery keys are not the same as keys used to encrypt and decrypt
the data, but rather provide a means of determining the data encryption/decryption
keys. The term key escrow refers to the safeguarding of these data recovery keys
with a government entity or government-licensed escrow agent.
Key recovery mechanisms differ from key escrow in that the former provides a means
of recovering the session key of a message so that in an emergency or for law
enforcement requirements, the session key that encrypted a file can be recovered and
that file (and only that file) can be decrypted. Typically, key recovery schemes use a
random session key encrypted with the public key of the recipient as well as being
encrypted with the public key of the key recovery center. The key recovery center
then can unlock the random key used to encrypt that particular message or data file.
The Key Recovery Alliance (KRA), a consortium of over 60 companies
dedicated to strong encryption and to helping defined a policy framework for
businesses and institutions, believes that many of the more recent forms of
key recovery offer stronger protection against unlawful search and seizure.
Nevertheless, the KRA makes the following recommendations:
1. Establish legal access standards for government to Key Recovery
information under conditions of due process, including procedures clearly
stating the government's accountability and auditability.
2. Establish standards for the retention and destruction of Key Recovery
information once it is acquired by government under lawful means. Once
government acquires recovery information through duly authorized means
(e.g., under court order), it must operate under clearly defined standards
established by law governing the use and destruction of such information.
3. Establish procedures guaranteeing that government agencies, once key
recovery information has been acquired and managed according to the two
preceding items, will not use the information to modify the treatment of
content in any form.
Key Groups and Organizations
1. The Center for Democracy and Technology (www.cdt.org) works to promote
democratic values and constitutional liberties in the digital age. With expertise in law,
technology, and policy, CDT seeks practical solutions to enhance free expression and
privacy in global communications technologies. CDT is dedicated to building
consensus among all parties interested in the future of the Internet and other new
2. The Key Recovery Alliance (www.kra.org) is a group of more than 60 international
companies (including IBM and TIS) that is dedicated to strong encryption and to
helping define a policy framework for businesses and institutions. The Alliance
focuses on the interoperability of key recovery technologies while supporting a wide
range of existing industry solutions.
3. The Identrus Pilot Project (www.identrus.com) is a global trust organization created
to provide authentication for digital certificates. Founding members include Bank of
America, ABM AMRO, Bankers Trust, Barclays Bank, Chase Manhattan Bank,
Citigroup, Deutsche Bank, and Hypo Vereinsbank. Using PKI technology, Identrus
aims to establish a secure, global business-to-business e-commerce network by
providing global CA services for b-to-b transactions . Initial users will be the
corporate customers of the founding banks.
4. Business Software Alliance (www.bsa.org) is a trade organization representing the
world's leading software developers before governments and with consumers in the
international marketplace. BSA educates computer users on software copyrights;
advocates public policy that fosters innovation and expands trade opportunities; and
fights software piracy. BSA worldwide members include Adobe, Autodesk, Bentley
Systems, Corel, Lotus Development, Macromedia, Microsoft, Network Associates,
Novell, Symantec and Visio.
5. Americans for Computer Privacy (www.computerprivacy.org) is a broad-based
coalition that brings together more than 100 companies and 40 associations
representing financial services, manufacturing, telecommunications, high-tech and
transportation, as well as law enforcement, civil-liberty, pro-family and taxpayer
groups. ACP supports policies that advance the rights of American citizens to encode
information without fear of government intrusion, and advocates the lifting of export
restrictions on U.S.-made encryption products.
6. The Electronic Frontier Foundation (www.eff.org) has been established to help
civilize the electronic frontier; to make it truly useful and beneficial not just to a
technical elite, but to everyone; and to do this in a way which is in keeping with
society's highest traditions of the free and open flow of information and
7. The Electronic Privacy Information Center (www.epic.org) is a public interest
research center in Washington, D.C. It was established in 1994 to focus public
attention on emerging civil liberties issues and to protect privacy, the First
Amendment, and constitutional values.
8. The Global Liberty Internet Campaign (www.gilc.org) advocates prohibiting prior
censorship of on-line communication, requiring that laws restricting the content of
on-line speech distinguish between the liability of content providers and the liability
of data carriers, insisting that on-line free expression not be restricted by indirect
means such as excessively restrictive governmental or private controls over computer
hardware or software, telecommunications infrastructure, or other essential
components of the Internet.
9. Privacy International (www.privacyinternational.org) is a human-rights group formed
in 1990 as a watchdog on surveillance by governments and corporations. PI is based
in London, England, and has an office in Washington, D.C. PI has conducted
campaigns throughout the world on issues ranging from wiretapping and national
security activities, to ID cards, video surveillance, data matching, police information
systems, and medical privacy.
Major Legislation/Regulation/Court Decisions:
1. The “Security and Freedom Through Encryption” (SAFE) Act would lift current
export restrictions. On February 25, 1999, Reps. Bob Goodlatte, R-Va., and Zoe
Lofgren, D-Calif., introduced H.R. 850, the Security and Freedom through
Encryption (SAFE) Act. The SAFE Act which seeks to protect the rights of
Americans to use the strongest possible encryption, while lifting exports
restrictions on U.S. encryption products. The SAFE Act has bipartisan support.
To date, more than 250 members of Congress, including House Majority Leader
Dick Armey, House Minority Leader Richard Gephardt and House Minority
Whip David Bonior have cosponsored the legislation. The SAFE Act would
accomplish the following objectives: (1) affirm Americans' freedom to use the
strongest possible encryption; (2) defeat attempts to force Americans to provide
the government with some government-approved "third party" with "keys" to
their encrypted information; and (3) allow the U.S. to compete in the rapidly
growing market for strong encryption products.
2. The "Promote Reliable Online Transactions to Encourage Commerce and Trade
(PROTECT) Act of 1999 was introduced on April 14, 1999 by Senator McCain
(R-AZ), Senator Burns (R-MT), Senator Wyden (D-OR), Senator Leahy (D-VT),
Senator Abraham (R-MI) and Senator Kerry (D-MA). The PROTECT Act would
accomplish the following objectives: (1) Immediately decontrol 64-bit encryption
products; (2) Direct NIST to complete development of the Advanced Encryption
Standard (AES), a strong new global standard based on encryption of 128 bits and
higher, and decontrol export of AES and equivalent products by 2002; (3) Allows
export of strong encryption products to certain trusted end-users, export of
recoverable products, and export of "crypto-ready" products; (4) Allows export of
generally available products over 64-bits; (5) Prohibits domestic controls of
encryption of any strength; (6) prohibits any federal or state agency from
requiring, setting standards, or providing incentives requiring key recovery "or
any other plaintext access capability.
The PROTECT Act does not go as far as the SAFE Act in that it does not
contain criminal provisions. The SAFE contains provisions that penalize the
use of encryption in the furtherance of a crime. These provisions have long
been a concern for privacy advocates because, while narrowly drafted, they
represent the first domestic restrictions that threaten to chill the use of
encryption. The PROTECT Act does not contain any of these criminal
3. The U.S. Court of Appeals for the Sixth Circuit ruled on April 4, 2000 that
"because computer source code is an expressive means for the exchange of
information and ideas about computer programming, . . . it is protected by the
First Amendment." The court decision was issued in Junger v. Daley, a legal
challenge to U.S. export controls on encryption software. The case, in which
EPIC filed an amicus brief, will now return to a lower court for consideration of
the impact of new U.S. export regulations issued in January.
4. The major initiative regarding the enforceability of electronic contracts is
enactment of the Uniform Electronic Transactions Act (known as UETA). UETA
states that if a signature is required for a contract to be enforceable, any electronic
signature satisfies that requirement. Unfortunately, there are several problems
with UETA. First, many states may be reluctant to enact UETA because of
concern that certain types of electronic signatures aren't appropriate for all
contracts and transactions involving a signature. Second, even if UETA is
enacted, it doesn't address the evidentiary issues relating to electronic signatures.
The federal government is also considering legislation similar to UETA. Both the
House and Senate have passed bills (H.R. 1714 and S. 761) that would preempt
any state law that doesn't recognize the validity of electronic signatures.
Although these bills have been referred to a conference committee and appear
likely to be enacted, significant opposition has emerged. Consumer protection
advocates believe this legislation will erode consumer protection statutes by
letting companies use e-mail or other unreliable electronic records to satisfy
notice requirements. Many states are opposed to the preemption of state contract
law, and question whether preemption of all state contract and signature laws is
Some states have enacted legislation specifically relating to digital signatures
(and a few address biometric signatures). These laws can be particularly
helpful because they not only provide that a digital signature satisfies any
signature requirement, they also create presumptions that the evidentiary
burdens are satisfied, eliminating evidentiary barriers to the enforceability of
an electronic contract. However, it may be some time before all states enact
digital signature legislation due to concerns about potential risks placed on
consumers and political lobbying for "technology neutral" statutes.
Finally, the Uniform Computer Information Transaction Act (UCITA), like
UETA, has been adopted by a body of state commissioners that proposes
uniform state laws. If enacted by individual states, UCITA will govern the
sale and licensing of computer software and access contracts (such as access
to Web sites). UCITA provides that any contract requiring payment fee of
US$ 5,000, and which can't be performed within one year, must have an
electronic or manual signature. All other contracts don't need an electronic
signature, but the party to be charged must have manifested assent to the
contract terms (such as clicking on a clickwrap button). UCITA has a slightly
different approach than UETA by distinguishing between an electronic
signature and manifesting assent.
Trademarks, Domain Names and Cybersquatting
on the Internet
Issue: To what extent should an e-business be entitled to a domain name when someone
else (including cybersquatters) has registered it?
Managerial Issue: How do you maximize use of the domain name system for the sale of
products and services? How do you protect brand image in cyberspace?
Background: Companies use domain names as both Internet addresses and as branding
devices for their products and services. The registration of domain names is fairly
simple. Anyone can go online to one of the numerous domain name registrars (such as
Network Solutions, Inc, or Register.com) and register a domain name (such as
www.yourcompanyname.com). If the name has not already been taken, then the domain
name will be assigned to the registrant, regardless of any trademark rights that the
registrant may or may not have. Because of the numerous variations on names (such as
www.your-company-name.com), it is difficult for any one company to capture all
possible names that might identify the company or its product in cyberspace. This, along
with the fact that some individuals “beat” companies to their own names in registering to
a domain name registrar, has brought companies, who own trademarks over their names,
in conflict with domain name speculators (often referred to as “cybersquatters”). Many
companies were forced to pay tens of thousands, if not millions, to the cybersquatters for
the domain names. Many trademark holders took the cybersquatters to court under the
claim of trademark infringement, but the judicial process was often slow and uncertain as
the trademark laws were not perfectly tailored to the complexities of the domain name
In late 1999 and early 2000, the balance in the struggle over domain names shifted
dramatically to companies holding trademarks. First, Congress passed the
Anticybersquatting Consumer Protection Act (“ACPA”), which outlawed bad faith
registration and use of domain names. For example, NASDAQ was able to use the
ACPA to get a court injunction against Deltacross Limited which had registered the
domain names “nasdaqeurope.com” and “nasdaqeurope.net.” The court ordered the
domains transferred to NASDAQ. Second, the Internet Corporation for Assigned Names
and Numbers (ICANN) set up a dispute resolution system that ensured that any dispute
over domain names would be resolved within sixty days from when the trademark holder
brought the case to ICANN. This process would also be much less expensive than going
to court. From January 2000 through May 31, 2000, nearly 1000 cases were filed under
this process, with trademark owners capturing the domain name in more than three-
quarters of the cases. The ICANN decisions are based on an ICANN policy – The
Uniform Domain Name Dispute Resolution Policy (“UDRP”) – that individuals and
companies registering domain names comply to their registration agreements upon
registering a domain name. The UDRP requires that the registrant submit to ICANN
arbitration if someone or some company brings a complaint to ICANN about the
ownership rights to the name. The UDRP requires that a domain name be transferred to
the trademark holder or cancelled altogether if:
• registrant has registered a domain name that is confusingly similar to a
• registrant has no legitimate interest in the domain name, and
• registrant has acted in bad faith in registering and using the domain name.
All three conditions are required. There are obviously many disputes about what
constitutes “confusingly similar,” “legitimate interest”, and “bad faith.” For the most
part, the ICANN arbitration panels have interpreted these provisions in the favor of
company trademark holders.
Conflicts: One of the major conflicts in domain name disputes involves the registration
of “generic” terms. Neither the ACPA nor the UDRP outlaws mere speculation in
domain names. Many people have registered names such as “fork.com” or
“airlines.com” in hopes of selling them to companies. For example, “business.com” was
recently sold for $7 million. Since trademarks on generic terms are weak, if allowed at
all, registrants of generic terms generally have a free hand in selling these valuable
Recently, a few courts and ICANN panels have taken away arguably generic names from
domain name registrants and giving them to companies who have a limited trademark on
the generic term. For example, the company J.Crew has a limited trademark on the
generic term “crew” as it relates to clothing. J.Crew had registered the “crew” trademark
with the Patent and Trademark Office some years ago. When a person registered the
name “crew.com”, J.Crew brought an ICANN proceeding and was able to wrestle the
name away. Many legal scholars, commentators, and businesses were surprised at the
move because “crew.com” was not being used to sell clothing. Many thought that “crew”
was a generic term with multiple uses in cyberspace and that J.Crew had engaged in
“reverse domain name hijacking.”
The addition of an “e” to a generic term has also been litigated. For the most part,
ICANN arbitrators have found that adding an “e” to a generic term does not create a
trademarkable term deserving protection under the UDRP. However, a recent court case
in California used the concept of “unfair competition” to address whether the registration
and use of “ecards.com” by a Canadian company violated rights of a U.S. company
named E-Cards. A jury decided there was a violation of E-Cards rights and awarded E-
Cards $4 million in damages.
Other domain name issues being litigated include whether the inclusion of a descriptive
term along with the trademark is sufficient to defeat a trademark holder’s claim. For
example, if someone registers “Ford-parts”, would Ford be able to get a transfer or
cancellation of the domain name under the UDRP? Some sites using descriptive terms
with trademarks have been protected against transfer and cancellation based on First
Amendment grounds. For example, “Ballysucks.com” is an expression of free speech
and would not violate trademark rights.
Key Players and Resources
• Professor Michael Geist (http://www.lawbytes.com). Leading expert on domain
name disputes and Internet Law.
• Internet Law and Business (http://www.lawreporters.com). Edited by Professor
Emerson Tiller, this publication specializes in reporting on domain names disputes.
• Internet Corporation for Assigned Names and Numbers (ICANN) (www.icann.org)
Sets policy regarding domain names and has set up a resolution system for domain
• ICANNWatch (www.icannwatch.org). Reports on developments on domain name
policy by ICANN.
• Uniform Domain Name Dispute Resolution Policy (“UDRP”)
• World Intellectual Property Organization(http://arbiter.wipo.int/domains/index.html),
National Arbitration Forum (http://www.arbforum.com/domains/), eResolution
(http://www.eresolution.ca/services/dnd/arb.htm) and CPR Institute for Dispute
Resolution (http://www.cpradr.org/home1.htm). These organizations currently
provide alternative dispute resolution for ICANN.
• U.S. Patent and Trademark Office Guidelines on registering domain name as a
• Anticybersquatting Consumer Protection Act
Taxation of Internet Commerce
Issue: Whether Internet commerce should be taxed, what commercial and personal
property transactions and services should be taxed, and who is responsible for
compliance. We concentrate on sales and use taxes on the Internet.
Relevance to E-Business Managers: Foreign governments, the U.S. Congress, as well
as many U.S. states have taken positions on the taxation of Internet commerce. The future
is still unfolding in many areas: (1) what will be the sales and use tax obligations of
sellers with minimal physical presence in the tax jurisdiction, (2) how will the nexus of
“online” transactions be determined, (3) what Internet services will be subject to sales
and use taxes, (4) how will e-commerce transactions avoid double taxation, and (5) what
are the implications of online taxation on privacy? E-business managers need to keep
abreast of the latest developments to take advantage of new opportunities and reduce the
overall tax burden on their electronic commerce activities.
Background: In 1997, The Internet Tax Freedom Act (ITFA) placed a 3-year
moratorium on special taxation of the Internet, ending October 21, 2001. The act
rendered the Internet as a “tax-free” zone in many respects. The act
o Prohibited taxation of information and information services on the Internet
o Prohibited multiple and discriminatory taxes on electronic commerce
o Prohibited federal taxes on electronic commerce
o Established an Advisory Commission on Electronic Commerce (ACEC),
o Urged the Clinton Administration to work with the European Union and
World Trade Association to keep the Internet tariff- and discriminatory-
The act barred taxes that would allow taxation by more than one state on purchases of
commercial or personal tangible property over the Internet. The moratorium prohibited
certain types of taxes. The act barred taxes on goods sold exclusively over the Internet
without comparable offline equivalent. The act contained a grandfather clause allowing
the states that were already taxing Internet access and Internet-related services to
continue to do so. (For example, the state of Texas imposes a tax on Internet access, web
services and other information and data processing services.) For most states, the 1997
Internet Tax Freedom Act put Internet commerce on the same playing field with the
The act made no changes to the nexus requirement. The act simply continued the nexus
laws used to determine if catalog retailers could be taxed. The rule is that a physical
presence is necessary in a jurisdiction in order to allow that jurisdiction to require the
business to collect and pay sales taxes. When a person buys a product from an out of
state business with no nexus in the customer’s jurisdiction, the burden is on the customer
to pay the sales tax. Business customers are likely to pay the use tax. No consumer really
abides by this rule and no effort is made to enforce it because, historically, the returns of
enforcement would not equal the costs. However, as purchases over the Internet from
nexus-less businesses increase, the potential rewards of enforcement greatly increase.
Additionally, there are credible arguments to be made that e-commerce costs states
substantial sums of tax revenue because it is a substitute for traditional retail purchases
which are easily taxed. These concerns were not, however, very large in 1997 when only
a small number of people were online. The act was easily passed with the main premise
that the Internet needed time to develop. The small Internet entrepreneurs who sell
customers beyond their state should not be burdened by having to collect taxes from
buyers who reside in thousands of different state and local tax jurisdictions.
The act established the Advisory Commission for Electronic Commerce (ACEC). Its
18-19 members include federal officials, business and consumer leaders, and
representatives from state and local governments. It was commissioned to submit its
recommendations by April 2000. The commission failed to reach conclusions on many
fronts. The commission was split into two main groups: the business caucus which
proposed limits to taxes, and the government caucus which pushed initiatives to protect
sales tax revenues.
When the Advisory Commission presented its conclusions to the U.S. Congress in the
spring of 2000, the two factions had reached agreements (two-thirds majority) in three
o The taxation issue is a digital divide issue
o Privacy must be protected
o No international tariffs and taxes should be imposed
The Commission advocated tax incentives and federal matching funds to states to help
provide needy citizens access to computers. It also argued for respect and protection of
consumer privacy in impose any tax collection processes on the Internet. The commission
advocated the continuation of a moratorium on any international tariffs on the Internet.
In May 2000, without holding a single hearing, the U.S. Congress voted on the
conclusions and extended the ban on Internet taxes for another five years till October
2006. Congress defeated the White House proposal of a two-year extension and defeated
the proposal of a permanent ban. It also banned the grandfather clause of taxing Internet
access, and thus prohibits the ten states currently collecting taxes on Internet access from
doing so after the current ban expires, in October 2001. The bill approved by the U.S.
congress does not solve the outstanding issues of fairness between Internet sellers and
brick and mortar retailers and it does not address how states and localities can collect
applicable taxes on Internet sales. There are also concerns that a five year moratorium
postpones any serious discussion of taxation for too long of a distant future. A long
moratorium also provides disincentives for states to begin to streamline and simplify their
tax rules for greater federal uniformity. The Advisory Commission’s conclusions will be
next discussed and voted by the U.S. Senate.
No agreement was reached on the number of proposals advanced by one of the caucuses.
First, the business caucus wanted to extend the moratorium to cover those physical
tangible products where a digital good existed. If the sale of a digital good such as
downloading of music was not taxed, then the physical CD should not be taxed either
even when it was purchased in a brick and mortar store. The taxing authorities resisted
this proposal because it threatened to erode their tax basis.
Second, the National Governors’ Association (NGA) proposed that the seller collects tax
even if it does not have any other connection with the state. The National Governors’
Association proposal assumes that Internet transactions will be taxed the same as brick
and mortar retailers. The proposal advocated a zero-burden system where all Internet
transactions would be taxed by using a centralized, third party, private-sector entity that
collects taxes based on the consumer’s location and then distributes the revenue to the
states. The system is zero-burden because neither the buyer nor the seller would have to
comply with the tax. The taxing jurisdiction would pay the third parties for the
Another proposal dealt with the removal of the 3% federal excise tax on
telecommunications that was instituted as a luxury tax on people who owned phones
during the time of the Spanish-American War. The Internet businesses are using this rule
to illustrate how it would take 100 years before taxes imposed on the Internet were
repealed. The proposal also recommended the simplification of state and local
A voluntary multi-state tax commission, in place since 1967, was proposed to be in
charge of simplifying definitions and rules of sales and use taxes across states. The rules
vary across states as well as within state. For example, in one taxing jurisdiction a bottle
of fruit juice might be taxed; whereas in another it is not if the bottle is of certain size.
The simplification effort proposes the use of uniform legislation and multi-state
The discussions of ACEC have been concerned that any U.S. solution to Internet taxation
must be internationally viable. The Internet Tax Freedom Act urged the Clinton
administration to work with the European Union and World Trade Organization (WTO)
to keep the Internet tariff and discriminatory-tax-free. The Clinton administration has
made mixed progress internationally. The Clinton administration has pushed towards
achieving a consensus to extend the moratorium on taxes for items delivered digitally
over the Web. In May 1998, the World Trade Organization’s (WTO) Ministerial
Conference adopted a declaration committing all WTO member governments to refrain
from imposing customs duties on electronic transactions. However, Ministers at the
December 1999 WTO meeting in Seattle failed to reach an agreement over the global
implementation of Internet taxation.
The Clinton administration has faced a set-back in Europe in the Spring of 2000.
The European Commission has proposed a new value-added tax regime for
electronically delivered consumer services such as music and software, charged
on services supplied from elsewhere. The companies selling digital services for
consumption in Europe have to collect sales taxes, services outside the Europe would be
exempt from the value added tax . EU justified the proposal by saying that it corrected
the current discrimination against EU-based suppliers of electronic services and brought
the sale of tangible goods such as CDs and books in line with digital goods. The proposal
does not provide details how EU will force non-EU suppliers of online software, music,
and services to pay value-added tax. Two-thirds of U.S. companies surveyed conceded
that they would have difficulties implementing and administering taxes on the Internet.
These developments suggest a chasm between Europe and U.S. on Internet taxation.
Conflicts: In their spring 2000 report, the Advisory Commission on Electronic
Commerce concluded that the debate over Internet taxation is about the ‘digital divide.’
Internet tax policies create a gap between those who are taxed and those who are not.
These discriminating tax policies are at the heart of the conflicts. The conflicts rally
around fairness and simplicity. (There is also a major conflict between federal and state
governments regarding the federal government’s ability to dictate to states what can or
cannot be taxed).
What should be taxed to ensure a level playing field? The current differences in what is
taxed create an imparity between Internet businesses and brick and mortar businesses.
Brick and mortar businesses argue that the current tax policies are creating a cost
advantage to the Internet businesses. The omission of tax on some categories also reduces
the potential sales tax revenues that tax authorities receive. As more and more consumers
buy goods over the Internet, taxing authorities are worried about erosion in the tax-base.
Currently purchases of tangible property are taxed on the Internet but Internet access and
Internet services are not. Many of the goods formerly bought in tangible form from brick
and mortar stores can now be downloaded while bypassing the tangible part of the
product. Examples are music, and (more in the future than the present) books and videos.
There is also the issue of bundling together Internet-services that are not taxable in most
states and physical property that is taxable. Will the bundled package be taxable and how
will the value be determined? Responses to the issues will likely require new definitions
of taxable purchases. The business caucus of ACEC already proposed that if a digital
product is not taxed by the state then the corresponding physical product should not be
taxed either. The proposal was not accepted, however.
Should the sales tax collection obligation be imposed on remote sellers to promote an
even playing field? The U.S. Constitution limits a state’s ability to tax businesses doing
business in interstate commerce. Nexus is the single most important issue in Internet
taxation. Nexus issues have been controversial in the context of catalog business for
many years because they affect who the state requires to pay the tax: the seller or buyer?
In addition, what burden of tax administration is fair given that many Internet business
are small businesses that sell remotely. The issue might also have impact on customers.
One study found that online purchases would decrease by 25 percent if taxed like brick
and mortar sales.
Sales tax is imposed on a sale in a particular state. On the Internet, it is difficult to
determine the place where the transaction occurs. An internet transaction may involve a
buyer in State A, a retailer incorporated in State B with a website hosted in State C, and
delivered as a gift to a State D. Enforcement can become a major issue because if nexus
cannot be established, a state cannot require the vendor to report and pay the taxes. It is
then the duty of the consumer to pay the taxes, which is generally ignored and rarely
enforced. Wal-Mart has tried to stay within a loophole in this moratorium by prohibiting
pickup or return at its retail stores. As long as customers cannot pickup or return at a
retail store, they figure that the goods are sold exclusively on-line.
Nexus also divides the Internet businesses from brick and mortar businesses by giving an
advantage to Internet merchants. Traditional retailers, even those who have substantial
Internet sales, argue that the new Web-based businesses are in a privileged position.
What collection processes are needed to ensure fairness? Now, the seller is burdened to
collect the sales tax. If the remote sellers are burdened to collect the tax, what relief do
they need? It is impractical for a small web business to master the rules of thousands of
tax jurisdictions in the U.S. alone.
There is push from the federal level for the states to participate in the simplication
process that would create a uniform system of definitions and rules so that purchasers (or
at least vendors) will be able to easily determine whether, at what rate, and by whom the
items will be taxed. Uniformity in tax rules will also help avoid double taxation. If the
tax is too difficult to administer, administrative costs, compounded by the sales tax, will
prevent retailers from being economically able to continue doing business on the Internet
The simplication process is voluntary and the states’ involvement is likely to vary by
incentives offered and alternatives made available. Some states heavily depend on sales
tax and will likely resist reducing the overall sales tax burden. Others may want to shift
sales tax burden to income taxes. Some states have already striven to become Internet-
friendly states by reducing as many taxes possible for Internet businesses. Other states are
at comparative disadvantage and might see the simplication process as a way to repeal
The National Governors’ Association proposal intends to overhaul and simplify sales and
use taxation. Although the current nexus rules would apply, over the long term, the
proposal advocates a uniform system across states and for brick and mortar and Internet
sales. The same taxes would apply to Internet sales as brick and mortar sales, states
would simplify their sales and tax system, and the tax administration burden would be
shifted to a trusted third parties instead of sellers. Trusted third parties might be linked
with credit card companies. The trusted third parties would be responsible for calculating,
collecting, reporting, and paying the tax.
How will the buyer’s privacy be protected? There are concerns of purchasers’ ability to
withhold personal information regarding both identity and location. There is an issue of
what types of records companies need to maintain for federal and state tax purchases of
their Internet transactions. The National Governors’ Association propose to shift the
collection responsibility to a third party, yet there is a concern as to what extent this
entity would monitor transactional information.
How will tax fraud be minimized? Regardless of what tax laws may be passed, there will
be a major problem of enforcement. Also, the Internet has opened up a wide array of
new possibilities for money-laundering schemes. Business operating exclusively over
the Internet may be able to avoid paying US income taxes on profits and incomes
received from the business. Currently, some of this may be done legally due to
loopholes. However, much of the illegal tax avoidance is rendered virtually undetectable
by Internet laundering.
o News clips on taxation policy developments around the world:
o The Tax Freedom Act http://www.house.gov/cox/nettax/ as enacted to law in
1998. The plain English summary is at www.house.gov/cox/nettax/lawsums.html.
o Advisory Commission on Electronic commerce report on ECommerce Spring
2000 http: //www.ecommercecommission.org
o The National Governors’ Association proposal to overhaul and simplify sales and
use taxation: //www.nga.org . The proposal recommends changes to the current
nexus laws and recommends the use of technological solutions to facilitate remote
o Multi-state Tax Commission Sales Tax Simplification Project: //www.mtc.gov.
The project encourages sates to adopt uniform tax laws and regulations.
Law Review Articles:
51 Fed. Comm. L.J. 245
22 Seattle Univ. L.R. 1187
52 Tax L. Rev. 571
14 Akron Tax J. 1
38 Colum. J. Transnat’l L. 191
Quill v. N.D. 504 U.S. 298
National Bellas Hess v. Dept. of Revenue of Illinois 386 U.S. 753
Internet Content Restrictions
Issue: What combination of governmental and individual regulation best balances
prohibition of harmful online content and protection of free speech?
Relevance to E-business Managers: Countries have restrictions on what type of
materials can be published on a Web site, and these restrictions come in many forms:
licensing and regulation laws, applying existing print and broadcast restrictions to the
Internet, filtering content, and direct censoring of content. These restrictions impose
liabilities on Internet businesses either as content or conduit providers. This issue is a
very active and rapidly developing area of law. Internet businesses need to be active in
monitoring these laws, especially if they do business outside the U.S.
The restrictions vary from country to country and reflect the cultural, political, and
religious norms. The U.S., various European countries, and many other countries have
laws particularly pertaining to pornographic and harmful content for children. Many
countries have found the Web site owners and their ISPs liable for objectionable content.
Both in Germany and Denmark, Web sites have been held liable for racist content
violating Germany’s hate speech law. A French court has sought measures from Yahoo!
to stop web users in France from gaining access to Nazi memorabilia which appear on
one of the web sites Yahoo! hosts.
Background: The Internet is a unique medium that enhances the freedom of expression
but it also allows fast dissemination of harmful content faster than ever. The Internet
allows immediate information delivery and accessibility in a decentralized
environmentas a result, content of the information appearing on the Internet is largely
In general, the failure to censor content before it reaches online audiences poses no
problems since the information many users access is appropriate for them. But the
Internet also allows immediate, easy access to content that is inappropriate for some or all
There are several categories of objectionable online content:
1. Content that has been prohibited because of its threat to public safety or welfare is
inappropriate for all audiences. This category includes child pornography, how-to
terrorist information, and any language promoting racial violence.
2. Traditionally, adult-oriented content is inappropriate for children. This category
includes violent images, graphic language, and sexually explicit materials.
3. Morality and local values. In Asia and the Middle East, governments frequently cite
protection of morality and local values as reasons for censorship.
4. State secrets
The Internet community has embraced two approaches to the regulation of objectionable
content on the Internet: governmental regulation and individual regulation.
Many U.S. governments – both on the state and national level – have passed laws
prohibiting certain content altogether (child pornography) or regulating the delivery of
certain content to audiences (children under 18 cannot access pornographic Web sites).
A majority of state laws have been invalidated under the Commerce Clause. Two United
States federal laws have been invalidated on First Amendment grounds. The Supreme
Court declared the Communications Decency Act unconstitutional, and a federal judge
temporarily blocked enforcement of the Child Online Protection Act.
Two categories of individual regulations exist: regulations that occur at the business
level and regulations that occur at the user level.
An e-business may implement its own internal policies to regulate online content.
Business-based regulatory solutions include:
1. Internet service providers adopting Acceptable Use Policies, outlined in their service
agreements, to ensure the confidence and safety of their subscribers. These policies
typically include guidelines concerning libelous, defamatory, obscene, pornographic,
threatening, abusive, or illegal behavior online, as well as the use of the service to
spread unsolicited e-mail (spam).
2. Netscape and Microsoft incorporating the Platform for Internet Content Selection
(PICS) into their browsers.
A consumer may use technological solutions to regulate online content transmitted into
the home. User-based regulatory solutions include:
1. Filtering tools. Filtering tools are software programs that filter content based upon
the Web site address, human review of a Web site, key words, and/or context
sensitive key words. These programs either run off the Internet Service Provider’s
server (server-based filters) or off the user’s home computer (client-based filters).
FamilyConnect is an example of an ISP that performs its own filtering; Cyber
Sentinel is an example of a client-based filter. Reviews of Internet Access Filtering
software can be found at http://www.superkids.com/
2. “Kids Only” Internet browsers. These browsers are easier to use than general
audience browsers and filter objectionable content as the child surfs the Web. A
browser would be designed to send a signal to Web sites that a child is doing the
3. Search engines for children. These search engines perform searches only within a
group of pre-approved sites, or they search the entire Web but only display results
from approved sites. Education World is an example of a search engine for children.
4. Monitoring tools. These tools track the Web addresses of sites children visit and
report those addresses to parents. Prudence is an example of a monitoring tool.
5. Scanning physical traits. Biometrics and digital age certificates have been proposed to
use to verify Internet user’s identities. A person’s age might be determined based on
scanning of fingerprints and a face.
1. What content do we restrict?
Anytime a community restricts content, it also restricts expression. Laws
restricting content face First Amendment challenges since they hinder access
to constitutionally protected speech. For example, a law that requires a
pornography Web site to verify credit card numbers prior to allowing users to
access the site certainly prevents most children from accessing the site, but it
also prevents adults without credit cards from accessing the site.
Even individual regulatory approaches hinder valuable free expression. When
parents restrict content for their children, the child may lose access to valuable
information. For example, since some filters look only for key words,
medical Web sites about breast cancer research are blocked at the same time
as pornography Web sites. In addition, companies that publish filtering
software identify objectionable sites according to subjective criteria and do
not reveal the sites or criteria to consumers. These companies may filter sites
based on their own political or social agendas, and parents who use the
filtering software unknowingly prevent their children from accessing
information about valuable different perspectives.
In addition, different communities find different content objectionable. Since the
Internet accommodates content from around the world, content restrictions in one
country may affect content providers in another country. If the two countries have
different views about objectionable content, the country imposing content restrictions
also imposes its culture upon the other.
To address the problem of cultural imperialism, the Internet Content Rating
Association has instituted an effort to create an international rating and filtering
system. Free speech advocates present the following seven objections to an
a. A standardized system provides the needed technical tool to allow governments to
mandate the use of the system.
b. A standardized rating system allows the criminalization of misrating Internet
c. Governments could use the system to block the exchange of information that is
controversial or unpopular.
d. A mandatory self-rating system imposes burdensome compliance costs on small
e. A single ratings system destroys diversity on the Internet.
f. A standardized system enables invisible upstream filtering by ISPs.
g. A standardized system leads to a homogenized Internet dominated by large
2. How do we most effectively restrict harmful content?
The Internet community is wary of governmental regulation and therefore
embraces an individual regulatory approach to content restriction. An
individual regulatory approach mirrors the private, decentralized nature of the
Internet. It allows users to customize content restrictions to their needs and
An individual regulatory approach is not completely effective, however, since it lacks
an enforcement mechanism. Legal regulation, especially for content that is
inappropriate for all audiences, is necessary to ensure that Web sites delivering
objectionable content stop their practices. Many Web sites are beyond the
jurisdiction of legal regulations, however.
Organizational Policy Makers
American Civil Liberties Union (www.aclu.org): Fights for free speech on the
Center for Democracy and Technology (www.cdt.org): Advocates free speech on
Electronic Frontier Foundation (www.eff.org): EFF is a non-profit, non-partisan
organization working in the public interest to protect fundamental civil liberties,
including privacy and freedom of expression in the arena of computers and the
Global Internet Liberty Campaign (www.gilc.org): Advocates free speech on the
Internet Content Coalition (www.netcontent.org): The Internet Content Coalition
is a non-profit organization composed of Web content producers. The ICC’s
mission is to represent and promote the interests of its membership in the
establishment of standards and practices for content, publishing, technology, and
commerce on the Internet; to educate the membership on Internet issues; and to
help the Internet remain a self-governing medium with the highest standards of
Freedom House (www.freedomhouse.org): Advocates democracy and human
rights worldwide. The organization publishes the Press Freedom Survey at http://
Platform for Internet Content Selection (www.w3.org/PICS/): The Platform for
Internet Content Selection (PICS) is a rating standard that establishes a consistent
way to rate and block online content. PICS was created by a large consortium of
Internet industry leaders and became operational last year. In theory, PICS does
not incorporate or endorse any particular rating system – the technology is an
empty vessel into which different rating systems can be poured. In reality, only
three Third-party rating systems have been developed for PICS: SafeSurf, Net
Shepherd, and the de facto industry standard RSACi.
Internet Content Rating Association (www.icra.org): The Internet Content Rating
Association is an international, independent, non-profit organization that
empowers the public, especially parents, to make informed decisions about
electronic media by means of an open, objective, content advisory system. The
RSACi system managed by ICRA provides consumers with information about the
level of sex, nudity, violence, or offensive language (vulgar or hate-motivated) on
International Corporation for Assigned Names and Numbers (www.icann.org). A
U.S. Senator has proposed that ICANN should create a new top-level domain
such as “.sex” or “XXX” to shield children from sexually explicit material.
Background reports on Internet content issues:
News on content regulation: http://www.qlinks.net/quicklinks/content.htm
Liability, jurisdiction and applicable law: http://www.qlinks.net/liabil.htm
News on rating and filtering: http://www.qlinks.net/quicklinks/rating.htm
Communications Decency Act (1996): Prohibits Web sites from making patently
offensive material available to minors (individuals under 18).
Child Online Protection Act (1998): Requires commercial operators of pornographic
Web sites to keep “harmful to minors” material from children. The law requires
viewers to supply a credit card number or personal identification number before any
images are displayed.
Reno v. ACLU, 521 U.S. 844 (1997) (Communications Decency Act provisions,
which prohibit knowing transmission to minors of "indecent" or certain "patently
offensive" communication, held to abridge free speech protected by First
ACLU v. Reno, 31 F. Supp. 2d 473 (E.D. Pa. 1999) (court issued a preliminary
injunction against the enforcement of the Child Online Protection Act).
Loudoun v. Board of Trustees of the Loudoun County Library, 24 F. Supp. 2d 552
(E.D. Va. 1998) (court held that a policy prohibiting the access of library patrons to
certain content-based categories of Internet publications violated the First
• Global Regulatory Efforts
The Internet Content Rating Association hosted an Internet Content Summit in Munich
during September 1999. The summit advanced the creation of an international rating and
filtering system. Formed in May 1999, ICRA was created to develop an internationally
acceptable online content labeling system. Founding members include AOL, Bell
Canada, Bertelsmann Foundation, British Telecom (BT), Cable & Wireless, Demon
Internet (UK), Deutsche Telekom Online Service, Electronic Network Consortium,
EuroISPA, IBM, Internet Watch Foundation, Microsoft, Novell, Software & Information
Industry Association, and UUNet
• Regional Regulatory Efforts
1. ASEAN: The members of the Association of Southeast Asian Nations (Brunei,
Malaysia, Singapore, Indonesia, Philippines, Thailand and Vietnam) have agreed to
police the Internet and block access to those sites that run counter to “Asian values”.
2. European Union: The EU established the European plan, called the Action Plan for
the Safe Use of the Internet, in 1998 to censor content that is harmful, unlawful or
undesirable. This plan is reliant upon the voluntary cooperation of all areas of the
communications media and grew out of a plan in 1996 to limit the spread of child
pornography on the Internet.
• Country Regulatory Efforts
1. Australia: Australia has already enacted legislation in its Broadcasting Services
Amendment (Online Services) Bill, which places sweeping restrictions on adults
providing or gaining access to material deemed unsuitable for minors as determined
by Australian film and video classification standards. In the first three months of the
bill (January-March, 2000), the Broadcasting Authority issued final take-down
notices for 31 items of Australian-hosted content, referred 45 items of content to the
makers of filtering software products and referred 7 items of content to law
2. China: China has made it a crime to access or spread anti-government material.
The state secret law applies to the web including chat rooms and personal e-mails.
The regulation states, “Any information provided to or issued on Internet web sites
must obtain the inspection and approval of secrecy censorship.” Newspapers report
on sentencing of web operators regularly.
3. Germany: A German court convicted the local manager of an Internet Service
Provider (ISP) because a subscriber used the service to transmit pornographic
4. Middle East: In Bahrain, Iran, Saudi Arabia, the United Arab Emirates, and Yemen,
ISPs (either under government orders or pressure) block Web sites on the basis of
their content. At least in the first four of these countries, blocking extends to cultural
and/or political content. Proxy servers in these countries can be used by authorities to
track which computer terminals are accessing which Web sites and for how long.
5. Singapore: Singapore requires ISPs to block designated Web sites.
6. United Kingdom. The UK law holds an ISP responsible for content published by third
parties or by its customers and have serious implications. Several ISPs have been
under pressure to take down Internet content which allegedly carried defamatory
content under the 1996 Defamation Act. For example, a Web site that reported on the
miscarriages of justice was taken down by the ISP because it was notified by the
lawyers of the Police Federation.
On the Internet worldwide, content restrictions are on the rise. According to the human
rights group Freedom House, only 69 of the countries studied have a completely free
media, while 51 have a partly free media and 66 countries suffer heavy government
censorship. Censorship methods include implementing licensing and regulation laws,
applying existing print and broadcast restrictions to the Internet, filtering content and
direct censoring after dissemination. Countries where Internet access is mostly or totally
controlled by the authorities include Azerbaijan, Belarus, Burma, China, Cuba, Iran, Iraq,
Kazakhstan, Kyrgyzstan, Libya, North Korea, Saudi Arabia, Sierra Leone, Sudan, Syria,
Tajikistan, Tunisia, Turkmenistan, Uzbekistan and Vietnam.
Future Trends: The government currently regulates cable television more stringently
than Internet communications since cable television is seen as “invading” the home.
Cable television and Internet communications will merge into one medium, however, and
the government will have a justification for applying cable regulations to Internet content.
E-business Strategies: Open Versus Closed Customer
and Competitor Environments
Sirkka Jarvenpaa and Emerson Tiller
Center for Business, Technology, and Law
University of Texas at Austin
June 19, 2000
The Linux open source operating system software unlike the Microsoft Windows
proprietary product, is made available to anyone who wants it free of charge. IBM has
announced a strategic shift to Linux, enabling all of its hardware and moving its own
software to support Linux development.4
Digital stampede occurred as Stephen King released his first electronic novel entitled
“Riding the Bullet, distributing some 400,00 copies on the first day; many of them given
for free. This raised eyebrows as first day sales of the most popular hard back books run
at most between 30,000 and 70,000 copies.5
Napster software allows for the free circulation of music over the Internet. The
technology allows the creation of a virtual global living room where “millions of ordinary
listeners [who] have converted portions of their purchased music collections into the MP3
format and copied them onto their hard drives … can play music directly from one
another’s PC, rather as they [college students] might play one of their roommate’s CDs
on the stereo in their dorm room.”6 Napster holds no copyrighted material on its servers.
Nearly half a million musical pieces have been made available using its technology.
AOL closed its 40 million Instant Messenger customer base from Microsoft’s MSN
Messenger users, on the defense that it had the responsibility of protecting the privacy
concerns of AOL users. As AOL closed its customer base to competitors, it was
lobbying the Justice Department to get cable companies (including AT&T) to open up
their cable-based Internet services to other ISPs.7
Amazon.com, the world’s largest online book seller, received a business method patent
on its “1-Click” checkout and brought an infringement suit against
Barnesandnoble.com’s use of Express Lane Checkout (a similar one-click method of
checkout), just prior to the Christmas shopping season. Free Software Foundation led a
boycott againstAmazon.com and a protest site was established. Many academics join the
outrage of computer professionals who object to commercial enterprises monopolizing
the Internet. Lawrence Lessig, a professor at Harvard Business School, writes “The idea
that 1-click is so amazing that it deserves a government-granted monopoly is ridiculous…
Seidel and Stewart (2000).
New York Times (March 16, 2000).
Computer Reseller News (1999).
These patents are going to change what the Internet is right now, which is a place for a
broad number of people to play in the innovation game.”8
In granting eBay an injunction against Bidder’s Edge’s use of a “spider” to search eBay’s
data bases, a U.S. federal court stated: “The parties submit a variety of declarations
asserting that the Internet will cease to function if, according to eBay, personal and
intellectual property rights are not respected, or according to … [Bidder’s Edge], if
information published on the Internet cannot be universally accessed and used. Although
the court suspects that the Internet will not only survive, but continue to grow and
develop regardless of the outcome of this litigation, the court also recognizes that it is
poorly suited to determine what balance between encouraging the exchange of
information, and preserving economic incentives to create, will maximize the public
The opposing notions of the free exchange of information and the preservation of the
economic incentives to create are not new. This tug of war has played itself out in a long
history of debates over the role and reach of intellectual property rights in a capitalist
economic system. But these notions have taken on new vigor with respect to the Internet
and e-commerce as ideas and knowledge have become universally the most important
asset of the e-business enterprise. This debate has been given the banner “open v.
closed” Internet, with “open” generally associated with free exchange and “closed”
associated with ownership and preservation of economic incentives. The Internet
technology injects itself into this debate in a unique way. In the old economy, the use
and ownership of technology was often governed by market forces and legal regimes. In
the new economy, Internet technology often has its own governance features separate
from market forces and legal regimes. Lawrence Lessig phrases it “code (software) is
law.” Internet technology as a governance mechanism has the ability to limit or expand
social and economic opportunities much like market and legal forces.
The three forces of market, law, and technology define the opportunities available to the
e-business enterprise. Sometimes these forces are substitutes and at other times
complements. The imperative for e-business is to recognize whether an “open” or
“closed” strategy is optimal and then which force or forces to use to execute the strategy.
Of course, these decisions are not always independent, as the choice to pursue a particular
strategy is often influenced by the market, legal and technology landscape facing the
The open v. closed strategy dilemma must also be considered on two levels, one relating
to the firm’s customers, the other to the firm’s competitors. In terms of customers, the
open v. closed strategy dilemma is recast for the manager as “to what extent do we limit
customer access to our product or service and how do we do it.” In some environments,
eBay v. Bidder’s Edge, No. C-99-21200 RMW (U.S. District Court for the Northern District of Northern
California, May 24, 2000).
the firm does not wish to limit access; in fact, the firm may even give the product away
for free. The purpose may be to gain quick market share, to sell complementary features,
or to profit from advertising. Free Yahoo! e-mail and free online newspapers are two
examples. In other environments, the firm may want to limit customer access. For
example, suppose the customer can easily become a competitor – a common risk in the
Internet environment where start-up costs are low and the transmission of information
quick. The recent debate over downloadable music and Napster (a software program that
allows Internet users to search each other’s computers for digitized music) illustrates the
dilemma. Record companies have become concerned that the trading of digitized music
among customers has, or will, cost them sales. Customers become competitors as they
freely distribute the music good. And there is some evidence that the presence of Napster
has cut into record sales in college communities.10 Record companies want to limit
access to their good (that is, “closed” Internet) to preserve their economic rents. Because
market pricing systems do not offer much protection, record companies had looked to
policy and technology. In terms of policy, they have brought copyright infringement
suits against companies such as MP3.com and Napster to reduce the ease of free
distribution. In terms of technology, record companies have developed software
mechanisms to reduce copying (such as watermarking the digital music software and the
Secure Digital Music Initiative).
In terms of competitors, the open v. closed strategy dilemma for the manager reduces to
the question, “to what extent do we want to limit our competitor’s access to customers
and how do we do it.” AOL’s protection of its Instant Messaging system is illustrative.
AOL has tens of millions of registered users on its American Online Instant Messaging
(“AIM”) system, far more than any competitor. Microsoft Corporation wished to allow
its MSN Messaging users access to the AIM customer base. Other companies with
instant messaging systems wished to do the same. AOL, however, wanted to protect its
current and potential user base from the competition. To protect its current and future
user base, AOL (1) made licensing agreements with some competitors for use or access
to the AIM system, (2) brought legal actions against other firms (AT&T) for trademark
infringement for using terms like “IM” or “buddy lists” – terms associated with AOL’s
instant messaging system, and (3) changed AIM protocols to bump off MSN messaging
users who were accessing the AIM system. In this sense, AOL was using market, policy
and technology alternatives to protect against (or “close”) competitor access to AOL’s
current or potential customer base.
In sum, the “closed Internet” versus “open Internet” discourse is not just a philosophical
debate, but one that has profound ramifications to the viability of different business plans
on the Internet. The open and closed business models reflect themselves in the market,
technology and policy strategies of E-business, The strategies play out in many areas
including customer information (to share with competitors or not), pricing (to charge
customers or not), patents (to monopolize Internet business methods or not) and domain
names (to lay claim to the maximum amount of Internet real estate or not). Choices in
these and other areas affect the degree of openness in customer and competitor
cite recent study of college student sales
environments and ultimately the viability of a given business model. In this article, we
provide a way for managers to think about their business strategies in light of this open
and closed Internet debate.
The Open Character of the Internet
According to the small bandit of dedicated researchers who founded the Internet, the
Internet was created as an open architecture in order to promote “worldwide broadcasting
capability, a mechanism for rapid and inexpensive information dissemination, and a
medium for collaboration and interaction between individuals and their computers
without regard for geographic location” (Leiner et al, 1997). The open architecture
ensured no global control and no single point of vulnerability. The Internet was created in
partnership with government, industry and academia whereby all key ideas,
developments, and results were made public. Importantly, the open Internet architecture
was created not for the sake of interoperable technology, but for the goal of achieving
collection of global communities (Leiner et al, 1997).
In 1991, an Internet Society was formed to promote the use of the Internet for
commercial purposes. By the middle of 1994, more than 50% of Internet traffic was
from commercial firms and on a profit-making basis, and the World Wide Web had
become the fastest growing Internet information service. Thanks to the arrival Mosaic in
1993, a graphical point-and-click software browser and the predecessor of Netscape,
users of the Internet could access, retrieve, display, store, and forward documents and
information from any authorized Web server in the world, delivering to the user in the
same familiar format each time regardless of the underlying computing hardware and
operating system technology of either the Web “server” (where the information to be
accessed resided) or the client workstation. By the summer of 1994, Mosaic had been
enhanced to facilitate electronic commerce including forms capability, encryption,
graphical mapping, and signature authentication.
It was not, however, until the beginning of 1995, that the business community took
notice. The second-generation browser, Netscape, was released in December 1994. This
product like many other Internet products that had the challenge of creating a market
where none existed was given away for free to the public. Netscape Communications
made money on its servers that companies needed in order to publish documents on the
web and to set up a commercial site. According to Yoffie and Cusumano (1999), “With
the market just emerging, Netscape saw an opportunity to “come in below the radar
screen” and take control of consumers’ surfing habits by making its browser the most
popular way to navigate the web. With any luck, by the time the online malls were built,
consumers would be hooked on driving around in Netscape’s car” (p. 13). Netscape
illustrates one of the first open e-business models. By giving the browser away for free,
this open model helped them to create a browser market.
Price is the most common way to control customer access to the product. At least
theoretically, free product removes the economic barriers between the customer and the
product Free product strategy has been exercised since the days of Netscape. In 1996,
Ester Dyson, the current head of ICANN -- the governing board of the Internet - stated
that because of ease of copying on the Internet, “most of what is now considered content
– including software – will soon be distributed on the Internet for free. Payment will be
made instead for ads and other services such as support and training” (Information Week,
Four years later, businesses across industries provide their products and services free on
the Internet. EFAX has about 14 million customers; they give away incoming email and
charge for outgoing and other services. Newspapers give away headlines and, now,
content on their web sites. Pornographic sites give away free sample pictures. MIT Press
gives away its monographs online and attribute an increase in actual book sales to
monographs. Monographs allow people to get a better feel for what they are buying. In
Europe, many companies are giving away free ISP service. Throttlebox.com provides
downloadable multimedia entertainment (independent bands, comedy sketches, cartoons,
movies, and sports) for free. Live365.com offers free broadcasting to users.
Clubcastlive.com offers live music club entertainment performances free of charge to
consumers around the world.11 Others distribute older or downgraded versions of the
product free. Eudora has long distributed its low-end version email software free.
In some instances, free products serve as promotional means to create a market where
none existed. Once the market is established, then charges might be instituted.
Alternatively, free products and services are sometimes supported by advertising revenue.
Some e-businesses are giving away products or service to increase the switching costs of
its customer base (e.g., Apple giving away free disk storage) for other non-free products
and services that have positive network externalities with the free products. Free
downloadable music might be tied to the sales of portable CD-players or Palm pilots that
consumers use to listen to music when not at their computers. [Rewrite]
Open strategy also characterized how some of the early Internet companies dealt with
competitors. Netscape like Mosaic was based on open standards – an approach that
invited Netscape’s rivals to compete. Open standards are often managed by industry
consortia such as the Internet Engineering Task Force, which makes specifications of the
standards publicly and freely available. This is in contrast to closed standards where a
single firm owns the specifications and only makes them available for price or on the
basis of exclusive contracting. Such proprietary standards represent closed strategies
because they lock in customers and lock out competitors.
Open source has gained foothold in many communities, particularly in software industry.
Linux gained notoriety not just because it gained popularity and could be conceived to
become a competitor to Microsoft’s Windows but also because it was maintained and
developed by a network of developers who in traditional economic sense should have
seen each other as competitors. Sun Microsystems gave away its Java source in order to
gather feedback and encourage additional development. Compared to Netscape and Sun,
At the 2000 South by Southwest Music Conference in Austin Texas, Clubcastlive.com broadcasted
hundreds of live performances over the Internet. This compares to the conference fee of $400 to attend the
same performances in the clubs.
Linux presents a slightly different development path. Whereas Linux was initially started
out by an individual who put out an open call for help for feedback and further
development, both Netscape and Sun Microsystems are examples of how existing
organizations rather than individuals make a product available for free and encourage
developers outside of their organization to provide input and add to the development
(Seidel and Stewart, 2000).
To some, open Internet strategies have meant lack of regulation and interference from the
government. Except through some financial support (e.g., via NSF), the legislative
branches of the U.S. government has maintained its “hands-off” attitude toward the
Internet. The judicial sectors have been more active. The Justice Department, for
example, recently made use of the antitrust laws to limit Microsoft’s efforts to “close”, or
monopolize, the Internet browser market. Courts and many municipalities have frowned
upon AT&T’s effort to buy up cable lines for broad band Internet access and then
limiting consumers’ choice to AT&T’s hand-picked internet service provider (ISP).
Similar concerns are likely to be raised regarding the recent proposed merger of AOL and
To some, the U.S. government is not actively involved enough to preserve the open
character of the Internet. Lessig (1999) in his recent book argues that to maintain open
Internet will require regulation from the government; if left to the commercial interests,
the Internet architecture will evolve into a set of “code” that no long protects fundamental
freedoms including free and open trade. According to Lessig (1999), government
policies should encourage, to the greatest extent possible, “open” Internet access and
competition, including new and small business entrepreneurs.
The “open” environment philosophy argues that because the Internet allows for almost
instant participation, innovation will occur quickly if business and other organizations are
not allowed broad propriety rights to ideas, code, and content. Moreover, there is the
belief that innovation will come from a broader segment of society in an open
environment. Small business entrepreneurs would be expected to thrive because they are
not hampered by a complicated legal framework that advantages large companies who
more easily master the patent and regulatory frameworks with their teams of lawyers and
The New Closed Internet Environment
Making money on the Internet has proven rather difficult. The Internet’s rapid
development is demanding from firms continuous innovation and consequently major
financial investments with little security of sustainable revenue streams.
On the other side of the open movement are business entrepreneurs. Aspects of open
strategies have created Achilles’ heels for their commercial interests on the Internet.
Websites’ giving product free prolongs the idea from the original days on the Internet that
access to, and the content on, the Internet is and should be free. In minds of these new
Internet entrepreneurs, this idea was viable when the Internet consisted of researchers and
academics, but not when millions of dollars of private funds have been invested for its
commercial development. In fact, many would argue that this value remained one of the
biggest hurdles to overcome as they move their operations on the Internet. Convincing
customers to pay after they have been accustomed to have other products or services for
free can be a challenge. The mindset of free Internet has also created an environment that
has been rather tolerant of piracy and other violations of intellectual property. Some users
accustomed to free software might think of software via the Internet as a free download
not as a monetary transaction with legal responsibilities carried by both sides. Because of
fears of piracy particularly in certain regions of the world, many businesses have limited
access to their product offerings.
Competitor strategies may also come in open or closed forms. Closed strategies work on
the premise that an e-business has the most to gain by limiting competitors’ ability to
compete for customers. For example, an e-business could acquire an e-patent on a
method of doing business (such as Priceline.com’s patent on reverse auctioning on the
Internet), form strategic alliances (Microsoft’s partnership with Ericsson in building a
mobile portal site), or acquire/design a technology that limits competitor access (such as
AT&T’s purchase and control of cable lines). Each of these prevents competitors from
having access to the pool of potential customers. Open strategies, by contrast, invite
collaboration and sharing with competitors. Original developers of the product and
service may offer, for example, the source code of software free to competitors for further
input, feedback, and development. Linux operating system is a case in point.
Whether the optimal customer or competitor strategy for the firm is open or closed
depends on the opportunities and constraints of the external environment and the type and
size of the firm. The policy environment, for example, may be favorable for closed
strategies (availability of internet business method patents and domain name trademarks),
or unfavorable for closed strategies (antitrust – Microsoft). The technology and market
environments may similarly vary. If, for example, the best technology for attracting and
retaining customers is uncertain, then firms may engage in open strategies where
information is shared to promote improvements and innovation, especially if there are
many potential and evolving customers for the industry. Large companies can easily
afford to file and defend their patents as well as register multiple domain names to protect
their brands but small businesses might rely more on open customer and competitor
strategies. Also, the structure and scope of the firm can also play a role.
The corporate giants of the Internet have been among those leading the way to closed
Internet environments. In the example of Instant messenger technology, AOL used their
know-how of technical protocols to close off Microsoft’s access to AOL users. AOL used
a policy strategy (trademark law) in an attempt to stop AT&T and others from benefiting
from the goodwill that the AOL had created with terms like “instant messenger” and
“buddy list.” AOL used a market strategy to form exclusive license arrangements with a
select group of ISPs and thereby limited the access of their technology to other ISPs and
their users. The actions on the technology, policy, and market frontiers are all ways to
limit the competitors’ access to AOL’s customers.
Recent developments in intellectual property rights has been a particularly fruitful area
for some firms wishing to close their competitors out. In 1998, the Federal Circuit Court
of Appeals reversed a century-long trend of denying patent protection to methods of
doing business. The court held that the practical application of a mathematical algorithm
embodied in any software that produces “a useful, concrete, and tangible result” may be
patented. The court also laid to rest the so-called “business method” exception, and held
that business methods can be patented provided they meet the same legal requirements
for patentability as any other method or process. Several federal circuit cases have
reinforced the decision in State Street. The State Street decision has led to a race of
obtaining Internet business method patents12. Method of doing business patents are an
opportunity for e-businesses to legally exclude others from using the certain methods for
up to 20 years.
Many “e-businesses” have succeeded in getting a patent on a business method. These
patents cover more than a technical software application; they cover the actual business
concept as applied to the Internet. These e-patents include patents on business paradigms
such as shopping carts, auctions, brokers, agents, customer profiling, product distribution,
and marketing. Consider a few examples: CyberGold, Inc. received a patent for a scheme
that rewards customers who receive and view online advertisements.
• Vocal Tec Communications has developed software technology designed to
deliver quality voice and enhanced communications services over Internet
protocol networks using a patent-pending set of algorithms that monitors and
dynamically adjusts to network conditions to overcome delays or distortions
in the transmission of real-time voice and multimedia communities.
• IBM has been granted a patent for a method of accessing data, which resides
in a World Wide Web server.
• Trilogy has a patent that covers a method by which online shoppers choose
the features and options they want when ordering a car from CarOrder.com.
The system, the patent states, functions as an online auto salesperson, helping
buyers choose the configuration of their next car.
• Juno Online Services Inc. received patents for interactive advertisement
scheduling and authentication. It previously received a patent for offline
processing aspects of its Internet business model.
• The Wedding Channel has a patent for planning weddings over the Internet.
• Home Gambling Network has a patent for remote, live wagering over the
Internet. BeFree, Inc. announced a patent for developing a profile of computer
And E-businesses have not been shy in taking their patent rights to court to defend their
business models against competitors. Several cases have been noted in the press:
The groundswell of new business-method patent applications has been so significant since State Street
that the U.S. Patent and Trademark Office does not typically examine a business-method application for the
first time until about two years after the application is filled
Sightsound.com, which was granted a patent on the sale of audio or video
recordings in download fashion over the Internet, is demanding royalties from
other online music distributors (NK2 and MP3) on all U.S. sales involving the
downloading of music over the web. Sightsound is requesting 1% of all revenue
that other companies derive from the sale of downloaded audio files.13
Priceline.com is suing Microsoft for launching Hotel Price Matcher, a service that
operates pretty much like the name-your-own-price system Priceline.com
Trilogy filed a complaint to block CarsDirect.com from using an option-selection
feature patented by Trilogy.
These “e-patents” on business models have the potential to capture whole Internet-based
industries, as well as more generic business practices that span several Internet-based
industries, to one patent holder.
The use of trademark laws and domain name registration have been another means to
close the Internet environment. A domain name, like a trademark, is a valuable asset in a
firm’s communications and marketing strategy. Domain names establish the firm’s
persona. Without intuitive domain names, customers will spend valuable time finding
their way around through guides, indexes, and search tools rather than spending that time
in purchasing activities at the site itself. An effective domain name translates into more
web customers reaching the firm’s business. In essence, domain names are what
“location-location-location” was in the traditional brick and mortar commerce. E-
businesses are aggressively using a variety of new and old trademark laws (The Lanham
Act, Federal Trademark Anti-Dilution Act, the Anti-Cybersquatting Consumer Protection
Act, for example) and related institutions (U.S. federal and state courts, the World
Intellectual Property Organization, and the Internet Corporation for Assigned Names and
Numbers) as strategies to protect their Internet space, brands, and advertising.
When should a firm use open versus closed customer and competitor strategies. For any
particular organization, the degree of openness and closeness can vary substantially. By
dichotomizing on open and closed customer and competitor strategies, a firm can
consider various strategic models for the firm and explore their performance implications.
In some conditions each model might prevail over the other, as well as in others the two
models can be used in a complementary fashion.
Barlow, J.P. “Napster’s Enormous Music Room,” New York Times.
Yoffie, D.B. and Cusumano, M.A. “Building a Company on Internet Time: Lessons from
Netscape,” California Management Review, 41, 3 Spring 1999, 8-28.
Computer Reseller News, “Face Off! – AOL, Microsoft battle over ownership of instant
messaging users, “ August 02, 1999,
Informationweek, “The Internet’s apostle of free content,” March 25, 1996.
Leiner, B.M., Cerf, V.G., Clark, D.D., and Kahn, R.E. et al “The past and future history
of the Internet,” Communications of the ACM, 40, 2 (February 1997) 102-108.
Lessig, L. CODE and Other Laws of Cyberspace,” Basic Books, New York, New York,
New York Times, “Long Line Online for Stephen King E-Novella,” March 16, 2000, p.
A1 and C10.