Last Updated Jan 3, 2008 2:10 PM EST
Peer-to-peer embraces the networking capabilities of the Internet. "P2P" (as dubbed by advocates) enables the sharing and direct publication of resources and also allows the unused processing capability of computers to be shared and used more productively. While the peer-to-peer concept has, in fact, been around for decades, it's grown as the Internet has become part of daily life. The concept first attracted serious attention as the original Napster music-swapping service gained popularity; peer-to-peer garnered more attention with the growth of Skype, a service that provides high-quality voice calls over the Internet.
When considering peer-to-peer networks, remember it is a still-emerging technology—in other words, approach it with caution. Peer-to-peer looks promising for collaborating with other individuals and for making better use of computer processing capacity; but it's also beset by the threat of major security breaches that need to be thoroughly addressed before adopting the concept.
Peer-to-peer puts every computer on an equal footing and enables every computer to be both a publisher and consumer of information. The traditional model on the Web is the client-server one: the client is a computer and browser able only to receive/consume information; the server delivers/publishes information retrieved from a given web site. Peer-to-peer makes a computer both a server and a client.
Perhaps the best-known example of a peer-to-peer network is Skype. A communications service that is now an eBay company, it allows people to make a voice call directly to any other Skype user in the world. There also are any number of P2P file-sharing programs. These programs generally work in this fashion: Person A could search for and download music from Person B's computer, while Person B could search for and download music from Person A's computer. At first glance, that's fine. However, none of these P2P file-sharing programs has any way to control the illegal distribution of copyrighted material.
In a word, several. But here are three distinct options best suited for peer-to-peer technologies:
- Information/content: the Napster example. The content on your computer becomes accessible to everyone else's in a peer-to-peer environment, and everyone else's content becomes accessible to yours.
- Process capacity sharing: computers with spare processing capacity are joined together electronically to combine their resources. Using a large number of computers, this can create very significant processing capabilities.
- Communication: a computer user can communicate in various ways with other people in the peer-to-peer network.
Yes. One does not exclude the other. The best scenario is to exploit the strengths of the client-server model—order, structure and management—and combine them with the flexibility and enabling capacity of peer-to-peer technology.
To begin with, certain capabilities need to be in place for peer-to-peer technology to function. They include:
- Documenting resources. To make a resource available, it must be accessible on a computer. This requires that it be identified as a resource that can be shared. Part of this identification process involves proper describing the resource so others can identify it quickly and accurately;
- Locating resources. A person seeking a resource first must locate it. This can be a major challenge in a large peer-to-peer environment where many millions of resources might be available. Some form of directory thus becomes essential. Creating one can be a challenge in itself.
- Utilizing resources. Once the resource has been located, there must be a method by which it can be utilized. If the resource is content, such as music, then it can be simply downloaded. However, tapping processing power per se will require more complex interaction and enabling procedure.
Studies have indicated that 50 percent or more of a typical organization's processing power may be unused. Peer-to-peer technology offers a way of tapping this unused resource for productive purposes. Indications are that business organizations see this as one of the most practical uses of peer-to-peer technology. But bear in mind that such an application becomes relevant only when there are major processing needs that an organization finds difficult to meet.
In these situations, it can be more efficient and cost-effective to spread processing across an organization's existing computers than buy more powerful new ones. The primary drawback here is the setup cost that installing peer-to-peer technology requires. Education and training are required, too. Add ongoing maintenance to these bills and the costs begin to mount up—so much so that a centralized solution becomes more cost effective.
Whether peer-to-peer is genuinely cost-effective in such situations depends on the amount of additional processing required. Careful analysis is required to establish when a peer-to-peer approach is worth considering.
The rapid growth in partnerships, plus the need to be more flexible and adaptive, has made collaboration a key attribute of a progressive organization. Peer-to-peer networks can prove especially helpful to people who are collaborating and routinely sharing resources and content. If there is a need to establish a group that might span several organizations, peer-to-peer technology can be faster and easier to implement and then operate than more traditional approaches.
The classic example of file sharing is someone downloading a file from a central server. This procedure can put a lot of strain on bandwidth if large numbers of individuals need to download files. Peer-to-peer file sharing seeks to use bandwidth more effectively.
Let's say Person A and Person B are located close together on a network. Person A downloads an e-learning course. Later, Person B wants to download the same course. On a peer-to-peer network, instead of acting on B's request by looking in a central server, the system looks to see if anyone on the network near Person B has downloaded the same course. The system finds that Person A has. So, instead of B downloading from the central server, B downloads directly from A's computer. This saves time and network resources.
Yes, more content today does reside on individual computers rather than on servers. In turn, peer-to-peer technology that enables you to see all such content within an organization—rather than just what's available in common files or on web sites—may be quite helpful when looking for specific information. Understand, however, that there are substantial drawbacks.
For instance, much of the content on an individual's computer is either private, in draft form, outdated, or simply not ready for publication or sharing. As far back as 2001, there were an estimated 550 billion-plus documents on intranets, extranets, and public web sites. That in itself is an unimaginable quantity of content—yet it still is dwarfed by the quantity of content on individual computers around the world!
At first glance, being able to access it all may sound valuable. In practice, though, it could render information overload far, far worse than it already is.
Moreover, peer-to-peer advocates tend to overlook a fundamental publishing consideration. Publishing is not—and never has been—a matter of providing "as much as you can read," but rather selecting then publishing only the best content. A quality publishing house will reject up to 90 percent of what's presented to it. Even then, a publisher will first polish that final 10 percent, and present it in a way that is easy to find and just as easy to read.
Peer-to-peer technology thrives in an open network environment. But so do hackers and viruses, and that's an opportunity for continuing trouble. Within any organization are a variety of operating systems and security protocols. Linking them in a single cohesive and secure manner is no simple task. Indeed, many believe this weakness to be the Achilles' heel of peer-to-peer networks.
Peer-to-peer network security depends on the authentication of users. Knowing that the "peer" with whom you want to share is reputable and trustworthy is critical. To improve security, many peer-to-peer interactions now use encryption, to ensure that communication links are secure as information passes from computer to computer.
Privacy is a major issue for persons whose computers will be part of peer-to-peer networks. While computer enthusiasts tend to be technically competent, and better able to deal with security and privacy problems, most computer users are technical novices. In turn, they become very dependent on their IT department to make sure nothing is going wrong. The average IT manager has no love for this situation. In addition, the idea that someone else can root around within his or her computer can unnerve an individual.
Making sure that private files are fully protected is only part of the problem. In essence, "P2P" requires thinking about the computer differently: it is both a public and private domain—at the same time. This requires a change in work practices that many people are not willing to accept.
Peer-to-peer technology can allow an organization to investigate its computer capabilities to see what resources it truly has. On the one hand, this enables an organization to monitor software continually and to distribute upgrades as they become available. On the other hand, it also enables an organization to examine content being created or downloaded by particular individuals, thus giving it more control. Some of them might lament this as "Big Brother" in action.
Peer-to-peer technology works best in an open network, but an open network is open to attack. The peer-to-peer structure also allows viruses to spread more easily. If accreditation and authentication of users is not carried out properly, hackers and/or other malicious person can gain access to the network.
Part of the original Napster rationale was bringing unknown and unsigned artists to the masses, but the reality was that the majority of Napster aficionados just wanted to hear well-known artists. Once Napster was stopped from illegally swapping commercial music, its usage dropped dramatically. The theory of peer-to-peer is that people are willing to wade through millions of documents and related content to find that precious gem; in fact, most people just want what is significant and well accepted.
Standards tend to vary widely in peer-to-peer technologies which makes it more difficult to share resources. Without proper standards, a peer-to-peer environment can quickly become chaotic. Because P2P applications are not based on open standards and have a complex architecture, they are difficult to support. Unless you are satisfied that a given P2P system is willing to give your organization a support contract, it's not wise to rely on it for business-critical purposes.
Steinmetz, Ralf and Klaus Wehrle (editors).
PC Magazine on P2P: www.pcmag.com/encyclopedia_term/0,2542,t=peer-to-peer+network&i=49056,00.asp