Tháng Tư 16, 2022
What Is P2P Protocol
P2P file sharing has changed all that. Suddenly, you had direct access to the data shared by other people. But let`s go back a bit: what is P2P, how does it work, and where did it start? There are pros and cons to P2P networks related to data backup, recovery, and availability. In a centralized network, system administrators are the only forces that control the availability of shared files. If administrators decide to stop distributing a file, they simply have to delete it from their servers and it will no longer be available to users. Not only does this make users powerless to decide what is distributed in the community, but it makes the entire system vulnerable to threats and demands from the government and other major forces. For example, YouTube has come under pressure from the RIAA, MPAA, and entertainment industry to filter out copyrighted content. Although server-client networks are able to monitor and manage the availability of content, they may have more stability in the availability of the content they want to host. A customer should have no problem accessing obscure content shared over a stable centralized network. However, P2P networks are less reliable when sharing unpopular files because sharing files over a P2P network requires that at least one node on the network has the requested data and that node can connect to the node requesting the data. This requirement is sometimes difficult to meet because users can delete or stop sharing data at any time. [43] Computers on a peer-to-peer network typically physically reside side by side and run similar network protocols and software. Before home networks became popular, only small businesses and schools built peer-to-peer networks.
In the advanced P2P network, the software establishes protocols that handle direct connections between multiple devices over the Internet. The name Napster referred to both the P2P network and the file sharing client it supported. In addition to being limited to a single client application, Napster used a proprietary network protocol, but these technical details did not have a significant impact on its popularity. Some peer-to-peer (e.B networks. Freenet) attach great importance to confidentiality and anonymity, that is, that the content of the communication is hidden from spies and the identities/locations of the participants are hidden. Public key cryptography can be used to provide encryption, data validation, authorization, and authentication of data/messages. Onion routing and other mixed network protocols (e.B. Tarzan) can be used to ensure anonymity. [60] Peer-to-peer applications are one of the main issues in the net neutrality controversy.
Internet Service Providers (ISPs) are known to throttle P2P file sharing traffic due to its high bandwidth usage. [67] Compared to web browsing, email, or many other Internet uses where data is only transferred at short intervals and in relatively small amounts, P2P file sharing often consists of relatively high bandwidth usage due to ongoing file transfers and swarm/network coordination packets. In October 2007, Comcast, one of the largest broadband Internet service providers in the United States, began blocking P2P applications such as BitTorrent. Their reasoning was that P2P is primarily used to share illegal content and that their infrastructure is not designed for continuous high-bandwidth traffic. Critics point out that P2P networks have legitimate legal uses and that this is another way that major vendors are trying to control usage and content on the Internet and direct people to a client-server-based application architecture. The client-server model offers financial barriers to entry for small publishers and individuals and may be less efficient for sharing large files. In response to this bandwidth limitation, several P2P applications have begun to implement protocol obfuscation, such as . B BitTorrent protocol encryption. Techniques for achieving “protocol scrambling” involve removing otherwise easily identifiable properties from logs, such as deterministic byte sequences and packet size, giving the impression that the data was random. [68] The high-bandwidth ISP solution is P2P caching, where an ISP stores the portion of files that P2P clients access most often to store Internet access. Developed in 2001, BitTorrent is an open source protocol where users create a metafile (called a .torrent file) that contains information about the download without specifying the download data itself. A tracker was needed to store these metafiles, as well as the one holding this file straight.
However, as an open protocol, anyone could program the client or tracking software. The spread of malware varies between different peer-to-peer protocols. For example, studies analyzing the prevalence of malware on P2P networks found that 63% of download requests answered on the Gnutella network contained some form of malware, while only 3% of content on OpenFT contained malware. In both cases, the three most common types of malware accounted for the vast majority of cases (99% in Gnutella and 65% in OpenFT). Another study analyzing traffic on the Kazaa network found that 15% of the 500,000 file samples collected were infected with one or more of the 365 different computer viruses tested. [39] In its simplest form, a peer-to-peer (P2P) network is created when two or more PCs are connected to each other and share resources without going through a separate server computer. .