What is the world wide web in brief. WWW – World Wide Web. Questions and tasks

As the Internet developed, more and more information was involved in its circulation, and navigating the Internet became increasingly difficult. Then the task arose to create a simple and understandable way to organize information posted on Internet sites. The new www (world wide web) service has fully coped with this task.

World Wide Web is a system of documents with text and graphic information, located on Internet sites and interconnected by hyperlinks. Perhaps this particular service is the most popular and for many users it is synonymous with the word INTErNET itself. Often, novice users confuse two concepts - the Internet and WWW (or Web). It should be recalled that WWW is just one of the many services provided to Internet users.

The main idea that was used in the development of the www system was is the idea of ​​accessing information using hypertext links. Its essence is to include in the text of the document links to other documents, which can be located either on the same or on remote information servers.

The history of www begins from the moment when, in 1989, an employee of the famous scientific organization CErN Berners-Lee proposed to his management to create a database in the form information network, which would consist of documents that include both the information itself and links to other documents. Such documents are nothing more than hypertext.

Another feature that sets www apart from other types of services is that through this system you can access almost all other types of Internet services, such as FTP, Gopher, Telnet.

WWW is a multimedia system. This means that using www you can, for example, watch a video about historical monuments or find out information about the World Cup. It is possible to access library information and recent photographs of the globe taken five minutes ago by meteorological satellites, along with.

The idea of ​​organizing information in the form of hypertext is not new. Hypertext lived long before the advent of computers. The simplest example non-computer hypertext – these are encyclopedias. Some words in articles are marked in italics. This means that you can refer to the related article and get more detailed information. But if in a non-computer hypertext you need to turn pages, then on the monitor screen, following a hypertext link is instantaneous. You just need to click on the link word.

The main merit of the above-mentioned Tim Berners-Lee is that he not only put forward the idea of ​​​​creating an information system based on hypertext, but also proposed a number of methods that formed the basis of the future www service.

In 1991, the ideas that originated in CErN began to be actively developed by the Center for Supercomputing Applications (NCSA). It is NCSA that creates the hypertext language html documents, as well as the Mosaic program designed to view them. Mosaic, developed by Mark Andersen, became the first browser and opened new class software products.

In 1994, the number of www servers began to grow rapidly and the new Internet service not only gained worldwide recognition, but also attracted a huge number of new users to the Internet.

Now let's give the basic definitions.

www– this is a set of web pages located on Internet sites and interconnected by hyperlinks (or simply links).

web page is a structural unit of www, which includes the actual information (text and graphic) and links to other pages.

website– these are web pages physically located on one Internet node.

The www hyperlink system is based on the fact that some selected sections of one document (which can be parts of text or illustrations) act as links to other documents that are logically related to them.

In this case, those documents to which links are made can be located both on a local and on a remote computer. In addition, traditional hypertext links are also possible - these are links within the same document.

Linked documents may, in turn, contain cross-references to each other and to other information resources. Thus, it is possible to collect documents on similar topics into a single information space. (For example, documents containing medical information.)

Architecture www

The architecture of www, like the architecture of many other types of Internet services, is built on the principle client-server.

The main task of the server program is the organization of access to information stored on the computer on which this program is running. After startup, the server program works in the mode of waiting for requests from client programs. Typically, web browsers are used as client programs, which are used by ordinary www users. When such a program needs to obtain some information from the server (usually these are documents stored there), it sends a corresponding request to the server. With sufficient access rights, a connection is established between the programs, and the server program sends a response to the request to the client program. After which the connection established between them is broken.

To transfer information between programs, the HTTP protocol (Hypertext Transfer Protocol) is used.

www server functions

www-server is a program that runs on the host computer and processes requests coming from www clients. When receiving a request from a www client, this program establishes a connection based on the TCP/IP transport protocol and exchanges information using the HTTP protocol. In addition, the server determines access rights to the documents that are located on it.

To access information that cannot be processed by the server directly, it is used lock system. Using a special CGI interface (Common Gateway Interface, General interface gateways) for exchanging information with gateways, the www-server has the ability to receive information from sources that would be inaccessible to other types of Internet services. At the same time, for the end user, the operation of the gateways is “transparent”, i.e., when viewing web resources in his favorite browser, an inexperienced user will not even notice that some information was presented to him using the gateway system

www client functions

There are two main types of www clients: web browsers and utility applications.

web browsers are used to directly work with www and obtain information from there.

Service web applications can communicate with the server either to obtain some statistics or to index the information contained there. (This is how information gets into databases search engines.) In addition, there are also service web clients, the work of which is related to the technical side of storing information on a given server.

A vulnerability (CVE-2019-18634) has been identified in the sudo utility, which is used to organize the execution of commands on behalf of other users, which allows you to increase your privileges in the system. Problem […]

The release of WordPress 5.3 improves and expands the block editor introduced in WordPress 5.0 with a new block, more intuitive interaction, and improved accessibility. New features in the editor […]

After nine months of development, the FFmpeg 4.2 multimedia package is available, which includes a set of applications and a collection of libraries for operations on various multimedia formats (recording, converting and […]

  • New features in Linux Mint 19.2 Cinnamon

    Linux Mint 19.2 is a long-term support release that will be supported until 2023. It comes with updated software and contains improvements and many new […]

  • Linux Mint 19.2 distribution released

    Release presented Linux distribution Mint 19.2, the second update to the Linux Mint 19.x branch, based on the Ubuntu 18.04 LTS package base and supported until 2023. The distribution is fully compatible [...]

  • New BIND service releases are available that contain bug fixes and feature improvements. New releases can be downloaded from the downloads page on the developer's website: […]

    Exim is a message transfer agent (MTA) developed at the University of Cambridge for use on Unix systems connected to the Internet. It is freely available in accordance with [...]

    After almost two years of development, the release of ZFS on Linux 0.8.0 is presented, an implementation of the ZFS file system, designed as a module for the Linux kernel. The module has been tested with Linux kernels from 2.6.32 to […]

    The IETF (Internet Engineering Task Force), which develops Internet protocols and architecture, has completed an RFC for the ACME (Automatic Certificate Management Environment) protocol […]

    The non-profit certification authority Let’s Encrypt, which is controlled by the community and provides certificates free of charge to everyone, summed up the results of the past year and talked about plans for 2019. […]

    History of the creation and development of the Internet.

    The Internet owes its origins to the US Department of Defense and its secret research conducted in 1969 to test methods that would allow computer networks survive during hostilities using dynamic message rerouting. The first such network was the ARPAnet, which combined three networks in California with a network in Utah under a set of rules called the Internet Protocol (IP for short).

    In 1972, access was opened to universities and research organizations, as a result of which the network began to unite 50 universities and research organizations that had contracts with the US Department of Defense.

    In 1973, the network grew to an international scale, combining networks located in England and Norway. A decade later, IP was expanded to include a set of communications protocols supporting both local and global networks. This is how TCP/IP was born. Shortly thereafter, the National Science Foundation (NSF) launched NSFnet with the goal of linking 5 supercomputing centers. Simultaneously with the introduction of the TCP/IP protocol new network soon replaced ARPAnet as the backbone of the Internet.

    Well, how did the Internet become so popular and developed, and the impetus for this, as well as for turning it into an environment for doing business, was given by the emergence of the World Wide Web (World Wide Web, WWW, 3W, ve-ve-ve, three double) - systems hypertext, which made surfing the Internet fast and intuitive.

    But the idea of ​​linking documents through hypertext was first proposed and promoted by Ted Nelson in the 1960s, but the level of computer technology existing at that time did not allow it to be brought to life, although who knows how it would have ended if Has this idea found application?!

    The foundations of what we understand today as the WWW were laid in the 1980s by Tim Berners-Lee while working on a hypertext system at the European Laboratory for Particle Physics (European Nuclear Research Centre). ).

    As a result of these works, in 1990 the scientific community was presented with the first text browser (browser), allowing viewing of hyperlinks. text files on-line. The browser was made available to the general public in 1991, but its adoption outside academia has been slow.

    A new historical stage in the development of the Internet is due to the release of the first Unix version of the graphical browser Mosaic in 1993, developed in 1992 by Marc Andreessen, a student who interned at the National Center for Supercomputing Applications (NCSA), USA.

    Since 1994, after the release of versions of the Mosaic browser for operating systems Windows systems and Macintosh, and soon after that - Netscape Navigator and Microsoft browsers Internet Explorer, begins the explosive spread of the popularity of the WWW, and as a consequence of the Internet, among the general public, first in the United States and then throughout the world.

    In 1995, NSF transferred responsibility for the Internet to the private sector, and since that time the Internet has existed as we know it today.


    Internet services.

    Services are types of services that are provided by Internet servers.
    In the history of the Internet, there have been different types of services, some of which are no longer in use, others are gradually losing their popularity, while others are experiencing their heyday.
    We list those services that have not lost their relevance in this moment:
    -World Wide Web - the World Wide Web - a service for searching and viewing hypertext documents, including graphics, sound and video. -E-mail – electronic mail – transmission service emails.
    -Usenet, News – teleconferences, news groups – a type of online newspaper or bulletin board.
    -FTP – file transfer service.
    -ICQ is a service for real-time communication using a keyboard.
    -Telnet is a service for remote access to computers.
    -Gopher – service for accessing information using hierarchical directories.

    Among these services we can highlight services designed for communication, that is, for communication, transfer of information (E-mail, ICQ), as well as services whose purpose is to store information and provide access to this information for users.

    Among the latest services, the leading place in terms of the volume of stored information is occupied by the WWW service, since this service is the most convenient for users and the most advanced in technical terms. In second place is the FTP service, since no matter what interfaces and conveniences are developed for the user, the information is still stored in files, access to which is provided by this service. The Gopher and Telnet services can currently be considered “dying”, since almost no new information is received on the servers of these services and the number of such servers and their audience is practically not increasing.

    World Wide Web - World Wide Web

    World Wide Web (WWW) - hypertext, or more precisely, hypermedia Information system searching for Internet resources and accessing them.

    Hypertext - information structure, which allows you to establish semantic connections between elements of text on a computer screen in such a way that you can easily transition from one element to another.
    In practice, in hypertext, some words are highlighted by underlining or coloring them in a different color. Highlighting a word indicates that there is a connection between this word and some document in which the topic associated with the highlighted word is discussed in more detail.

    Hypermedia is what happens if you replace the word “text” in the definition of hypertext with “any type of information”: sound, graphics, video.
    Such hypermedia links are possible because, along with textual information, you can link any other binary information, for example, encoded sound or graphics. So, if a program displays a world map and if the user selects a continent on this map with the mouse, the program can immediately provide graphic, sound and text information about it.

    The WWW system is built on a special data transfer protocol called the HyperText Transfer Protocol (HTTP).
    All content of the WWW system consists of WWW pages.

    WWW pages are hypermedia documents of the World Wide Web system. They are created using the hypertext markup language HTML (Hypertext markup language). One WWW page is actually usually a set of hypermedia documents located on one server, intertwined with mutual links and related in meaning (for example, containing information about one educational institution or one museum). Each page document, in turn, can contain several screen pages text and illustrations. Each WWW page has its own “title page” (English: “homepage”) - a hypermedia document containing links to the main components of the page. Addresses " title pages" are distributed on the Internet as page addresses.

    A set of Web pages interconnected by links and designed to achieve a common goal is called a Web site.

    Email.

    Email appeared about 30 years ago. Today it is the most widespread means of exchanging information on the Internet. The ability to receive and send email can be useful not only for communicating with friends from other cities and countries, but also in a business career. For example, when applying for a job, you can quickly send out your resume using e-mail to various companies. In addition, on many sites where you need to register (on-line games, online stores, etc.) you often need to provide your e-mail. In a word, e-mail is a very useful and convenient thing.

    Electronic mail (Electronic mail, English mail - mail, abbreviated e-mail) is used for transmitting text messages within the Internet, as well as between other networks Email. (Picture 1.)

    Using e-mail, you can send messages, receive them in your email inbox, respond to letters from correspondents, send copies of letters to several recipients at once, forward a received letter to another address, use logical names instead of addresses, create several subsections mailbox for various types of correspondence, include various sound and graphic files, and binary files- programs.

    To use E-mail, the computer must be connected to the telephone network via a modem.
    A computer connected to a network is considered a potential sender and receiver of packets. Each Internet node, when sending a message to another node, splits it into fixed-length packets, usually 1500 bytes in size. Each packet is provided with a recipient address and a sender address. Packets prepared in this way are sent over communication channels to other nodes. When receiving any packet, the node analyzes the recipient's address and, if it matches its own address, the packet is accepted, otherwise it is sent further. Received packets related to the same message are accumulated. Once all packets of one message are received, they are concatenated and delivered to the recipient. Copies of packets are stored on sending nodes until a response is received from the recipient node indicating successful delivery of the message. This ensures reliability. To deliver a letter to the addressee, you only need to know his address and the coordinates of the nearest mailbox. On the way to the addressee, the letter passes several post offices(nodes).

    FTP service

    Internet service FTP (file transfer protocol) stands for protocol
    file transfer, but when considering FTP as an Internet service there is
    not just a protocol, but a service - access to files in file
    archives.

    IN UNIX systems FTP is a standard program that works using the TCP protocol,
    always supplied with the operating system. Its original purpose is
    transfer files between different computers operating in TCP/IP networks: on
    On one of the computers the server program is running, on the second the user runs
    a client program that connects to the server and sends or receives
    FTP files (Figure 2)

    Figure 2. FTP protocol diagram

    The FTP protocol is optimized for file transfer. Therefore, FTP programs have become
    part of a separate Internet service. The FTP server can be configured like this
    way that you can connect with him not only under a specific name, but also under
    conditional name anonymous - anonymous person. Then not all information becomes available to the client.
    file system computer, but a certain set of files on the server, which
    composes the contents of an anonymous ftp server - a public file archive.

    Today, public file archives are organized primarily as servers
    anonymous ftp. A huge amount of information is available on such servers today.
    and software. Almost everything that can be provided
    to the public in the form of files, accessible from anonymous ftp servers. These are programs -
    freeware and demo versions and multimedia, it's finally
    just texts - laws, books, articles, reports.

    Despite its popularity, FTP has many disadvantages. Programs-
    FTP clients may not always be convenient or easy to use. It's not always possible
    understand what kind of file this is in front of you - whether it is the file that you are looking for or not. No
    a simple and universal search tool for anonymous ftp servers - although for
    This is why there are special programs and services, but they don’t always provide
    the desired results.

    FTP servers can also provide access to files under a password - for example,
    to your clients.

    TELNET service

    The purpose of the TELNET protocol is to provide a fairly general, bidirectional, eight-bit byte-oriented means of communication. Its main purpose is to allow terminal devices and terminal processes to communicate with each other. It is intended that this protocol can be used for terminal-to-terminal communication ("bundling") or for process-to-process communication ("distributed computing").

    Figure 3. Telnet terminal window

    Although a Telnet session has a client side and a server side, the protocol is actually completely symmetrical. After establishing a transport connection (usually TCP), both ends of it play the role of “network virtual terminals” (English). Network Virtual Terminal, NVT) exchanging two types of data:

    Application data (that is, data that goes from the user to the text application on the server side and back);

    Telnet protocol commands, a special case of which are options that serve to understand the capabilities and preferences of the parties (Figure 3).

    Although a Telnet session running over TCP is full duplex, the NVT should be considered a half-duplex device that operates in line buffered mode by default.

    Application data passes through the protocol without changes, that is, at the output of the second virtual terminal we see exactly what was entered at the input of the first. From a protocol point of view, the data is simply a sequence of bytes (octets), which by default belong to the ASCII set, but when the option is enabled Binary- any. Although extensions have been proposed to identify a character set, they are not used in practice.

    All application data octet values ​​except \377 (decimal: 255) are transmitted as is on the transport. The \377 octet is transmitted as a \377\377 sequence of two octets. This is because the \377 octet is used at the transport layer to encode options.

    The protocol provides minimal functionality by default and a set of options that extend it. The principle of negotiated options requires negotiations to take place when each option is included. One party initiates the request, and the other party can either accept or reject the offer. If the request is accepted, the option takes effect immediately. Options are described separately from the protocol itself, and their support by software is optional. The protocol client (network terminal) is instructed to reject requests to enable unsupported and unknown options.

    Historically, Telnet was used for remote access to the interface command line operating systems. Subsequently, it began to be used for other text interfaces, including MUD games. Theoretically, even both sides of the protocol can be not only people, but also programs.

    Sometimes telnet clients are used to access other protocols based on the TCP transport, see Telnet and other protocols.

    The telnet protocol is used in the FTP control connection, that is, logging into the server with the telnet command ftp.example.net ftp to perform debugging and experimentation is not only possible, but also correct (unlike using telnet clients to access HTTP, IRC and most other protocols).

    The protocol does not provide for the use of either encryption or data authentication. Therefore, it is vulnerable to any type of attack to which its transport is vulnerable, i.e. TCP protocol. For the functionality of remote access to the system, the SSH network protocol (especially its version 2) is currently used, during the creation of which the emphasis was placed specifically on security issues. So keep in mind that a Telnet session is very insecure unless it is done on a fully controlled network or with network-level security (various VPN implementations). Due to unreliability, Telnet as a means of managing operating systems has long been abandoned.

    The World Wide Web(English World Wide Web) is a global information space based on the physical infrastructure of the Internet and the HTTP data transfer protocol. The World Wide Web has caused a real revolution in information technology and the boom in Internet development. Often, when talking about the Internet, they mean the World Wide Web. The word web and the abbreviation “WWW” are also used to refer to the World Wide Web.

    The World Wide Web is made up of millions web servers Internet networks located around the world. A web server is a program that runs on a computer connected to the network. In its simplest form, such a program receives an HTTP request for a specific resource over the network, finds the corresponding file on the local hard drive and sends it over the network to the requesting computer. More complex web servers are capable of dynamically generating resources in response to an HTTP request.

    To view information received from the web server on the client computer, use special program-client - web browser. The main function of a web browser is to display hypertext.

    The World Wide Web is inextricably linked with the concepts of hypertext . Hypertext is a document or system of documents with cross-references ( hyperlinks). You can not navigate through such a document sequentially, but by activating hyperlinks, you can follow them to texts or files associated with the links.

    Language is used to create, store and display hypertext HTML (HyperText Markup Language), language markings hypertext. The work of marking up hypertext is called layout; markup specialists are called webmasters.

    An HTML file is the most common resource on the World Wide Web. The HTML file available to the web server is called " web page" A set of web pages related by theme, design or owner forms Web site.

    Information on the web can be displayed either passively (that is, the user can only read it) or actively - then the user can add information and edit it. Methods for actively displaying information on the World Wide Web include:

    Guest books,

    Wiki projects,

    Content management systems.

    HTML markup

    HTML (HyperText Markup Language) is not a programming language, it is a formatting language, i.e. giving appearance web page when viewed in a browser. Used to mark up a document tags. Tags are enclosed in angle brackets, and, with rare exceptions, are paired, i.e. there is an opening and closing tag. For example, to mark the beginning of a new paragraph in a document, a tag is placed

    (from paragraph). Then at the end of the paragraph there must be a closing tag

    .


    When placing tags, the following rule is followed: tags are closed in the reverse order of their appearance. For example, if a word in the text should be highlighted with boldness (tag from bold) and at the same time in italics (tag from italic), then this can be done in one of the following ways: word , or word .

    Below is the text of some html document and the result of its display in the browser:

    Good day, dear visitor !

    I hope you got exactly where you wanted.

    Here you will find poetry , songs And scenarios for organizing any holidays.

    And now a special gift for September 1b>

    He's used to "A" grades -

    Russian five and singing.

    I always like his diary

    Spoils the mood.

    Structure and principles of the World Wide Web

    World Wide Web around Wikipedia

    The World Wide Web is made up of millions of Internet web servers located around the world. A web server is a program that runs on a computer connected to a network and uses the HTTP protocol to transfer data. In its simplest form, such a program receives an HTTP request for a specific resource over the network, finds the corresponding file on the local hard drive and sends it over the network to the requesting computer. More complex web servers are capable of dynamically allocating resources in response to an HTTP request. To identify resources (often files or parts thereof) on the World Wide Web, uniform resource identifiers (URIs) are used. Uniform Resource Identifier). Uniform URL resource locators are used to locate resources on the web. Uniform Resource Locator). These URL locators combine URI identification technology and the DNS domain name system. Domain Name System) - Domain name(or directly the address in numeric notation) is part of the URL to designate the computer (more precisely, one of its network interfaces) that executes the code of the desired web server.

    To view information received from the web server, a special program is used on the client computer - a web browser. The main function of a web browser is to display hypertext. The World Wide Web is inextricably linked with the concepts of hypertext and hyperlinks. Most of the information on the Internet is hypertext. To facilitate the creation, storage and display of hypertext on the World Wide Web, HTML is traditionally used. HyperText Markup Language), hypertext markup language. The work of marking up hypertext is called layout; the markup master is called a webmaster or webmaster (without a hyphen). After HTML markup, the resulting hypertext is placed in a file; such an HTML file is the main resource of the World Wide Web. Once an HTML file is made available to a web server, it is called a “web page.” A collection of web pages makes up a website. Hyperlinks are added to the hypertext of web pages. Hyperlinks help World Wide Web users easily navigate between resources (files), regardless of whether the resources are located on local computer or on a remote server. Web hyperlinks are based on URL technology.

    World Wide Web Technologies

    To improve the visual perception of the web, CSS technology has become widely used, which allows you to set uniform design styles for many web pages. Another innovation worth paying attention to is the URN resource designation system. Uniform Resource Name).

    A popular concept for the development of the World Wide Web is the creation of the Semantic Web. The Semantic Web is an add-on to the existing World Wide Web, which is designed to make information posted on the network more understandable to computers. The Semantic Web is a concept of a network in which every resource in human language would be provided with a description that a computer can understand. The Semantic Web opens up access to clearly structured information for any application, regardless of platform and regardless of programming languages. Programs will be able to find the necessary resources themselves, process information, classify data, identify logical connections, draw conclusions and even make decisions based on these conclusions. If widely adopted and implemented wisely, the Semantic Web has the potential to spark a revolution on the Internet. To create a computer-readable description of a resource, the Semantic Web uses the RDF (English) format. Resource Description Framework ), which is based on XML syntax and uses URIs to identify resources. New in this area is RDFS (English) Russian (English) RDF Schema) and SPARQL (eng. Protocol And RDF Query Language ) (pronounced "sparkle"), a new query language for quick access to RDF data.

    History of the World Wide Web

    Tim Berners-Lee and, to a lesser extent, Robert Cayo are considered the inventors of the World Wide Web. Tim Berners-Lee is the originator of HTTP, URI/URL and HTML technologies. In 1980 he worked at the European Council for Nuclear Research (French). Conseil Européen pour la Recherche Nucléaire, CERN ) consultant for software. It was there, in Geneva (Switzerland), that he wrote the Enquire program for his own needs. Enquire, can be loosely translated as "Interrogator"), which used random associations to store data and laid the conceptual foundation for the World Wide Web.

    The world's first website was hosted by Berners-Lee on August 6, 1991 on the first web server available at http://info.cern.ch/, (). Resource defined the concept World Wide Web, contained instructions for setting up a web server, using a browser, etc. This site was also the world's first Internet directory because Tim Berners-Lee later posted and maintained a list of links to other sites there.

    The first photograph on the World Wide Web was of the parody filk band Les Horribles Cernettes. Tim Bernes-Lee asked the group leader for scans of them after the CERN Hardronic Festival.

    But still theoretical basis The web was founded much earlier than Berners-Lee. Back in 1945, Vannaver Bush developed the concept of Memex. (English) Russian - auxiliary mechanical means of “expanding human memory”. Memex is a device in which a person stores all his books and records (and, ideally, all his knowledge that can be formally described) and which provides the necessary information with sufficient speed and flexibility. It is an extension and addition to human memory. Bush also predicted comprehensive indexing of text and multimedia resources with the ability quick search necessary information. The next significant step towards the World Wide Web was the creation of hypertext (a term coined by Ted Nelson in 1965).

    • The Semantic Web involves improving the coherence and relevance of information on the World Wide Web through the introduction of new metadata formats.
    • The Social Web relies on the work of organizing the information available on the Web, carried out by the Web users themselves. In the second direction, developments that are part of the semantic web are actively used as tools (RSS and other web channel formats, OPML, XHTML microformats). Partially semanticized sections of the Wikipedia Category Tree help users consciously navigate information space However, very lenient requirements for subcategories do not give reason to hope for the expansion of such areas. In this regard, attempts to compile knowledge atlases may be of interest.

    There is also a popular concept Web 2.0, which summarizes several directions of development of the World Wide Web.

    Methods for actively displaying information on the World Wide Web

    Information on the web can be displayed either passively (that is, the user can only read it) or actively - then the user can add information and edit it. Methods for actively displaying information on the World Wide Web include:

    It should be noted that this division is very arbitrary. So, say, a blog or guest book can be considered a special case of a forum, which, in turn, is a special case of a content management system. Usually the difference is manifested in the purpose, approach and positioning of a particular product.

    Some information from websites can also be accessed through speech. India has already begun testing a system that makes the text content of pages accessible even to people who cannot read and write.

    The World Wide Web is sometimes ironically called the Wild Wild Web, in reference to the title of the film Wild Wild West.

    see also

    Notes

    Literature

    • Fielding, R.; Gettys, J.; Mogul, J.; Fristik, G.; Mazinter, L.; Leach, P.; Berners-Lee, T. (June 1999). “Hypertext Transfer Protocol - http://1.1” (Information Sciences Institute).
    • Berners-Lee, Tim; Bray, Tim; Connolly, Dan; Cotton, Paul; Fielding, Roy; Jeckle, Mario; Lilly, Chris; Mendelsohn, Noah; Orcard, David; Walsh, Norman; Williams, Stuart (December 15, 2004). "Architecture of the World Wide Web, Volume One" (W3C).
    • Polo, Luciano World Wide Web Technology Architecture: A Conceptual Analysis. New Devices(2003). Archived from the original on August 24, 2011. Retrieved July 31, 2005.

    Links

    • Official website of the World Wide Web Consortium (W3C) (English)
    • Tim Berners-Lee, Mark Fischetti. Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web. - New York: HarperCollins Publishers (English) Russian . - 256 p. - ISBN 0-06-251587-X, ISBN 978-0-06-251587-2(English)
    Other organizations involved in the development of the World Wide Web and the Internet in general