How do we model the transient or ephemeral Web? Billions of Web pages are dynamically generated; they exist for the period of a particular query or transaction. How do we model this graph beneath the graph that is the Web?
How are Bayesian or other uncertainty representations best used within the Web?
What is the topological structure of the Web? Can connections always be established between its various parts, or do particular dynamic and time-dependent conditions create disconnected or sub- regions within it?
A particular query about a given subject may organize Web pages, existing or virtual, according to “how close they are” with respect to the given search criteria. This changes the virtual “shape” of the Web, as observed by the user. Given the huge numbers of searches performed simultaneously, the Web, at any given moment, will present a different structure to different users. It is a mathematical challenge to develop tools to describe this structure.
How do we measure the level of complexity of the Web? For a graph, this can be done by finding a linear space of a lowest dimension in which the graph will fit as a metric subspace. Such techniques are studied in pure mathematics and also in computer science.