Natural and man-made crises in the world have become distressingly familiar features of the globalised world and God’s creation. Often, the crises are complex in nature, global in scale, and are the result of multiple, unanticipated, inter-related events of spatial and temporal dimensions. Scientists are strongly motivated to understand and anticipate the crises to mitigate their effects and recover from their consequences. To conquer, governments anchor to several tools the most important of which being science and technology.
Science targets detecting potential features of the creatures and providing the ground to employ their capabilities. Our moral obligation is to generate possibilities, to discover the infinite while complex and multi-dimension ways, to play in their games, and eventually, rationalize and justify the future of human life interactions upon the best of the existing knowledge. It will take all possible species of intelligence in order for the universe to understand itself. Science, in this sense, is holy; it makes a divine trip.
In the meanwhile, the term numerical analysis, as a drive, is too restrictive, failing to capture the intellectual breadth required for modern computational scientists to anticipate future upon the existing knowledge. In reality, although numerical analysis remains as a useful label for an established body of knowledge, the scientists’ consensus is that the age in which a researcher spends a career within the narrow confines of such body of knowledge – if it ever existed – has long passed. To move the field forward and to advance in algorithms, analysis and modeling, we must develop our understanding in both pure mathematics and applications.
Several ideas have been explored towards the growth of numerical analysis over the past forty years, as one may measure them by watching the impact factors and counting the pages in the relevant journals. After discussing the merits of such metrics, the utilities can be gauged and scaled by the extent to which our algorithms and software are applied, even if eventual users are unaware of the ideas upon which they rely. This may be considered as a real impact.
Contemporary problems specifically in civil engineering incorporate more sophisticated physical models, and even these are typically only objective functions for a design optimization that is the true goal of the computation. Models developed by engineers and scientists appear more and more extreme while current methods sound completely inadequate for such models. Conceptually, it may be emphasized that the future of the discipline can only go well if there is more realization and mathematics is used correctly in numerical models. In addition, the role that quantifying uncertainty now plays in computational science is receiving an increased attention. No longer is it sufficient to compute a "best guess" for a solution; the result must now be qualified as "to how confident the researcher is in that best guess." In the forthcoming years, we will see growing use of several numerical algorithms as weak and strong forms on discrete models, networks, Monte Carlo simulation and databases. The conditions may promote the idea of the conformation of real cases with "mathematical sciences" as a whole and with a relaxation of the borders that separate the field from statistics and operational research.
In addition to changes in mathematical tools and applications, it is emphasized that numerical analysts must anticipate the significant changes in computer architecture on the horizon and be ready with appropriate algorithms to match. Real changes come from algorithms rather than faster processors.
Developments of numerical methods followed by science will continue to surprise us with what they discover and create; then it will again astound us by devising other novel methods. At the core of science's self modification is technology. New tools enable the structures of knowledge and the ways of discovery. Whereas the achievement of science is to know new things, the evolution of science is to know them in new ways. What evolves is less related to the body of what we know but more to the nature of our knowing.
The two branches, science and numerical analysis, are the foundations for our subjective level of culture and society. While civilizations appear and disappear, science grows steadily onwards. In other words, recursion is the essence of science; science papers cite other each other, and that process of research, pointing at itself, invokes higher levels, forming the emergent shape of the citation space. Recursion always does that. It is the engine of scientific progress and, thus, of the progress of society.
A particularly productive angle to look at the history of science is to study how science itself has changed over time, with an eye to what that trajectory suggests about the future.
Projecting forward, five issues can be stated about the next 100 years in science:
1) There will be more change in the next 50 years of science than in the last 400 years.
2) This will be a century of biology. It is the domain with the most scientists, the most new results, the most economic value, the most ethical importance and the most to learn.
3) Computers will keep leading to new ways of science. Information is growing by 66% per year while physical production grows by only 7% per year. The data volume is growing to such levels of "zillionics" that we can expect science to compile vast combinatorial libraries to run combinatorial sweeps through possibility space (as Stephen Wolfram has done with cellular automata), and to run multiple competing hypotheses in a matrix. Deep realtime simulations and hypothesis search will drive data collection in the real world.
4) New ways of knowing will emerge. "Wikiscience" is leading to perpetually refined papers with a thousand authors. Distributed instrumentation and experiment, thanks to miniscule transaction cost, will yield smart-mob, hive-mind science operating "fast, cheap and out of control." Negative results will have positive value. Triple-blind experiments will emerge through massive non-invasive statistical data collection – no one, not the subjects nor the experimenters, will realize an experiment was going on until later. We may expect zero-author papers, generated wholly by computers.
5) Science will create new levels of meaning. The Internet is already made of one quintillion transistors, a trillion links, a million emails per second, 20 Exabyte of memory. It is approaching the level of the human brain and is doubling every year, while the brain is not. It is all becoming effectively one machine. And we are the machine.
Technology is, in its essence, new way of thinking. The most powerful type of technology, sometimes called enabling technology, is a thought incarnate which enables new knowledge to find and develop new ways to know. This kind of recursive bootstrapping is how science evolves. As in every type of knowledge, it accrues layers of self-reference to its former state.
As a problem, too often, we underwhelm numerical users with first-order approximations to derivatives, with Euler's method and Simpson's rule, thus training scientists who fail to appreciate the elegance and remarkable accuracy that typify our best solutions to well-behaved problems. However, it can be pointed out that some of our users first encounter numerical methods through a particular application, only to get hooked by computational questions. "The future of numerical analysis will be determined by future users.
K.N.Toosi University of Technology is one of the sponsors in the country for the development of policy-relevant scientific tools and has devoted parts of its assigned resources to the development of innovative tools that allow the communication of results to the users, with special attention to the needs of non-experts who are subject to a variety of external “non-scientific” constraints. Communication and visualization tools that are accessible to the general public are especially important since the consent of the public is a prerequisite for successful policy development and implementation. Whenever it is pertinent, input from non-expert groups should be sought during the design and implementation of research programs whose outcomes have the potential of influencing public policies.