Feeds:
Posts
Comments

Posts Tagged ‘Fractals’

Last evening we watched an episode of Inspector Morse in which the theory of chaos was briefly discussed. A few years ago I got interested in this subject which led inevitably to the wondrous world of fractals. For any who may not have heard of the word “fractal” it is the name for an unsmooth geometric object. Classical geometry describes a tabletop as a plane and ignores the roughness of its surface while fractal geometry is concerned with the local irregularities. Fractals have self-similarity: each part of a fractal object is similar to the whole object – they are clones of themselves. The similarity can be perfect or only approximated.
I won’t go further into the fascinating theory of fractals. Those interested can find a good introduction here.
Edward Lorenz, an American mathematician and meteorologist, discovered in the 1960’s, while considering the ability of computer models to predict the weather, that small variations in values of the input parameters of certain mathematical models can have an enormous influence on the result. This appeared to be a property of models using non-linear equations which are necessary to describe natural processes.
It must be said that Henri Poincaré, the French mathematician, physicist and philosopher, was the first to discover this form of model behavior in the 1880’s during his research into the mathematical equations describing the orbits of three gravitationally bound objects in space.
Study has shown that these non-linear systems are not random but chaotic. Though they appear random, they are actually deterministic systems governed by physical or mathematical laws (predictable in principle, if you have exact information) that are impossible to predict in practice beyond a certain point. As Wikipedia puts it: “…, a nonlinear system is any problem where the variable(s) to be solved for cannot be written as a linear combination of independent components.”
The uncertainty involved in the application of models to non-linear systems is the reason why weather forecasting centers use what are called “ensemble forecasts” for their daily long term forecasts. This involves running the computer model a number of times using slightly different values of the input parameters each time and grouping the output results according to similarity. The members of the largest group can then be taken as providing the most reliable forecast. In doubtful cases the results of other computer models from other centers can eventually be taken into account.
The above also applies to other natural systems such as evolution. Even economic models are susceptible due to their dependence on natural processes.
There are those in high places who believe that Nature can be positively influenced by man, using his technology, to produce something superior. Indeed, it is well known that weather modification programs have been around for decades. Publicized results on, for example, the artificial production of rain, dissipation of clouds and reduction of hurricane intensity have been inconclusive.
Man is a part of Nature and not external to it and when applying his thought process through action to something in the “external” environment there is an immediate reaction which can never be entirely predicted due to its complexity. Outcomes of earlier interactions can be a guide but no more than that. My point is that the use of models to predict natural processes will always involve an element of uncertainty and their practical application an element of risk to the environment and man himself.

Read Full Post »