**Donate to arXiv**

One would not even be able to understand the contingent properly without drawing upon the idealized model. More generally, understanding physical communication through the mathematical model provided a mental model to reconcile the idealistic and the mechanistic interpretations of experimental facts.

The possibility of reconciliation between mathematical clarity and empirical uncertainty is warranted ex ante , but has to be realized through historical process. The dynamic interaction between un certainty and un clarity. The cosmology warrants order within each of the substances and between them, so that what seemed at first to be different i.

The harmonic solution at the cosmological level warrants reconciliation at the metaphysical one: nature is revealed to us by God's grace, and therefore we are able to reconcile our mathematical image with physical reality. While there is initially a gap between the complexity of the contingencies and the idealization in the model system, the two dimensions of mathematical clarity and empirical uncertainty can be brought to interact, and we are warranted in achieving scientific understanding, i.

Such interaction, however, requires a process in the time dimension.

- Cosmological and Psychological Time;
- Subscribe & Get Offer.
- How to Drive Your Competition Crazy: Creating Disruption for Fun and Profit.
- 4-dimensional anti-Kahler manifolds and Weyl curvature.

Taking time as a transcendental given, Newton and Huygens could then formulate the two central positions on how to achieve more clarity and certainty by scientific investigation. On the one hand, Newton tended towards the empiricist position when he formulated his well-known " hypotheses non fingo ":. As soon as there are more than two systems to synchronize, the interaction can in principle be decomposed in more than one way, and therefore the transcendental relation may itself become uncertain.

In the absence of a single metaphysical guarantee for preestablished harmony and cosmos, asynchronicity and chaos will prevail. I showed above in terms of Huygens' critique that the question of how clarity can be related to uncertainty was raised in the 17th century, but was then answered in a specific way in order to secure the progress of physics.

I shall argue in the second part of this study that one can nowadays specify the conditions under which clarity can be generated in relations among systems which contain and process uncertainties. Indeed, in the philosophy of science, in the social sciences, and most pronouncedly in the reflexive sociology of science e.

## Psychological Arrow of Time | John Ankerberg Show - John Ankerberg Show

Can anything more than informed opinion be formed in sociological theorizing? Does this imply that one can ultimately achieve only uncertainty? As noted above, "uncertainty" may substantively mean something different in various dimensions. Therefore, we need a definition which leaves room for the variance in the substantive meaning of uncertainty, i.

A definition without reference to a system has to be content-free, i. In , Shannon provided us with such a definition of "uncertainty" as part of the mathematical theory of communication [44]. Shannon defined "information" as the uncertainty contained in a finite sequence of signals or, more generally, in a distribution.

Whether one should call this quantity "information" has been heavily debated e. But more important than these semantic problems, was Shannon's equation of the concept with probabilistic entropy [16]. In contrast to thermodynamic entropy, however, the probabilistic uncertainty is defined yet content-free, i. Thermodynamic entropy is a measure of disorder among molecules in thermodynamics, and it can also be used to describe the direction of time in evolutionary processes e. In the social sciences, however, one is usually not interested in the non-equilibrium thermodynamics of a physico-chemical system, but in the development of uncertainty, disorder, and complexity in social systems.

Thus, the uncertainty refers to a different substance, and it can be reflected only by a different theory of communication. But how can substances communicate if there is no pre-established harmony and synchronicity? The probabilistic interpretation of communication.

The envisaged generalization of concepts like "entropy" and "communication" to the dynamics of systems other than the physico-chemical one requires further reflection on the assumptions contained in the mathematization of physics. As noted, the concept of communication is much older than the thermodynamic concept of entropy [3] or its probabilistic interpretation in the mathematical theory of communication [44]. Descartes and Huygens, for example, had to assume that "motion" momentum and energy is communicated in a collision in order to be conserved, and thus they discussed this conservation in terms of the "laws of communication of motion.

I shall now use the example of the collision in a classical system to infer the probabilistic concept of communication from this older notion of communication as a special case. In a system of colliding balls momentum and energy have to be conserved, and thus to be communicated upon collision. As we know nowadays, the efficiency of the communication of momenta in a physical realization depends on the amount of free energy which dissipates as thermodynamic entropy.

The ideal communication of momenta and kinetic energies of the colliding balls is thus dampened by this dissipation. When the physical realization approximates the ideal case, the thermodynamic entropy vanishes, but the redistribution of momenta and energies at the macro-level becomes more pronounced since there is less dissipation. Correspondingly, the message that the collision has taken place contains a larger amount of information i.

Thus, the two types of entropy can vary independently: the one may increase and the other vanish in the same event. The reason for this independence is that the systems of reference for the two entropies are different: thermodynamic entropy refers exclusively to the distribution of, for example, momenta and positions among molecules, while the reference system for probabilistic entropy in this case is the system which conserves macroscopic momenta and energy.

Thermodynamic entropy is generated only in the special case where the communication has the physico-chemical system as its substantive reference. Shannon's probabilistic definition of entropy enables us to develop a content-free definition of communication systems which operate by processing distributions. In the example above, the macroscopic energy system communicated in terms of the kinetic energies of billiard-type balls, the momentum system in terms of momenta. Social communication systems communicate in terms of means of social communication e.

### Make A Difference

In these cases the probabilistic entropy is defined with reference to systems other than the physico-chemical one. The translation of contingent uncertainty into mathematical clarity by Descartes has been generalized by Shannon to the understanding of a contingency as a probability distribution.

- Join Kobo & start eReading today?
- Cosmological and Psychological Time.
- Communication of Time.

Like the uncertainty in the act of doubt, the mathematical awareness of a probabilistic event cannot be given a substantive meaning internally by this theoretical system; it needs an external reference. However, the external reference again need not be physical existence. In systems other than the physical one, other quantities than "motion" may have to be conserved, and therefore communicated. For example, in classical chemistry a mass balance for each element involved in the reaction is assumed. In this case, the atoms of the elements are redistributed. One can express the communication of any redistributed quantities as a message which contains information, and thus in terms of probabilistic entropy.

The systems and subsystems [Note 19] are different with respect to the quality of what is being communicated, not with respect to the generation of probabilistic entropy. If the system under study generates probabilistic entropy with respect to two communications e.

In general, the number of dimensions of the information in the message that the event happened is equal to the number of systems of reference for the information. Each system of reference adds another quality to the uncertainty, and therefore another dimension to the communication. Thus we arrive at a general formulation of the problem noted by Huygens that the dimensionality of the uncertainty has to be specified.

When Huygens refered to mathematical space and physical extension, he hypothesized two dimensions i. If, for example, in a chemical reaction three qualitatively different elements have to be balanced in terms of their respective total mass, the message of this event will analogously contain a three-dimensional uncertainty. Information is never free-floating, but necessarily itself processed within a contingent communication system.

The communication systems are delineated in terms of what they communicate. Whatever they communicate is redistributed in the communication, and this redistribution is in itself a message which is sent to all the communication systems with which this system can communicate externally. In a single communication, i. Analogously, the receiving systems can only receive the message by operating, and thus by redistributing their own information contents.

Cycles of communication are thus generated. The complexity increases rapidly i.

## The Arrow of Time

What are the conditions under which communication systems can also organize their chaos, either among one another or internally? In other words: what are the conditions under which networks can retain and organize information? As noted, some systems are conservative, i. In general, the number of elements n which a system contains sets a limit to the information which the system can hold. One may also express this as the maximal entropy viz. As noted above, the number of elements in systems can be multiplied by adding other systems of reference to the communication, and thus by increasing the number of dimensions in the information n x m.

Furthermore, open systems like social communication systems can be defined only in terms of the communication, and consequently these systems have uncertain boundaries. Each additional node of the network n adds n - 1 possible links. In general, when the number of elements increases more rapidly than the information content of the system, the redundancy which can be defined as the complement of the information content also increases.