Entropy and Information Transmission in Causation and Retrocausation
نویسنده
چکیده
Although experimental evidence for retrocausation exists, there are clearly subtleties to the phenomenon. The bilking paradox, in which one intervenes to eliminate a subsequent cause after a preceding effect has occurred, appears on the surface to show that retrocausation is logically impossible. In a previous paper, the second law of thermodynamics was invoked to show that the entropy in each process of a psi interaction (presentence, telepathy, remote perception, and psychokinesis) cannot decrease, prohibiting psi processes in which signals condense from background fluctuations. Here it is shown, perhaps contrary to one's intuition, that reversible processes cannot be influenced through retrocausation, but irreversible processes can. The increase in thermodynamic entropy in irreversible processes which are generally described by Newtonian mechanics but not Lagrangian dynamics and Hamilton's Principle is required for causation. Thermodynamically reversible processes cannot be causal and hence also cannot be retrocausal. The role of entropy in psi interactions is extended by using the bilking paradox to consider information transmission in retroactive psychokinesis (PK). PK efficiency, r\pK, is defined. A prediction of the analysis is that r\PK ^ HIHQ, where H is the information uncertainty or entropy in the retro-PK agent's knowledge of the event that is to be influenced retrocausally. The information entropy can provide the necessary ingredient for nonreversibility, and hence retrocausation. Noise and bandwidth limitations in the communication to the agent of the outcome of the event increase the maximum PK efficiency. Avoidance of the bilking paradox does not bar a subject from using the premonition of an event to prevent it from occurring. The necessity for large information entropy, which is the expected value of the surprisal, is likely to be essential for any successful PK process, not just retro-PK processes. Hence uncertainty in the communication process appears to be a necessary component of retrocausation in particular, and of PK in general.
منابع مشابه
Causal Network Inference by Optimal Causation Entropy
The broad abundance of time series data, which is in sharp contrast to limited knowledge of the underlying network dynamic processes that produce such observations, calls for an general and efficient method of causal network inference. Here we develop mathematical theory of Causation Entropy, a model-free information-theoretic statistic designed for causality inference. We prove that for a give...
متن کاملQualitative Assessment of Hydrometric Station Density Kermanshah Province Using the Discrete Entropy
Monitoring the quality parameters of the river is very important in the management of surface water resources. One of the effective methods in this field is entropy theory which has a very good capability in designing quality monitoring network. In this research, discrete entropy theory has been studied and monitored in order to qualitatively evaluate the temporal and spatial frequencies of the...
متن کاملEntropy and Subtle Interactions
Entropy considerations are used to determine the types of subtle interactions (SI), or psi phenomena, that are consistent with the second law of thermodynamics. The analysis is preceded by a short tutorial on entropy and the second law of thermodynamics. For a coherent advanced (retrocausation) signal to exist, it would conflict with the second law in forward time because it would require a red...
متن کاملTransmission of International Prices of Corn to Iranian Domestic Markets
Market volatility remains one of the most important research fields in agricultural economics.Interestingly, price transmission mechanism seems to be symmetric in sectors that are likely to be of high political power.This paper analyzes the price transmission effects from international markets to domestic markets for corn in Iran. For this purpose, we estimate the elasticity of substitution bet...
متن کاملCausation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings
Inference of causality is central in nonlinear time series analysis and science in general. A popular approach to infer causality between two processes is to measure the information flow between them in terms of transfer entropy. Using dynamics of coupled oscillator networks, we show that although transfer entropy can successfully detect information flow in two processes, it often results in er...
متن کامل