Advanced Search

    Why mathematicians should learn more physics and physicists should learn more mathematics and all of us should learn more philosophy

    Under Review Since : 2019-08-04

    The idea of the paper is to think about the result presented in Numberphile (http://www. numberphile.com/) talk (https://www.youtube.com/watch?v=w-I6XTVZXww) where they claim that 1 + 2 + 3 + ..., the Gauss sum, converges to −1/12. In the video they make two strong statements: one that the Grandi’s Series 1 − 1 + 1 − 1 + 1 − 1 + ... tends to 1/2 and the second that as bizarre as the −1/12 result for the Gauss sum might appears, as it is connected to Physics (this result is related with the number of dimensions in String Theory) then it is plausible. In this work we argue that these two statements reflect adhesion to a particular probability narrative and to a particular scientific philosophical posture. We argue that by doing so, these (Gauss and Grandi series) results and String Theory ultimately, might be mathematical correct but they are scientifically (in the Galileo-Newton-Einstein tradition) inconsistent (at least). The philosophical implications of this problem are also discussed, focusing on the role of evidence and scientific demarcation.

    An Upper Bound for Lebesgue’s Covering Problem

    Under Review Since : 2019-08-01

    A covering problem posed by Henri Lebesgue in 1914 seeks to find the convex shape of smallest area that contains a subset congruent to any point set of unit diameter in the Euclidean plane.  Methods used previously to construct such a covering can be refined and extended to provide an improved upper bound for the optimal area. An upper bound of 0.8440935944 is found.

    Model misspecification, Bayesian versus credibility estimation, and Gibbs posteriors

    Under Review Since : 2019-07-19

    In the context of predicting future claims, a fully Bayesian analysis---one that specifies a statistical model, prior distribution, and updates using Bayes's formula---is often viewed as the gold-standard, while Buhlmann's credibility estimator serves as a simple approximation. But those desirable properties that give the Bayesian solution its elevated status depend critically on the posited model being correctly specified. Here we investigate the asymptotic behavior of Bayesian posterior distributions under a misspecified model, and our conclusion is that misspecification bias generally has damaging effects that can lead to inaccurate inference and prediction. The credibility estimator, on the other hand, is not sensitive at all to model misspecification, giving it an advantage over the Bayesian solution in those practically relevant cases where the model is uncertain. This begs the question: does robustness to model misspecification require that we abandon uncertainty quantification based on a posterior distribution? Our answer to this question is No, and we offer an alternative Gibbs posterior construction. Furthermore, we argue that this Gibbs perspective provides a new characterization of Buhlmann's credibility estimator.

    Fast and scalable non-parametric Bayesian inference for Poisson point processes

    Under Review Since : 2019-06-24

    We study the problem of non-parametric Bayesian estimation of the intensity function of a Poisson point process. The observations are $n$ independent realisations of a Poisson point process on the interval $[0,T]$. We propose two related approaches. In both approaches we model the intensity function as piecewise constant on $N$ bins forming a partition of the interval $[0,T]$. In the first approach the coefficients of the intensity function are assigned independent Gamma priors, leading to a closed form posterior distribution. On the theoretical side, we prove that as $n\rightarrow\infty,$ the posterior asymptotically concentrates around the ``true", data-generating intensity function at optimal rate for $h$-H\"older regular intensity functions ($0 < h\leq 1$).

    In the second approach we employ a Gamma Markov chain prior on the coefficients of the intensity function forms. The posterior distribution is no longer available in closed form, but inference can be performed using a straightforward version of the Gibbs sampler. Both approaches scale well with sample size, but the second is much less sensitive to the choice of $N$.

    Practical performance of our methods is first demonstrated via synthetic data examples.
    We compare our second method with other existing approaches on the UK coal mining disasters data. Furthermore, we apply it to the US mass shootings data and Donald Trump's Twitter data.

    Pairwise Comparisons Using Ranks in the One-Way Model

    Under Review Since : 2019-05-27

    The Wilcoxon Rank Sum is a very competitive robust alternative to the two-sample t-test when the underlying data have tails longer than the normal distribution. Extending to the one-way model with k independent samples, the Kruskal-Wallis rank test is a competitive alternative to the usual F for testing if there are any location differences. However, these positives for rank methods do not extend as readily to methods for making all pairwise comparisons used to reveal where the differences in location may exist. We demonstrate via examples and simulation that rank methods can have a dramatic loss in power compared to the standard Tukey-Kramer method of normal linear models even for non-normal data.

    Comment on D. Bernoulli (1738)

    Under Review Since : 2019-05-25

    Daniel Bernoulli’s study of 1738 [1] is considered the beginning of expected utility theory. Here I point out that in spite of this, it is formally inconsistent with today’s standard form of expected utility theory. Bernoulli’s criterion for participating in a lottery, as written in [1], is not the expected change in utility.

    VARIANCES: Life Unpercepted. Against Teleological Entrenchment in Evolution and Design Alike

    Under Review Since : 2019-04-20

    A philosophical version

     

    In this work of foresight, I communicated my perception of Taleb's policy paper and the Black Swan problem discussed in it. To this effect, I:

    1. Re-conceptualized the concept of "foresight,” non-teleologically, and its “method”;
    2. Revived Empedocles’ non-teleological philosophy of evolution with modern scientific data;
    3. Located the real GMO safety problem in (you guessed it) teleology: in the suppression of dissent within institutions under a seeming assumption of knowing what waste is.

    On GMO & Overflow: Reduced Generation Is .. Aging, Not Foresight

    Under Review Since : 2019-03-28

    The work argues that Nassim Taleb's precautionary principle should not apply to the domain of ‘GMOs’ any more than to other monopolizing economic domains, because the probability of systemic ruin stemming from the GM technology itself is dwarfed by other systemic risks of the Deductive-Optimization Economy of today.

    The author proposes solutions of reinventing our imagination within specialized bodies of expertise by replacing the socially constructed fear to lose one’s face with a fear to miss out on an intellectual contribution. This may result in strengthening of public trust in the institutions and delay their demise.

    Increased generation at the agricultural level (with the GM technology) absolutely must be accompanied by an even greater idea and dissent generation among professionals charged with developing and sustaining this complex system. Life starts with generation, not precaution; limiting the options is a path to extinction. We must limit the fear of loss instead. We will be less unsafe, insofar as it is possible to be safe from oneself, as long as the pace of idea generation within professional bodies outstrips the pace of complexity introduction into our life support systems, such as in agriculture. 

    Validity-preservation properties of rules for combining inferential models

    Under Review Since : 2019-03-09

    An inferential model encodes the data analyst's degrees of belief about an unknown quantity of interest based on the observed data, posited statistical model, etc. Inferences drawn based on these degrees of belief should be reliable in a certain sense, so we require the inferential model to be valid. The construction of valid inferential models based on individual pieces of data is relatively straightforward, but how to combine these so that the validity property is preserved? In this paper we analyze some common combination rules with respect to this question, and we conclude that the best strategy currently available is one that combines via a certain dimension reduction step before the inferential model construction.

    An evolutionary advantage of cooperation

    Under Review Since : 2019-03-05

    Cooperation is a persistent behavioral pattern of entities pooling and sharing resources. Its ubiquity in nature poses a conundrum: whenever two entities cooperate, one must willingly relinquish something of value to the other. Why is this apparent altruism favored in evolution? Classical treatments assume a priori a net fitness gain in a cooperative transaction which, through reciprocity or relatedness, finds its way back from recipient to donor. Our analysis makes no such assumption. It rests on the insight that evolutionary processes are typically multiplicative and noisy. Fluctuations have a net negative effect on the long-time growth rate of resources but no effect on the growth rate of their expectation value. This is a consequence of non-ergodicity. Pooling and sharing reduces the amplitude of fluctuations and, therefore, increases the long-time growth rate for cooperators. Put simply, cooperators' resources outgrow those of similar non-cooperators. This constitutes a fundamental and widely applicable mechanism for the evolution of cooperation. Furthermore, its minimal assumptions make it a candidate explanation in simple settings, where other explanations, such as emergent function and specialization, are implausible. An example of this is the transition from single cells to early multicellular life.

    Prometheus Shaken Baby Debate

    Published Date : 2019-03-04

    Abstract

    These papers - one proposition paper and ten responses - comprise a debate on shaken baby syndrome. This is the hypothesis that a Triad of indicators in the head of a dead baby reveal that it has been shaken to death, and that the killer was the person last in charge of the baby. The debate was scheduled to have appeared in Prometheus, a journal concerned with innovation rather than matters medical. It struck the editors of Prometheus that a hypothesis that had survived nearly half a century and was still resistant to challenge and change was well within the tradition of Prometheus debate. The debate focuses on the role of the expert witness in court, and especially the experiences of Waney Squier, a prominent paediatric pathologist, struck from the medical register in the UK for offering opinions beyond her core expertise and showing insufficient respect for established thinking and its adherents. The debate’s responses reveal much about innovation, and most about the importance of context, in this case the incompatibility of medicine and the law, particularly when constrained by the procedures of the court. Context was also important in the reluctance of Taylor & Francis, the publisher of Prometheus, to publish the debate on the grounds that its authors strayed from their areas of expertise and showed insufficient respect for established thinking.

    Prometheus shaken baby debate

    Contents

    Introduction

    The shaken baby debate - Stuart Macdonald

    Proposition paper

    Shaken baby syndrome: causes and consequences of conformity - Waney Squier

    Response papers

    Shaken baby syndrome: a fraud on the courts - Heather Kirkwood

    Shaken baby: an evolving diagnosis deformed by the pressures of the courtroom - Susan Luttner

    Waney Squier’s ordeal and the crisis of the shaken baby paradigm - Niels Lynøe

    Another perspective - simply my brief thoughts - Dave Marshall

    Has Squier been treated fairly? - Brian Martin

    Commentary on the paper by Waney Squier: ‘Shaken baby syndrome: causes and consequences of conformity’ - Michael J Powers

    Waney Squier and the shaken baby syndrome case: a clarion call to science, medicine and justice - Toni C Saad

    The role of the General Medical Council - Terence Stephenson

    When experts disagree - Stephen J. Watkins

    The General Medical Council’s handling of complaints: the Waney Squier case - Peter Wilmshurst

    Empirical priors for prediction in sparse high-dimensional linear regression

    Under Review Since : 2019-03-03

    Often the primary goal of fitting a regression model is prediction, but the majority of work in recent years focuses on inference tasks, such as estimation and feature selection. In this paper we adopt the familiar sparse, high-dimensional linear regression model but focus on the task of prediction. In particular, we consider a new empirical Bayes framework that uses the data to appropriately center the prior distribution for the non-zero regression coefficients, and we investigate the method's theoretical and numerical performance in the context of prediction. We show that, in certain settings, the asymptotic posterior concentration in metrics relevant to prediction quality is very fast, and we establish a Bernstein--von Mises theorem which ensures that the derived prediction intervals achieve the target coverage probability. Numerical results complement the asymptotic theory, showing that, in addition to having strong finite-sample performance in terms of prediction accuracy and uncertainty quantification, the computation time is considerably faster compared to existing Bayesian methods.

    Expanding Universe Controversy Summarised by Dr Dennis Overbye: An Alternative Explanation

    Under Review Since : 2019-02-28

    This communication outlines the potential for a novel, alternative model to rationalise the quantitative difficulties with the Hubble constant. This model, the Proto-Quantum Field (PQF) model, is an alternative to the singularity-big-bang (SBB) model and the two are mutually incompatible. A justification is that the theoretical developments required to validate the PQF hypothesis is closely derived from the standard model for particles and forces, more so than those required to modify the SBB hypothesis.

    A New Hypothesis for the Universe: The Role of Thermal Gradients in Setting Time, Forming Three Dimensional Space and Populating with Matter and Energy.

    Published Date : 2019-02-15

    The universe is formed from proto-quantum field(s) (PQFs). The initiating event is the formation of a thermal gradient which establishes synchronous oscillations which describes time. Time is not quantised. Concomitantly PQFs, either directly or indirectly, differentiate into the all the quantum fields required for the standard model of particles and forces, and three dimensional space. The transition of PQFs to functional quantum fields is a continuous process at the boundary of a spherical universe, a “ring of fire”, necessary to maintain time. 

    Matching Supernova Redshifts with Special Relativity and no Dark Energy

    Under Review Since : 2019-02-11

    This analysis shows that a special relativity interpretation matches observed type 1a supernova redshifts. Davis & Lineweaver reported in 2003 that a special relativity match to supernova redshift observations can be ruled out at more than 23σ, but MacLeod’s 2004 conclusion that this finding was incorrect and due to a mathematical error is confirmed. MacLeod’s plot of special relativity against observation has been further improved by using celerity (aka proper velocity) instead of peculiar velocity. A Hubble plot of type 1a supernova celerity against retarded distance has a straight line of 70 km s-1 Mpc-1 for as far back in time as we can observe, indicating that, with a special relativity interpretation of cosmological redshift, expansion of the universe is neither accelerating nor decelerating, and it is not necessary to invoke the existence of dark energy.

    Time is Ticking and Tocking

    Under Review Since : 2019-02-05

    The words used to describe time don’t clearly distinguish when time is a dimension, used for locating objects and events in spacetime, and when it is a property of objects that can age at different rates. It is proposed to revise the terminology of time to distinguish between ‘tick-time’ (a dimensional property) and ‘tock-time’ (a property of objects related to energy).

    False confidence, non-additive beliefs, and valid statistical inference

    Under Review Since : 2019-02-03

    Statistics has made tremendous advances since the times of Fisher, Neyman, Jeffreys, and others, but the fundamental and practically relevant questions about probability and inference that puzzled our founding fathers remain unanswered. To bridge this gap, I propose to look beyond the two dominating schools of thought and ask the following three questions: what do scientists need out of statistics, do the existing frameworks meet these needs, and, if not, how to fill the void? To the first question, I contend that scientists seek to convert their data, posited statistical model, etc., into calibrated degrees of belief about quantities of interest. To the second question, I argue that any framework that returns additive beliefs, i.e., probabilities, necessarily suffers from false confidence---certain false hypotheses tend to be assigned high probability---and, therefore, risks systematic bias. This reveals the fundamental importance of non-additive beliefs in the context of statistical inference. But non-additivity alone is not enough so, to the third question, I offer a sufficient condition, called validity, for avoiding false confidence, and present a framework, based on random sets and belief functions, that provably meets this condition. Finally, I discuss characterizations of p-values and confidence intervals in terms of valid non-additive beliefs, which imply that users of these classical procedures are already following the proposed framework without knowing it.

    What does theoretical Physics tell us about Mexico's December Error crisis

    Under Review Since : 2019-01-30

    A perfect economic storm emerged in M\'exico in what was called (mistakenly under our analysis) The December Error (1994) in which Mexico's economy collapsed. In this paper, we show how Theoretical Psychics may help us to understand the under processes for this kind of economic crisis and eventually perhaps to develop an early warning. We specifically analyze monthly historical time series for inflation from January 1969 to November 2018. We found that Fisher information is insensible to inflation growth in the 80's decade but capture quite good The December Error (TDE). Our results show that under Salinas administration Mexican economy was characterized by unstable stability must probably due to hidden risk policies in the form of macro-economy controls that artificially suppress aleatority out of the system making it fragile. And so, we conclude that it was not at all a December error but a sexenal sustained error of fragilization.

    How to teach complexity? Do it by facing complex problems, a case of study with weather data in natural protected areas in Mexico

    Under Review Since : 2019-01-24

    Inhere we present a proposal of how to teach complexity using a Problem Based Learning approach under a set of philosophical principles inspired by the pedagogical experience in sustainability Sciences. We described the context in which we put on practise y these ideas that was a graduate course on Complexity and Data Science applied to Ecology.  In part two we present the final work presented by the students as they wrote it and which we believe could be submitted to a journal by its own merits

    The rise of the technobionts: toward a new ontology to understand current planetary crisis

    Under Review Since : 2019-01-21

    Inhere we expand the concept of Holobiont to incorporate niche construction theory in order to increase our understanding of the current planetary crisis. By this, we propose a new ontology, the Ecobiont, as the basic evolutionary unit of analysis. We make the case of \textit{Homo Sapiens} organized around modern cities (technobionts) as a different Ecobiont from classical \textit{Homo Sapiens} (i.e. Hunter-gatherers \textit{Homo Sapiens}). We consider that Ecobiont ontology helps to make visible the coupling of \textit{Homo Sapiens} with other biological entities under processes of natural and cultural evolution. Not to see this coupling hidden systemic risks and enhance the probability of catastrophic events. So Ecobiont ontology is necessary to understand and respond to the current planetary crisis.

    Accuracy Assessments For Headwater Stream Maps In Western North Carolina

    Under Review Since : 2018-12-31

    Headwater streams are essential to downstream water quality, therefore it is important they are properly represented on maps used for stream regulation. Current maps used for stream regulation, such as the United States Geological Survey (USGS) topographic maps and Natural Resources Conservation Service (NRCS) soil survey maps, are outdated and do not accurately nor consistently depict headwater streams. In order for new stream maps to be used for regulatory purposes, the accuracy must be known and the maps must show streams with a consistent level of accuracy. This study assessed the valley presence/absence and stream length accuracy of the new stream maps created by the North Carolina Center for Geographic Analysis (CGIA) for western North Carolina. The CGIA stream map does not depict headwater streams with a consistent level of accuracy. This study also compared the accuracy of stream networks modeled using the computer software program, Terrain Analysis using Digital Elevation Models (TauDEM), to the CGIA stream map. The stream networks modeled in TauDEM, also do not consistently predict the location of headwater streams across the mountain region of the state. The location of headwater streams could not be accurately nor consistently predicted by solely using aerial photography or elevation data. Other factors such as climate, soils, geology, land use, and vegetation cover should be considered to accurately and consistently model headwater stream networks.

    Online Monitoring of Metric Temporal Logic using Sequential Networks

    Under Review Since : 2018-12-28

    Metric Temporal Logic (MTL) is a popular formalism to specify patterns with timing constraints over the behavior of cyber-physical systems. In this paper, I propose sequential networks for online monitoring applications and construct network-based monitors from the past fragment of MTL over discrete and dense time behaviors. This class of monitors is more compositional, extensible, and easily implementable than other monitors based on rewriting and automata. I first explain the sequential network construction over discrete time behaviors and then extend it towards dense time by adopting a point-free approach. The formulation for dense time behaviors and MTL radically differs from the traditional pointy definitions and in return, we avoid some longstanding complications. I argue that the point-free approach is more natural and practical therefore should be preferred for the dense time. Finally, I present my implementation together with some experimental results that show the performance of the network-based monitors compared to similar existing tools.

    Ether Interpretation of General Relativity

    Under Review Since : 2018-12-17

    Analogue gravity models are attempts to model general relativity by using such things as acoustic waves propagating through an ideal fluid. In his work, we take inspiration from these models to re-interpret general relativity in terms of an ether continuum moving and changing against a background of absolute space and time. We reformulate the metric, the Ricci tensor, the Einstein equation, continuous matter dynamics in terms of the ether. We also reformulate general relativistic electrodynamics in terms of the ether, which takes the form of electrodynamics in an anisotropic moving medium. Some degree of simplification is achieved by assuming that the speed-of-light is uniform and isotropic with respect to the ether coordinates. Finally, we speculate on the nature of under-determination in general relativity.

    On nonparametric estimation of a mixing density via the predictive recursion algorithm

    Under Review Since : 2018-12-05

    Nonparametric estimation of a mixing density based on observations from the corresponding mixture is a challenging statistical problem. This paper surveys the literature on a fast, recursive estimator based on the predictive recursion algorithm. After introducing the algorithm and giving a few examples, I summarize the available asymptotic convergence theory, describe an important semiparametric extension, and highlight two interesting applications. I conclude with a discussion of several recent developments in this area and some open problems.

    Empirical priors and coverage of posterior credible sets in a sparse normal mean model

    Under Review Since : 2018-12-05

    Bayesian methods provide a natural means for uncertainty quantification, that is, credible sets can be easily obtained from the posterior distribution. But is this uncertainty quantification valid in the sense that the posterior credible sets attain the nominal frequentist coverage probability? This paper investigates the validity of posterior uncertainty quantification based on a class of empirical priors in the sparse normal mean model. We prove that there are scenarios in which the empirical Bayes method provides valid uncertainty quantification while other methods may not, and finite-sample simulations confirm the asymptotic findings.

    Integrating Mind and Body

    Under Review Since : 2018-12-04

    Reality has two different dimensions: information-communication, and matter-energy. They relate to each other in a figure-ground gestalt that gives two different perspectives on the one reality.

    1. We learn from modern telecommunications that matter and energy are the media of information and communication; here information is the figure and matter/enegy is the ground.

    2. We learn from cybernetics that information and communication control matter and energy; here matter/energy is the figure and information/communication is the ground.

    The General Theory of Control

    Under Review Since : 2018-12-04

    The cybernetic control loop can be understood as a decision making process, a command and control process. Information controls matter and energy. Decision making has four necessary and sufficient communication processes: act, sense, evaluate, choose. These processes are programmed by the environment, the model, values and alternatives.

    The Great Information Error of 1948

    Under Review Since : 2018-12-04

    The idea that information is entropy is an error. The proposal is neither mathematically nor logically valid. There are numerous definitions of entropy, but there is no definition of entropy for which this equation is valid. Information is information. It is neither matter nor energy.

    The Political Economy of Net Neutrality

    Under Review Since : 2018-12-03

    Internet is increasingly important for our economies and societies. This is the reason for a growing interest in internet regulation. The stakes in network neutrality - that all traffic on the internet should be treated equally - are particularly high. This paper argues that technological, civil-libertarian, legal and economic arguments exist both for- and against net neutrality and that the decision is ultimately political. We therefore frame the issue of net neutrality as an issue of political economy. The main political economy arguments for net neutrality are that a net-neutral internet contributes to the reduction of inequality, preserves its openness and prevents artificial scarcity. With these arguments Slovenia, after Chile and the Netherlands, adopted net neutrality legislation. We present it as a case study for examining how political forces are affecting the choice of economic and technological policies. After a few years we are finding that proper enforcement is just as important as legislation.

    Consciousness and the Incompleteness of Science

    Under Review Since : 2018-11-22

    Physicalism, which provides the philosophical basis of modern science, holds that consciousness is solely a product of brain activity, and more generally, that mind is an epiphenomenon of matter, that is, derivable from and reducible to matter. If mind is reducible to matter, then it follows that identical states of matter must correspond to identical states of mind.

    In this discourse, I provide a cogent refutation of physicalism by showing examples of physically identical states which, by definition, cannot be distinguished by any method available to science but can nevertheless be distinguished by a conscious observer. I conclude by giving an example of information that is potentially knowable by an individual but is beyond the ken of science.

    Hylomorphic Functions

    Under Review Since : 2018-11-22

    Philosophers have long pondered the Problem of Universals. One response is Metaphysical Realism, such as Plato's Doctrine of the Forms and Aristotle's Hylomorphism. We postulate that Measurement in Quantum Mechanics forms the basis of Metaphysical Realism. It is the process that gives rise to the instantiation of Universals as Properties, a process we refer to as Hylomorphic Functions. This combines substance metaphysics and process metaphysics by identifying the instantiation of Universals as causally active processes along with physical substance, forming a dualism of both substance and information. Measurements of fundamental properties of matter are the Atomic Universals of metaphysics, which combine to form the whole taxonomy of Universals. We look at this hypothesis in relation to various different interpretations of Quantum Mechanics grouped under two exemplars: the Copenhagen Interpretation, a version of Platonic Realism based on wave function collapse, and the Pilot Wave Theory of Bohm and de Broglie, where particle--particle interactions lead to an Aristotelian metaphysics. This view of Universals explains the distinction between pure information and the medium that transmits it and establishes the arrow of time. It also distinguishes between univerally true Atomic Facts and the more conditional Inferences based on them. Hylomorphic Functions also provide a distinction between Universals and Tropes based on whether a given Property is a physical process or is based on the qualia of an individual organism. Since the Hylomorphic Functions are causally active, it is possible to suggest experimental tests that can verify this viewpoint of metaphysics.

    Principia Qualia

    Under Review Since : 2018-12-05

    The nature of consciousness has been one of the longest-standing open questions in philosophy. Advancements in physics, neuroscience, and information theory have informed and constrained this topic, but have not produced any consensus. What would it mean to ‘solve’ or ‘dissolve’ the mystery of consciousness?

    Part I begins with grounding this topic by considering a concrete question: what makes some conscious experiences more pleasant than others? We first review what’s known about the neuroscience of pain & pleasure, find the current state of knowledge narrow, inconsistent, and often circular, and conclude we must look elsewhere for a systematic framework (Sections I & II). We then review the Integrated Information Theory (IIT) of consciousness and several variants of IIT, and find each of them promising, yet also underdeveloped and flawed (Sections III-V).

    We then take a step back and distill what kind of problem consciousness is. Importantly, we offer eight sub-problems whose solutions would, in aggregate, constitute a complete theory of consciousness (Section VI).

    Armed with this framework, in Part II we return to the subject of pain & pleasure (valence) and offer some assumptions, distinctions, and heuristics to clarify and constrain the problem (Sections VII-IX). Of particular interest, we then offer a specific hypothesis on what valence is (Section X) and several novel empirical predictions which follow from this (Section XI). Part III finishes with discussion of how this general approach may inform open problems in neuroscience, and the prospects for building a new science of qualia (Sections XII & XIII). Lastly, we identify further research threads within this framework (Appendices A-F).

    Interaction is Everything

    Published Date : 2018-11-13

    This paper argues that experimental evidence, quantum theory, and relativity theory, taken together, suggest that reality is relational: Properties and behaviors of phenomena do not have a priori, intrinsic values; instead, these properties and behaviors emerge through interactions with other systems.

    Polls, Pundits, or Prediction Markets: An assessment of election forecasting

    Under Review Since : 2018-11-05

    I compare forecasts of the 2018 U.S. midterm elections based on (i) probabilistic predictions posted on the FiveThirtyEight blog and (ii) prediction market prices on PredictIt.com. Based on empirical forecast and price data collected prior to the election, the analysis assesses the calibration and accuracy according to Brier and logarithmic scoring rules.  I also analyze the performance of a strategy that invests in PredictIt based on the FiveThirtyEight forecasts. 

    The Collected Works of Francis Perey on Probability Theory

    Published Date : 2018-11-01

    Francis Perey, of the Engineering Physics Division of Oak Ridge National Lab, left a number of unpublished papers upon his death in 2017. They circulate around the idea of probabilities arising naturally from basic physical laws. One of his papers, Application of Group Theory to Data Reduction, was published as an ORNL white paper in 1982. This collection includes two earlier works and two that came later, as well as a relevant presentation. They are being published now so that the ideas in them will be available to interested parties.

    Self-Limiting Factors in Pandemics and Multi-Disease Syndemics

    Under Review Since : 2018-10-22

    The potential for an infectious disease outbreak that is much worse than those which have been observed in human history, whether engineered or natural, has been the focus of significant concern in biosecurity. Fundamental dynamics of disease spread make such outbreaks much less likely than they first appear. Here we present a slightly modified formulation of the typical SEIR model that illustrates these dynamics more clearly, and shows the unlikely cases where concern may still be warranted. This is then applied to an extreme version of proposed pandemic risk, multi-disease syndemics, to show that (absent much clearer reasons for concern) the suggested dangers are overstated.

    Law of total probability and Bayes' theorem in Riesz spaces

    Under Review Since : 2018-10-03

    This note generalizes the notion of conditional probability to Riesz spaces using the order-theoretic approach. With the aid of this concept, we establish the law of total probability and Bayes' theorem in Riesz spaces; we also prove an inclusion-exclusion formula in Riesz spaces. Several examples are provided to show that the law of total probability, Bayes' theorem and inclusion-exclusion formula in probability theory are special cases of our results.

    What a t-test easily hides

    Under Review Since : 2018-09-18

    To justify the effort of developing a theoretical construct, a theoretician needs empirical data that support a non-random effect of sufficiently high replication-probability. To establish these effects statistically, researchers (rightly) rely on a t-test. But many pursue questionable strategies that lower the cost of data-collection. Our paper reconstructs two such strategies. Both reduce the minimum sample-size (NMIN) sufficing under conventional errors (α, β) to register a given effect-size (d) as a statistically significant non-random data signature. The first strategy increases the β-error; the second treats the control-group as a constant, thereby collapsing a two-sample t-test into its one-sample version. (A two-sample t-test for d=0.50 under a=0.05 with NMIN=176, for instance, becomes a one-sample t-test under a=0.05, β=0.20 with NMIN=27.) Not only does this decrease the replication-probability of data from (1-β)=0.95 to (1-β)=0.80, particularly the second strategy cannot corroborate hypotheses meaningfully. The ubiquity of both strategies arguably makes them partial causes of the confidence-crisis. But as resource-pooling would allow research groups reach NMIN jointly, a group’s individually limited resources justify neither strategy.

    The RESEARCHERS.ONE mission

    Published Date : 2018-09-15

    This article describes our motivation behind the development of RESEARCHERS.ONE, our mission, and how the new platform will fulfull this mission.  We also compare our approach with other recent reform initiatives such as post-publication peer review and open access publications.  

    Logic of Probability and Conjecture

    Under Review Since : 2018-09-15

    I introduce a formalization of probability in intensional Martin-Löf type theory (MLTT) and homotopy type theory (HoTT) which takes the concept of ‘evidence’ as primitive in judgments about probability. In parallel to the intuition- istic conception of truth, in which ‘proof’ is primitive and an assertion A is judged to be true just in case there is a proof witnessing it, here ‘evidence’ is primitive and A is judged to be probable just in case there is evidence supporting it. To formalize this approach, we regard propositions as types in MLTT and define for any proposi- tion A a corresponding probability type Prob(A) whose inhabitants represent pieces of evidence in favor of A. Among several practical motivations for this approach, I focus here on its potential for extending meta-mathematics to include conjecture, in addition to rigorous proof, by regarding a ‘conjecture in A’ as a judgment that ‘A is probable’ on the basis of evidence. I show that the Giry monad provides a formal semantics for this system.

    Why Did The Crisis of 2008 Happen?

    Under Review Since : 2018-09-14

    This paper is a synthesis of the deposition in front of the Financial Crisis Inquiry Commission by the Obama Administration in 2010. Note that none of its ideas made it to the report.

    In peer review we (don't) trust: How peer review's filtering poses a systemic risk to science

    Under Review Since : 2018-09-14

    This article describes how the filtering role played by peer review may actually be harmful rather than helpful to the quality of the scientific literature. We argue that, instead of trying to filter out the low-quality research, as is done by traditional journals, a better strategy is to let everything through but with an acknowledgment of the uncertain quality of what is published, as is done on the RESEARCHERS.ONE platform.  We refer to this as "scholarly mithridatism."  When researchers approach what they read with doubt rather than blind trust, they are more likely to identify errors, which protects the scientific community from the dangerous effects of error propagation, making the literature stronger rather than more fragile.  

    Adaptive inference after model selection

    Published Date : 2018-09-14

    Penalized maximum likelihood methods that perform automatic variable are now ubiquitous in statistical research. It is well-known, however, that these estimators are nonregular and consequently have limiting distributions that can be highly sensitive to small perturbations of the underlying generative model. This is the case even for the fixed “p” framework. Hence, the usual asymptotic methods for inference, like the bootstrap and series approximations, often perform poorly in small samples and require modification. Here, we develop locally asymptotically consistent confidence intervals for regression coefficients when estimation is done using the Adaptive LASSO (Zou, 2006) in the fixed “p” framework. We construct the confidence intervals by sandwiching the nonregular functional of interest between two smooth, data-driven, upper and lower bounds and then approximating the distribution of the bounds using the bootstrap. We leverage the smoothness of the bounds to obtain consistent inference for the nonregular functional under both fixed and local alternatives. The bounds are adaptive to the amount of underlying nonregularity in the sense that they deliver asymptotically exact coverage whenever the underlying generative model is such that the Adaptive LASSO estimators are consistent and asymptotically normal, and conservative otherwise. The resultant confidence intervals possess a certain tightness property among all regular bounds. Although we focus on the Adaptive LASSO, our approach generalizes to other penalized methods.  (Originally published as a technical report in 2014.) 

    A Research Prediction Market Framework

    Under Review Since : 2018-09-09

    Prediction markets are currently used for three fields: 1. For economic, political and sporting event outcomes. (IEW, PredictItPredictWise) 2. For risk evaluation, product development and marketing. (Cultivate Labs/Consensus Point) 3. Research replication. (Replication Prediction Project, Experimental Economics Prediction Project, and Brian Nosek’s latest replicability study) The latter application of prediction markets has remained closed and/or proprietary despite the promising results in the methods. In this paper, I construct an open research prediction market framework to incentivize replicate study research and align the motivations of research stakeholders.   

    Modeling of multivariate spatial extremes

    Under Review Since : 2018-09-06

    Extreme values are by definition rare, and therefore a spatial analysis of extremes is attractive because a spatial analysis makes full use of the data by pooling information across nearby locations. In many cases, there are several dependent processes with similar spatial patterns. In this paper, we propose the first multivariate spatial models to simultaneously analyze several processes. Using a multivariate model, we are able to estimate joint exceedance probabilities for several processes, improve spatial interpolation by exploiting dependence between processes, and improve estimation of extreme quantiles by borrowing strength across processes. We propose models for separable and non-separable, and spatially continuous and discontinuous processes. The method is applied to French temperature data, where we find an increase in the extreme temperatures over time for much of the country.

    Empirical priors and posterior concentration rates for a monotone density

    Under Review Since : 2018-09-04

    In a Bayesian context, prior specification for inference on monotone densities is conceptually straightforward, but proving posterior convergence theorems is complicated by the fact that desirable prior concentration properties often are not satisfied. In this paper, I first develop a new prior designed specifically to satisfy an empirical version of the prior concentration property, and then I give sufficient conditions on the prior inputs such that the corresponding empirical Bayes posterior concentrates around the true monotone density at nearly the optimal minimax rate. Numerical illustrations also reveal the practical benefits of the proposed empirical Bayes approach compared to Dirichlet process mixtures.

    Gibbs posterior inference on value-at-risk

    Under Review Since : 2018-09-04

    Accurate estimation of value-at-risk (VaR) and assessment of associated uncertainty is crucial for both insurers and regulators, particularly in Europe. Existing approaches link data and VaR indirectly by first linking data to the parameter of a probability model, and then expressing VaR as a function of that parameter. This indirect approach exposes the insurer to model misspecification bias or estimation inefficiency, depending on whether the parameter is finite- or infinite-dimensional. In this paper, we link data and VaR directly via what we call a discrepancy function, and this leads naturally to a Gibbs posterior distribution for VaR that does not suffer from the aforementioned biases and inefficiencies. Asymptotic consistency and root-n concentration rate of the Gibbs posterior are established, and simulations highlight its superior finite-sample performance compared to other approaches.

    Homotopy Equivalence as FOLDS Equivalence

    Under Review Since : 2018-09-02

    We prove an observation of Makkai that FOLDS equivalence coincides with homotopy equivalence in the case of semi-simplicial sets.

    The Art of The Election: A Social Media History of the 2016 Presidential Race

    Under Review Since : 2018-09-01

    The Art of The Election: A Social Media History of the 2016 Presidential Race

    Abstract

    The book is 700 pages comprising of Donald Trump’s tweets from June 2015 to November 2016 and footnotes which comprise 70-80% of the tweets which explain the context of each tweet. The book has a 100 page bibliography.

    It is highly likely that Trump would not have been elected President were it not for social media. This is an unprecedented statement. This is the first time a presidential candidate utilized a social network to get his message out directly to voters, but moreover, to shape the media feedback loop. His tweets became news. This is primary source material on the 2016 election. No need for narratives, outside ”experts” or political ”science”.

    The file is too large to post on this website. But you can download the book under this link:

    https://www.dropbox.com/s/bxvsh7eqh2ueq6j/Trump%20Book.docx?dl=0

    Keywords and phrases: 2016, book, Trump, election, social media.

    The Evolutionary Theory Of Value

    Under Review Since : 2018-09-01

    We propose the first economical theory of value that actually works. We explain evolutionary causes of trade, and demonstrate how goods have value from the evolutionary perspective, and how this value is increased with trade. This "Darwinian" value of goods exists before humans assign monetary value (or any other value estimate) to traded goods. We propose objective value estimate expressed in energy units.

    On valid uncertainty quantification about a model

    Under Review Since : 2018-09-14

    Inference on parameters within a given model is familiar, as is ranking different models for the purpose of selection. Less familiar, however, is the quantification of uncertainty about the models themselves. A Bayesian approach provides a posterior distribution for the model but it comes with no validity guarantees, and, therefore, is only suited for ranking and selection. In this paper, I will present an alternative way to view this model uncertainty problem, through the lens of a valid inferential model based on random sets and non-additive beliefs. Specifically, I will show that valid uncertainty quantification about a model is attainable within this framework in general, and highlight the benefits in a classical signal detection problem.

    The Fundamental Principle of Probability

    Under Review Since : 2018-09-17

    I make the distinction between academic probabilities, which are not rooted in reality and thus have no tangible real-world meaning, and real probabilities, which attain a real-world meaning as the odds that the subject asserting the probabilities is forced to accept for a bet against the stated outcome.  With this I discuss how the replication crisis can be resolved easily by requiring that probabilities published in the scientific literature are real, instead of academic.  At present, all probabilities and derivatives that appear in published work, such as P-values, Bayes factors, confidence intervals, etc., are the result of academic probabilities, which are not useful for making meaningful assertions about the real world.

    The Logic of Typicality

    Under Review Since : 2018-08-30

    The notion of typicality appears in scientific theories, philosophical arguments, math- ematical inquiry, and everyday reasoning. Typicality is invoked in statistical mechanics to explain the behavior of gases. It is also invoked in quantum mechanics to explain the appearance of quantum probabilities. Typicality plays an implicit role in non-rigorous mathematical inquiry, as when a mathematician forms a conjecture based on personal experience of what seems typical in a given situation. Less formally, the language of typicality is a staple of the common parlance: we often claim that certain things are, or are not, typical. But despite the prominence of typicality in science, philosophy, mathematics, and everyday discourse, no formal logics for typicality have been proposed. In this paper, we propose two formal systems for reasoning about typicality. One system is based on propositional logic: it can be understood as formalizing objective facts about what is and is not typical. The other system is based on the logic of intuitionistic type theory: it can be understood as formalizing subjective judgments about typicality.

    Is statistics meeting the needs of science?

    Under Review Since : 2018-08-28

    Publication of scientific research all but requires a supporting statistical analysis, anointing statisticians the de facto gatekeepers of modern scientific discovery. While the potential of statistics for providing scientific insights is undeniable, there is a crisis in the scientific community due to poor statistical practice. Unfortunately, widespread calls to action have not been effective, in part because of statisticians’ tendency to make statistics appear simple. We argue that statistics can meet the needs of science only by empowering scientists to make sound judgments that account for both the nuances of the application and the inherent complexity of funda- mental effective statistical practice. In particular, we emphasize a set of statistical principles that scientists can adapt to their ever-expanding scope of problems.

    Rethinking probabilistic prediction: lessons learned from the 2016 U.S. presidential election

    Under Review Since : 2018-09-08

    Whether the predictions put forth prior to the 2016 U.S. presidential election were right or wrong is a question that led to much debate. But rather than focusing on right or wrong, we analyze the 2016 predictions with respect to a core set of {\em effectiveness principles}, and conclude that they were ineffective in conveying the uncertainty behind their assessments. Along the way, we extract key insights that will help to avoid, in future elections, the systematic errors that lead to overly precise and overconfident predictions in 2016. Specifically, we highlight shortcomings of the classical interpretations of probability and its communication in the form of predictions, and present an alternative approach with two important features.  First, our recommended predictions are safer in that they come with certain guarantees on the probability of an erroneous prediction; second, our approach easily and naturally reflects the (possibly substantial) uncertainty about the model by outputting plausibilities instead of probabilities.

    Imprecise probabilities as a semantics for intuitive probabilistic reasoning

    Under Review Since : 2019-03-26

    I prove a connection between the logical framework for intuitive probabilistic reasoning (IPR) introduced by Crane (2017) and sets of imprecise probabilities. More specifically, this connection provides a straightforward interpretation to sets of imprecise probabilities as subjective credal states, giving a formal semantics for Crane's formal system of IPR. The main theorem establishes the IPR framework as a potential logical foundation for imprecise probability that is independent of the traditional probability calculus.