Advanced Search

    On nonparametric estimation of a mixing density via the predictive recursion algorithm

    Under Review Since : 2018-12-05

    Nonparametric estimation of a mixing density based on observations from the corresponding mixture is a challenging statistical problem. This paper surveys the literature on a fast, recursive estimator based on the predictive recursion algorithm. After introducing the algorithm and giving a few examples, I summarize the available asymptotic convergence theory, describe an important semiparametric extension, and highlight two interesting applications. I conclude with a discussion of several recent developments in this area and some open problems.

    Empirical priors and coverage of posterior credible sets in a sparse normal mean model

    Under Review Since : 2018-12-05

    Bayesian methods provide a natural means for uncertainty quantification, that is, credible sets can be easily obtained from the posterior distribution. But is this uncertainty quantification valid in the sense that the posterior credible sets attain the nominal frequentist coverage probability? This paper investigates the validity of posterior uncertainty quantification based on a class of empirical priors in the sparse normal mean model. We prove that there are scenarios in which the empirical Bayes method provides valid uncertainty quantification while other methods may not, and finite-sample simulations confirm the asymptotic findings.

    Integrating Mind and Body

    Under Review Since : 2018-12-04

    Reality has two different dimensions: information-communication, and matter-energy. They relate to each other in a figure-ground gestalt that gives two different perspectives on the one reality.

    1. We learn from modern telecommunications that matter and energy are the media of information and communication; here information is the figure and matter/enegy is the ground.

    2. We learn from cybernetics that information and communication control matter and energy; here matter/energy is the figure and information/communication is the ground.

    The General Theory of Control

    Under Review Since : 2018-12-04

    The cybernetic control loop can be understood as a decision making process, a command and control process. Information controls matter and energy. Decision making has four necessary and sufficient communication processes: act, sense, evaluate, choose. These processes are programmed by the environment, the model, values and alternatives.

    The Great Information Error of 1948

    Under Review Since : 2018-12-04

    The idea that information is entropy is an error. The proposal is neither mathematically nor logically valid. There are numerous definitions of entropy, but there is no definition of entropy for which this equation is valid. Information is information. It is neither matter nor energy.

    The Political Economy of Net Neutrality

    Under Review Since : 2018-12-03

    Internet is increasingly important for our economies and societies. This is the reason for a growing interest in internet regulation. The stakes in network neutrality - that all traffic on the internet should be treated equally - are particularly high. This paper argues that technological, civil-libertarian, legal and economic arguments exist both for- and against net neutrality and that the decision is ultimately political. We therefore frame the issue of net neutrality as an issue of political economy. The main political economy arguments for net neutrality are that a net-neutral internet contributes to the reduction of inequality, preserves its openness and prevents artificial scarcity. With these arguments Slovenia, after Chile and the Netherlands, adopted net neutrality legislation. We present it as a case study for examining how political forces are affecting the choice of economic and technological policies. After a few years we are finding that proper enforcement is just as important as legislation.

    Consciousness and the Incompleteness of Science

    Under Review Since : 2018-11-22

    Physicalism, which provides the philosophical basis of modern science, holds that consciousness is solely a product of brain activity, and more generally, that mind is an epiphenomenon of matter, that is, derivable from and reducible to matter. If mind is reducible to matter, then it follows that identical states of matter must correspond to identical states of mind.

    In this discourse, I provide a cogent refutation of physicalism by showing examples of physically identical states which, by definition, cannot be distinguished by any method available to science but can nevertheless be distinguished by a conscious observer. I conclude by giving an example of information that is potentially knowable by an individual but is beyond the ken of science.

    Hylomorphic Functions

    Under Review Since : 2018-11-22

    Philosophers have long pondered the Problem of Universals. One response is Metaphysical Realism, such as Plato's Doctrine of the Forms and Aristotle's Hylomorphism. We postulate that Measurement in Quantum Mechanics forms the basis of Metaphysical Realism. It is the process that gives rise to the instantiation of Universals as Properties, a process we refer to as Hylomorphic Functions. This combines substance metaphysics and process metaphysics by identifying the instantiation of Universals as causally active processes along with physical substance, forming a dualism of both substance and information. Measurements of fundamental properties of matter are the Atomic Universals of metaphysics, which combine to form the whole taxonomy of Universals. We look at this hypothesis in relation to various different interpretations of Quantum Mechanics grouped under two exemplars: the Copenhagen Interpretation, a version of Platonic Realism based on wave function collapse, and the Pilot Wave Theory of Bohm and de Broglie, where particle--particle interactions lead to an Aristotelian metaphysics. This view of Universals explains the distinction between pure information and the medium that transmits it and establishes the arrow of time. It also distinguishes between univerally true Atomic Facts and the more conditional Inferences based on them. Hylomorphic Functions also provide a distinction between Universals and Tropes based on whether a given Property is a physical process or is based on the qualia of an individual organism. Since the Hylomorphic Functions are causally active, it is possible to suggest experimental tests that can verify this viewpoint of metaphysics.

    Principia Qualia

    Under Review Since : 2018-12-05

    The nature of consciousness has been one of the longest-standing open questions in philosophy. Advancements in physics, neuroscience, and information theory have informed and constrained this topic, but have not produced any consensus. What would it mean to ‘solve’ or ‘dissolve’ the mystery of consciousness?

    Part I begins with grounding this topic by considering a concrete question: what makes some conscious experiences more pleasant than others? We first review what’s known about the neuroscience of pain & pleasure, find the current state of knowledge narrow, inconsistent, and often circular, and conclude we must look elsewhere for a systematic framework (Sections I & II). We then review the Integrated Information Theory (IIT) of consciousness and several variants of IIT, and find each of them promising, yet also underdeveloped and flawed (Sections III-V).

    We then take a step back and distill what kind of problem consciousness is. Importantly, we offer eight sub-problems whose solutions would, in aggregate, constitute a complete theory of consciousness (Section VI).

    Armed with this framework, in Part II we return to the subject of pain & pleasure (valence) and offer some assumptions, distinctions, and heuristics to clarify and constrain the problem (Sections VII-IX). Of particular interest, we then offer a specific hypothesis on what valence is (Section X) and several novel empirical predictions which follow from this (Section XI). Part III finishes with discussion of how this general approach may inform open problems in neuroscience, and the prospects for building a new science of qualia (Sections XII & XIII). Lastly, we identify further research threads within this framework (Appendices A-F).

    Interaction is Everything

    Published Date : 2018-11-13

    This paper argues that experimental evidence, quantum theory, and relativity theory, taken together, suggest that reality is relational: Properties and behaviors of phenomena do not have a priori, intrinsic values; instead, these properties and behaviors emerge through interactions with other systems.

    Polls, Pundits, or Prediction Markets: An assessment of election forecasting

    Under Review Since : 2018-11-05

    I compare forecasts of the 2018 U.S. midterm elections based on (i) probabilistic predictions posted on the FiveThirtyEight blog and (ii) prediction market prices on PredictIt.com. Based on empirical forecast and price data collected prior to the election, the analysis assesses the calibration and accuracy according to Brier and logarithmic scoring rules.  I also analyze the performance of a strategy that invests in PredictIt based on the FiveThirtyEight forecasts. 

    The Collected Works of Francis Perey on Probability Theory

    Published Date : 2018-11-01

    Francis Perey, of the Engineering Physics Division of Oak Ridge National Lab, left a number of unpublished papers upon his death in 2017. They circulate around the idea of probabilities arising naturally from basic physical laws. One of his papers, Application of Group Theory to Data Reduction, was published as an ORNL white paper in 1982. This collection includes two earlier works and two that came later, as well as a relevant presentation. They are being published now so that the ideas in them will be available to interested parties.

    Self-Limiting Factors in Pandemics and Multi-Disease Syndemics

    Under Review Since : 2018-10-22

    The potential for an infectious disease outbreak that is much worse than those which have been observed in human history, whether engineered or natural, has been the focus of significant concern in biosecurity. Fundamental dynamics of disease spread make such outbreaks much less likely than they first appear. Here we present a slightly modified formulation of the typical SEIR model that illustrates these dynamics more clearly, and shows the unlikely cases where concern may still be warranted. This is then applied to an extreme version of proposed pandemic risk, multi-disease syndemics, to show that (absent much clearer reasons for concern) the suggested dangers are overstated.

    Law of total probability and Bayes' theorem in Riesz spaces

    Under Review Since : 2018-10-03

    This note generalizes the notion of conditional probability to Riesz spaces using the order-theoretic approach. With the aid of this concept, we establish the law of total probability and Bayes' theorem in Riesz spaces; we also prove an inclusion-exclusion formula in Riesz spaces. Several examples are provided to show that the law of total probability, Bayes' theorem and inclusion-exclusion formula in probability theory are special cases of our results.

    What a t-test easily hides

    Under Review Since : 2018-09-18

    To justify the effort of developing a theoretical construct, a theoretician needs empirical data that support a non-random effect of sufficiently high replication-probability. To establish these effects statistically, researchers (rightly) rely on a t-test. But many pursue questionable strategies that lower the cost of data-collection. Our paper reconstructs two such strategies. Both reduce the minimum sample-size (NMIN) sufficing under conventional errors (α, β) to register a given effect-size (d) as a statistically significant non-random data signature. The first strategy increases the β-error; the second treats the control-group as a constant, thereby collapsing a two-sample t-test into its one-sample version. (A two-sample t-test for d=0.50 under a=0.05 with NMIN=176, for instance, becomes a one-sample t-test under a=0.05, β=0.20 with NMIN=27.) Not only does this decrease the replication-probability of data from (1-β)=0.95 to (1-β)=0.80, particularly the second strategy cannot corroborate hypotheses meaningfully. The ubiquity of both strategies arguably makes them partial causes of the confidence-crisis. But as resource-pooling would allow research groups reach NMIN jointly, a group’s individually limited resources justify neither strategy.

    The RESEARCHERS.ONE mission

    Published Date : 2018-09-15

    This article describes our motivation behind the development of RESEARCHERS.ONE, our mission, and how the new platform will fulfull this mission.  We also compare our approach with other recent reform initiatives such as post-publication peer review and open access publications.  

    Logic of Probability and Conjecture

    Under Review Since : 2018-09-15

    I introduce a formalization of probability in intensional Martin-Löf type theory (MLTT) and homotopy type theory (HoTT) which takes the concept of ‘evidence’ as primitive in judgments about probability. In parallel to the intuition- istic conception of truth, in which ‘proof’ is primitive and an assertion A is judged to be true just in case there is a proof witnessing it, here ‘evidence’ is primitive and A is judged to be probable just in case there is evidence supporting it. To formalize this approach, we regard propositions as types in MLTT and define for any proposi- tion A a corresponding probability type Prob(A) whose inhabitants represent pieces of evidence in favor of A. Among several practical motivations for this approach, I focus here on its potential for extending meta-mathematics to include conjecture, in addition to rigorous proof, by regarding a ‘conjecture in A’ as a judgment that ‘A is probable’ on the basis of evidence. I show that the Giry monad provides a formal semantics for this system.

    Why Did The Crisis of 2008 Happen?

    Under Review Since : 2018-09-14

    This paper is a synthesis of the deposition in front of the Financial Crisis Inquiry Commission by the Obama Administration in 2010. Note that none of its ideas made it to the report.

    In peer review we (don't) trust: How peer review's filtering poses a systemic risk to science

    Under Review Since : 2018-09-14

    This article describes how the filtering role played by peer review may actually be harmful rather than helpful to the quality of the scientific literature. We argue that, instead of trying to filter out the low-quality research, as is done by traditional journals, a better strategy is to let everything through but with an acknowledgment of the uncertain quality of what is published, as is done on the RESEARCHERS.ONE platform.  We refer to this as "scholarly mithridatism."  When researchers approach what they read with doubt rather than blind trust, they are more likely to identify errors, which protects the scientific community from the dangerous effects of error propagation, making the literature stronger rather than more fragile.  

    Adaptive inference after model selection

    Published Date : 2018-09-14

    Penalized maximum likelihood methods that perform automatic variable are now ubiquitous in statistical research. It is well-known, however, that these estimators are nonregular and consequently have limiting distributions that can be highly sensitive to small perturbations of the underlying generative model. This is the case even for the fixed “p” framework. Hence, the usual asymptotic methods for inference, like the bootstrap and series approximations, often perform poorly in small samples and require modification. Here, we develop locally asymptotically consistent confidence intervals for regression coefficients when estimation is done using the Adaptive LASSO (Zou, 2006) in the fixed “p” framework. We construct the confidence intervals by sandwiching the nonregular functional of interest between two smooth, data-driven, upper and lower bounds and then approximating the distribution of the bounds using the bootstrap. We leverage the smoothness of the bounds to obtain consistent inference for the nonregular functional under both fixed and local alternatives. The bounds are adaptive to the amount of underlying nonregularity in the sense that they deliver asymptotically exact coverage whenever the underlying generative model is such that the Adaptive LASSO estimators are consistent and asymptotically normal, and conservative otherwise. The resultant confidence intervals possess a certain tightness property among all regular bounds. Although we focus on the Adaptive LASSO, our approach generalizes to other penalized methods.  (Originally published as a technical report in 2014.) 

    A Research Prediction Market Framework

    Under Review Since : 2018-09-09

    Prediction markets are currently used for three fields: 1. For economic, political and sporting event outcomes. (IEW, PredictItPredictWise) 2. For risk evaluation, product development and marketing. (Cultivate Labs/Consensus Point) 3. Research replication. (Replication Prediction Project, Experimental Economics Prediction Project, and Brian Nosek’s latest replicability study) The latter application of prediction markets has remained closed and/or proprietary despite the promising results in the methods. In this paper, I construct an open research prediction market framework to incentivize replicate study research and align the motivations of research stakeholders.   

    Patrick Matthew (1790 - 1874) and Natural Selection. Historical News about the Forgotten

    Under Review Since : 2018-10-05

    Ever since Charles Darwin admitted Patrick Matthew's priority in 1860, the latter's book On Naval Timber and Arboriculture is thought to contain the first clear and complete anticipation of the idea of (macro-)evolution through natural selection. Most publications dealing with Matthew, however, merely repeated the little that was known about Matthew through articles by Walther May and William Calman and re-printed the ever same pieces of correspondence and excerpts from Matthew’s book. Apart from this repeating of old lore, stronger claims merely jumped to the conclusion that Charles Darwin plagiarised Patrick Matthew from facts that are equivocal and have never been scrutinised within their historical context. This publication attempts to establish a proper historiography of Patrick Matthew, where there is currently mere repetition of old tales or tossing around of tall claims. The tall claims about Matthew raise questions that need to be properly addressed and put in historical perspective before tentative conclusions can be drawn.

         A reader that knows a good deal about evolutionary theory and its history but nothing about Patrick Matthew will naturally ask the following questions when confronted with tall claims: Who, what, where, how? Who was that Patrick Matthew? What exactly did he anticipate? Wherein did he publish this anticipation? How did his contemporaries receive it? On careful inspection of the evidence and its context, none of these questions can be answered in a simple and definitive way. This should come as no surprise after 150 years without a proper historiography of Patrick Matthew and, instead, a mixture of repeating lore and jumping to unwarranted conclusions. The following, therefore, demands of the reader to endure the suspense of not knowing some things for sure, because of the incomplete historical record, and the courtesy to excuse an unknown author enlarging on an unknown protagonist. On the other hand, it highlights how many interesting historical inquiries await talented students. Each question opens a historical panorama when addressed with an open mind rather than prejudices, orthodox or revisionist.

         Many archival sources given below are new to the historical canon of evolutionary biology. They are quoted verbatim when that is the most succinct way to support an argument. Each chapter is preceded by a captivating vignette, in present tense and different font, in order to whet the readers’ appetite. The style of the vignettes also differs from the other sections in that citations are put into footnotes, in order to keep the flow of reading uninterrupted. The citation style is scientific within the chapters, in order to ease the tracking and cross-checking for interested scholars, who want to follow up with their own inquiries. Some dramatising has been introduced in some of the vignettes, like the one that follows below. These extrapolations have been performed without affecting the historical facts, and they were written with sympathy for all the protagonists.

         The heading of each chapter is a question and the chapters are referred to as Q1 to Q9. The chapters themselves are subdivided into a summary part, the evidence that will seem excessive to some readers (they may skip it) and not enough to others (they may follow up with their own inquiries), and a conclusion. As an exception, Q3-Q5 are summarised and concluded together because they form a comparison, as a group, of the respective transmutation mechanisms of Patrick Matthew, Charles Darwin, and Alfred Wallace. [Chapters Q3-Q5 are based on an article that has previously been published in the Biological Journal of the Linnean Society 123(4).] 

         Part 1 addresses Patrick Matthew’s life. In particular, Q1 refutes the widespread myth that Patrick Matthew studied medicine at the University of Edinburgh until his father died in 1807. Q2 shows that this student of medicine was a namesake from Newbigging. Part 2 addresses the book On Naval Timber and Arboriculture of Patrick Matthew. In particular, Q3-Q5 compare the transmutation mechanisms of Matthew, Darwin and Wallace. This shows that the similarities between their schemes are superficial amounting to no more than that they all include natural selection somehow and lead to species transmutation somehow else. Without the retrospective that inflates the importance of natural selection over all other parts of the theories in question, they are as different as Cuvier’s and Matthew’s, say, or Lyell’s and Darwin’s theories. Q6 reveals what further information can be gleaned from the book about its kludgy composition. Part 3 sheds light on the reception of Matthew’s book by his contemporaries that refutes the myths of its utter non-reception and the opposite myth of its perfect reception. In particular, Q7-Q9 look at the roles of three popularisers of science, who have been claimed to have communicated Matthew’s ideas on natural selection and species transformation to Charles Darwin and Alfred Wallace respectively. First, Robert Chambers probably never read the book of Matthew. Second, John Loudon may or may not have been the author of an anonymous review of Matthew’s book in the Gardener’s Magazine. The only question remaining is why did Darwin miss the short passage about the origin of species in that review? As it starts by recounting matters of naval timber, shipbuilding and other issues of no interest to Darwin, however, it is easy to see how he might have inadvertently skipped the crucial passage. Third, Prideaux Selby read Matthew’s book but did not understand Matthew’s idea on ecological competition, which was a prerequisite for comprehending his evolutionary ideas. Even if he had understood, however, it is hard to see how he is supposed to have communicated the intelligence to Wallace in the Malay Archipelago.

     

    Modeling of multivariate spatial extremes

    Under Review Since : 2018-09-06

    Extreme values are by definition rare, and therefore a spatial analysis of extremes is attractive because a spatial analysis makes full use of the data by pooling information across nearby locations. In many cases, there are several dependent processes with similar spatial patterns. In this paper, we propose the first multivariate spatial models to simultaneously analyze several processes. Using a multivariate model, we are able to estimate joint exceedance probabilities for several processes, improve spatial interpolation by exploiting dependence between processes, and improve estimation of extreme quantiles by borrowing strength across processes. We propose models for separable and non-separable, and spatially continuous and discontinuous processes. The method is applied to French temperature data, where we find an increase in the extreme temperatures over time for much of the country.

    Empirical priors and posterior concentration rates for a monotone density

    Under Review Since : 2018-09-04

    In a Bayesian context, prior specification for inference on monotone densities is conceptually straightforward, but proving posterior convergence theorems is complicated by the fact that desirable prior concentration properties often are not satisfied. In this paper, I first develop a new prior designed specifically to satisfy an empirical version of the prior concentration property, and then I give sufficient conditions on the prior inputs such that the corresponding empirical Bayes posterior concentrates around the true monotone density at nearly the optimal minimax rate. Numerical illustrations also reveal the practical benefits of the proposed empirical Bayes approach compared to Dirichlet process mixtures.

    Gibbs posterior inference on value-at-risk

    Under Review Since : 2018-09-04

    Accurate estimation of value-at-risk (VaR) and assessment of associated uncertainty is crucial for both insurers and regulators, particularly in Europe. Existing approaches link data and VaR indirectly by first linking data to the parameter of a probability model, and then expressing VaR as a function of that parameter. This indirect approach exposes the insurer to model misspecification bias or estimation inefficiency, depending on whether the parameter is finite- or infinite-dimensional. In this paper, we link data and VaR directly via what we call a discrepancy function, and this leads naturally to a Gibbs posterior distribution for VaR that does not suffer from the aforementioned biases and inefficiencies. Asymptotic consistency and root-n concentration rate of the Gibbs posterior are established, and simulations highlight its superior finite-sample performance compared to other approaches.

    Homotopy Equivalence as FOLDS Equivalence

    Under Review Since : 2018-09-02

    We prove an observation of Makkai that FOLDS equivalence coincides with homotopy equivalence in the case of semi-simplicial sets.

    The Art of The Election: A Social Media History of the 2016 Presidential Race

    Under Review Since : 2018-09-01

    The Art of The Election: A Social Media History of the 2016 Presidential Race

    Abstract

    The book is 700 pages comprising of Donald Trump’s tweets from June 2015 to November 2016 and footnotes which comprise 70-80% of the tweets which explain the context of each tweet. The book has a 100 page bibliography.

    It is highly likely that Trump would not have been elected President were it not for social media. This is an unprecedented statement. This is the first time a presidential candidate utilized a social network to get his message out directly to voters, but moreover, to shape the media feedback loop. His tweets became news. This is primary source material on the 2016 election. No need for narratives, outside ”experts” or political ”science”.

    The file is too large to post on this website. But you can download the book under this link:

    https://www.dropbox.com/s/bxvsh7eqh2ueq6j/Trump%20Book.docx?dl=0

    Keywords and phrases: 2016, book, Trump, election, social media.

    The Evolutionary Theory Of Value

    Under Review Since : 2018-09-01

    We propose the first economical theory of value that actually works. We explain evolutionary causes of trade, and demonstrate how goods have value from the evolutionary perspective, and how this value is increased with trade. This "Darwinian" value of goods exists before humans assign monetary value (or any other value estimate) to traded goods. We propose objective value estimate expressed in energy units.

    The Fundamental Principle of Probability: Resolving the Replication Crisis with Skin in the Game

    Under Review Since : 2018-09-17

    I make the distinction between academic probabilities, which are not rooted in reality and thus have no tangible real-world meaning, and real probabilities, which attain a real-world meaning as the odds that the subject asserting the probabilities is forced to accept for a bet against the stated outcome.  With this I discuss how the replication crisis can be resolved easily by requiring that probabilities published in the scientific literature are real, instead of academic.  At present, all probabilities and derivatives that appear in published work, such as P-values, Bayes factors, confidence intervals, etc., are the result of academic probabilities, which are not useful for making meaningful assertions about the real world.

    The Logic of Typicality

    Under Review Since : 2018-08-30

    The notion of typicality appears in scientific theories, philosophical arguments, math- ematical inquiry, and everyday reasoning. Typicality is invoked in statistical mechanics to explain the behavior of gases. It is also invoked in quantum mechanics to explain the appearance of quantum probabilities. Typicality plays an implicit role in non-rigorous mathematical inquiry, as when a mathematician forms a conjecture based on personal experience of what seems typical in a given situation. Less formally, the language of typicality is a staple of the common parlance: we often claim that certain things are, or are not, typical. But despite the prominence of typicality in science, philosophy, mathematics, and everyday discourse, no formal logics for typicality have been proposed. In this paper, we propose two formal systems for reasoning about typicality. One system is based on propositional logic: it can be understood as formalizing objective facts about what is and is not typical. The other system is based on the logic of intuitionistic type theory: it can be understood as formalizing subjective judgments about typicality.

    Is statistics meeting the needs of science?

    Under Review Since : 2018-08-28

    Publication of scientific research all but requires a supporting statistical analysis, anointing statisticians the de facto gatekeepers of modern scientific discovery. While the potential of statistics for providing scientific insights is undeniable, there is a crisis in the scientific community due to poor statistical practice. Unfortunately, widespread calls to action have not been effective, in part because of statisticians’ tendency to make statistics appear simple. We argue that statistics can meet the needs of science only by empowering scientists to make sound judgments that account for both the nuances of the application and the inherent complexity of funda- mental effective statistical practice. In particular, we emphasize a set of statistical principles that scientists can adapt to their ever-expanding scope of problems.

    Rethinking probabilistic prediction: lessons learned from the 2016 U.S. presidential election

    Under Review Since : 2018-09-08

    Whether the predictions put forth prior to the 2016 U.S. presidential election were right or wrong is a question that led to much debate. But rather than focusing on right or wrong, we analyze the 2016 predictions with respect to a core set of {\em effectiveness principles}, and conclude that they were ineffective in conveying the uncertainty behind their assessments. Along the way, we extract key insights that will help to avoid, in future elections, the systematic errors that lead to overly precise and overconfident predictions in 2016. Specifically, we highlight shortcomings of the classical interpretations of probability and its communication in the form of predictions, and present an alternative approach with two important features.  First, our recommended predictions are safer in that they come with certain guarantees on the probability of an erroneous prediction; second, our approach easily and naturally reflects the (possibly substantial) uncertainty about the model by outputting plausibilities instead of probabilities.