techhub.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A hub primarily for passionate technologists, but everyone is welcome

Administered by:

Server stats:

4.6K
active users

#scientificmethod

2 posts2 participants0 posts today

My Road to Bayesian Stats

By 2015, I had heard of Bayesian Stats but didn’t bother to go deeper into it. After all, significance stars, and p-values worked fine. I started to explore Bayesian Statistics when considering small sample sizes in biological experiments. How much can you say when you are comparing means of 6 or even 60 observations? This is the nature work at the edge of knowledge. Not knowing what to expect is normal. Multiple possible routes to a seen a result is normal. Not knowing how to pick the route to the observed result is also normal. Yet, our statistics fails to capture this reality and the associated uncertainties. There must be a way I thought. 

Free Curve to the Point: Accompanying Sound of Geometric Curves (1925) print in high resolution by Wassily Kandinsky. Original from The MET Museum. Digitally enhanced by rawpixel.

I started by searching for ways to overcome small sample sizes. There are minimum sample sizes recommended for t-tests. Thirty is an often quoted number with qualifiers. Bayesian stats does not have a minimum sample size. This had me intrigued. Surely, this can’t be a thing. But it is. Bayesian stats creates a mathematical model using your observations and then samples from that model to make comparisons. If you have any exposure to AI, you can think of this a bit like training an AI model. Of course the more data you have the better the model can be. But even with a little data we can make progress. 

How do you say, there is something happening and it’s interesting, but we are only x% sure. Frequentist stats have no way through. All I knew was to apply the t-test and if there are “***” in the plot, I’m golden. That isn’t accurate though. Low p-values indicate the strength of evidence against the null hypothesis. Let’s take a minute to unpack that. The null hypothesis is that nothing is happening. If you have a control set and do a treatment on the other set, the null hypothesis says that there is no difference. So, a low p-value says that it is unlikely that the null hypothesis is true. But that does not imply that the alternative hypothesis is true. What’s worse is that there is no way for us to say that the control and experiment have no difference. We can’t accept the null hypothesis using p-values either. 

Guess what? Bayes stats can do all those things. It can measure differences, accept and reject both  null and alternative hypotheses, even communicate how uncertain we are (more on this later). All without making assumptions about our data.

It’s often overlooked, but frequentist analysis also requires the data to have certain properties like normality and equal variance. Biological processes have complex behavior and, unless observed, assuming normality and equal variance is perilous. The danger only goes up with small sample sizes. Again, Bayes requires you to make no assumptions about your data. Whatever shape the distribution is, so called outliers and all, it all goes into the model. Small sample sets do produce weaker fits, but this is kept transparent. 

Transparency is one of the key strengths of Bayesian stats. It requires you to work a little bit harder on two fronts though. First you have to think about your data generating process (DGP). This means how do the data points you observe came to be. As we said, the process is often unknown. We have at best some guesses of how this could happen. Thankfully, we have a nice way to represent this. DAGs, directed acyclic graphs, are a fancy name for a simple diagram showing what affects what. Most of the time we are trying to discover the DAG, ie the pathway of a biological outcome. Even if you don’t do Bayesian stats, using DAGs to lay out your thoughts is a great. In Bayesian stats the DAGs can be used to test if your model fits the data we observe. If the DAG captures the data generating process the fit is good, and not if it doesn’t. 

The other hard bit is doing analysis and communicating the results. Bayesian stats forces you to be verbose about your assumptions in your model. This part is almost magicked away in t-tests. Frequentist stats also makes assumptions about the model that your data is assumed to follow. It all happens so quickly that there isn’t even a second to think about it. You put in your data, click t-test and woosh! You see stars. In Bayesian stats stating the assumptions you make in your model (using DAGs and hypothesis about DGPs) communicates to the world what and why you think this phenomenon occurs. 

Discovering causality is the whole reason for doing science. Knowing the causality allows us to intervene in the forms of treatments and drugs. But if my tools don’t allow me to be transparent and worse if they block people from correcting me, why bother?

Richard McElreath says it best:

There is no method for making causal models other than science. There is no method to science other than honest anarchy.

IRIS Insights I Nico Formanek: Are hyperparameters vibes?
April 24, 2025, 2:00 p.m. (CEST)
Our second IRIS Insights talk will take place with Nico Formanek.
🟦
This talk will discuss the role of hyperparameters in optimization methods for model selection (currently often called ML) from a philosophy of science point of view. Special consideration is given to the question of whether there can be principled ways to fix hyperparameters in a maximally agnostic setting.
🟦
This is a WebEx talk to which everyone who is interested is cordially invited. It will take place in English. Our IRIS speaker, Jun.-Prof. Dr. Maria Wirzberger, will moderate it. Following Nico Formanek's presentation, there will be an opportunity to ask questions. We look forward to active participation.
🟦
Please join this Webex talk using the following link:
lnkd.in/eJNiUQKV
🟦
#Hyperparameters #ModelSelection #Optimization #MLMethods #PhilosophyOfScience #ScientificMethod #AgnosticLearning #MachineLearning #InterdisciplinaryResearch #AIandPhilosophy #EthicsInAI #ResponsibleAI #AITheory #WebTalk #OnlineLecture #ResearchTalk #ScienceEvents #OpenInvitation #AICommunity #LinkedInScience #TechPhilosophy #AIConversations

TL;DR: I think a specific kind of "backward" reasoning, common in conservative religious traditions in the US, is one of the things driving the current crisis.

Long version:.......

First, my experience is with the LDS church, which has spent a few decades trying to be Evangelical enough to be buddies with the actual Evangelicals, and my experiences so far suggest that LDS church members have a lot in common with Evangelical Christians at this point in time, in regards to the issues in this long-ass post.

I was #LDS (i.e., #Mormon) for the first mumblenumbermumble decades of my life. I was taught--expicitly, not by the also-ubiquitous methods of "read-between-the-lines", "pay attention to consequences instead of words," etc.--that the right and proper way to #reason about all things religious was thus:

  1. Find out what is true
  2. Use all resources after that to support, justify, explain, and believe that truth

The first point is a problem, of course, because it comes before any external evidence. There is #evidence of a kind, but it is 100% subjective: the results of your #spiritual promptings or feelings or inspiration. You get these by praying really hard, thinking the right thoughts, etc. Thinking "negative" thoughts (often any kind of skepticism or doubt is included in this category) will drive the Holy Spirit away and he won't be able to tell you how true all the stuff is.

There are many people--and I truly believe they are almost all sincere and well-meaning--to help you navigate this difficult process. This means to help you come to the right conclusions (i.e., that Jesus is God and died for our sins, that Joseph Smith was His prophet, that the LDS church is the only true #church etc.). Coming to the "wrong" conclusions means you aren't doing it right, hard enough, humbly enough, etc. so you will keep at it, encouraged by family, friends, and leaders, until you get the "right" answer.

See, you make up your mind about the truth of things before acquiring any outside #evidence. I was a full-time missionary in Mexico for two years; I am aware that of course evidence does get used, but not the way a scientist or other evidence-informed person would use it. We used scriptures, #logic, personal stories, empirical data, etc. as merely one of many possible tools to bring another soul to Christ. LDS doctrine is clear on this (where its notably unclear on a huge range of other things): belief/#faith/testimony does not come from empirical evidence. It comes from the Holy Spirit, and only if you ask just right.

Empirical evidence, clear reasoning, etc. are nice but they're just a garnish; they're only condiments. The main meal is promptings (i.e., feelings) from the Holy Spirit. That is where true knowledge comes from. All other #knowledge is inferior and subordinate. All of it. If the Holy Spirit tells you the moon is cheese, then by golly you now have a cheesemoon. More disturbingly, if the Holy Ghost tells you to kill your neighbors, you should presumably do that. This kind of "prepare for the worst" thinking is a lot more common in conservative Christian groups than I think some people realize.

Anyway, you get these promptings. They're probably not because you're a sleep-deprived, angsty, sincere teenager who has been bathed and baked in this culture your entire life and has no concept of any outcome other than this. You get the promptings. Now you know. You know that Jesus is your Savior, that Joseph Smith was his prophet, that the LDS Church is the only true and living church on the face of the etc. etc.

You don't believe; you know.

So the next step is... nothing specific, really. You're done learning. That step is over. As we were reminded repeatedly as young missionaries: your job is to teach others, not to be taught by them. You go through the rest of your life with this knowledge, and you share it whenever you can. Of course, some events and facts and speech might make you doubt your hard-won knowledge. What to do?

You put the knowledge first and make the #facts fit it. You arrange the facts you see or read or whatever so that they fit this knowledge you acquired on your knees late at night with tears in your eyes, or in Sacrament Meeting the morning after a drama-filled youth conference. If you can't make the facts fit your knowledge, you reject the facts.

You seriously reject facts, and pretty casually. You might decide they aren't facts, or you might get really interested in the origin of anidea so you can discredit it, etc. Some people reject the theory of evolution. Others reject a history in in which many of the founding fathers of the USA were #atheist, #agnostic, or Not Very Good People. You can reject anything, really. You can reject the evidence of your eyes and ears, as the Party demands. It's kind of easy, in fact.

Millions of people think like this: they explicitly reject information that does not fit the narrative they have acquired through a process that depended 100% on subjective experiences (and, afterward, is heavily dominated by "authority figures" and trusted friends who tell you what to believe this week).

As a psychologist, even though #DecisionScience is not my area of research, I can tell you various ways in which one's #subjective experience can be manipulated, especially with the support of a life-saturating religious worldview and community. Relegating facts to a supporting role (at best) means giving all kinds of biases free rein in influencing your views. Facts were one of the things that might have minimized that process. In fact, I think facts as correctives for human biases was a main motivation underlying the development of the #ScientificMethod.

This becomes how you live your life: find out what's true, then rearrange your worldview, your attitudes, your specific beliefs, your behavior, and potentially even how you evaluate evidence to fit that knowledge. You aren't faking it, you aren't pretending; you simply believe something different. You see the world differently. I'm guessing you'd pass a lie detector test.

Note that nowhere in this process is there ever what a #philosopher, a #scientist, or a mathematician would call an "honest, open inquiry." That would imply uncertainty about the outcome of the inquiry. It would imply a willingness to accept unexpected answers if the evidence or reasoning led there. That's not possible because there can be only one answer: what you already know. Evidence cannot be allowed to threaten knowledge.

Coincidentally, now you're a perfect member of the Trump/Musk/whoever personality cult. All you need are some trusted sources (e.g. friends, neighbors, celebrities, local church leaders) to tell you that #Trump is a Good #Christian, that #AOC is secretly a communist, that #Obama was born in Africa, that Killary is literally eating babies, that a pizza parlor has a torture basement, that Zelensky is a villain and Putin a hero, etc. Literally anything. You haven't just learned how to do this; it is how your brain works, now. This is how "reasoning" happens. This is how belief and worldview and personal commitment are formed and shifted.

Now you casually accept new concepts like "crisis actors", "alternative facts", the "deep state", and "feelings-based reality." You have no problem doing this. Conspiracy theories are a cakewalk; you could fully believe six impenetrable Qanon ravings before breakfast.

I've seen progressives casually assume that Evangelical-type Christians are hypocrites, or lying, or "virtue signaling" as they state their support for whatever value-violating thing Trump or Musk or any national GOP figure has said or done (e.g., "Hey, I now believe that god doesn't love disabled people, after all!"). I've accused conservatives of those things things myself, though I don't actually believe that's what is happening. What we're seeing is not just hypocrisy or dishonesty. What's happening, at least with many religious people, is that a trusted leader has told them they should believe a different thing, so now they do. It's that simple. Many might even die for their new belief in the right circumstances (certain Christians are a little bit obsessed with the possibility of dying for their faith, so this isn't as high a bar as you might think).

Sure, some people who flipflop overnight probably are lying or putting on an act even they don't truly believe. However, many more are simply being who they are, or who they've become by existing in this ideological/cultural system for years.

Obviously, I believe this kind of reasoning is not good and makes the world a better place. I would like to reduce it or even eliminate it. It is embedded, though, with other dynamics: ingroup/outgroup tribalism, authoritarianism (boy howdy do conservative churches train you to be an authoritarian), prejudices of various kinds, and basic cognitive biases (which run rampant in such environments).

It's also bound up with religious #AntiIntellectualism. In the LDS church, for instance, there's a scripture that gets tossed around at election time saying that being educated is good, but only up to a point (any education that leads a person to question God's words, etc. is by definition too much" learning). As a person with a graduate degree, my last decade or so in the LDS church was marked by a more or less constant social tension from the possibility that I might "know too much".

Education reliably reduces this problematic kind of thinking/believing system in many people. Specifically, "liberal arts" education (which isn't about liberalism or necessarily arts) is the special sauce; the classes many students will be forced to take for "general education" at most US universities are pretty good at teaching students different ways of thinking and helping them try on alternative worldviews. Many of the people learning multiple worldviews and getting some tools for reasoning and evidence, etc. tend to use them for the rest of their lives. Even truly exploring one or two wrong alternative worldviews or thinking patterns tends to yield big rewards over time. Notably, the GOP's attacks on higher ed have become much worse, recently.

Anyway, this is (IMO) what progressives are up against in the USA. It is not just that some people believe different things; it's that many of those people have entirely different cognitive/emotional/social structures and processes for how belief happens and what it means.

Undoing this will take generations. In the meantime, I encourage pushing back on conservative flip-flops. No matter what, not even Evangelical congressmen want to look inconsistent. Even the evangelicalest of Christians will sometimes engage with facts and reasoning to some degree, and pressure simply works, sometimes. Keep your expectations for personal change low, however.

xkcdHyphen

#Perth's low-grade, LNP-promoting newspaper/real estate flyer, #TheWestAustralian had the below.

Suspecting genAI.
No journalist name, no researcher name, speaks of increase but shows no older figures. And the following ridiculousness?

"More than a third (34 per cent) reported a decline in their desire to work in science."

Yuckie, kinda like dried chook droppings.

#STEMEducation looks like it needs to include training in how to respond to bullying and harassment.

Right now, there is a war on #Science and #Education. We cannot allow this to continue unchallenged. We must unite, we must continue to search for the truth using the #scientificmethod, and we must not give in.

Learn about what we must do in this age of disinformation.

medium.com/@schagi/science-in-

Medium · The War on Science: Why We Must Defend Education and Truth in the Age of MisinformationBy Seth Chagi

Study 4 years for a degree...

Study 3 more for a PhD...

Join a lab, start working...

Spend years studying a problem...

Form hypothesis, gather evidence...

Test hypothesis, form conclusions...

Report findings, clear peer review...

Findings published, reported in press...

...

Some guy on the Internet: "Bullshit"

😞

Harry Potter and the Methods of Rationality is a wonderful take on HP where instead of being raised by a family full of hate and dysfunction, the parents are scientists. Harry then brings the Scientific Method to Hogwarts and analyzes how magic works. He also becomes friends with Draco! Highly recommended.

hpmor.com/
fanfiction.net/s/5782108/1/Har

hpmor.comHarry Potter and the Methods of Rationality | Petunia married a professor, and Harry grew up reading science and science fiction.