top of page
Search
Writer's pictureSeth Garrett

Credentist Epistemology: the Yin and Yang of Knowledge

Updated: Mar 10





Skepticism:

The skeptics were right – truth is impossible to obtain. But what implications does this have on the concept of knowledge? If we cannot access truth, does that prohibit us from obtaining knowledge?


We cannot be absolutely certain that the world around us is real or that our interpretations of reality are true, because truth transcends our sensory data – there is always the possibility of a deeper layer of reality. A deeper simulation, a deeper matrix, a deeper demonic illusion, a deeper brain in a vat – we have no access to the most ultimate layer of depth.



We cannot be absolutely certain that the future will be like the past, because the truth about the future transcends our memories of the past. The only reason we think the future will be like the past is because we have a pattern of memories that sketch a pattern. We see the sun rise on Monday. We see the sun rise on Tuesday. We sketch a pattern that the sun will also rise on Wednesday. Yet, we have no reason to have absolute confidence in this pattern. We are merely correlating events and projecting that correlation.



The laws of gravity, you say. We can trust that the sun will rise tomorrow because of the laws of gravity. Who says that the laws of gravity must remain consistent? Your only appeal to the consistency of gravity is a memory that gravity worked yesterday, and it also worked the day before. You have no certainty that gravity should be a reliable force every single day into the future, you merely project this pattern from your memories.



Yet, who is to say that your memories are reliable? You went to sleep last night, lost consciousness, and suddenly woke up today with all of these memories of patterns for how the world is supposed to work. Who is to say those memories are even real? How do you know that those memories were not implanted in your brain by a mad scientist or a demon? You cannot be 100% certain that your memories are even real.



But what about math! If we can’t be certain about anything else, at least we can be certain about math, right? But what even is math? Math is also just patterns from our memory. We see 2 rocks and we put them into a pile with another 2 rocks and suddenly we notice there are 4 rocks. This act of “putting together” is what we call addition. For some reason, when we put things together, they accumulate instead of annihilating. When things accumulate, they increase along the pattern of mathematical addition. But what if the laws of physics were different? What if putting things together caused annihilation as opposed to accumulation? What if putting things together summoned dragons instead? Again, math is just based on our memories of what happens in reality when things come together. These memories are not reliable, and the reality they are based on is not reliable either!



Okay, but what about Descartes’ “I think, therefore I am!” Surely at rock bottom, at least we can have complete confidence in the idea that we exist. But is it so obvious that it justifies 100% certainty? We must ask ourselves; do we really know what we are? Do we know what it means to think? To feel? Isn’t the concept of “I” so complex that it almost evades accurate definition? What does it even mean to exist? If we can’t have complete confidence in our understanding of the individual words in the statement, “I think, therefore I am”, how can we have complete confidence in it when it is combined together in a sentence?



There is a deeper problem with 100% confidence – the idea of an infinite regress. Every time we claim A, we must justify it with explanation B. But then Explanation B now needs a justification. We must justify B with explanation C. Then C is explained by D, and D by E, and E by F, etc. The problem is there is no end to this pattern. We can never get to the end, as every new letter requires an additional letter, or every new explanation requires a further explanation. If we stop somewhere in the chain, we are lacking an explanation for one of the letters, and hence lack 100% certainty.


In the pursuit of understanding the world around us, the elusive nature of knowledge becomes apparent. Certainty, like a distant mirage, remains unattainable. The application of the correspondence theory of truth, when applied to knowledge, is officially a dead-end because we cannot prove any correspondence between our knowledge and reality.


Knowledge, by its very essence, is fraught with uncertainty. The acknowledgment that "absolute certainty is unattainable" forms the foundation of my epistemological paradigm that I call "credentism" (based on credence). Every expression of knowledge is a truth claim which, at its core, is a manifestation of high confidence, but never an assertion of infallibility.



Epistemology: Definitions of Knowledge


Classical Knowledge = JTB


From the time of Plato, knowledge was thought of in terms of it being a justified true belief (JTB). Justification might mean you have some logical or empirical reasons to believe something. But you could believe the wrong thing, despite thinking that you had good reasons to believe it. Hence, the inclusion of “true” into the definition. If you justifiably believe a false thing, you might have thought you had knowledge, but you didn’t really have knowledge.


Gettier Knowledge = JTB - Luck


After thousands of years of people assuming JTB was a sufficient definition for knowledge, Edmund Gettier upended the traditional definition by coming up with a category of epistemological paradoxes known as “Gettier problems”. The gist of Gettier problems is the idea that a person can come to a justified true belief accidentally, and accidentally true beliefs do not instinctually feel like knowledge. It seems more like someone got lucky as opposed to actually had knowledge.


Example: If you look at your watch to tell the time, you have a justification to believe the time on the watch because watches are usually reliable sources of truth. Unfortunately, your watch is broken. But it just so happens that a broken clock is right twice a day. Your watch accidentally gives you the correct time because you fortuitously looked at your watch at the exact time of the day it happens to align with. You have a justification. You have truth. You have a belief. But you lack knowledge. Instead of knowledge, you have luck.


As I see it, the solution to Gettier problems is to just acknowledge that there must be a logical relationship between the justification and the truth of the matter. A working clock has a logical relationship with the truth. A broken clock has no logical relationship with the truth. If the reasons behind your beliefs don’t have a logical relationship with the truth, then they are just accidentally true beliefs as opposed to knowledge.


Gettier Knowledge = JTB - Luck = J→TB

(→ being interpreted as "logical relationship")


Side Note: I believe I have discovered that Socrates formulated the first Gettier case, proving the adage, “Everything of importance has been said before by somebody who did not discover it.” ― Alfred North Whitehead. “Everything has been said before. But since nobody listens we have to keep going back and beginning all over again.” ― André Gide.


Socrates' Gettier Case:

THEAETETUS By Plato

SOCRATES: When a person at the time of learning writes the name of Theaetetus, and thinks that he ought to write and does write Th and e; but, again, meaning to write the name of Theododorus, thinks that he ought to write and does write T and e—can we suppose that he knows the first syllables of your two names?

THEAETETUS: We have already admitted that such a one has not yet attained knowledge.


Due to the similar sounds of "Th" and "T" in Ancient Greek, it is quite possible when learning someone’s name, you might hear it incorrectly and think you heard “Th” instead of “T” and vice versa. When spelling a name, a person would have to assume the reliability of their ears when determining if they should write "Th" and "T". But it is likely that their justified confidence in the spelling is more like a guess. In the case where they guess correctly, they accidentally get it right, without really knowing, causing a justified true belief based on luck as opposed to a logical connection between the person's justification and the correctness of their spelling.



The Nature of Truth:

I have never liked the inclusion of “truth” within the definition of knowledge. Truth is that which conforms to reality. But reality is elusive. How can we know what reality is? We view reality through our senses. What if our senses are lying to us? I am skeptical that "truth" can be demonstrated, so it is a useless aspect of the definition. Hence, truth seems more like “supposed truth” which is basically a belief. So JTB reduces to JBB, or justified belief belief. What is the relationship between the belief and the "supposed truth" belief? It seems like it is asking for the belief to conform to something outside of belief. If a belief can’t practically conform to reality (since reality can’t be verified), perhaps a belief can conform to an external understanding of reality - the episteme? This means that other people have to agree that their belief is justified. So, this means that a desire for truth means a desire for external confirmation which means a desire for alignment with the episteme, which means socially justified. This means that justification takes on an internal component and an external component. The belief must be internally justified and externally justified. But often this external justification never comes until later. Social approval is useless if the episteme is wrong. So why should we even include this in the definition of knowledge in the first place? Subtracting this means we can reduce JTB from JBB down to JB.


Credentist Knowledge = JB


The Subjective Nature of Knowledge:

Knowledge is an aspect of human nature – an outgrowth of our subjectivity. Our brains are pattern-recognition machines. We are constantly deducing patterns. When we find strong patterns, our brains encode those patterns internally with a high degree of confidence.


When the person looks at their watch for the time, they have a high degree of confidence about the time, and hence think they have knowledge. When the person realizes that their watch is broken, they find a problem with the pattern, and lower their confidence in the watch. They now lack knowledge about the time.


In this sense, knowledge is something that fluctuates based on personal feelings. Gettier cases are not only solved by recognizing justification includes a logical relationship between the reasons for belief and the supposed truth of the matter, but also by recognizing that in both cases, the individual had the subjective experience of knowledge, but latter knowledge outcompetes the former knowledge by having superior justification.


This is the pattern of knowledge over time from episteme to episteme, from paradigm to paradigm. Newton had knowledge of gravity. But his knowledge was superseded by a superior knowledge in Einstein.


This means that knowledge can be reduced to justified belief (JB) and justified belief is interpreted as confidence.


Credentist Knowledge = JB = C



The Evolutionary Function of Knowledge:

I think that words function as tools for social utility, which ultimately boils down to evolutionary utility. 'Knowledge' as a word, hence, functions for evolutionary purposes, not metaphysical ones. Knowledge seems to be a degree of confidence that justifies further actions. The higher the risks associated with an action, the higher the confidence needed to justify those actions. So, the degree is relative to the situation.


When a friend asks you "Do you know his name?", you say "Yes, it’s Bob" because he told you his name before. But technically he could have been lying - he could have been operating on a fake name. But since the situation seems trivial, we just assume that our degree of confidence justifies further benign actions like referring to him as 'Bob'.



But if we were to file a government report on Bob, we might want further verification like a driver’s license, because the stakes are higher, we want a higher degree of confidence to justify further actions. If a government official asks, "Do you know his name is Bob?" You cannot respond with a "yes" unless you checked his driver's license, whereas formerly we COULD respond with "yes" even without the driver’s license.


Epistemological Paradigms:

Level 1 Epistemology – Dogmatism:

Knowledge = 100% confidence; I have knowledge.

Level 2 Epistemology – Skepticism:

Knowledge = 100% confidence; I have no knowledge.

Level 3 Epistemology – Credentism (relying on credence):

It is not useful to define knowledge in such a way that it ceases to exist. Hence, knowledge = a sufficiently high degree of confidence. I have knowledge.


Credentism:

When we begin to focus on degrees of credence we have embraced credentism. Now we are looking at all of the patterns and aggregating them probabilistically into degrees of confidence. The more data the better. With credentism, we begin apply a coherentist paradigm of truth to knowledge, as that which coheres provides greater confidence.


Imagine that there is a 99% probability that you exist, a 99% probability that your senses are trustworthy, and a 99% probability that your memories are trustworthy.

In order for your reality to be a lie, all of these things must be false. In order for 3 probabilities to all occur, they must be multiplied. 1% multiplied by 1% multiplied by 1% equals 0.0001%. What this means is that the reason for doubt gets exponentially smaller with each factor that must be false. The inverse of this means that as we aggregate facts about the world, we build a foundation of knowledge. In order for some science to be wrong, our understanding of many other things must be wrong as well. The probability that all of our underlying assumptions are wrong is

exponentially low the more they are aggregated.



Simply put, we can aggregate data like a Bayesian. We initially don’t know the sun will rise. Then we observe it and collect probability data. Then we observe it rise again. And again. We keep updating our probability data and we start to gain an exponentially strong confidence that the sun will also rise again tomorrow!

Then we build a theory of gravity upon this foundation. Gravity then makes further predictions that support our Bayesian analysis of the sun. If our gravitational predictions are validated, then we have even stronger confidence that the sun will rise again tomorrow. We must now debunk gravity before we can debunk the rising of the sun, adding a new layer of strengthening probability.


Credentism as Epistemic Synthesism:

If knowledge is based on confidence, then the question is – where does confidence arise. Under coherentism, the more data we aggregate the more confidence. Methodologically, we can think about confidence as the sum of all the justifications. Philosophically, we might want to think about what types of justifications are available and then try to leverage them together synergistically.


The Yin: Knowledge as Sufficiently High Probability:

Within this credentist paradigm, knowledge emerges as an affirmative force — a sufficiently high degree of confidence that confirms something. It can manifest as the aggregation of a sufficiently large amount of data to justify the probability of something, or as a web of cohering logical claims that exponentially support each other, probabilistically. Evolution has designed us with brains that are natural probability calculators that allow us to instinctively build degrees of confidence based on subconscious calculations. These feelings of confidence we can call “knowledge”. It is the probability that exponentially approaches 100%.


The Yang: Absurdity as Sufficiently Low Probability:

Conversely, absurdity takes center stage as the disconfirming counterpart to knowledge. Evolution has given us an instinct for feeling the emotion of absurdity. This is a signal that our subconscious probability calculators have determined that something is so unlikely that it would be laughable to base or decisions upon its possibility. This feeling of absurdity is the inverse of the feeling of knowledge. It is the instinctual knowledge that something cannot be the case – a disconfirmation. It is having a sufficiently high degree of confidence that something is not true. It is the probability that exponentially approaches 0%.


The Yin and Yang Relationship:

The beauty lies in the interconnected dance of knowledge and absurdity—a yin and yang relationship that enriches our understanding of truth. They are not disparate entities but two sides of the same coin, each influencing and shaping the other.



Small Circles Within the Yin and Yang:

Within the yin and yang symbol, small circles emerge as poignant symbols. These circles represent doubt and possibility - the gray areas between knowledge and absurdity. The existence of these small circles shows an element open-mindedness that transcends the more black and white closed mindset.


The circle within knowledge is the degree of doubt - the probability our assumptions are wrong. Numerically, 100% equals the sum of knowledge and doubt. The larger the doubt, the less knowledge. As we reduce our doubts towards zero, our knowledge increases towards 100% confidence.


Conversely, the circle within absurdity is the degree of possibility, the probability that despite our confidence that something is false, the possibility that it may be true despite our expectations. Absurdity approaches 100% confidence in the negative direction, confidence that something is not the case. But there is always the small possibility that we are wrong. Numerically, 100% equals the sum of absurdity and possibility. The larger the possibility, the less absurdity. As we reduce possibility towards zero, the level of absurdity increases towards 100% - 100% confidence that something has a 0% chance of occurring.



Modelling Knowledge

Depending on your epistemological paradigm, you will have different definitions of knowledge. Agnosticism, meaning a lack of knowledge, will hence vary in interpretation based on the volatility of the definition of knowledge.


Some people have very low standards for what is considered knowledge. Any type of personal testimony (even if second-hand) is as good as gospel. For them, absolutely anything that remotely shifts probabilities in the direction of a conclusion is sufficient evidence for knowledge. Had a weird dream? Yep, you have knowledge that demons are real. Friend saw a ghost? Knowledge of the supernatural. From this paradigm, everything is possible, putting the default probability of everything at a 50%. Only when things are truly 50/50 can this type of a person be agnostic. Once the slightest bit of evidence comes along to push the probability towards 51%, BAM! They now have knowledge.


I am calling this paradigm "epistemic agnosticism" because their lack of knowledge is due to having absolutely zero epistemic resources. Because their knowledge hinges on an epistemic binary (any evidence vs. no evidence), their knowledge strictly follows this epistemic binary. This seems to be in line with a type of dogmatic epistemology that jumps to the conclusion that it has knowledge in an unwarranted way.



After moving past a dogmatic mindset, people often evolve into a skeptical mindset. If one is wrong to jump quickly to the conclusion of knowledge, then perhaps one should do the opposite, never jump to the conclusion of knowledge! This brings us to "solipsistic agnosticism" - the type of agnosticism that is so rigorous in its standards for knowledge that we begin to loose our grasp on common sense knowledge, like "there are other minds" and "the external world is real". Technically, we can never be 100% confident that there are other minds, so we are forced to embrace solipsism with such an impossible standard of knowledge. By placing the standard so high, we force ourselves to be agnostic about everything and define knowledge out of existence.



Within the scientific paradigm, be begin to find a more happy medium between the two extremes of epistemic agnosticism and solipsistic agnosticism. Now we are being mathematically rigorous about the probabilities of the phenomenon we observe being due to scientific causation as opposed to chance. Scientific agnosticism becomes the type of agnosticism wherein you lack knowledge because the studies don't yet show a strong enough link to justify concluding a causal relationship as opposed to a correlated relationship. The more studies you do, the more you can raise your confidence until it lands in a place comfortable enough to call knowledge. Science is a credentist epistemology because it embraces the "sufficiently high confidence" definition of knowledge that is based on probabilities and accumulation of evidence.



Science is a very rigorous epistemic paradigm that is not appropriate for all types of knowledge. You know your friend's birthday. Did you need to do a study to verify that they were born in January? Or were the stakes low enough, and the probability of trustworthiness high enough to justify knowledge without scientific experimentation? Yes, you may be wrong. It is possible that he was adopted and they lied about his birthday because the true birthday was unknown. But you still have a justified belief. For credentism, that is enough. Enter "philosophic agnosticism". Philosophic agnosticism is when you lack that justified belief. This is when you understand your beliefs are not warranted. Maybe you are aware that your friend isn't trustworthy. Maybe you are aware that there are profit incentives to warp the truth. Maybe you are aware that there is illicit activity that needs to be covered up. These factors make beliefs unjustified, forcing an agnostic conclusion. Philosophic agnosticism is also based on a credentist epistemology, but it is more flexible to be used informally, without rigorous mathematical calculations.




In each of these paradigms of knowledge, we can infer the abstracted nature of the yin and yang of knowledge and absurdity within its probabilistic structure. The gnostic red/orange portions represent knowledge and absurdity. The yellow, green, and blue portions represent the small circles within the yin and yang of doubt and possibility.


Conclusion:

In conclusion, the yin and yang relationship between knowledge and absurdity invites us to navigate the intricate tapestry of human understanding with humility. As we embrace the uncertainties inherent in knowledge, we recognize that our perceptions of truth are shaped by the delicate balance between affirmation and disconfirmation, or between high and low probabilities. This paradigm challenges us to appreciate the interconnected dance of knowledge and absurdity, fostering a deeper and more nuanced appreciation for the dynamic nature of our beliefs. In this dance, the small circles within the yin yang symbol beckon us to explore the subtleties that define our intellectual journey, while drawing the boundaries where justified confidence may be asserted.


135 views0 comments

Recent Posts

See All

Comments


bottom of page