Based on Stefan Mazurkiewicz’s paper (1974 ) and his definition of the stochastic or probabilistic distance between two deductive systems by:
d(a, b) = p(a.b’ v a’.b) = p(a .b’) +p(a’.b). Popper introduces the definition of distance from the truth ( T is the set of true statements of L , t is the strongest true theory.) :
dT(a) = d(a,t)= p(a .t’) +p(a’.t)
Also it is introduced the nearness function (truthlikeness, verisimilitude):
nT(a)= 1- dT(a) = dT(a’)
t’ is the negation of the strongest true statement, it is the weakest of all false theories (an irreducible theory). If we identify t with the set of true statements T, its complement is the set of false statements F. The nearness to the truth (Verisimilitude) of a is represented by the shaded parts of Fig 3. Now, these two shades areas when read from right to left represent the truth content of the theory a, CtT(a) and what remains is the set F of false statements when the falsity content CtF(a) of the theory a is removed, thus:
Vs(a) = CtT(a)+F- CtF(a)
Popper thinks that the T is too big, and that only we should admit in our universe of statements, the ones that we conjecture to be relevant, and by confining ourselves to those conjectures solve the problem of how to avoid the strengthening a theory by inserting just any stray irrelevant conjunct.
One problem is the one of the comparison of the verisimilitude of false theories. Popper wasn’t able to show that we can approach truth through better and better approximations- through false theories which come nearer and nearer to the truth.
David Miller thought that this approach implies that we can always define new constants such that, with respect to these, the accuracy of the theories is reversed, He seems to think that in taking a problem P1 and the parameters belonging to it as fundamental I somehow have to reject another problem say P3 with it its set of parameters. Popper doesn’t reject P3 nor any other problem that may arise out of the critical discussion of the theory instead he reminds an old schema.(where P stands for problem, TT for tentative theory, EE for error elimination)
P1 --> TT1 --> EE1--> P2--> TT2 --> EE2 --> P3 etc.
If P3 is such a problem, then TT3 will have to be better not only with respect to P3 but also with respect to all preceding problems: this is demanded by the principles ensuring the rationality of the growth of knowledge. This shows that the way of approaching truth will partly depend upon the succession of problems; that is to say, upon the history of thought. But this is no more backward looking than forward looking, two historically isolated and different chains of problems which its solutions may become comparable with respect to verisimilitude only after the two chains have merged; that is after we have found theories that solve the problems of both chains better than all their predecessors.
A statement like “ the theory a is nearer to the true than the competing theory b" is never demonstrable , but may be asserted as a conjecture, strongly arguable for or against on the basis of (1) a comparison of the logical strength of the two theories and (2) a comparison of the state of their critical discussion, including the severity of test which they have pass or failed ( a comparison of their degree of corroboration). Due to this, Popper can give good support of the conjecture that Einstein’s theory of gravitation is nearer to the truth than Newton’s