## Neither meaning nor truth (nor practical value) in formal mathematics

At my IIT (BHU) lecture (see also previous post), I emphasized Bertrand Russell’s remark that there is neither meaning nor truth in (formal) mathematics. Hence, any nonsense proposition one desires (such as “All rabbits have two horns”) can be proved as a formal mathematical theorem from appropriate postulates: Russell’s sole criterion being that the postulates should be “amusing”.

To drive the point home, I pointed out how, long ago, when I still believed in formal math, I used to teach a course (A) on Real Analysis while also teaching a more advanced course (B) on Advanced Functional Analysis, in the math department of Pune University. In the elementary course (A) I taught

**Theorem:** A differentiable function must be continuous. (Therefore, a discontinuous function cannot be differentiated.).

In the more advanced course (B) I taught

**Theorem:** Any (Lebesgue) integrable function can be differentiated infinitely often. (Therefore, a function with simple discontinuities can be differentiated infinitely often.)

I have made exactly this point earlier in this blog.

“Now, for several years I taught real analysis to students and mathematically proved in class that a discontinuous function

cannotbe differentiated. I also taught advanced functional analysis (and topological vector spaces and the Schwartz theory according to which every Lebesgue integrable function can be differentiated). In the advanced class, I mathematically proved the exact opposite that a function with a simple discontinuitycanbe differentiated infinitely often (and the first derivative is the Dirac δ).”

The question is which definition of the derivative should one use for the differential equations of physics? As pointed out in *Cultural Foundations of Mathematics* (or see this paper) the issue can only be decided empirically, unless the aim, like that of Stephen Hawking and G. F. R. Ellis, is to spread Christian superstitions about creation using bad mathematics.

Superstitions go naturally with ignorance. One such ignorant professor from the IIT mathematics department was present during my lecture. His knowledge was limited to the first of the theorems above, and he ignorantly believed that it was some kind of absolute truth, which everyone was obliged to believe. He objected to my claim that a discontinuous function can, of course, be differentiated, and walked out to show his contempt of my claim.

Even the students had heard of the Dirac δ, and agreed with me. The next day during the workshop, I explained that I had engaged with this question since my PhD thesis. But the professor remained absent, though his ignorance was exposed before the students. He is welcome to respond by email; I will post it publicly since it is sure to further expose his ignorance.

Oliver Heaviside applied first applied this to problems of electrical engineering over a century ago, and Dirac, formerly an electrical engineer, then applied the Dirac δ to physics. It remains very useful because it is the Fourier transform of white noise (flat spectrum or the unit function), and used even in the formal mathematical theory of Brownian motion.

Earlier in the lecture, the same professor, contested my claim that probability was invented in ancient India, and taken from India in the 16^{th} c., where credit for it was later falsely given to people like Pascal and Poisson. I pointed to the aksa sukta in the Rgveda, and to the use of “the theory of dice” for sampling (to count the number of fruits in a tree) in the Mahabharata story of Nala and Damayanti, as explained in the above paper, a copy of which is also posted online. As every entrant to IIT: JEE is expected to know, probabilities in games of chance (such as dice, or cards) are calculated using the theory of permutations and combinations, and the above paper also pointed out (and I repeated during the lecture) how the relevant theory of permutations and combinations developed in India.

The ignorant professor completely missed my point that our own mathematics was returned to us along with some foolish added metaphysics (of nil value) and a false Western history, that all significant mathematics was done by “Christians and friends” (church) or white men (racist) or the West (colonialists).

He implied that probability was NOT about its practical value for sampling theory or games of chance but only about slavishly obeying Western axioms related to probability. In fact, in the above paper, I also pointed out that the Kolmogorov axioms for probability as measure fail for quantum probability (of course the related theory was unknown to our ignorant professor who did not even know about the Dirac δ).

My point all along has been that (normal) mathematics is about practical value from calculations (e.g. of discrete probability, sampling theory etc.), and not about any silly metaphysics of infinity proposed by formal mathematicians to prove some theorems of little value.

To drive home the uselessness of theorem-proving, I pointed out long ago, in my Hawai’i paper, that formal mathematics cannot even use probability to deliver practical value in the financial market. This was reiterated in my very recent response to some equally ignorant comments in Undark magazine as follows.

“(b) Regarding the Black-Scholes model, it is obviously defective in assuming the normal distribution. Stock-market distributions are fat tailed (e.g. Levy distribution). As I pointed out long ago, with regard to this very example, in “Computers, mathematics education, and the alternative epistemology of the calculus in the Yuktibhasa” (Philosophy East and West

51:3 (2001) pp. 325–362), formalists are restricted to the normal distribution and Wiener process, since there is no formal existence and uniqueness proof of solutions of stochastic differential equations driven by Levy motion. (That is still true.) But the solutions can be easily computed within normal math, e.g. using my computer program as described in my article on “Supercomputing in finance” (Pranjana, 3 (1&2) 2000, 11–36). As regards the complete failure of statistical inference on the belief that probability is measure, and the prime superstition of formal mathematical proof that logic is two-valued, see e.g. the section on quantum probabilities and quantum logic of my article on “Probability in Ancient India”, in Handbook of Philosophy of Science, vol 7. Philosophy of Statistics, Elsevier 2011, pp. 1175-1195. https://www.sciencedirect.com/science/article/pii/B978044451862050037X.”

But, of course, an ignorant professor of mathematics who has not even heard of the Dirac δ, cannot possibly have any understanding of even the Wiener process, which involves the Dirac δ, leave alone follow the above quote.

What a pity that our elite institutions have such ignorant professors of mathematics! They are just enslaved colonial minds, relaying the demand that the Western master be blindly imitated. Curiously, the ignorant professor’s name is Das!