Calculus was developed by many workers, and their incremental progress was independently systematized by Newton and Leibnitz in the late 1600s. At that time the concept of limit had not been devised yet, and even the concept of a function was still in development, and there was not yet a precise definition of a function. The term “function” appears to have been introduced by Leibnitz in 1673. Thus, calculus was developed in its early days by discussing “variable quantities,” which we would nowadays call variables, and the currently-accepted definition of a function was not formulated until the late 1800s, as was the currently-accepted definition of a limit. Place yourself in the shoes of Newton and Leibnitz in the late 1600s, then, striving to make sense of their newly-created systems without having adequately precise definitions to work with. They were like searchers groping in a dark cave, possessing some very unusual night-vision goggles, and yet not quite able to see clearly. In this light, their progress appears all the more remarkable.
To make sense of calculus, Newton and Leibnitz thought in terms of infinitesimals. (d’Alembert was the first to think of a derivative in terms of limits in the 1700s.) They conceived of an infinitesimal number as a number that is smaller in magnitude than any real number, but not yet zero. It should be emphasized that there is no such real number! This point was made forcefully by George Berkeley in his 1734 book The Analyst, which was subtitled:
A Discourse Addressed to an Infidel Mathematician: Wherein It Is Examined Whether the Object, Principles, and Inferences of the Modern Analysis Are More Distinctly Conceived, or More Evidently Deduced, Than Religious Mysteries and Points of Faith. “First Cast the Beam Out of Thine Own Eye; and Then Shalt Thou See Clearly to Cast Out the Mote Out of Thy Brother’s Eye.”
For 21st-century readers, used to subtitles being less than ten words long, if they appear at all, this is quite a long subtitle!
Berkeley had earlier attacked “free-thinkers” in response to their attacks on Christianity. Sir Edmund Halley, a noted free-thinker and devotee of Newton, mocked Berkeley’s attacks, and apparently a sick friend of Berkeley’s had refused Berkeley’s “spiritual consolation, because Halley had convinced the friend of the untenable nature of Christian doctrine.” (See page 470 of Boyer’s A History of Mathematics.) It is speculated that “The Infidel Mathematician” in Berkeley’s subtitle is Halley, and that the book was a response to Halley. (It is doubtful that the devoutly religious Newton was Berkeley’s target.)
On the one hand, “Can’t we all just get along?” and on the other hand, Berkeley’s criticisms about the foundations of calculus were on point. Berkeley did not dispute that the results of calculus were valid (their applications in astronomy by Newton and others had been empirically supported), he simply, and correctly, pointed out that the reasoning that produced these valid results was dodgy. In Berkeley’s words (fluxions were Newton’s version of infinitesimals),
And what are these fluxions? The velocities of evanescent increments. And what are these same evanescent increments? They are neither finite quantities, nor quantities infinitely small, nor yet nothing. May we not call them ghosts of departed quantities?
The last sentence is pretty biting, and forcefully makes Berkeley’s point. You can’t say some quantity has been incremented by a small non-zero quantity, then divide by this quantity as if it were indeed non-zero, and then later suppose that this quantity is ignorable (i.e., zero), without some careful justification. Newton and his contemporaries did not quite do the job. But let’s not be critical of them; it took two centuries of hard work by many very bright researchers to finally figure this out to the general satisfaction of the community. That is an interesting chapter of mathematics history, and we shall turn to it another time.
The moral of this story is that creative mathematicians come up with all kinds of interesting ideas, many of which are practical and some of which are even revolutionary. But it is too much to ask of any one person, or even of any one generation of workers, to tidy up every loose end in these new fields of mathematics. The tidying-up process is also creative, but in a different sense; logic comes to the fore in the tidying-up process. Once a field is mature, then clear definitions and axioms are identified, and theorems are derived in a coherent, systematic way from the foundations. Calculus is by now a very mature field of mathematics, and if you dig more deeply into the subject you will be able to study its foundations to your heart’s content. But when first learning a subject, it is beneficial to focus on numerous examples to internalize the main concepts, problems, and methods, and to save a deeper consideration of foundational issues for later study.