Annotated bibliography of files in the directory `/docs/math/`

.

*2009-01-01*–*2021-04-07*
*in progress*
certainty: *log*
importance: *0*

# Files

`1931-ramsey-foundationsofmathematicsandotherlogicalessays.epub`

`1973-knuth.pdf`

: “The Dangers of Computer-Science Theory”, Donald E. Knuth (1973):This chapter discusses the difficulties associated with the computer-science theories.

The theory of automata is slowly changing to a study of random-access computations, and this work promises to be more useful. Any algorithm programmable on a certain kind of pushdown automaton can be performed efficiently on a random-access machine, no matter how slowly the pushdown program runs.

Another difficulty with the theory of languages is that it has led to an overemphasis on syntax as opposed to semantics. For many years there was much light on syntax and very little on semantics; so simple semantic constructions were unnaturally grafted onto syntactic definitions, making rather unwieldy grammars, instead of searching for theories more appropriate to semantics.

Theories are often more structured and more interesting when they are based on real problems; somehow they are more exciting than completely abstract theories will ever be.

`1979-demillo.pdf`

: “Social Processes and Proofs of Theorems and Programs”, Richard A. De Millo, Richard J. Lipton, Alan J. Perlis (1979):Many people have argued that computer programming should strive to become more like mathematics. Maybe so, but not in the way they seem to think. The aim of program verification, an attempt to make programming more mathematics-like, is to increase dramatically one’s confidence in the correct functioning of a piece of software, and the device that verifiers use to achieve this goal is a long chain of formal, deductive logic. In mathematics, the aim is to increase one’s confidence in the correctness of a theorem, and it’s true that one of the devices mathematicians could in theory use to achieve this goal is a long chain of formal logic. But in fact they don’t. What they use is a proof, a very different animal. Nor does the proof settle the matter; contrary to what its name suggests, a proof is only one step in the direction of confidence. We believe that, in the end, it is a social process that determines whether mathematicians feel confident about a theorem—and we believe that, because no comparable social process can take place among program verifiers, program verification is bound to fail. We can’t see how it’s going to be able to affect anyone’s confidence about programs.

`1980-euler-rationalmechanicsflexibleelasticbodies16381788.pdf`

`1996-hoare.pdf`

: “How did software get so reliable without proof?”, C.A.R. Hoare (1996-03):By surveying current software engineering practice, this paper reveals that the techniques employed to achieve reliability are little different from those which have proved effective in all other branches of modern engineering: rigorous management of procedures for design inspection and review; quality assurance based on a wide range of targeted tests; continuous evolution by removal of errors from products already in widespread use; and defensive programming, among other forms of deliberate over-engineering. Formal methods and proof play a small direct role in large scale programming; but they do provide a conceptual framework and basic understanding to promote the best of current practice, and point directions for future improvement.

`2001-borwein.pdf`

: “Some Remarkable Properties of Sinc and Related Integrals”, David Borwein, Jonathan M. Borwein (2001-03):Using Fourier transform techniques, we establish inequalities for integrals of the form

\[\int_0^\infty \prod_{k = 0}^n \frac{\sin (a_{k}x)_{} }{a_{k}x}dx\]

We then give quite striking closed form evaluations of such integrals and finish by discussing various extensions and applications. [

**Keywords**: sinc integrals, Fourier transforms, convolution, Parseval’s theorem]`2002-dijkstra.pdf`

: “EWD1300: The Notational Conventions I Adopted, and Why”, Edsger W. Dijkstra (2002-12-01):At a given moment, the concept of polite mathematics emerged, the underlying idea of which is that, even if you have only 60 readers, it pays to spend an hour if by doing so you can save your average reader a minute. By inventing an idealized ‘average reader’, we could translate most of the lofty, human goal of politeness into more or less formal criteria we could apply to our texts. This note is devoted to the resulting notational and stylistic conventions that were adopted as the years went by. We don’t want to baffle or puzzle our reader, in particular it should be clear what has to be done to check our argument and it should be possible to do so without pencil and paper. This dictates small, explicit steps. On the other hand it is well known that brevity is the leading characteristic of mathematical elegance, and some fear that this ideal excludes the small, explicit steps, but one of the joys of my professional life has been the discovery that this fear is unfounded, for brevity can be achieved without committing the sin of omission. I should point out that my ideal of crisp clarity is not universally shared. Some consider the puzzles that are created by their omissions as spicy challenges, without which their texts would be boring; others shun clarity lest their work is considered trivial.

`2007-colton.pdf`

: “Computational Discovery in Pure Mathematics”, Simon Colton (2007-01-01):We discuss what constitutes knowledge in pure mathematics and how new advances are made and communicated. We describe the impact of computer algebra systems, automated theorem provers, programs designed to generate examples, mathematical databases, and theory formation programs on the body of knowledge in pure mathematics. We discuss to what extent the output from certain programs can be considered a discovery in pure mathematics. This enables us to assess the state of the art with respect to Newell and Simon’s prediction that a computer would discover and prove an important mathematical theorem.