Recent book reviews

Five recent book reviews

Network Geeks

Author(s): 
Brian E. Carpenter
Publisher: 
Springer Verlag
Year: 
2013
ISBN: 
978-1-4471-5024-4
Price (tentative): 
21.19 € (pbk)
Short description: 

This is and autobiographic account of the career of Brian Carpenter, who played a key role in the development of the Internet. He was chair of the IETF (Internet Engineering Task Force) in the period 2005-2007. Starting from a family history, he sketches his work as a network engineer while being employed by CERN, and later by IBM.

URL for publisher, author, or book: 
www.springer.com/computer/general+issues/book/978-1-4471-5024-4
MSC main category: 
68 Computer science
MSC category: 
68-00
Other MSC categories: 
90B18
Review: 

The title is catchy yet not very informative about what to expect from the contents. The current state of the Internet raised many interesting mathematical problems, but I was not really expecting these to be discussed here. So I started reading with an open mind, but having read the book, it is not easy to describe it in one sentence. The best I can come up with is: an autobiography (and more) of the author who has played a key role in the development of the Internet as we know it today, partly by his involvement as member and chair of the IETF (Internet Engineering Task Force).

Most chapters deal with his experience in his successive jobs. This is partly technical, yet reads as a novel about the birth of the Internet. However, the opening chapter describes a fictitious meeting of the IETF, putting the reader somewhat out of balance. Then comes a rather extensive family history going back several generations. A reader hoping to read about the Internet better skip this, because this is basically only of interest to the Carpenter family. The nostalgic description of Brian's own youth, is a pleasant read if you want to know how life was of a middle-class family in a post-war England: the first family car, first meccano, school days, entering Downing College and studying at Cambridge University,.... All this is well written and brings along forgotten memories if you have about the same age as the author. However, again, if you are essentially interested in the Internet, I can imagine that you consider the name of the horse of the family's milkman a bit of an overkill as introductory material. But along the road, also information about Turing and other initiations of computer science are skillfully interwoven.

Carpenter's professional career started at CERN in 1971 where he was in charge of controlling the proton beams which involved communication between computers. He tells us about the succession of computers, computer companies and people from different nationalities, and the emerging of the first (research) computer networks. Some of his collaborators and peers are considered to be founding fathers of the World Wide Web and the Internet. He stayed at CERN with an interruption of three years teaching in New Zealand. He grew into his job of controlling networks, first after his return to CERN and later he became fully involved while employed by IBM. A central role in his story is played by packet switching, an idea he attributes to Donald Davies at NPL. This means that data are split up in smaller packets that are sent independently over the net. Another recurrent issue is the TCP/IP concept. TCP and IP are internet protocol standards that describe how the packets should be formatted, addressed, routed, and collected in the end. Currently it has become a standard that has won the protocol war. This is the kind of problems Carpenter was involved with while he was chair of the IETF (2005-2007) and of which he carefully recounts the origin but also the sometimes hidden agendas of the people, of the companies and all the organizations involved.

Since 2007 the author is teaching at the University of Auckland. So most of the book is a round up of his previous jobs and his active involvement in the growth of the Internet. However, the story doesn't end abruptly. A protocol he has been intensively involved with was the development of IPv6. Internet addresses used to be stored in the IPv4 protocol as a structured 32 bit register. However, every device connected to the net needs a unique address and already in 1980 it was clear that 32 bits would eventually not be enough. A new protocol IPv6 extends the addresses to 128 bits, and after about 30 years of standardization and meetings, it is now gradually being implemented by Internet Service Providers (ISP). The network is organized as a structured network of networks of networks. This is the strength of the self-regulating Internet that can recover from node failures after events like 9/11 (2001), hurricane Katherina (2005), or the earthquake and tsunami in Fukushima (2011). The dot-com bubble in 2000 had financial impact hence also impact on employment, and therefore indirectly influenced the evolution. A final threat that is mentioned is that politicians, being afraid of digitized revolutions, start discussing about restricting the freedom of the Internet, known as 'Internet governance', which has been demonstrated e.g., in Syria and China. Phenomena like Wiki-leaks and Edward Snowden revealing the U.S. Intelligence program intercepting private Internet activity are not discussed by Carpenter. But this is more politics than engineering and Carpenter is an engineer and a self declared geek, and I can add that he is a talented story teller too.

Some might find the content not focussed or maybe a bit chaotic, but that is how the Internet came about, and Carpenter tells about real people, and these people are not only the internet engineers, but they also take sometimes difficult decisions, they have to solve a problem with the tools that are available to them, they may struggle with language problems, they sometimes have to move to different continents, they have a family, etc. Most of the text is about the technical development, but it is never overcomplicated. All terms are explained in a simple way. There are regularly interrupts by photographs, and inserts in the text giving further information or short side-excursions. There are a lot of abbreviations for networks, technical terms and organizations, which are all defined the first time they are used, but if you are not so familiar with this world, it may take a bit of effort to keep them available all the time. But if needed one can look them up using the index given at the end.

Reviewer: 
A. Bultheel
Affiliation: 
KU Leuven

Towing Icebergs, Falling Dominoes, and Other Adventures in Applied Mathematics

Author(s): 
Robert B. Banks
Publisher: 
Princeton University Press
Year: 
2013
ISBN: 
978-0-691-15818-1 (pbk)
Price (tentative): 
16.95 £
Short description: 

This is the fifth printing and the first paperback edition of the successful original first published in 1998. It is an extensive collection of applied mathematics problems that model and solve serious and less serious problems: the trajectory of a golf ball, the spreading of an epidemic, the reduction of the national deficit, or indeed the towing of an iceberg with all the numerical details included. Here and there, some assignments and problems are left for the reader to work out.

URL for publisher, author, or book: 
press.princeton.edu/titles/9990.html
MSC main category: 
00 General
MSC category: 
00A69
Other MSC categories: 
00A79, 92D25
Review: 

Robert B. Banks (1922-2002) has written two marvelous books illustrating what applied mathematics really is about. The present one was the first to appear in 1998 and his Slicing Pizzas, Racing Turtles, and Further Adventures in Applied Mathematics was a sequel that was published in 1999. This version is the first edition in paperback.

In 24 chapters the reader is bombarded by a firework of models and solutions for serious and amusing problems. The opening paragraph is typical giving all the data about the meteor that hit the earth some 50,000 years ago near Flagstaff (AZ). It induces a chapter on different units, which is useful for the rest of the book.

Although not in a particular order, one might recognize some recurrent themes in the different applications: things (large and small) falling from the sky (meteor, parachute, raindrops, etc.) but later also trajectories of basketballs, baseballs, water jets, and ski jumpers. Other applications are related to growth models (population, epidemic spread, national deficit, length of people, and world records running, etc). Some chapters deal with wave phenomena (traffic, water waves, and falling dominos), and others with statistics (monte carlo simulation) or curves (in architecture, jumping ropes and Darrieus wind turbines).

But this enumeration is far from complete. There are two chapters completely working out the economic project of towing icebergs from the Antarctic to North and South America, Africa, and Australia. This includes the computation of the energy needed, the optimal route to be followed, the thickness of the cables needed, the melting process, etc. And there are many other models for phenomena, I have not mentioned.

The models are sometimes derived, but in many occasions, they are mostly just given in the form of a differential equation (but also delay differential equations and integro-differential equations appear). It is indicated how to obtain solutions (often analytic, sometimes numerical), but intermediate steps are left to the reader to check. At several places also suggestions for assignments or extra problems to work out are included. Historical comments ad suggestions for further reading are often summarized. Hence teachers may find here inspiration for (if not ready-made examples of) exercises to give to their students.

The book stands out because the examples are all treated as real-life examples with real data, and taking into account all the complications that are usually left out in academic examples: the earth is not a perfect sphere, a baseball is rough because of its stitches, it is thrown with spin, there is resistance of the air, and the resistence differs with the height, etc. Even though, there is a lot of formulas and numbers, the reading is pleasant and smooth. It may be much harder if one wants to work out the details and/or the exercises for oneself.

The chapters can be used independently, although there are some forward or backward references, but these are not essential. One does however need some knowledge of differential equations (usually linear and first order but sometimes going beyond these), integrals are clearly needed (even elliptic integrals are used).

The edition is still the same as the original one. That means that references are still the older ones that have not been updated. Robert B. Banks has passed away some 10 years ago. If not, given his enthusiasm displayed in this book, I would have expected an update about the models for economic evolution, taking into account the banking problems in 2008 and the aftermath of the economic crisis that we are still living in, or perhaps also data about the tsunami that hit Japan in 2011 with the nuclear disaster of Fukushima as a consequence, or the impact and fall-out of the eruption of the Eyjafjallajökull vulcano in 2010. Perhaps someday, someone will add a third volume to these wonderful collections of applied problems.

Reviewer: 
A. Bultheel
Affiliation: 
KU Leuven

Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers

Author(s): 
John MacCormick
Publisher: 
Princeton University Press
Year: 
2013
ISBN: 
978-0-691-158198-7 (pbk), 978-0-691-14714-7 (hbk)
Price (tentative): 
11.95 £ (paper), 19.95 £ (cloth)
Short description: 

A low level introduction to a selection of algorithms that are used by most people almost on a daily basis, often without knowing it. How search engines work: the indexing system of AltaVista and the PageRank system of Google; public key cryptography; error correcting codes; pattern recognition; data compression; database consistency and reliability and digital signatures. To conclude, some considerations about the limitations of computers and decidability are given.

URL for publisher, author, or book: 
press.princeton.edu/titles/9528.html
MSC main category: 
68 Computer science
MSC category: 
68-01
Other MSC categories: 
68M15, 68P15, 68P25, 68P30, 68T05, 68T10
Review: 

Today nearly everybody is able to handle a computer, yet many have no idea what computer science is about, like some think that if you are a mathematician, then you should be good at mental arithmetic and vice versa. So if the grandson is a computer whiz, who is very good in solving grandma's computer problems, she thinks he should become a computer scientist. To judge whether this is true, you should know what computer science really is about. This book gives some insight.

An unrespectful and somewhat inaccurate alternative title for this book could be Computer science for dummies and MacCormick is very good in explaining to the grandmas what computer scientists have done for her, even if before she didn't care too much. All is kept at a very low level. No mathematics, no unexplained technical terms, and all concepts introduced by analogy through a non-technical everyday-life situation. The text reads as if you are listening to MacCormick giving a Christmas lecture.

MacCormick has chosen nine algorithms (although nine-ish is more appropriate since the precise number nine is disputable) to bring his message. Why these algorithms? This is explained in the introduction: algorithms that everybody is using, solving concrete problems and realizing some abstract ideas. I believe these arguments are a bit shaky because in my view they apply to many other algorithms as well. I suspect the criteria are written ad hoc like an introduction most often is. Of course he also has to explain what an algorithm is and what makes an algorithm `great'. The latter is simple because they gave MacCormick a personal `aha'-experience when the key-idea is exposed (which he calls a `trick', i.e. the 'thing-that-does-the-trick'). That does explain the selection, because there is no argument possible against personal preference. But whatever the reason is, the selected algorithms solve problems that everyone is familiar with while most people never thought about the mechanisms behind them and MacCormick is good at explaining the basics in a very accessible way.

So here is what has been selected. Chapter 2 and 3 deal with search engines for the WWW. The origin of indexing the web pages which made AltaVista great and then the ranking mechanism used by Google. Mind you, although PageRank is basically solving a huge eigenvalue problem, nothing of that sort is ever mentioned, but the basics of hyperlinks, authorities and random surfer are neatly explained with a toy problem where people are referring to a page giving a recipe for scrambled eggs. This kind of remark also applies to the other chapters. For example, the next one explains public key cryptography, and although there are pretty hard mathematics behind involving prime numbers, elliptic curves etc., the idea is explained making mixtures of paint public, which can not be unmixed, and yet two people can communicate a common color keeping one ingredient private, without other observers being able to find out what their common color is. Paints become numbers only at a later stage of the chapter. Similarly other mechanisms are introduced in the subsequent chapters like error correcting codes, pattern recognition (e.g., automatic reading of ZIP codes) via decision trees and neural nets, data compression (encoding and jpeg are discussed, but words like `Fourier' or `wavelet' are carefully avoided). A somewhat unexpected subject for me is the one on databases. This explains how databases are protected against crashes or unfinished operations and how replicas are kept synchronized avoiding inconsistencies at all cost. Digital signatures on the other hand are a more obvious choice.

All these algorithms are tacitly used on a daily basis by millions of people when they use a keyboard or a touch screen, often without realizing that these things are happening at the other side of the screen. The next chapter however is somewhat out of the main track. What MacCormick tries to do is explain the undecidability problem. The impossibility to construct an algorithm that always leads to a correct yes-no answer. Here he starts with convincing the reader that any program on a computer can run on any file and strongly encourages to try that out. I hope that the experiments of the grandmas following this advise will not damage their computer too much so that their grandsons can restore the original operational mode. Anyway the author hypothetically constructs a sequence of programs that read programs as input and give a yes-no answer (along the lines explaining what a proof by contradiction is). He so can show the reader that they must come to the conclusion that it is impossible to write a crash-detecting program since it will give a contradiction when it uses itself as input. This brings the reader who managed to finish the chapter up to the halting problem, the Church-Turing thesis and the related philosophical considerations.

Whether you agree with the choice of algorithms made or not, they form a good set of samples of what made some of the ICT applications a success. The world would be different if they had not been there. MacCormick did an excellent job is explaining the basic ingredients without needing any programming skills. Anyone who is used to a computer is able to grasp the ideas. His afterthoughts about what the great algorithms of the next generation or century will be are a bit thin, but nobody really knows what the future will be. If we ever get quantum computing to work, we shall have to rethink almost everything. This brings us back to the title of the book. The algorithms that were discussed have certainly influenced current computing, but whether they will change the future is not certain.

Thus be warned: no mathematics, no Computer Science with capitals but easy reading for everyone from 9 till 99. If you are a computer scientist yourself, you might find ideas about how to explain things, or you might find this book an excellent idea to give as a present to grandma so that you don't have to explain yourself. You could hardly do better.

Reviewer: 
A. Bultheel
Affiliation: 
KU Leuven

Polyhedral and algebraic methods in computational geometry

Author(s): 
Michael Joswig, Thorsten Theobald
Publisher: 
Springer
Year: 
2013
ISBN: 
9781447148166
Price (tentative): 
€41.64
Short description: 

The authors discuss in the book a selection of linear and non-linear topics in computational geometry. The first part of the book is devoted to linear computational geometry problems while the second part of the book focuses on non-linear computational geometry techniques, being the main tool the use of Groebner bases. In the third part of the book a selection of applications is given.

URL for publisher, author, or book: 
http://www.springer.com/mathematics/geometry/book/978-1-4471-4816-6
MSC main category: 
68 Computer science
MSC category: 
68Q25
Other MSC categories: 
52B55, 13P10
Review: 

The authors discuss in the book a selection of linear and non-linear topics in computational geometry. The first part of the book, devoted to linear computational geometry, starts with an introduction to projective geometry to proceed to study polytopes, linear programming problemas, convex hulls, Voronoi diagrams and Delaunay triangulations. The software program polymake is used to illustrate and visualize the concepts that are discussed.

The second part of the book focuses on non-linear computational geometry techniques, being the main tool the use of Groebner bases to solve systems of polynomial equations. Examples are provided using software programs Maple and Singular.

Finally, the third part of the book is devoted to a selection of applications, including the reconstruction of curves using Delaunay triangulation as a tool and an application of Groebner bases to geolocalization using GPS satellites.

The book's audience is made up of mathematicians interested in applications of geometry and algebra as well as computer scientists and engineers with good mathematical background.

Reviewer: 
Antonio Valdés Morales
Affiliation: 
Departamento de Geometría y Topología, Universidad Complutense de Madrid

Symmetric Markov Processes, Time Change, and Boundary Theory

Author(s): 
Zhen-Qing Chen, Masatoshi Fukushima
Publisher: 
Princeton university Press
Year: 
2012
ISBN: 
978-0-691-13605-9
Short description: 

This work gives a comprehensive exposition on the theory of Markov Processes, Dirichlet forms and their applications. In addition, the last chapters are devoted to some recent results on the topic, collected in a book for the first time.

MSC main category: 
60 Probability theory and stochastic processes
Review: 

The title of this monograph, written by two worldwide known experts on the field, gives a clear idea of the contents of the book, devoted to Markov processes. As the reader can immediately understand from the Preface, the work can be divided in two different parts. The first one (chapters 1,2,3 & 4) contains the definitions of the theory of symmetric Markovian and Dirichlet forms together with many examples and properties. The approach is both probabilistic and analytic and the exposition is rigorous and complete. This part is closely related to other books written by the second authors (in collaboration with others) as for example “Dirichlet forms and symmetric Markov processes by Fukushima”, de Gruyter Stud. Math. 19, Berlin, 1994.
The second part (chapters 5, 6 &7) contains some recent results on time changes and boundary theory, topics which are first presented here in the format of a book. In particular, with respect to time changes, a characterization of time-changed Markov process in terms of Douglas integrals is presented. On the other hand, the boundary theory studies the problem of extending a Markov process from its original space to a larger one in which the original is an open subset and such that the new process spends zero time in the extension.
The presentation is comprehensive and self contained though it requires solid previous knowledge on theoretic probability together with some notions on measure theory and functional analysis. The volume can serve as textbook for graduate students or as a reference for researchers on the field.

Reviewer: 
Marco CASTRILLON LOPEZ
Affiliation: 
Universidad Complutense de Madrid, Spain