Advertisment

The after-life of Pi

In the latest feat where supercomputing and mathematics intersect, Swiss researchers have calculated the mathematical constant Pi

author-image
CIOL Bureau
New Update
pi

Pratima H

Advertisment

Are mathematicians and supercomputers strange friends or lost souls waiting to be united? We are learning more and more about it in this decade, and the latest record broken for Pi reminds us just that 

"Infinite is a meaningless word: except – it states - The mind is capable of performing an endless process of addition." Louis Zukofsky was underlining mathematics with philosophy so well when he said that.

That’s what makes the cross-roads of machines and Mathematicians so interesting perhaps. Specially for puzzles that have challenged mathematicians for long. From the four colour theorem to the Diophantine equation, the summing of three cubes to cracking more decimals of Pi, many researchers have used supercomputing muscles to lift weights that used to slip between the fingers earlier. 

Advertisment

In the latest feat where supercomputing and mathematics intersect, Swiss researchers have calculated the mathematical constant Pi. This has hit a new world-record level of exactitude, getting close to 62.8tn figures using a supercomputer. It is a calculation that took 108 days and nine hours. This marks a new milestone of pride for the Graubuenden University of Applied Sciences. Notably, the previous world-record Pi calculation was about 50tn figures. So why is Pi – the ratio of a circle’s circumference to its diameter – so alluring to people who play with supercomputers? Is it because it has, plausibly, no repeating patterns? Why to chase trillions of decimals and do so by pushing hardware and software? Are we using Maths more and more in our lives to catch frauds and find patterns to save the world? Is that where we borrow a hitch-hike ride from the super-machines? And more importantly what next!

We get these answers, and many intellectual questions in return, from Thomas Keller, the project manager of the Pi challenge and Professor Heiko Roelke, head of DAViS. Incidentally, DAViS is a part of the University of Applied Sciences of the Grisons. It is consolidating expertise in the area of high-performance computing (HPC) and HPC infrastructure for a mandate of the canton of Grisons. The DAViS Competency Centre also covers data mining, machine learning and simulation.

The team was running this computation to prepare its infrastructure for future use in research and development. Currently it is running computations in RNA Analysis, Fluid Dynamics and Deep Learning. Calculating Pi is just a by-product of that preparation. Let’s calculate the ‘why’ in this interview.

Advertisment

Why did you choose the Pi challenge – what is the mathematical or computational context/complexity that makes it a good candidate for supercomputing? Why do computing brains take the Pi challenge – and not something else – like the Tribonacci constant?

Keller: We chose Pi, because that's the constant most people can relate to. The Pi calculation was an ideal candidate, because our hardware is (accidentally) ideal for this type of algorithm.

How does this accomplishment compare to some precedents in Pi calculation – like that of Google?

Advertisment

Keller: Google and Timothy Mullican, both did a fantastic job in calculating Pi to such incredible precision. We were able to leverage the latest hardware and our computer architecture just lends itself to calculating Pi. As said before, this is a coincidence though.

What is noteworthy here – the number of figures cracked or the speed? If you could enlighten us on the ‘why’ that would be great.

Roelke: We think, for the public, noteworthy is primarily the number of figures discovered. Calculation speed and especially fast data transfers are essential for our research projects. The Pi calculation serves as a hardware test, to build up know-how and as a showcase to the public.

Advertisment

How can this feat be used for future application areas – which ones and how soon?

Keller: Calculating Pi to trillions of digits is meaningless for us too. We can't think of anything useful to do with 62.8 billion decimal places. Calculating this number was more of a fitness test for our infrastructure. This test was successful, however: We learned a lot and even detected some flaws in our infrastructure, namely in terms of backup size and transfer speed. 

Can you share something from your research pipeline – anything interesting in progress or planned next?

Advertisment

Keller: We are currently involved with RNA analysis, helping people find cures to atopical dermatitis. Another line of research is deep translation and more generally text processing. No Pi calculation here (he smiles).

Does it intersect anywhere with Benford’s law?

Keller: To my knowledge, Benford’s law holds on collections of numbers from empirical research. I cannot see any connections to Pi.

Advertisment

As far as I know, it is an open question whether the digits in Pi are equally distributed. We did some checks on the numbers calculated as an intermediate result and it looks rather equally distributed – which is far from a proof and the check on the complete dataset is still to be done.

Do we see Blockchain or Quantum Computers ever replacing or complementing what supercomputers are doing?

Keller: I have my doubts relating to Blockchains here but Quantum computers are progressing fast. Maybe we see way better algorithms in the near future. However, I doubt there will be an improvement for a sequential computation as in our case. But maybe there is a quantum Pi formula to be found? 

Do breakthroughs in Mathematics research help accelerate and elevate supercomputing or is it vice-versa?

Roelke: This works in both directions. Better algorithms make better use of the available hardware and may even stipulate new architectures. Some recent breakthroughs in Mathematics make heavy use of computationally searching for (counter) examples. 

Can better models and algorithms derived from Mathematics beat the supercomputer performance spikes that come due to hardware innovations or Moore’s law?

Roelke: As said before, we are no experts in this field. Often this is more a co-evolution where both hard- and software plays its role.

What role or shifts did supercomputing exhibit during the pandemic? What – in your reckoning- should be observed in terms of supercomputing’s contribution?

Keller: I do not think that the pandemic changed much for the field of supercomputing. Maybe the field of computation has reached a bit more attention due to the trend to digitalisation. For the field of life sciences in general there is a lot to be expected e.g. in deep learning applications for chemistry and medicine. 

supercomputing hpc quantum-computing