Almost every day we hear news about the latest advances in computer technologies. Such snippets take any number of forms: a company launching a new service based on artificial intelligence (AI), a promising new medical treatment stemming from the analysis of data from a large cohort of patients, a new smartphone application that tells us what is happening at home, and so on...
We take all these things for granted but they arise from the progress made in various computing fields. The workings of Moore's law and the creativity of computer architects bring greater computing power with every passing decade.
Data volumes are soaring thanks to the so-called internet of things. Algorithmic techniques are advancing by leaps and bounds and are playing a prominent role in all AI disciplines.
Data volumes are soaring thanks to the so-called internet of things
Computer and data technologies play an even greater role in science. It is getting harder and harder to imagine a research field in which calculus, simulations, big data and AI do not play key roles.
From engineering to medicine, from climate sciences to astrophysics, the great scientific discoveries in recent years have almost always been spurred on by the use of highly advanced computer techniques, which have become a complementary tool in pioneering research.
Is this all a five-day wonder or is ICT here to stay? All the signs are that the trend will continue.
On the one hand, increased sensorisation and communications will produce more data sent at ever faster rates. On the other hand, advances in computer sciences will lead to more powerful chips and hardware. The combination of these two elements will further boost the presence of these technologies in the scientific and industrial fields.
What geopolitical consequences will these ever-present technologies have in the near future?
We are facing a technological ‘Big Bang’ that is rippling through every field and will continue to do so for decades to come. But I do not want to delve into philosophical abstractions such as the future of work and of human interaction, ‘singularities’ and so forth. Rather, I want to focus on something that is much more down-to-earth and important in the short term, namely what geopolitical consequences will these ever-present technologies have in the near future?
Some of our American colleagues like to spout the phrase “He who does not compute does not compete." Applied to both companies and countries, this pearl of wisdom tells us that access to these technologies offers great competitive advantages.
Keenly aware of this, the United States, China and Japan have long drawn up their own roadmaps to win the race to get the world's most powerful computers (‘supercomputers’). In all three cases, their plans are realistic, precise and well-endowed with public R&D funds. Above all, they are all based on the ability to build these supercomputers with home-grown technology.
Why is this last point so relevant? It is not just about having access to machines that will soon be able to perform a million of a million of a million operations per second (that is, one followed by 18 zeroes of mathematical operations per second).
It is also about having sufficient computing capacity to guarantee a country's technological sovereignty. This is vital for ensuring its scientists have the ability to compete on a global scale, and that its industries have access to the latest chip models for sectors such as the automotive or manufacturing industry. What would happen to the European automobile or aeronautical industry if trade wars suddenly deprived them of the latest Intel, AMD or Nvidia chips?
This real threat has led the European Commission to take steps to ensure the EU is not left behind in this great strategic technology race. Together with all the member states, Brussels has launched an ambitious commitment: the Europe High Performance Computing project. For the first time, it has funds that put it in the same league as other world powers. The project seeks to provide our scientists with the best computational capabilities. It is supported by technology developed in Europe to guarantee we keep abreast of our competitors.
Another good tiding is that Barcelona will play a big role in this initiative, which will be key in ensuring European science and industry thrives over the next few decades.
The BSC is the largest supercomputing centre in Europe
The commitment made in parallel computing during the 80s and 90s at the UPC led to the birth of the Barcelona Supercomputing Centre (BSC) in 2004. Today, the BSC employs over 600 people in over 40 research groups. It is hosted by the MareNostrum4 supercomputer sited at the Torre Girona chapel. The BSC is the largest supercomputing centre in Europe, and is already preparing to host the new MareNostrum5 and to lead the development of European computing technology based on open architectures.
Only time will tell how we fare in this race but one thing is clear: no European country can meet this challenge on its own. At a time when the so-called ‘European Project’ is plagued by doubts, supercomputing shows us one of the greatest virtues of the European Union: working together means we can meet challenges that would otherwise overwhelm us were we to act separately.
Join the Do Better community
Become a member and enjoy our free benefits. Get recommendations, receive personalised content in your inbox and save your favourite articles to read later.