High Performance Computing has primarily been applied within scientific research, the financial world and relatively recently also to forecast the weather. The financial services industry was the first commercial industry to adopt HPC. Nowadays you will find the most advanced HPC systems trading stocks in nano-seconds and generating massive amounts of money for their owners.
Banks have used HPC for pricing exotic financial instruments, optimizing their portfolios of mortgage-backed securities and managing firm-wide global credit and debit risks. But there are more use cases for HPC in combination with Smart Data:
Fraud detection is an extremely important area for the financial industry. Millions of dollars could be saved if suspicious transactions are stopped before they occur. To do this, millions of transactions across the globe should be analysed in real time. The problem is that algorithms will need to find unknown patterns within that data, which could indicate a fraudulent transaction.
It requires not only analysing millions of transaction, but also many other data sources that offer the right context for a certain transaction. For example, is a certain credit card transaction valid or not. That not only depends on the transaction itself, but also for example on the location or the moment of the transaction.
One company that deals with these volumes is PayPal. On a daily basis, they deal with 10+ million logins, 13 million transactions and 300 variables that are calculated per event to find a potential fraudulent transaction. Thanks to High Performance Data Analytics, PayPal saved in their first year of production over $700 million in fraudulent transactions that they would not have detected previously.
Personalized Medicines and Drug Discovery
Already I mentioned the possibilities of personalized medicines and how it requires HPC. Although personalized medicines might still be quite far away, the pharmaceutical industry is already using High Performance Computing for drug design and discovery.
Drug design and discovery is a tedious process. It can easily require several years or longer before a drug hits the market. This is due to rigorous testing in labs on animals and later humans before it is made available to the masses.
High Performance Computing in combination with Big Data enables the pharmaceutical industry to find, for example, the right proteins for a certain drug among millions of compounds. This can be done thanks to simulation analysis and testing a plethora of varieties on thousands of different virtual patients. Thus, drug discovery can be reduced with multiple years, eventually saving a lot of lives.
Smart Energy Grids
Smart energy grids might still be long away; already multiple energy companies are experimenting with the possibilities of a smart energy grid. The potential of a truly smart energy grid is tremendous; reduced energy consumption, better and safer electrical grids and a safer and cleaner environment. Thanks to HPC, smart energy grids are becoming a reality.
A smart energy grid deals with massive amounts of data from a wide variety of sources, all located in a highly disperse network. Imagine if every household would have a truly smart energy meter and at every single moment in time, it is measured how much energy they use, from which devices, for how long and at what price. Multiplied by millions of households, we are easily talking exabytes of data that need to be analysed in real-time to determine whether someone is allowed or not to charge his/her electrical car. That can only be done using a HPC infrastructure and a High-Performance Data Analytics environment.
Manufacturing Simulation Analysis
From the beginning, HPC has been involved in the modelling and simulation of complex physical and quasi-physical systems. Modelling and simulation analysis enables an organization to gain a better understanding of a certain project, without the need to actual test the product in real-life. Thanks to this approach for example, Tesla was able to have the early edition of the Tesla Roadster pass dozens of tests, without the need for dozens of cars that could be crashed (as is traditionally done in the automotive industry).
What Tesla did during developing the Tesla Roadster has become common practice for Tesla. But also in other industry, simulation analysis can save companies a lot of money when developing new products. Thousands of iterations can quickly be tested taking into account hundreds of different variables. Based on the outcome of each simulation, the eventual product is improved. High Performance Computing is required to perform these millions of simulations, at least if you want to get it done relatively quickly.
Of course, these four industries are just the beginning of what is possible. Basically, any organization that wants to make serious business from Big Data could use HPC and HPDA.
Other industries that are already using High Performance Computing are among other the Space industry, Engineering (Oil and Gas) industry, Defence, Publishing industry, governments (think the NSA) and of course academia. In the years to come, this list will probably become a lot longer.
Author: Mark van Rijmenam
Mark van Rijmenam is Founder of Datafloq. He is a highly sought-after international public speaker, a Big Data strategist and author of the best-selling book Think Bigger – Developing a Successful Big Data Strategy for Your Business. He is named a global top 10 Big Data influencer. Currently, he also pursues a PhD at University of Technology, Sydney on how organizations should deal with Big Data, Blockchain and (Human) AI.