COMPUTING
scientists and engineers
DEEP
LEARNING,
DEEP
IMPACT?
Artificial intelligence
and HPC development
www.scientific-computing.com
Introducing the Future of
Simulation-Driven Design
Be Inspired at altair.com/InspirePlatform
LEADER
SCIENTIFIC
Emigsville, PA 17318-0437.
Managing director: Warren Clark
COMPUTING
Cover: Immersion Imagery/Shutterstock.com
Scientific Computing World is published by Europa Science Ltd,
WORLD
4 Signet Court, Cambridge, CB5 8LA l ISSN 1744-8026
Tel: +44 (0) 1223 211170 l Fax: +44 (0) 1223 213385 Subscribe for free at
Web: www.researchinformation.info www.scientific-computing.com/subscribe
The impact of AI
in physics-based
environment that supports traditional The field stagnated for a number of
simulation-based research, as well as years due to a lack of computational
emerging data science and machine simulations” power, coupled with promises made by
learning approaches in preparation
for Aurora, the first US exascale
Federico Carminati, many who entered the field but did not
fully understand the limitations of the
supercomputer. project coordinator, CERN technology. As a result of many broken g
ShadeDesign/Shutterstock.com
ANLi
of the word ‘learning’ for ANNs, although
both are commonly used. For example,
it has been known since the 1980s that
ANNs do not ‘learn’ or ‘think’, but rather fit
multidimensional surfaces which are used
for inferencing. DOE objective: drive integration of simulation, data analytics and machine learning
in his ISC18 talk on NLAFET that, ecosystem of numerical tools. In among colleagues and between
‘Linear algebra is both fundamental and particular, the HiCMA linear algebra library disciplines, the potential for discovery
ubiquitous in computational science and can operate on billion by billion matrices, through the pragmatic application of
its vast application areas’. So many HPC in workstations containing only gigabytes new computational resources, coupled
applications heavily rely on BLAS that of memory. with unrestrained data flow, staggers the
even a small performance increment can imagination.’
translate into huge savings in runtime by Data flow architectures From new hardware to new approaches,
the aggregate HPC community. New ideas, numerical methods, and the true impact of deep and machine
The parallel Numerical Linear Algebra for programmable hardware devices such learning on HPC is yet to be seen.
Future Extreme Scale Systems (NLAFET) is as FPGAs opens the door to data flow In short, the biggest impact is not
a high-profile example where methods and architectures. Cloud providers such technological, but rather a change in
hardware popularised for AI applications as Microsoft Azure are already using mindset that is stimulating innovation
– including DAGs (directed acyclic graphs) persistent neural networks for inference. and new approaches to decade’s old
and reduced precision hardware, are In this case, the ANN is implemented technologies and problems. Thus, we are
used to map BLAS operations to various directly on the FPGA – there is no program at the start of a point of inflection brought
hardware platforms including CPUs, GPUs counter or executable like a conventional about by the popularity of AI hardware
and FPGAs. Similarly, Alexander Heinecke CPU or GPU. Instead, data ‘flows’ through that a host of bright and innovative
(research scientist, Intel Labs) has created the computational elements on the FPGA scientists are exploiting to bring about
the open-source libXSMM library that to produce the desired output. The result the convergence of AI and HPC. The
can speed batched small linear algebra is high performance, low latency and low ramifications are difficult to predict but will
operations, including techniques that use power consumption. be extraordinarily fun to see.
reduced-precision arithmetic operators, While there is speculation about the
while still preserving accuracy. adoption of non-von Neumann data A fully referenced version of this feature is
The advent of optimised reduced- flow architectures in HPC and exascale available online
precision hardware for AI has brought supercomputers, it is clear that scientists
the question of ‘how much numerical are currently laying the groundwork
precision is enough?’ to the attention for the use of data flow architectures . Rob Farber was a pioneer in the field of neural
of computer scientists and is in some This is a natural merging of the use of networks while on staff as a scientist in the
theoretical division at Los Alamos National
sense motivating the development of new DAGs in projects such as NLAFET and Laboratory. He works with companies and national
numerical approaches, such as the efforts programmable hardware such as FPGAs. laboratories as a consultant, and also teaches about
at King Abdullah University of Science and As the ALCF posted, ‘Because current HPC and AI technology worldwide. Rob can be
reached at info@techenablement.com
Technology (KAUST) to build an enhanced innovation is driven by collaboration
Visit the PRACE booth #2033 at SC18 from 12 to 15 November 2018. More information: www.prace-ri.eu/praceatsc18
The Partnership for Advanced Computing in Europe (PRACE) is an international non-profit association with its seat in Brussels.
The PRACE Research Infrastructure provides a persistent world-class high performance computing service for scientists and
researchers from academia and industry in Europe. The computer systems and their operations accessible through PRACE are
provided by 5 PRACE members (BSC representing Spain, CINECA representing Italy, ETH Zurich/CSCS representing Switzerland, GCS
representing Germany and GENCI representing France). The Implementation Phase of PRACE receives funding from the EU’s Horizon
2020 Research and Innovation Programme (2014-2020) under grant agreement 730913. For more information, see www.prace-ri.eu Partnership for Advanced Computing in Europe
HIGH PERFORMANCE COMPUTING
Exploring machine
learning with FPGAs
BILL JENKINS, OF INTEL
PROGRAMMABLE SYSTEMS
GROUP, LOOKS AT THE ROLE
OF FPGA TECHNOLOGY IN
MACHINE LEARNING
NAG is a
No other consulting group in the UK, USA or worldwide has the experience,
WHPC Chapter expertise, independence and track record that NAG brings to you
COMPUTING
INSIGHT UK 2018
EXHIBITORS WELCOME
Want a presence at the UK conference for
HPC, big data analytics and cognitive technologies?
Stand spaces are allocated on a first come, first
served basis - so snap yours up quick!
www.scd.stfc.ac.uk/ciuk
@CompInsightUK #ciuk
HIGH PERFORMANCE COMPUTING
2019
Submission Deadline
December 12, 2018
TUTORIALS
Submission Deadline FRANKFURT, GERMANY
February 6, 2019
WORKSHOPS
Submission Deadline
February 13, 2019 SAVE THE DATE!
Platinum Sponsors:
Next-Generation High Performance Components | Exascale Systems | Extreme-Scale Applications | HPC and Advanced
Environmental Engineering Projects | Parallel Ray Tracing -- Visualization at its Best | Blockchain Technology and
Cryptocurrency | Parallel Processing in Life Science | Quantum Computers / Computing | What´s New with Cloud Computing
for HPC | Parallel Programming Models for Extreme-Scale Computing | Workflow Management | Machine Learning and
Big Data Analytics | Deep Learning and HPC – State of the Art
isc-hpc.com
HIGH PERFORMANCE COMPUTING
Arbi Studio/Shutterstock.com
as users, jointly developed
version 5.0 of the OpenMP
specification to fulfil these
requests. In addition to
The OpenMP API is an several minor and major
application programming improvements, the updated
interface (API) that gives specification includes the
parallel programmers a following features:
simple and flexible interface Full support for accelerator
for developing parallel devices. With the latest
applications. The OpenMP additions, the OpenMP
community has made a specification has full support
multitude of requests since the for accelerator devices.
OpenMP language introduced These include mechanisms to
version 4.5 in 2015, and as a require unified shared memory
result, OpenMP 5.0 adds many between hosts system and
new features that will be useful coprocessor devices, the
for highly parallel and complex ability to use device-specific
applications. With version function implementations,
5.0, the OpenMP API now better control of implicit data ”The OpenMP extended with OpenMP 5.0
covers the whole hardware
spectrum from embedded
mapping, and the ability to
override device offload at
community has features. More information can
made a multitude
be found on the Resources
systems and accelerator runtime. Reverse offload, page of the OpenMP website,
devices to multicore systems implicit function generation, of requests since where you can also find links
and shared-memory systems.
Vendors have started releasing
and the ability to easily
copy object-oriented data the OpenMP for OpenMP benchmarks, and
OpenMP research projects.
reference implementations of structures are also supported. language The OpenMP YouTube
parts of the standard, and user
courses will soon be given at
Improved debugging
and performance analysis.
introduced version channel is a great place to find
education videos, from basic
OpenMP workshops and major Two new tools interfaces 4.5 in 2015” entry-level videos to advanced
conferences. The OpenMP enabling intuitive debugging videos treating the different
specification version 5.0 has and support for deeper OpenMP 5.0 features. On the
been released at SC18. performance analysis. a new meta-directive for OpenMP website, you find links
OpenMP users are in a Support for the latest performance portability by to tutorials. Advanced OpenMP
wide range of fields, from versions of C, C++, and compile-time adaptation of courses treating elements
automotive and aeronautics Fortran. The OpenMP API now OpenMP pragmas. of OpenMP 5.0 are being
to biotech, automation, supports important features of The specifications can be given at OpenMP workshops
robotics, and financial analysis. Fortran 2008, C11, C++14, and downloaded from the OpenMP and the SC conferences
There were user requests C++17. website, and it is possible to (SC18 & SC19). The OpenMP
to bring OpenMP to the Multilevel memory participate in the discussions workshops, where courses on
embedded system space systems. Memory allocation on the OpenMP Forum! OpenMP 5.0 can be followed,
and the accelerator space. mechanisms are available are the International Workshop
Also, there was an urgent that place data in different Implementations on OpenMP (IWOMP), the
need to bring OpenMP to the types of memory, such as The major vendors have OpenMP Users’ Conference
latest levels of the C, C++, high-bandwidth. New OpenMP implemented parts of the (OpenMPCon) and the UK
and Fortran standards, and to features also make it easier OpenMP 5.0 specification in OpenMP Users’ Conference.
have a standard interface for to deal with the NUMAness of their compiler products. GNU Once OpenMP 5.0 has been
debugging and performance modern HPC systems. is the furthest along, with released, the basic OpenMP
tools. Finally, there were Greater memory flexibility. their implementation of GCC. courses given at universities
user requests for improved Support for acquire/release They plan to have quite a few and other venues will be
portability and usability. semantics to optimise low- features in the next release of updated. In addition, a guide
The OpenMP ARB, a group level memory synchronisation. GCC, viz GCC 9. In addition, the with OpenMP examples is
of major computer hardware Enhanced portability. debugging and performance available for download from
and software vendors, as well Declare variant directive and tools of the vendors are being the OpenMP website.
Discover a broad program Explore exhibits showcasing Enjoy several days in Dallas,
of HPC-focused technical the latest HPC technologies one of the country’s top tech
presentations, papers, workshops, from the world’s leading cities, and home to great
informative tutorials, timely vendors, universities, and amenities, food, and nightlife.
research posters, and more. research organizations. Bring the family!
Sponsored by:
HIGH PERFORMANCE COMPUTING
Master3D/Shutterstock.com
significantly sped up by using more CPU ‘Engineers are running bigger models,
cores.’ bigger in terms of size and complexity, and
Slagter noted that the concept came they also have to run an increasing number
from a survey of more than 1,800 Ansys of design variants to ensure product
Ansys has developed a free benchmark users. From the survey Ansys found integrity and robustness. But customers
tool that allows users to test their own that ‘customers said how often they are and engineers are compute bound and
simulation model on a small HPC cluster to constrained by turnaround limitations. constrained by their compute capacity.
demonstrate the benefits of scaling up an It was striking to me that a really large That is why we have established this
organisation’s computing infrastructure. percentage of the respondents limit the free benchmark programme. Customers
This tool is aimed at driving adoption of size or amount of detail for nearly every wonder about the performance of their
HPC resources through HPC appliances simulation model,’ added Slagter. simulation model. No matter how many
and cloud-based solutions that Ansys According to the survey, 40 per cent HPC benchmarks we produce, engineers
offers in collaboration with its partners. of respondents limit detail in simulation still want to know the performance of their
This can benefit organisations that require models due to time constraints. The own model,’ Slagter stated.
HPC but do not have the time or inclination survey also reflects that, in many cases, While limiting a model’s size or scale
to set up, configure and operate an limiting the size or amount of detail can can help to reduce the simulation
in-house cluster. result in lower-fidelity results less useful to time, engineering organisations must
‘I personally have never come across respondents’ design experiments. compete in highly competitive markets
an engineer yet who does not want more which require a careful balance between
compute power. But many of them want performance, innovation and time-to-
to see proof that using more CPU cores “According to the survey market for new products or components.
on a more powerful machine is worth the
investment,’ said Wim Slagter, director of
of customers, 40 per The benchmark tool allows users to look
at their own workloads to see how their
HPC and cloud alliances at Ansys. cent of the more than current projects could be accelerated
‘Engineers need to convince their 1,800 respondents through the application of HPC resources,
boss, and possibly the purchaser of the
organisation, and they can now get that limit the size or amount either on premise or in the cloud.
If engineers have to limit the size of a
proof for free through our benchmark of detail in simulation model ‘then they are almost wasting their
program,’ said Slagter. ‘It was designed
as an easy way for an engineer to see
models because of time time because they have to change and
re-mesh the model in order to be able
proof that their own Ansys model can be constraints” to squeeze it onto the machine or to get g
ANS107
HIGH PERFORMANCE COMPUTING
DRN Studio/Shutterstock.com
is not what they want. They clearly want to
solve their engineering problem and not
unnecessarily to spend time adapting the
model to suit the computing resources
available.’
Data integrity: Audit tails with ease of review Ultra-Fast, High-Fidelity Computational Fluid
By Thermo Fischer Scientific Dynamics on GPUs for Automotive Aerodynamics
By Altair
Learn about the Thermo Scientific™ Chromeleon™ CDS
audit trail controls, the regulatory requirements and Altair ultraFluidX™ is a simulation tool for ultra-fast
guidance that they pertain to, and how Chromeleon CDS prediction of the aerodynamic properties of passenger
will ease the review of audit trails. and heavy-duty vehicles as well as for the evaluation of
building and environmental aerodynamics.
AI advances
healthcare research
Panuwatccn/ WhiteMocca/Shutterstock.com
SOPHIA KTORI EXPLORES
THE ROLE OF AI AND DEEP
LEARNING IN HEALTHCARE
– IN THE FIRST OF A TWO
PART SERIES
parameters, run the workflows, and then for personalised medicine and other areas used in multiple fields, from business
capture and analyse the images. ‘But using of clinical practice, Dr Steigele believes. analytics, to physics or even predictive
Imagence, one single biologist can now Large-scale assays using low-volume medicine. ‘AI is best thought of as an
run the complete workflow from image clinical samples, coupled with more optimiser and a mathematical learning
production to the interpretation of results,’ consistent data extraction and analysis, machine,’ Dr Sirendi continues. ‘It can
Dr Steigele says. will help to identify which and how drugs streamline many processes encountered
‘It saves time, money, resources, and impact on subsets of patients, and make in the modern workplace, which can
a multidisciplinary workforce. Labs can it faster and more cost-effective to apply boost the productivity of existing
now effectively scale-up throughput technologies, such as HCS, for the fast- business activities. But AI can also reveal
and data output massively; and whereas developing field of personalised medicine. unexpected insights from large messy
classical approaches to HCS require major Come out of the lab and into the clinic datasets, which opens up entirely new
computing clusters to analyse huge sets and healthcare environment, and artificial product categories. Our AI is designed to
of images, we use some essentially very intelligence is being harnessed to aid and accomplish both tasks. It will make alarms
primitive hardware setups,’ he adds. speed diagnosis at the patient bedside. more actionable and clinically relevant,
Imagence is the result of collaborations Transformative AI is harnessing artificial while identifying something the human
between Genedata and leaders in intelligence and analytical techniques eye is unable to spot – subtle changes in
the biopharmaceutical industry. It developed at CERN and Cambridge human physiology that precede the onset
was the basis for a Genedata project University to generate predictive of sudden cardiac arrest.’
with AstraZeneca, Deep Learning for monitoring tools that they hope will And that predictive capacity can save
Phenotypic Image Analysis, which won ultimately save patients’ lives. lives, the company believes. Although
the 2018 Bio-IT World Best Practices ‘We believe that predictive analytics doctors can give lifesaving CPR and
Award. ‘We wanted to remove the burden hold the key to transforming healthcare,’ defibrillation to patients already in cardiac
of classical analysis, in which humans states Dr Marek Sirendi, CEO and co- arrest, the arrest occurs suddenly and
have to think about, recognise and then there is no warning, so treatment is started
personalised preparation
example of high-content screening). It’s before they begin, giving doctors a chance
the expertise of the biologist that drives to proactively manage this life-threatening
the whole process, but the underlying and prevention” condition.’
complexity is hidden.’ Importantly, the algorithm also reduces
Efficient generation of training data is the number of false alarms. ‘Nurses are
key. As Genedata explains, the process founder at Transformative AI. ‘We aim frustrated by an abundance of alarms,
involves generation of interactive, human to transform the emergency medicine rather than a lack of them,’ Dr Sirendi
readable maps from images in which paradigm from rapid response to adds. We cut out 54 per cent of irrelevant
similar responding cells present in the personalised preparation and prevention.’ alarms while making the remaining more
images co-cluster and thereby allow a The firm’s first product, designed for clinically relevant. The key is to make
very fast exploration of the phenotypic use in hospitals, analyses data from technology smarter, so that people (the
space and the collection of training data monitors to warn doctors in advance nurses, doctors, technicians) can rely on it
for all phenotype classes of interest; a that the patient may be likely to suffer to a greater extent.’
process typically applied during assay deadly cardiac arrhythmias. The algorithm Ultimately, the decision on whether to
development and via deep learning, which detects tiny changes in physiology that act on a machine-derived prediction is up
is capable of detecting even very subtle are predictive of sudden cardiac death, to the clinician, who will also have blood
differences. It’s then over to the biologist, triggering an alarm that gives doctors the test and potentially other clinical data and
who has the expertise to understand opportunity to prepare for, and potentially results to inform their course of action.
which phenotypes are important for the prevent the episode, explains Dr Sirendi. ‘We’re working with a number of hospitals,
assay in development. Relevant training Perhaps surprisingly, the Transformative cardiologists and electrophysiologists,’ Dr
data are then curated in just a very few AI algorithm used for predictive monitoring Sirendi notes.
hours to train networks for production in hospital settings is based on decision Engaging clinicians and healthcare
application on quantities of screening algorithms developed and used at CERN’s providers will be key if AI is to be
data (typically 50,000 to 2.5 million tested large hadron collider, which detect exotic accepted into mainstream healthcare,
substances per screening campaign). proton-proton collision events in real time. the company believes. ‘To get healthcare
‘The algorithm employs a number of state- providers excited about integrating AI
Advancing healthcare of-the-art deep learning models along with into healthcare, new tools shouldn’t just
The ability to apply deep learning other machine learning frameworks. We replicate tasks that human healthcare
techniques to typically workforce- present it with examples, and ask it to learn workers are capable of. Rather, AI tools
intensive tasks in discovery and preclinical to recognise the cases of interest.’ should provide novel insights that elevate
R&D will have important knock-on effects This basic AI approach can thus be the standard of clinical care.’
Phipatbig/Shutterstock.com
Data ecosystems in the cloud
Faisal Mushtaq explains the role of cloud informatics in overcoming the
challenges associated with modern pharmaceutical R&D
compliance quickly and cost-effectively. expertise to better use driving innovation re-design is needed, data can be carried
Against this backdrop, biotech and – within the R&D pipeline. over in full from the previous framework.
pharmaceutical organisations need an Cloud-based solutions also offer Alternatively, if partial upgrades to an
integrated, open and secure informatics enhanced data sharing functionality, an existing platform are required to bring
platform to utilise their growing volumes increasingly important requirement for additional capabilities on-stream, new
of R&D data in the most efficient way. modern pharmaceutical data and analytics features can be added on without
However, for many organisations, this platforms. The last decade has witnessed fundamental IT infrastructure having to be
isn’t the reality. While large numbers a tangible shift towards more collaborative replaced.
of biotech and pharmaceutical firms working practices, as stakeholders One of the most important features of
have made the leap from paper-based recognise that many of today’s most cloud-based platforms is the ability to
workflows to laboratory information pressing healthcare challenges require scale and adapt to users’ needs. Some
management systems (LIMS) and knowledge and expertise from across the providers offer application libraries that
electronic laboratory notebooks (ELNs), industry. Equally, growth in the contract allow laboratories to install new features,
for a significant proportion, these tools research and manufacturing sector means tools and interfaces as their needs evolve
are not used in a joined-up, integrated that organisational partnerships are and their pipelines grow. Because these
way. becoming the new norm. pre-configured modular applications
Instead, they’re employed as point To thrive, biotech and pharmaceutical are designed to comply with the latest
solutions or assigned to specific companies need platform-based industry best practice and regulatory
workflows or departments – resulting in approaches that make securely sharing requirements, they are ready to be
poor visibility over the full R&D pipeline, large datasets as simple as sending used alongside existing LIMS and ELNs
and severely limiting organisational an email. Cloud-based platforms make from the moment they are installed. By
output. Most importantly, when new supporting organisations through this
technologies are brought on-line, these flexible approach, cloud-based platform
fragmented systems are often unable to “The sustained growth in solutions can help laboratories tackle the
cope. So, what’s the solution?
the volume of information R&D challenges that are most relevant
to them, in the most cost-effective and
Cloud-based informatics: The solution generated by modern scalable way.
to expanding R&D pipelines
Cloud-based laboratory informatics
R&D workflows presents Biotech and pharmaceutical companies
are under continued pressure to find
platforms bring together R&D data, a challenge for biotech informatics solutions to manage the
creating a single connected digital
ecosystem for drug discovery,
and pharmaceutical mountains of multi-dimensional data
generated by their R&D workflows. To
development and manufacture. In companies, in terms of maintain the competitive advantage
doing so, these systems make end- organising and utilising moving forwards, these platforms must
to-end pipeline data fully searchable,
sharable, accessible and actionable, these vast datasets” make searching, sharing and manipulating
large datasets quick and efficient, and
freeing organisations from the technical above all, they must be capable of
challenges associated with fragmented data retrieval and distribution fast evolving with the rapidly changing drug
approaches. And because these tools and straightforward, providing a solid development landscape.
are deployed through cloud-based framework on which to build successful As a result, many future-savvy
architecture, they offer much greater partnerships. By giving authorised organisations are implementing cloud-
flexibility and scalability compared to users real-time access to pipeline data based informatics platforms to create
traditional in-house set-ups. In short, they – from genomics datasets through to a single, integrated digital ecosystem
offer the perfect solution to the challenge chromatography method parameters – for their R&D data and analytics. These
of managing an expanding information these platforms can streamline workflows, modern tools for data and analytics
pipeline. improve communication and accelerate are bringing together R&D streams and
What’s more, because cloud-based R&D innovation at the click of a button. overcoming the limitations associated
platforms are developed and maintained with fragmented approaches, helping
by independent software vendors, many Managing the R&D pipelines organisations to achieve greater pipeline
boast innovative workflow support tools of tomorrow oversight, boost efficiency and drive
that would be unfeasible to develop Developing, implementing and validating faster, more effective decision-making.
in-house. For instance, some of the new IT infrastructure in-house can often
latest platforms incorporate artificial be complex, expensive and resource- Faisal Mushtaq is the Vice President and General
intelligence (AI) functionality, empowering intensive. Manager of the Digital Science business unit at
Thermo Fisher Scientific. Faisal began his career
organisations with highly-effective In contrast, the latest solutions make developing software solutions for healthcare
reporting and trending capabilities. AI migrating to a cloud-based informatics providers. In recent years, Faisal has transitioned
algorithms can, for example, analyse platform simple and straightforward. In to executive management roles at firms that deliver
focussed, software-as-a-service solutions
complex unstructured biological data in particular, systems based on modular
real-time using machine learning, natural frameworks, such as Thermo Fisher
language processing and text analytics, Platform for Science software, offer Reference
enabling faster and smarter decision- additional flexibility and are capable of [1]
K Kulski. Next-Generation Sequencing – An
J
making. Additionally, AI frees experienced seamlessly integrating data from existing Overview of the History, Tools, and “Omic”
Applications, in Next Generation Sequencing –
scientists from some of the more routine LIMS and ELN systems in a way that is Advances, Applications and Challenges, 2016, Ed.
and time-consuming data analysis consistent with an organisation’s growth JK Kulski, IntechOpen, DOI: 10.5772/61964.
responsibilities, so they can put their needs. Even when a complete system
Decentralised systems
buck data-sharing trend
car model application. This was done network to bypass the need for
ADRIAN GIORDANI REPORTS using a complex machine learning network middlemen between permitted parties
ON THE USE OF BLOCKCHAIN to enable a four-door sedan Porsche within the network, saving time. It
IN THE AUTOMOTIVE Panamera to act as the software client algorithmically decides on the contents
INDUSTRY independently of backend database of the next ‘block’ in the blockchain
support, with overall control managed with a proprietary-created consensus
by the driver through a blockchain mechanism called a ‘Proof of Kernel Work’
application, a type of distributed ledger (PoKW). Only specific users or drivers
technology. are given rights to the car for opening or
XAIN successfully tested their locking the doors or starting the engine.
technology to remotely open and close XAIN’s network runs on the Ethereum
Until the first half of 2018 the Panamera up to six times faster than Blockchain, an open-source distributed
automotive manufacturers before (1.6 seconds). computing platform. Its uses range from
struggled to integrate a ‘However, we need to differentiate here decentralised video streaming services
smartphone-based system into their new that other automotive manufacturers to helping distribute food vouchers and
car models. Communication between an were previously integrating a blockchain mobile banking in third world countries.
external network and a car would only wallet into a car; which only stores private ‘Here security and incentivisation are
work if the manufacturer’s central [cryptographic] keys so that the car can, really the key points. The most powerful
database was always online. However, for example, pursue payments,’ said technique are open consensus methods,
systems that increase trust and remove Leif-Nissen Lundbæk, CEO, chairman that protect from changes, while also
the concept of third parties within a and co-founder of XAIN. ‘In our case, we allowing for incentivising all sorts of
network, known as distributed ledger integrated a full client in a car that also processes, ranging from the consensus
technologies, are growing at a fast pace. verifies all communication, so that the itself to standardisation processes,’ said
This is disrupting centralised data blockchain is not part of the backend Lundbæk.
management approaches, both at a system, but actually the car itself is the The key layer of XAIN’s novel approach
commercial and academic level. blockchain.’ is the distributed ledger technology or
In July at the international exhibition His work focuses on AI algorithms for DLT protocol. It enables a special form of
for global microelectronics industries: distributed cryptographic systems. In electronic data processing and storage.
SEMICON West in San Francisco, US, fact, XAIN originated as a spin-off from his Importantly, a DLT is a database that
analysts touted that the automotive sector academic work at Oxford University and exists across several locations, nodes
– particularly self-driving cars, in addition Imperial College London.
to artificial intelligence and high-powered His firm’s mission is an eXpandable
computing – is driving the next boom in Artificial Intelligence Network (XAIN), “We integrated a
semiconductors. Car chips, for example,
use the same amount of silicon as five to
using blockchain systems as an open or
permission-based system that enables
full client in a car
15 iPhones. connectivity to Internet-connected that also verifies all
In addition to a rise in computational devices. These include electronic control communication, so that
the blockchain is not part
demand, players in the decentralised data units (ECU) in cars to control electrical
movement are working in partnerships to systems or subsystems. XAIN cooperates
solve problems and explore opportunities. with semiconductors and integrates them of the backend system,
This year XAIN, based near Berlin in
Germany, solved trust between a database
with automotive microcontrollers from
other vendors. but actually the car itself
of its automotive client Porsche and its Smart contracts are used by the is the blockchain”
Piusillu/WhiteMocca/Shutterstock.com
University of California San Diego, US,
were awarded a three-year National
Science Foundation grant of $818,000
to design and develop an infrastructure
of open-source distributed ledger
technologies to enable researchers to
efficiently share information about their
scientific data, while preserving the
original information. The hope is that
or participants. A blockchain is just one of this will be obsolete because code can researchers will be able to efficiently
specific type of DLT. Not all DLTs use a replace the work that currently requires access and securely verify data, according
chain of blocks, for example, and not all lots of energy. This goes back to what we to the SDSC press release.
distributed databases are DLTs – the trust are doing in Advanced Blockchain. Known as the Open Science Chain, this
boundaries are different. ‘If we look at distributed ledger is not a supercomputer-related project
A DLT delegates access to functions and technology, blockchain is just one... the according to Subhashini Sivagnanam,
data over multiple parties through secure majority of people and start-ups are principal investigator for the grant and
cryptographic principles. This platform focussing on Fintech, everything around a scientific computing specialist with
works in tandem with Porsche’s traditional finance and money. But there are a few SDSC’s data-enabled scientific computing
centralised servers, but the novel network industries with a higher potential that division. This infrastructure will be
becomes part of the backend with better will disrupt sooner than others; those integrated with actual scientific datasets.
trust and security, and gives network users industries that involve a middle party The Open Science Chain aims to
a new level of access between ‘read’ and have a problem with the decentralised increase productivity, security and
‘write’ permissions and more. movement,’ said Küfner. reproducibility. As the datasets change
DLTs are most famously known for The automotive sector is one such area. over time, new information will be
underpinning the digital currency Bitcoin. It can enable better oversight of odometer appended to the chain.
In 2018, Bitcoin mining represented fraud to car buyers, such as clocking or ‘My first impressions are that it may
roughly 0.6% of global energy demand – busting miles, which is the illegal practice help part of the problem – validating
equivalent to Argentina’s consumption. of rolling back odometers to make it and verifying the data,’ said Les Hatton,
Even though Bitcoin mining consumes appear that vehicles have lower distances emeritus professor of forensic software
lots of energy, a DLT network controlled travelled, costing millions. engineering at Kingston University,
algorithmically does not necessarily have ‘The vehicle identification number, the London, UK.
to consume more energy per node. time stamp and actual mileage of the Hatton, who is not involved in this
‘First of all, I think blockchain is vehicle will be uploaded and stored to research, said: ‘However, nothing is said
outdated. Blockchain became a buzzword, a ledger,’ said Küfner; ‘which cannot be about the software which analyses that
we even used it on our company name,’ compromised. By doing so, you have a data. Full computational reproducibility
said Robert Küfner, advisor to Advanced digital twin logbook of the car.’ depends on the whole package: data,
Blockchain, a German publicly listed analysis software, glue software and
company focusing on the design, Opening up to more reproducible testing software.’
development and deployment of DLTs. scientific research However, the large-scale challenge of
‘Using more energy in order to solve a A distributed approach to data also reproducing scientific data is bigger than
problem is simply wrong,’ said Küfner. ‘All opens up possibilities for academia. In just one technical approach, Hatton states.g
spainter_vfx/Shutterstock.com
roughly 39 per cent of the world’s vehicle
Enhanced security production, on integrations in their
Advocates of DLTs say that they enable cars,’ said Lundbæk. ‘We are working
better security between parties in the with Infineon, a leading microcontroller
network by decreasing the chances of a manufacturer, to embed our protocol
malicious outside attack. For example, an on their devices to make our solution
automotive manufacturer stops becoming easier to adapt in vehicles, and more
a single entry point; but, DLTs are not a “If we look at distributed secure using the trusted microcontroller
panacea and must not be used in isolation.
‘It would be wrong to use a blockchain
ledger technology, environment as an accelerator.’
Based on these early adopters, could
to store data, for reasons of scalability and blockchain is just one... a DLT approach make centralised
post-quantum security, so it is used rather the majority of people approaches go the way of the minidisc or
as a synchronisation proof mechanism
and user registry,’ said Lundbæk. and start-ups are eight-track tape?
‘Traditional database will likely continue
These architectures can be used with focussing on Fintech, to be the standard and have their role until
traditional centralised or decentralised
databases to enhance resilience to
everything around these protocols are tested at production
level,’ said Lundbæk.
hacks. However, any executable program finance and money. But ‘This process has just started and it
running within that network could also be a
security risk open to hackers.
there are a few industries is still very challenging for centralised
businesses to think this way, but it is
‘A lot of the most impactful advances with a higher potential mostly the case that they fear that their
are algorithmic in nature, instead of a that will disrupt sooner business will end if they don’t change.’
matter of scale,’ said André Platzer, an
associate professor in the computer than others” According to Küfner, the rise of DLTs
compare with the TCP/IP protocol that
science department at Carnegie Mellon underpins email exchanges. In the 90s,
University, Pennsylvania, US. His work Privacy in a networked world when people tried to explain to others how
focusses on the general principles for The growth of automation and distributed to create and use an email address, there
designing motion or other physical data exchanges heightens the concerns was little knowledge available. Küfner sees
processes in cyber-physical systems, of personal data threats and control. The DLTs moving much faster.
such as surgical robots, aircraft or self- European Union’s General Data Protection As it progresses, blockchain suffers
driving car applications. Regulation (GDPR) came into effect in May from issues such as block response times
‘The most subtle but impactful 2018 for 28 member states, including and block sizes. Advanced Blockchain is
challenges in all these systems is the businesses. The GDPR’s main goals are focussed on implementing another type of
identification of which actions can safely stronger rules on data protection, so DLT that is practical and scalable, called a
be taken under what circumstance and individuals have more control over their Directed Acyclic Graph (DAG).
why,’ said Platzer. ‘For example, when personal data and businesses have a level ‘These issues are not sustainable and
should an aircraft climb and when it should playing field on processing private data. blockchain will be replaced by a better
descend instead; or, when can a car The use of cryptography within DLTs can version called DAG,’ said Küfner. ‘If you
accelerate or coast, or even brake.’ also support GDPR principles. All personal have a fully functional DAG, imagine all
Platzer and his colleagues develop data is encrypted and can be stored on the energy that you can save. So DLT will
methods that enforce safety in individual file storage. be fantastic for climate change because
reinforcement learning algorithms. For ‘A GDPR-compliant process – specific it will reduce the amount of energy that is
example, they use programming principles to a given use case – can then be defined, consumed by all the processes that are
with an automated pipeline approach individualised and built into the user currently running.’
called ‘VeriPhy’ (verified controller interface by mapping out or combining Large organisations or companies, like
executables from verified cyber-physical processes for consent, access, update large cruise ships or tankers, are floating
system models). and erasure policy,’ said Lundbæk. on an ocean of possibilities. Some are
It provides a safe interaction between ‘The user can then decide whether or actively directing their paths straight
the code and actual physics to generate not to grant access of this data to the into the currents of decentralised data
executables that perform exactly in the manufacturer or other parties.’ management streams more than others.
same way that the original models and Now XAIN is testing other applications The market is ambitious.
algorithms were supposed to execute. with Porsche vehicles. These include ‘There is no one in DLT who is a
Platzer says that this is a more disciplined real-time notifications for drivers about a professional and has proven themselves
approach to address security concerns. third-party car access, granting remote to know what they are doing, because
‘Privacy is another matter and, indeed, access of a parked car for secure delivery the industry is so young,’ Küfner said.
it’s not clear if the world is better off of packages and unlocking or locking a ‘In fact, I would not call this an industry,
favouring privacy over exchange of car with a blockchain-powered offline rather a movement. Because it affects
information – in case only the latter can connection, with no server connection. human interactions; that is why it is not a
prevent collisions of cars. Privacy clearly Porsche benefits from increased trust business. It is a beginning of a change to
is something to be thought about in vehicle data, audited information for social-economic behaviour and will involve
carefully,’ said Platzer. reports, local data access for predictive in the near future.’
*Registration required
Intelligent
Optimisation
However, tolerances and precision in the team, or product optimisation with non-
GEMMA CHURCH LOOKS AT aerospace industry (where optimisation is simulation experts, to name a few.’
THE ROLE OF OPTIMSATION an established tool) are astoundingly tight
SOFTWARE IN MODERN because of the high degree of regulation Digital twins
ENGINEERING PRACTICES and safety required. As a result, errors Optimisation is now helping companies
and deformation from thermal cycling create digital twins with the integration
that occur on the micron scale during of machine learning techniques. Brennan
manufacturing processes can render a explained: ‘As things mature, optimisation
part unusable. algorithms will combine with learning
Manufacturing Technology Centre algorithms to improve product design.
(MTC) is working to optimise the achieved ‘For example, if we have a product out
Optimisation is everywhere. It’s precision in its additive processes and in the field such as a wind turbine, data
prevalent in our day-to-day overcome this challenge. Using Comsol will be gathered on its performance in
decisions as we choose the Multiphysics modeling and simulation various conditions. This puts a load on that
fastest route to work, the cheapest apps, MTC has created an app that device, which starts to make the baseline
product or service for our needs, or the predicts the deformation of parts and design struggle for its normal longevity
healthiest snack (or, maybe, the tastiest) allows designers to build deformations of product duration, particularly during
when hunger arises. directly into their designs. adverse conditions.’
It’s also inherent to optimise the The implications of such apps are wider Brennan added: ‘That data can be
designs, processes and services that reaching, according to Sjodin, who added: used the next time a structure needs to
shape our lives through a range of evolving ‘The field of simulation apps also opens be placed in that same place, under the
simulation techniques and applications. up the door for a variety of optimisation same conditions. That new structure will
As such, optimisation is finding a new opportunities, such as supporting a sales then withstand that environment better
place in the additive manufacturing market because of the information provided to the
and is helping manufacturers meet rising digital twin version with data from the field.
demands for lighter and more energy “3D printing or additive That’s a form of optimisation that takes
manufacturing is
efficient products by optimising mass place over years – after the product leaves
distributions. the manufacturing plant and is in service.
Bjorn Sjodin, VP of product probably the current Product improvement still hasn’t stopped
management at Comsol, explained: ‘3D
printing, or additive manufacturing, is
strongest trend within because the data is taken from the
environment of true operation to inform
probably the current strongest trend the optimisation space” the next generation of that product.’
within the optimisation space. Here,
shape and topology optimisation are
Altair
important methods, and they result in new
shapes and designs that the human mind
could never envision without the help of
computational tools. It is an exciting area
that we are very active in, with respect to
customer interactions and implementation
of new tools for the future.’
Consequently, topology optimisation is
finding a new lease of life in the 3D printing
and additive manufacturing markets.
Jeffrey Brennan, chief marketing officer at
Altair, explained: ‘Topology optimisation,
with its generation of efficient, non-
traditional, organic-like designs, serves
as a perfect complement for 3D printing,
given the manufacturing flexibility that it
offers. But it goes beyond that to include
the design and optimisation of even
complex lattice structures using different
optimisation disciplines.’ The latest Altair 365 and Altair Inspire platform images
Optimised,
Ansys
varying density,
lattice geometry
created using Ansys
Mechanical showing
displacement
g Engineering expertise
COMSOL
Esteco has been working in the
optimisation field for nearly two decades,
with a focus on real-life engineering
problems. It promotes an approach based
on ‘optimisation driven design’, where
users can advance their design process
with intelligent algorithms, which utilise
machine learning techniques. These
algorithms ‘understand’ each specific
problem and find optimal designs in
less time, at lower cost and using fewer
computational resources than traditional
techniques.
Matteo Nicolich, Volta product manager
at Esteco, explained: ‘When building
workflows composed of complex software
chains, having a tool that automates all the
repetitive work needed to interface one
software with the other, or to interface the
output of one software with the input of
the following one is essential.’ Shape optimisation analysis of a mounting bracket created using Comsol Multiphysics software
Esteco also focuses on parametric
optimisation. ‘This type of optimisation of variables, which will be used by set of smart algorithms that come with a
– based on the management of ‘free’, user- Volta, either as range limits or specific possibility to run in ‘autonomous’ mode.’
defined parameters – allows to apply the predetermined sets. Kleanthous said: ‘We provide our customers with
same techniques to a significantly more ‘Volta has incorporated those variables algorithms that are not only able to deploy
vast spectrum of problems compared to into easily editable number sets. Volta multiple strategies at once to the same
topological optimisation, which remains has given us the opportunity to run engineering problem, but are also able to
limited to geometry problems,’ Nicolich optimisation and sensitivity analysis for learn from the problem itself and adapt
added. the various scenarios our vessel was accordingly.
Esteco’s optimisation tools have been tested on.’ ‘This is not about optimisation for
used as part of the GasVessel project in The cargo capacity, vessel’s speed, dummies, but it is about obtaining viable
Cyprus, where researchers are developing distance travelled, supply and demand and accurate insights with little time or
a prototype tank containment system, variables have all been optimised using information about the product at hand.’
which will be installed on a vessel and the Volta platform. CHC is working with However, such automation opens up
used to transport compressed natural gas Esteco to add new features or change another can of worms: management of the
(CNG). specific processes in the simulation huge swathes of data that result. ‘Together
The Volta platform has incorporated algorithm. It also expects to integrate Volta with the management of simulation
a deterministic simulation to generate with its other projects. processes, there is the big issue of
results as cost per cubic unit. Athos Kleanthous said: ‘Volta has helped managing the data and all the work in
Kleanthous, commercial analyst at Cyprus identify the smallest vessel capacity progress that is related to the construction
Hydrocarbons Company (CHC), explained: possible for a specific route, less number and creation of process automation, and
‘More specifically, we are running a of ships employed in the specific supply all the data that are generated upon the
simulation of a vessel, which lifts a cargo chain and optimal speed to achieve on- execution of these processes,’ Nicolich
from point A and has to travel a certain time deliveries of the cargo in transit.’ added.
number of nautical miles to point B, where ‘The results Volta has returned have Esteco has developed a Simulation Data
it should discharge the cargo.’ shed light to the development team for the Management tool, which allows users to
The simulation needs a number directions to follow, in order to deliver a keep track of all the information related
viable project. to a simulation process, so they can cycle
‘Besides the algorithm and the results back along the history to see how a single
“We provide our itself, Volta is a web-based tool, which is piece of information was created, by which
customers with easily accessible through any internet
browser, having inputs in a simple format
data and which model.
Nicolich said: ‘With the Engineering
algorithms that are and also offers an extremely simple user Data Intelligence tool, we want the set
not only able to deploy interface,’ Kleanthous added. of functionalities to understand and
mine information in this potentially huge
multiple strategies Autonomous future amount of engineering data, and extract
at once to the same Esteco is now focusing on its optimisation information on how to predict the next
*Registration required
Webcasts
now available online
Sponsored by
Creating a connected ecosystem to gain insights across R&D
Sponsored by
Ensuring Data Integrity & Facilitating Compliance with ISO 17025
Leveraging Machine Learning for Decision Making in the Materials Sciences Sponsored by
SCIENTIFIC
COMPUTING
www.scientific-computing.com/webcasts WORLD
RESOURCES
Suppliers’ Directory
Find the suppliers you need quickly and easily
www.scientific-computing.com/suppliers
Integrated Engineering Eurotech high-performance Our experts in life science At OCF we work with you to
Software is a leading computing solutions help and industrial R&D help you meet your high performance
developer of hybrid simulation universities, research centres, to transform information into data processing, data
tools for electromagnetic, companies and governments knowledge by custom data management and data storage
thermal and structural design to excel in their field. integration projects for LIMS/ needs
analysis. ELN/Bio&Chem Informatics.
JULIOT CURIE HAZEL HEN JUWELS SUPERMUC MARCONI MARENOSTRUM 4 PIZ DAINT
@ GENCI @ CEA, France @ GCS @ HLRS, Germany @ GCS @ FZJ, Germany @ GCS @ LRZ, Germany @ CINECA, Italy @ BSC, Spain @ CSCS, Switzerland
Drop by the PRACE booth 2033 and speak to our experts about:
• PRACE Project and Preparatory Access
• PRACE Industry Access and SHAPE (SME HPC Adoption Programme in Europe)
• PRACE Training & Education
• PRACE White Papers & Best Practice Guides EXDCI-2 PROJECT
• PRACE MOOCs, CodeVault IS PRESENTED TOO!
www.labware.com