Keynote Speakers

speaker

CHRISTOPHE CLARAMUNT

Shanghai Maritime University (China) & Naval Academy Research Institute (France)

"Big Data, Spatial Data and Social Networks over the Web"

Abstract: Nowadays the Web and other information resources and exchange mediums such as Twitter, Snapchat and many others provide many novel opportunities for the analysis of human interactions and behaviors. This talk will introduce several examples of recent work that consider these information means as big data systems that offer several avenues for social and economic studies. I will first introduce a graph-based and computational modelling approach that derives the main structural, temporal and spatial properties that emerge from the study of an implicit research community exhibited by a series of conferences over the Web. Next I will present a study that explores large crowd behavior at the regional scale using a large geo-tagged Twitter dataset. The main idea behind this computational study is to explore human spatio-temporal patterns and moods at the regional scale in Japan. Patterns are analyzed in space and time, emotions are categorized using a sentiment-based dictionary approach. Finally I will discuss some of the many opportunities left for further research.

Short biography: Professor Christophe Claramunt is currently the chair of the Naval Academy Research Institute in France. He was previously a senior lecturer in computing at the Nottingham Trent University and senior researcher at the Swiss Federal Institute of Technology in Lausanne. His research is oriented towards theoretical, computational and pluri-disciplinary aspects of geographical information systems. Over the past few years he has been regularly involved in EU funded projects such as the H2020 project datAcron "Big Data Analytics for Time Critical Mobility Forecasting". Amongst other affiliations, he is a research fellow at the Research Center for Social Informatics of the Kwansei University in Japan, Centre for Planning Studies at the Laval University, the Laboratory for Geographical Information Science at the Chinese University of Hong Kong and the Logistics Engineering Department at the Shanghai Maritime University. 
More at: http://christophe.claramunt.free.fr/ or https://sites.google.com/site/christopheclaramunt/

speaker

ADRIAN HOPGOOD

University of Porstmouth, UK

“Practical Artificial Intelligence with Big Data”

Abstract: Big data are important for delivering practical artificial intelligence, but they are not the complete picture. A wide range of techniques has emerged from the field of artificial intelligence including rules, frames, model-based reasoning, case-based reasoning, Bayesian updating, fuzzy logic, multiagent systems, swarm intelligence, genetic algorithms, deep learning, and neural networks. They are all ingenious and useful in narrow contexts. It will be argued in this presentation that a truly intelligent system needs to draw on a variety of these approaches within a hybrid system. Five distinct ways to enhance or complement one technique with another will be identified. Several practical examples will be presented, ranging from medical diagnosis to the control of specialised manufacturing processes.

Short biography: Adrian Hopgood is Full Professor of Intelligent Systems and Director of Future & Emerging Technologies at the University of Portsmouth in the UK. He is also a visiting professor at the Open University and at Sheffield Hallam University. He is a Chartered Engineer, Fellow of the BCS (the Chartered Institute for IT), and a committee member for the BCS Specialist Group on Artificial Intelligence. 
Professor Hopgood has extensive experience in both academia and industry. He has worked at the level of Dean and Pro Vice-Chancellor in four universities in the UK and overseas, and has enjoyed scientific roles with Systems Designers (now part of Hewlett-Packard) and the Telstra Research Laboratories in Australia. 
His main research interests are in artificial intelligence and its practical applications. He has supervised 19 PhD projects to completion and published more than 100 research articles. His text book "Intelligent Systems for Engineers and Scientists” has been published in three editions and is ranked as a bestseller.
More at: https://researchportal.port.ac.uk/portal/en/persons/adrian-hopgood(81562870-679c-4459-8963-86ef6670a4b0).html

Industry Speaker

speaker

MAX HOFFMANN

RWTH Aachen University, Germany

Short biography: Dr. Max Hoffmann is a scientific researcher with the Institute of Information Management in Mechanical Engineering at the RWTH Aachen University since 2012. In the years from 2012 to 2017 he focused on the consulting of various industrial partners as part of the research group “Production Technology” as well as on his Ph.D. thesis (Dr.-Ing.). Since 2016, Max Hoffmann is Research Group Leader of the “Industrial Big Data” group, which focuses on the requirement of modern manufacturing with regard to the digitization. Prior to his engagement with the institute Max Hoffmann has studied Mechanical Engineering with emphasis on Process Engineering at the RWTH Aachen University until 2010. Parallel to his first consultancy activities in IT he acquired an additional degree in general economic science in 2012 by achieving a Master of Business Administration (MBA). In the context of his doctorate activities as well as in terms of his current research, Max Hoffmann is focusing on topics related to the digitization in the manufacturing industries. The covers the information technological process chain from the acquisition of data in the field by making use of semantic interface and IoT technologies (OPC UA, MQTT, …), the integration of information using highly scalable technologies (Big Data) as well as the creation of valuable insights for the production process by means of data-driven approaches (Machine Learning). A distinctive focus of the Industrial Big Data group hereby consists in the research novel concepts for processing and storage of huge data sets by making use of “Data Lake” approaches. These concepts allow for a holistic usage of data from the field together with information from higher systems of production planning and control (ERP, MES, …). Current research activities of Max Hoffmann besides topics related to the “Industrial Big Data” also cover fields such as semantic technologies, ontologies and the application of (Industrial) Internet of Things technologies in the production context. Additional research activities focus on multi-agent system technologies in manufacturing. In this context, Max Hoffmann is part of the expert group “Agent systems in automation technology”, member of the technical committee “Agent systems” of the VDI/VDE-Gesellschaft Mess- und Automatisierungstechnik (GMA) as well as author of the standard of the working group. 
More at: https://cybernetics-lab.de/en/employees/max.hoffmann

Invite Speake

Gabriele Mencagli

University of Pisa, Italy

Short biography: Gabriele Mencagli is an Assistant Professor at the Computer Science Department of the University of Pisa, Italy. He got his PhD in 2012 from the same University. His research is oriented towards novel parallel paradigms for data stream processing, autonomic computing and high-performance computing in general. He has been involved in several Italian and European research projects (i.e. FP7 Repara and H2020 Rephrase projects) an in industrial collaborations (e.g., with AutoDesk CO.). He published more than 50 peer-reviewed papers appeared in international conferences and journals. He is member of the editorial board of Future Generation Computer Systems (Elsevier) and Cluster Computing (Springer).

Title: Efficient Big Data Streaming on Modern Scale-Up Servers

Abstract: An ever-growing number of devices are capable of sensing the world by producing huge flows (data streams) of information regarding the users and the environment. A large set of applications need efficient processing techniques to extract insights and complex knowledge from such a massive and transient data deluge. Furthermore, to keep the processing in real-time, existing systems must expose parallel features to adapt the algorithms and the way the processing is performed to unpredictable and time-varying input rates and workloads.

This talk will provide a critical review of traditional Stream Processing Systems targeting large-scale distributed platforms (e.g., Apache Flink, Storm and Spark Streaming). Most of them are still inadequate to exploit the computational power provided by modern scale-up servers equipped with several multi-core CPUs and co-processors like GPUs and FPGAs. This talk will clarify the reasons for such inefficiencies, which pave the way to the design of the WindFlow library, a new C++ framework for stream processing on scale-up servers built on top of the RISC-like parallel building blocks provided by the FastFlow parallel programming environment. The talk will show experimental results on some real-world applications and will conclude with a discussion on future research problems and open directions in this research field.