Jiannong Cao: Computation Partitioning for Mobile Cloud Applications

Abstract: Mobile Cloud Computing (MCC) offers great opportunities for mobile service industry, allowing mobile devices to utilize the elastic resources offered by the cloud. By using MCC technologies, developers can create advanced applications on mobile devices, such as multimedia applications, and augmented reality, which far more exceed the capability of the devices. However, MCC faces challenges, among which the computation partitioning problem studies how to optimally divide an application into modules and decide which modules should be offloaded to the cloud for improving the system and application performance. There are various research issues. From the mobile side, the partitioning of applications should dynamically change with the user’s mobile environments which may vary due to the user’s mobility. From the cloud side, computation partitioning should support multi-user applications to gain profit through realizing the economic of scale. In this talk, I will first present the state of arts of research in computation partitioning in terms of application modeling, profiling, optimization, and distributed execution. I will then describe how to develop a systematic approach to support computation partitioning from three different dimensions, namely application, user, and environment. I will also present our work on addressing the challenging issues in the three respective dimensions.

Bio: Dr. Cao is currently a chair professor and head of the Department of Computing at Hong Kong Polytechnic University, Hung Hom, Hong Kong. His research interests include parallel and distributed computing, computer networks, mobile and pervasive computing, fault tolerance, and middleware. He has co-authored 3 books, co-edited 9 books, and published over 300 papers in major international journals and conference proceedings. He is a fellow of IEEE, a senior member of China Computer Federation, and a member of ACM. He was the Chair of the Technical Committee on Distributed Computing of IEEE Computer Society from 2012 - 2014. Dr. Cao has served as an associate editor and a member of the editorial boards of many international journals, including ACM Transactions on Sensor Networks, IEEE Transacitons on Computers, IEEE Transactions on Parallel and Distributed Systems, IEEE Networks, Pervasive and Mobile Computing Journal, and Peer-to-Peer Networking and Applications. He has also served as a chair and member of organizing / program committees for many international conferences, including PERCOM, INFOCOM, ICDCS, IPDPS, ICPP, RTSS, DSN, ICNP, SRDS, MASS, PRDC, ICC, GLOBECOM, and WCNC. Dr. Cao received the BSc degree in computer science from Nanjing University, Nanjing, China, and the MSc and the Ph.D degrees in computer science from Washington State University, Pullman, WA, USA.

Geoffrey Fox: Classification of Big Data Applications and Convergence of HPC and Cloud Technology

Abstract: We discuss study of the nature and requirements of many big data applications in terms of Ogres that describe important general characteristics. We develop ways of categorizing applications with features or facets that are useful in understanding suitable software and hardware approaches where 6 different broad paradigms are identified. This allows study of benchmarks and to understand when high performance computing (HPC) is useful. We propose adoption of DevOps motivated scripts to support hosting of applications on the many different infrastructures like OpenStack, Docker, OpenNebula, Commercial clouds and HPC supercomputers.

Bio : :Professor Fox is a distinguished professor of Informatics and Computing, and Physics at Indiana University where he is director of the Digital Science Center and Associate Dean for Research and Graduate Studies at the School of Informatics and Computing. He has supervised the Ph.D. of 61 students and published over 600 papers in physics and computer science. He currently works in applying computer science to Bioinformatics, Defense, Earthquake and Ice-sheet Science, Particle Physics and Chemical Informatics. He is principal investigator of FutureGrid - a new facility to enable development of new approaches to computing. Professor Fox is a Fellow of ACM.

Randy Goebel: What could we expect from a logic of visualization?

Abstract: In the history of formalizing knowledge, logics have provided a framework for debugging the machine representation of information, and have eventually been adapted to help articulate both static and dynamic representations. A significant advantage of applying logics is the clear connection between syntax and semantics, even when the implementation of logical inference systems often pose computation efficiency challenges. In this brief talk, we focus on the potential value of applying logics, and consider the structure of logics of visualization. The primary question is really about constraining the connection between the semantics of data, even big data, and visualizations that support inferences on that data.

Bio: R.G. (Randy) Goebel is currently professor of Computing Science in the Department of Computing Science at the University of Alberta, and principle investigator in the Alberta Innovates Centre for Machine Learning (AICML). He received the B.Sc. (Computer Science), M.Sc. (Computing Science), and Ph.D. (Computer Science) from the Universities of Regina, Alberta, and British Columbia, respectively. Professor Goebel's theoretical work on abduction, hypothetical reasoning and belief revision is internationally well know, and his recent application of practical belief revision and constraint programming to scheduling, layout, and web mining is now having industrial impact. His recent research is focused on the formalization of visualization, with applications in web mining, optimization, and nanotechnology. Randy has previously held faculty appointments at the University of Waterloo and the University of Tokyo, and is actively involved in academic and industrial collaborative research projects in Canada, Japan, China, and Germany.

Valentina Salapura:Analytics Platforms in the Cloud: Architecture and Applications

Abstract: Two new technologies are revolutionizing information use for decision making in many application domains: cloud computing and big data analytics. Cloud computing provides elastic and virtually unlimited compute resources on demand, whereas analytics can extract actionable insights from data. The IBM Watson system has demonstrated the power of applying analytics to large bodies of knowledge. Analytics applications are distilling insights from a variety of data, mining data to support a broad range of decision making processes.
One emerging application area is medical analytics: to serve this application domain, IBM has announced Watson Health, a cloud-based offering to support continuous care and medical decision making. A critical part of making this information accessible includes a framework spanning cloud-based processing and mobile information delivery. This talk discusses the architecture and design of analytics platforms in order to ensure data isolation and security, high availability and resilience of applications, and to increase performance of analytics applications in the cloud computing environment.

Bio: Dr. Salapura is an IBM Master Inventor and System Architect at the IBM T.J. Watson Research Center. She leads the development of analytics platforms for cloud computing and is architect for Resilience and High Availability for IBM Cloud Managed Services. In 2010, Dr. Salapura served as a co-lead for the Global Technical Outlook as part of the IBM Research Strategy and Worldwide Operations team defining IBM’s Internet-of-Things roadmap and strategy. Previously, she was a computer architect for the Power8 processor definition team, and the Blue Gene program since its inception. Before joining the IBM at the T.J. Watson Research Center in 2000, Dr. Salapura was a faculty member with Technische Universität Wien, where she also received her Ph.D. degree. Dr. Salapura has received several IBM corporate awards for her technical contributions. She is the author of over 80 papers and several book chapters on processor and network architecture, and holds over 120 patents in this area. Dr. Salapura is an ACM Distinguished Speaker and Fellow of the IEEE. She is a recipient of the 2006 ACM Gordon Bell Prize for Special Achievements for the Blue Gene/L supercomputer and quantum chromodynamics.

Yanchun Zhang: Medical Big Data: Medical Data Mining and Innovative Applications

Abstract: Due to the recent development or maturation of database, data storage, data capturing, patient monitoring and sensor technologies, huge medical and health data have been generated at hospitals and medical organizations at unprecedented speed. Those data are a very valuable resource for improving health delivery, health care and decision making andbetter risk analysis and diagnosis. Health care and medical service is now becoming more data-intensive and evidence-based since electronic health records are used to track individuals' and communities' health information (particularly changes). These substantially motivate and advance the emergence and the progress of data-centric health data and knowledge management research and practice.
In this talk, we will introduce several innovative data mining techniques and case studies to address the challenges encountered in e-health. This includes techniques and development on data streams, data clustering, correlation analysis, pattern recognition, abnormally detection and risk predictions in Critical Care and Cardiovascular Medicine.

Bio:Yanchun Zhang has been a full Professor and Director of Centre for Applied Informatics at Victoria University since 2004. He is National "Thousand Talents Program" Professor in China since 2010 (currently with Fudan University). Prof. Zhang obtained a PhD degree in Computer Science from The University of Queensland in 1991. His research interests include big data, databases, data mining, internet, and e-health / health informatics. He has published over 260 research papers in international journals and conference proceedings including top journals such as ACM Transactions on Computer and Human Interaction (TOCHI), IEEE Transactions on Knowledge and Data Engineering (TKDE), Computational and Mathematical Methods in Medicine, Computer Methods and Programs in Biomedicine, and a dozen of books and journal special issues in the related areas.
He is currently leading a multi-disciplinary research group in developing innovative technologies and big data analytics for health/medicine and environment studies. His research has been supported by a number of Australian Research Council (ARC) Linkage Projects and Discovery Projects grants and NSFC (China) projects. The research work on data mining for patient monitoring and risk prediction has received great attention from the media. The result was reported in The Age, Sydney Morning-Herold, Canberra Time, WA Today, Brisbane Time, etc, with the title “a few second can save lives” and in Australian: "Surgery made safer with program that predicts patients' vital signs". It is also reported in a number of Chinese media channels like ChinaDaily, XinhuaNet, ChinaNews, ...
Dr. Zhang is a founding editor and editor-in-chief of Health Information Science and Systems Journal (BioMed Central) and editor-in-chief of World Wide Web Journal (Springer), and also the founding editor of Web Information Systems Engineering Book Series and Health Information Science Book Series. He is Chairman of International Web information Systems Engineering Society (WISE). He was a member of Australian Research Council's College of Experts (2008-2010), and serves as expert panel member at various international funding agencies and government boards including National Natural Science Fund of China (NSFC), “National 1000 Talents Program” of China, Royal Society of New Zealand Marsden Fund, College of Experts of Australia Research Council, etc.

Hai Zhuge: Unconventional Mapping from Big Data into Knowledge Space

Abstract: Big data have attracted great attention in science, technology, industry and society. However, its nature and the fundamental challenge have not been recognised. Big data research is developing with the evolving scientific paradigm, the fourth industrial revolution, and transformational innovation of technologies. This lecture tries to answer the following questions: What is the nature of big data? What is the relation between big data and knowledge? What kind of infrastructure is requested to support not only big data management and analysis but also knowledge discovery and management? What is the relationship between big data and science paradigm? What is the fundamental challenge of big data computing? Can we find an approach to mapping big data into knowledge space?