Overview of Speakers and Programme for the BTD8 on June 12th, 2015
An overview of the programme can be found here.
An overview of the programme can be found here.
Daylen Yang - University of California at Berkeley
Deep Blue was the best chess computer in 1997. Since then, modern chess engines have made significant improvements and are now far stronger than any human grandmaster. This talk gives an overview of how chess engines work, covering search/evaluation and interesting high-level and low-level optimizations. It will also discuss the Fishtest distributed testing framework, a method to measure strength improvement during chess engine development.
Daylen Yang joined Stockfish open source project in 2010. He is the developer of the Stockfish for Mac app and has played various roles for the project team. He is currently studying electrical engineering and computer science at the University of California, Berkeley and is interning at Facebook this summer.
Tord Romstad, stockfishchess.org
Since the earliest days of computer chess, almost all competitive chess programs have used some variant of the alpha-beta algorithm. The alpha-beta algorithm is simple and elegant, always gives the same result as a plain minimax search, and effectively doubles the achievable search depth for a given amount of thinking time. However, the branching factor in chess is so high that even with alpha-beta pruning, a brute-force search will not get very deep in the game tree. The alpha-beta algorithm is also notoriously difficult to parallelize. This talk explains how modern chess programs reduce the tree size much further than what is possible with only alpha-beta pruning, and how to efficiently distribute the search across multiple CPU cores.
Tord Romstad is an algebraic geometer from Oslo, Norway, and one of the main authors of the open source chess engine Stockfish. He is currently working for the Oslo-based data science startup Sannsyn, and for Play Magnus AS, the company behind the official Magnus Carlsen apps for iOS and Android. This app promises you to play against the current World Chess Champion at 19 different ages.
Machine learning and big data are some of the biggest technology trends nowadays, and neural networks are experiencing their second renaissance. What is this fuzz about? Is it justified? And why does it happen now? This talk will discuss some developments in this field, and show sample applications where machines reach human level accuracy.
Martin Stumpe is an engineer at Google, focussing on Machine Learning and Computer Vision applications. Prior to joining Google, he worked at NASA Ames Research Center in Mountain View to develop signal processing algorithms for NASA’s Kepler Mission, and before that at as a researcher at Stanford University to investigate protein folding dynamics using physics-based simulations. He also developed and brought to market a video tracking software (AnTracks) that is used in various applications such as microbiology and behavioral ecology. Martin’s background is in physics. He has studied at the Universities of Muenster and Goettingen, where he received a Dr. rer. nat. in Physics from the Max Planck Institute for Biophysical Chemistry and the University of Goettingen, Germany.
Professor Dr. Gordon Cheng, chair of the Institute for Cognitive Systems at the Technical University of Munich
It is amazing how we can use tools so easily from a simple screwdriver to dexterous usage of chopsticks, our brain seamlessly make use of these devices for fairly complex tasks. In this talk I'll touch on ways in how we can fool our brain embrace even complex robotic systems.
Professor Gordon Cheng holds the Chair of Cognitive Systems, Founder and Director of Institute for Cognitive Systems, at the Technical University of Munich. Formerly, he was the Head of the Department of Humanoid Robotics and Computational Neuroscience, ATR Computational Neuroscience Laboratories, Kyoto, Japan. He was the Group Leader for the newly initiated JST International Cooperative Research Project (ICORP), Computational Brain. He has also been designated as a Project Leader / Research Expert for National Institute of Information and Communications Technology (NICT) of Japan. He is also involved (as an adviser and as an associated partner) in a number of major European Union Projects. He held fellowships from the Center of Excellence (COE), Science and Technology Agency (STA) of Japan. Both of these fellowships were taken at the Humanoid Interaction Laboratory, Intelligent Systems Division at the ElectroTechnical Laboratory (ETL), Japan. At ETL he played a major role in developing a completely integrated humanoid robotics system. He has extensive industrial experiences in consultancy as well as contractual development of large software systems. He was also the chief executive officer of the company, G.T.I. Computing, a company he founded specializing in networking and transport management systems in Australia. Over the past ten years Professor Cheng is the co-inventor of approximately 18 patents and the author of approximately 250 technical publications, proceedings, editorials and book chapters. Furthermore, he demonstrated successfully to be able to work within and across a number of research domains. During his time as the head of Department of Humanoid robotics and Computational Neuroscience, Professor Cheng conducted high impact collaborative research in neuroscience, cognitive science and alike, and co-authored numerous publications in these areas.
Dr. Ian Goodfellow, research scientist at Google
Machine learning algorithms have reached human-level performance on a variety of benchmark tasks. This raises the question of whether these algorithms have also reached human-level 'understanding' of these tasks. By designing inputs specifically to confuse machine learning algorithms, we show that statistical models ranging from logistic regression to deep convolutional networks fail in predictable ways when presented with statistically unusual inputs. Fixing these specific failures allows deep models to attain unprecedented levels of accuracy, but the philosophical question of what it means to understand a task and how to build a machine that does so remains open.
Ian Goodfellow is a research scientist at Google. He earned a PhD in machine learning from Université de Montréal in 2014. His PhD advisors were Yoshua Bengio and Aaron Courville. His studies were funded by the Google PhD Fellowship in Deep Learning. During his PhD studies, he wrote Pylearn2, the open source deep learning research library, and introduced a variety of new deep learning algorithms. Previously, he obtained a BSc and MSc in computer science from Stanford University, where he was one of the earliest members of Andrew Ng's deep learning research group.
Professor Dr. Jürgen Schmidhuber, Co-director of the Swiss Research Institute for Artificial Intelligence IDSIA
In recent years, deep artificial neural networks (including recurrent ones) have won many contests in pattern recognition and machine learning. They are now widely used in industry and academia. In this talk, the field and the latest state-of-the-art results in numerous important applications will be reviewed, and it will also be discussed how artificial intelligence will fundamentally change our civilization.
Since age 15 or so, the main scientific ambition of Prof. Jürgen Schmidhuber (pronounce: You_again Shmidhoobuh) has been to build an optimal scientist through self-improving Artificial Intelligence (AI), then retire. He has pioneered self-improving general problem solvers since 1987, and Deep Learning Neural Networks (NNs) since 1991. The recurrent NNs (RNNs) developed by his research groups at the Swiss AI Lab IDSIA & USI & SUPSI & TU Munich were the first RNNs to win official international contests. They have revolutionized connected handwriting recognition, speech recognition, machine translation, optical character recognition, image caption generation, and are now in use at Google, Microsoft, IBM, Baidu, and many other companies. The first 4 members of DeepMind (sold to Google for over 500M) include 2 former PhD students from his lab. IDSIA's Deep Learners were also the first to win object detection and image segmentation contests, and achieved the world's first superhuman visual classification results, winning nine international competitions in machine learning & pattern recognition (more than any other team). They also were the first to learn control policies directly from high-dimensional sensory input using reinforcement learning. His research group also established the field of mathematically rigorous universal AI and optimal universal problem solvers. His formal theory of creativity & curiosity & fun explains art, science, music, and humor. He also generalized algorithmic information theory and the many-worlds theory of physics, and introduced the concept of Low-Complexity Art, the information age's extreme form of minimal art. Since 2009 he has been member of the European Academy of Sciences and Arts. He has published 333 peer-reviewed papers, earned seven best paper/best video awards, and is recipient of the 2013 Helmholtz Award of the International Neural Networks Society.
Cortical.io’s Semantic Fingerprinting technology originates in a new, fundamentally different machine learning approach: it is based on a statistics-free processing model that uses similarity as a foundation for intelligence. The Cortical.io Retina converts any kind of text into a numeric representation, a Semantic Fingerprint that encodes meaning explicitly with all contained senses and contexts. The system "understands" the relatedness of two items by simply measuring their overlap. As a result, it is very fast, reliable and easy to implement - a breakthrough technology that leverages the intelligence of the brain to enable the Natural Language Processing of Big Text Data.
Francisco took an interest in information technology as a medical student specializing in genetics and serology at the University of Vienna. He participated in various research projects at the Vienna Serological Institute and was heavily involved in medical data processing. He took part in such projects as establishing and organizing Austria's dialysis register database and creating a patient documentation system for the university clinic. In the mid-1990s, he worked with Konrad Becker to found Vienna's Institute for New Culture Technologies and Public Netbase - at that time Austria's only free public-access internet server - thus establishing an international competency platform for the critical use of information and communication technologies. In 2005, Francisco founded Matrixware Information Services, a company that developed the first standardised database of patents under the name of Alexandria, where he acted as a CEO. He also initiated the foundation of the Information Retrieval Facility in 2006, a not-for-profit research institute, with the goal to reduce the gap between science and industry.
Professor Dr. Immanuel Bloch, Director at the Max-Planck-Institute for quantum optics
More than 30 years ago, Richard Feynman outlined the visionary concept of a quantum simulator for carrying out complex physics calculations. Today, his dream has become a reality in laboratories around the world. All this has become possible using complex experimental setups of thousands of optical elements, allowing atoms to be cooled to Nanokelvin temperatures, where they almost come to rest. The atoms can then be trapped and manipulated in arrays of millions of microscopic light traps. Such ‘light crystals’ allow an unprecedented view into the microscopic world of quantum materials and have enabled the most precise atomic clocks to date that are fundamental to next generation timing and navigation applications. In my talk, I will give an introduction how such quantum simulators can be realized at the lowest man-made known temperatures and outline some of their applications ranging from condensed matter physics over statistical physics to high energy physics with table-top experiment.
Immanuel Bloch is scientific director at the Max-Planck Institute of Quantum Optics in Garching and holds a chair for experimental physics at the Ludwig-Maximilians University (LMU) in Munich. His scientific work is among the most highly cited in the field of quantum physics and has helped to open a new interdisciplinary research field at the interface of atomic physics, quantum optics, quantum information science and solid state physics. For his research, Immanuel Bloch has received several national and international prizes, among them the Gottfried Wilhelm Leibniz Prize of the DFG, the Bundesverdienstorden of Germany, the Philip-Morris Research Prize, the Senior Prize for Fundamental Aspects of Quantum Electronics and Optics of the European Physical Society and the Körber European Science Prize.
Hagen Klauk, Ph.D., Director of the Max Planck Research Group Organic Electronics
Active-Matrix Organic Light-Emitting Diode (AMOLED) displays are among the most exciting developments in consumer electronics in the past decade. 250 million AMOLED displays with a total area of 1.6 square kilometers and a total value of 12 billion Euros were manufactured in 2013, mainly for mobile phones and small tablets. Current trends in AMOLED displays include higher resolution (beyond HD), larger screen size (55 inch diagonal) and the development of flexible backplanes for rollable or foldable displays. The latter will benefit from a technology which allows the thin-film transistors (TFTs) that control the individual display pixels to be fabricated at temperatures compatible with polymeric substrates, i.e., at temperatures below about 150 °C. One possibility are TFTs based on conjugated organic semiconductors.
Hagen Klauk received the Diplom-Ingenieur degree in electrical engineering from Chemnitz University of Technology, Germany, in 1995 and the Ph.D. degree in electrical engineering from the Pennsylvania State University in 1999. From 1999 to 2000 he was a post-doctoral researcher in the group of Prof. Thomas N. Jackson at the Pennsylvania State University. In 2000 he joined the Polymer Electronics group of Infineon Technologies in Erlangen, Germany. Since 2005 he has been head of the Organic Electronics group at the Max Planck Institute for Solid State Research in Stuttgart, Germany.
Professor Dr. Florian Matthes, Chair for Software Engineering for Business Information Systems at the Technical University of Munich
In this talk we report on the latest results of our social software engineering research at TU München. SocioCortex is a web-based platform that provides a novel mix of content and model management concepts and services to support problem-solving processes in organizations. These processes involves stakeholders with different interests and background that want to use their preferred content representations (tables, hypertexts, images, drawings, maps, 3d models, matrices, mathematical formulas) which involve diverse content sources and channels. Using examples from industry projects we illustrate how SocioCortex enables the emergence of data models, access-control modes, process models and UI models shaped by the actual problem-solving processes performed on the platform.
Florian Matthes holds the chair Software Engineering for Business Information Systems at Technische Universität München. The current focus of his research is on enterprise architecture management, model-driven web application engineering and social software. Earlier stations of his academic career are the Goethe-University Frankfurt (Diploma 1988) the University of Hamburg (PhD 1992), the Digital Systems Research Center (now HP SRC Classic) in Palo Alto, USA (Researcher 1992-1993), and the Technical University Hamburg-Harburg (Associate Professor 1997-2002). He is the head of the software architecture working group of the Gesellschaft für Informatik, member of the advisory board of the Ernst Denert-Stiftung für Software Engineering and organizer of several workshops and conferences. He is co-founder and chairman of CoreMedia (1996) and infoAsset (1999) , co-founder of further small software and service university spin-off, and scientifc advisor of UnternehmerTUM, the center of innovation and business creation at TU München.
Nicolai Josuttis, open source developer
For many years enigmail has been one of the standard tools for email encryption with Mozilla Thunderbird. In the post-Snowden era the relevance and the focus of enigmail have completely changed. What was formerly a tool for Techies, has become extremely important for the implementation of the fundamental right to privacy. Usability on the one hand and the compatibility with other email tools on the other hand are the most crucial features. This talk illustrates the current stage of development and gives some insight into the depth of open source development in the age of mass surveillance.
Nicolai Josuttis has been known for many years as a writer, speaker and independent consultant for C ++, SOA and the technical management of large projects. Since the beginning of 2014 he is contributor of enigmail, the e-mail encryption add-on from Mozilla Thunderbird.
The USBaddies, TNG
In summer/autumn last year, badUSB devices were short-term the cause of a big safety agitation (e.g. see badUSB: “Wenn USB-Geräte böse werden” | heise Security). Responsible are not only the missing safety mechanisms of the USB printout but also the users’ carelessness or rather their curiosity. Maybe there is something interesting to find on the located USB stick? A business report, or pictures of the attractive neighbor? By means of microcontroller developing plates and a proper USB stick, every reasonably versed developer is able to program USB devices that can almost arbitrarily change their behavior: keyboard, mouse, printer, data medium etc. Already in a free programmable keyboard, that additionally pursues its own plans, a lot of fooling can be done on a victim computer. We demonstrate for example an attack on a standard Windows 7 PC, on which administrative rights are gained by means of a primed device, selective exceptions are established in the firewall and a register console is generated and started. Through the console we show subsequently the content of the hard disk which in reality can be spied on as well. Meanwhile there are collections of finished scripts for different developing plates online, which make amusing gags but also real attacks realizable. Basically every computer and every operating system is vulnerable. The declared aim of the talk is, besides a very slim depiction of the technical basics, to conduct different demonstrations on victim systems with primed devices, in order to create a consciousness for the risks of an all too careless handling with (foreign) USB devices.
Tamás Lengyel, Senior Security Researcher at Novetta, and Thomas Kittel, PhD student at the chair for IT security at Technical University of Munich
New methods and approaches for securing cloud environments are becoming increasingly more critical now that it is being widely adopted by the businesses sector. Cloud system inherently rely on hardware virtualization to enable sharing of hardware resources, and despite the fact that virtualization itself is not inherently insecure, nearly two thirds of all virtual systems are less secure than those physical systems they replace. This curious state arises primarily because traditional host security strategies are not well integrated into virtual environments: as an example, typical antivirus scans are a critical component of layered defense-in-depth, but they rapidly exhaust available CPU and memory when protecting a large number of virtual machines. Virtualization nevertheless also offers a unique opportunity: the ability to peer into a running operating system from an outside perspective, known as introspection. It is possible to observe the memory, storage, CPUs, processes, and kernel of a running virtual machine from a safe vantage point. More interestingly, it is also possible to alter the behavior of all of these components to help protect virtual systems. Over the last few years our team has worked on implementing these features for the open-source Xen hypervisor through the LibVMI library. Our work enables third-party security tools to take advantage of the extra layer of protection presented by the hypervisor, both on x86 and ARM systems. While these features present a significant improvement in cloud security tools, active introspection is a double-edged sword, as it highlights the implicit trust placed into the hands of cloud providers.
Tamás works as Senior Security Researcher at Novetta, and has previously worked as Security Researcher at TU Munich's Chair for IT Security. Tamas' focus is on virtualization security for cloud and mobile devices. He is currently finishing his PhD at the University of Connecticut on the topic of Malware Collection and Analysis via Hardware Virtualization. He is an avid open-source developer, contributing to projects such as the Xen Project Hypervisor, LibVMI and the Linux kernel.
Thomas is currently doing his PhD at the Chair for IT Security at the Technical University of Munich. His research interests mainly are in the field of Virtual Machine Introspection and VMI-based Operating System Security.
Tom Green, Solutions Architect at Couchbase
4.0 is the new release of the NoSQL Document Database from Couchbase. This is major step forward in data management for mission critical, real time interactive applications. This talk will give an overview of Use Cases suited to NoSQL, an insight into the technical architecture of a clustered NoSQL database, and introduce the new data access capabilities of the SQL for Documents query language.
Tom Green is a Solutions Architect working at Couchbase, providers of high performance NoSQL technology for mission critical systems. Tom works closely with customers on building prototypes, proof of concepts, data modelling and performance tuning. Prior to joining Couchbase Tom worked at Intel in the Microprocessor Research Lab, and at IBM in Systems Storage Development.
Open Source Software is free, but in order to preserve the freedom of open source and keep this model working, open source licenses also express obligations for users or re-distributors. Today the number of open source licenses is in the hundreds and in most cases, an open source package is not covered by a single license but by multiple licenses. Therefore many different obligations exist and some of them may apply when using a piece of open source software. For commercial use, it is inevitable to assess the license conditions – a kind of clearing task. Software tools support this assessment. One solution is the open source project FOSSology. In the presentation we will explain the main use case, why people should use FOSSology and how latest features improve the clearing with respect to efficiency and effectiveness.
At Siemens Corporate Technology, Michael works as project manager, software architect, trainer and consultant for distributed systems, server applications and their development with open source software. He has more than 12 years of experience in professional software development. He prefers Java to C#, Xcode to Eclipse, and likes Macs since 27 years. Michael received Dipl-Ing. and Dr.-Ing. degrees, both from TU Berlin and is a certified software architect.
Andreas works as a Senior Consultant at TNG. His main focus is agile software development and project management with an emphasis on sustainable design. He likes to work in Java, Python, PHP and C++ and in his free time he is frequently contributing to various open source projects. Andreas is originally an experimental physicist and studied at the Universities of Bayreuth and later Constance where he received his PhD degree.
Robert Pintarelli, Principal Consultant at TNG
The other day, after a release (like this or similar):
Chief: "And, what are the news?"
Team: "All the same, except ..."
Chief: "What?!? What have you done the last few weeks?"
Team: "... the website is now much faster, filling the product index takes only seconds and the search engine finally finds partial words..."
Chief: "But how does that work, we were online all the time, right?!?"
Team: "Of course!"
The Open Source "full-text search and analytics engine" Elasticsearch doesn’t has to fear commercial competition, because in some cases it is even superior - and not only in price matters. This can be observed in the example of a hybris based web shop which was converted from a specialized, commercial product index to Elasticsearch. This talk depicts how it ever came so far, why Elasticsearch was selected, and what steps were necessary for the uninterrupted migration).
"Simply to deliver a good software - and that from the first deployment", that is Robert Pintarelli’s goal when developing new functionalities for customers. For nearly eight years, Robert has been assisting TNG’s customers in creating software solutions that work. In his current project - an online fashion shop based on hybris - he is engaged in updating a rather obsolete software for a future internationalization of the customer.
Thomas Skowron, open source developer
Progressively OpenStreetMap becomes the largest supplier of geospatial data for applications of all kinds. The data is added every day by thousands of volunteers to the database. Indeed, it provides not only static prefabricated maps, but a fortune on data which is available with almost no use restrictions and therefore creates new opportunities to better understand our world. OpenStreetMap is though not just a database, but an entire ecosystem of people, software, and companies that supply and further develop the tools for processing, improving, and visualization of the dataset. The mapping goes beyond the simple recording of roads, it registers also shops, house numbers, collection times, public transport connections, footpaths and much more, in order to make this information available to the public. Errors exist in every database, but in OpenStreetMap everyone - even oneself - may fix them. This talk shall give an insight into eleven years of project history and the current state, as well as provide some introductive points on how one can collaborate.
Thomas Skowron is a software engineer of location-based services and geospatial applications, who lives in Dresden and is involved in projects such as Wikimedia Commons and OpenStreetMap. In his spare time he is committed to the availability of open data and produces the podcast "Mikrowelle" (microwave) on technology and spatial data. As a programmer, he has designed and developed software for engineers, websites, interactive eBooks and map applications and works on server-side software that reaches millions of users.
Although Behavior Driven Development has been existing for over 10 years, the methodology hasn’t yet been very popular in the Java world. One reason for this are the existing BDD tools for Java that are cumbersome for developers to use and require a lot of maintenance. The speaker wants to change this with JGiven and provide Java developers with a framework that they like to use and at the same time satisfy the operating department with instructive reports. JGiven scenarios are written in the usual Given-When-Then formula with an embedded Java DSL. This allows developers to use all IDE features such as auto completion and refactoring tools. The resulting scenarios are indeed separately very readable, but JGiven can additionally generate more reports in different formats that can be used for collaboration with domain experts. Through a modular concept, new scenarios can be easily assembled from parts of other scenarios. This speeds up the creation of new scenarios and avoids test code duplication. Since neither Groovy nor Scala are still needed and JGiven is compatible with JUnit and TestNG, JGiven can be immediately applied in Java projects and be easily integrated into existing test infrastructures. In this presentation, the speaker will give an introduction to JGiven and, based on a short live coding session, show how quickly and easily BDD Scenarios can be written in JGiven.
Dr. Jan Schäfer is Senior Consultant at the TNG Technology Consulting GmbH and has been developing there for the last 4 years Java Enterprise applications. He holds a doctor degree in computer science and has for more than 15 years the Java world as his home. Recently, he discovered his passion for Behavior Driven Development with JGiven and released a new BDD framework for Java.
Modern software relies on tests to avoid bugs instead of proving code correct because traditional formal methods are prohibitively expensive. This talk illustrates a more practical approach to formal methods that combines equational reasoning in Haskell with introductory category theory to cheaply verify high-level properties.
Gabriel Gonzalez is a Software Engineer at Twitter who is a Scala programmer by day and a Haskell programmer by night. He primarily focuses on API design and equational reasoning, which he blogs about at haskellforall.com.
Dr. Antonius Weinzierl, Knowledge-based Systems Group at the Technical University of Vienna
NP-complete problems often occur at the heart of important application domains like logistics, scheduling, or satisfiability checking. Today, no fast and correct algorithm is known for any NP-complete problem; in the worst-case, they all require exponential time. If satisfiability checking with 3 variables takes 1 second, exponential time then means that 13 variables require 1024 seconds, and 35 variables require 4 billion seconds or 126 years. Despite this, modern satisfiability checkers deal with 500.000 variables in less than 10 minutes. This talk reveals some key techniques that enable such performances.
Antonius Weinzierl is a researcher at the Vienna University of Technology (TU Wien) where he received his PhD in 2014. His current research focus is on logic programming where satisfiability checking techniques are used intensely. In 2009 he started his work at TU Wien investigating inconsistency management for knowledge-exchange systems. In 2009 he received his diploma degree in computer science from LMU Munich.
Sebastian Blessing and Sylvan Clebsch, CEO and CTO at causality.io
Concurrency is here to stay. Your phone is multi-core, your workstation is many-core, and your data centre is a super-computer. As an industry, we need to be able to write code that scales when our hardware scales, we need to be able to guarantee that our code doesn't have data-races or deadlocks, and we need to be able to express all this simply and concisely. Pony is a new open source programming language that aims to do all of this. It's an actor-model language (like Erlang), it's object-oriented with extensive functional features (like Scala), it compiles to fast native machine code without a VM (like C/C++), and it has an innovative type system that guarantees your program will have no data-races, won't deadlock, and will never have a null-pointer or other runtime exception. We'll be discussing some of the fun properties of both the language and the runtime library, and we'll talk about where we're going, and how to get involved.
Sebastian Blessing did his MSc at Imperial College London on extending Pony to distributed clusters. He has worked on high-performance data analytics systems, for tasks such as oncology and ERP, at places like SAP and IBM. He is the CEO of Causality.
Sylvan Clebsch was kicked out of a lot of schools before becoming a serial entrepreneur in the 90s, working on embedded OSes, secure systems, VOIP, physical simulation, and graphics engines, before accidentally becoming an Executive Director in IT at a major investment bank - which he has now left. He is the CTO of Causality, the company that develops Pony.
Eric Weikl, Principal Consultant, TNG
It seems that microservices are increasingly being hailed as the panacea of architecture. However, many organizations are completely unprepared for the technical, operational, and organizational consequences of this approach. Deploying and running microservice-based systems will bring new challenges to the devops table. Some of these challenges can be alleviated by the recent container movement (of course, not without introducing some new ones). In this talk, we'll take a look at using Docker for deploying microservice architectures as well as the options provided by (or missing from) the current ecosystem.
Eric Weikl has been creating software with passion at TNG for over 10 years. He supports clients with developing, integrating and deploying mission-critical systems. His favorite place is where the platonic ideals of programming and dirty IT reality collide.
Kjetil Hustveit and Simen Gan Schweder, sannsyn.com
Sannsyn is a small Norwegian startup which develops a platform for recommendations. Having developed our own toolbox from scratch we see that there are several challenges our customers want solved where we have to develop entirely new sets of algorithms. Collecting and processing lots of data about persons also raises important issues with regard to privacy which we have an obligation both to law and ethics to solve.
Kjetil Hustveit is a software engineer and co-founder of Sannsyn AS. He has been developing software for the last 15 years both as employed in various companies and as an independent consultant. He has worked with projects ranging from web applications and app development to developing equipment for analyzing digital tv network streams. Now he is most happy when let loose to conjure up new ideas.
Simen Gan Schweder(1970) has a Master of Statistics with specialization in data modeling and recommender systems. He has worked as a college teacher on subjects like Algorithms, Data Structures and Application Development. He also has a long history as a software engineer in industries like printing and web shops. He wrote the now deceased movie recommender eSmak in the nineties, which were used by norwegian movie distributor Oslo Kinomatografer.
Anastasia Kazakova, Product Marketing Manager for CLion, JetBrains
Being C/C++ fan Anastasia Kazakova (@anastasiak2512) has been creating real-time *nix-based systems and pushing them to the production for 8 years. She has passion for networking algorithms and embedded programming and believes in good tooling. With all her love and passion to C++ she finally has joined JetBrains team, working now as a Product Marketing Manager for CLion and AppCode.
The problems of the Record system in Haskell are one of its most long-standing and controversial issues. The community has long reached a consensus that the current state of affairs is unsatisfactory, however any solution to it has been spawning debates to this day. The debates have long been the cause of nothing actually getting implemented, despite concrete proposals periodically getting presented since at least the year 1999. Fortunately, the recent developments in GHC made it possible to reimplement the records as a library. Such a library, ideas behind it and its practical usage are the subject of this talk.
Nikita Volkov is a software engineer with 4 years of Haskell practice preceded by a decade of experiences with mainstream OO languages. He is employed by a Norwegian company Sannsyn AS, which provides a recommendation engine as a service as well as general consulting services in IT. After joining the Haskell community Nikita became an active contributor, releasing such notable open-source projects as stm-containers, hasql and record. The last project is the subject of the talk.
In this talk you'll hear how the start-up Jolla managed to build a mobile OS and a mobile phone with just 100 employees. Staffed with mostly ex-Nokia developers, we managed to build, launch and ship a non-Android mobile device with our own mobile operating system in a lot shorter time than established vendors have ever been able to. Beyond the story I'll speak details on how a modern mobile operating system is built up - not only from a technical point of view, but how you make a great product and sell it and work with your customers.
Carsten Munk is Chief Research Engineer at Jolla, strongly passionate about open source and was previously involved with the MeeGo project by Intel and Nokia. Jolla was born in 2011 out of the passion of its founders towards open innovation in the mobile space.
Richard Mortier, Systems Research Group at the Cambridge University Computer Lab
The Mirage OS is an open-source library operating system that compiles code written in the OCaml functional language into a variety of hardware backends, most notably specialized unikernels that run directly on the Xen hypervisor. Mirage is particularly useful for building safe, reliable OS components such as storage or networked daemons. Instead of having to manage complex deployments such as a LAMP stack (with the associated security headaches), Mirage offers the opportunity to "compile your own cloud" from a set of protocol libraries. For example, the Mirage website is provided using DNS and HTTP servers running as distinct cloud-hosted unikernels, coordinated via the Irminsule storage stack which uses Git as its communications protocol. Deployment relies uses the Travis continuous integration tool to commit the entire compiled images to (e.g.,) Github from which they can be automatically deployed. In this talk I will describe the architecture of Mirage, present some benchmark results comparing the performance of our unikernels to traditional applications such as Apache, BIND and OpenSSH, run through the deployment workflow benefits it brings, and present our latest results using Mirage for low-latency deployments to low-power small form-factor ARM boards.
Richard Mortier is a member of faculty in the Systems Research Group at the Cambridge University Computer Lab. Past work includes Internet routing, distributed system performance analysis, network management, aesthetic designable machine-readable codes, and home networking. He works in the intersection of systems and networking with human-computer interaction, and is currently focused on how to build user-centric systems infrastructure that enables people to better support themselves in a ubiquitous computing world through Human-Data Interaction.
Mattias Petter Johansson, web developer at Spotify
The Spotify desktop app is built out of a large number of small, self-contained web apps called Spotlets. In this talk, I'll tell you how they allow us to divide ownership between teams, and how they allow us to remotely live-replace parts of the app in the running clients of millions of users. I will also tell you the story behind how Spotlets originally came to be, how they've evolved, and some of the surprising effects they've had on us.
What if your navigation system could project route information directly to your field of vision, or if facial recognition technology could help you identifying faces? Due to the rapid development of virtual reality glasses in the last few years, this vision can become reality soon. The Maker-Team “Red Pill” from TNG Technology Consulting developed applications that expand reality by useful additional information with the help of the VR glasses “Oculus Rift DK2”. Contrary to the typical use case of such displays, the user doesn’t enter a virtual world but maintains his normal field of vision. This happens through the extension of the VR glasses by two additional cameras. The thus gained stereoscopic picture could for instance be enriched by data for facial recognition and identification. In this talk the TNG consultants will introduce you to the world of augmented reality. You get an insight into the whole program sequence, from the readout of the pictures as far as the representation of the data on the VR display. Thereby they will illuminate the history of this actually not so new technology and point out further areas of application.
Analyzing classic economic indicators such as the gross domestic product and the unemployment rate Bavaria is far ahead in the national comparison. We all like to live in Bavaria - but is Bavaria really well prepared for the future? This image changes a lot if one takes instead of the classic indicators rather the future-oriented indicators such as income distribution, start-up quote, integration index and education mobility into account. From McKinsey’s perspective, the world is changing radically: structural breaks as “the end of the work as we know it”, the emergence of a “volatile world”, the shortening of resources (budgets) and the development of “disruptive technologies” will lead us to a completely different and rather unknown world. Pertaining the impacts of these breaks one can find two opposite interpretations: on the one side a world characterized by cleavage and on the other side a world as a stage for participation. New playing fields may pick up the conflict zones: we gathered a total of 15 initial ideas (e.g. digital education, integrated health care system, energy efficiency and independence, revival of the Bavarian identity). When discussing the future of Bavaria it became clear that there isn’t “a Bavaria”, but several regions that should work in synergy rather than compete.
Johannes Elsner holds a law degree from the Ludwig Maximilian University of Munich and completed the General Course of London School of Economics in 2000. In 2006 he joined McKinsey in Munich as Associate, where he became a partner in 2012. His previous projects include the development of a growth strategy for the business with institutional investors for a leading Asset Manager, the definition of a growth strategy for a large consumer finance company taking mobile banking into account, and the restructuring of the business in Italy for a large investment company.
David Anderson, Anderson Associates
Kanban has been around for 10 years already. We know it provides direct benefits for service delivery. We know that evolutionary change is a better way to adapt organizations to market uncertainties and changing market conditions. However, Kanban adoption has mostly been inside-out, starting with small teams and in the middle of service delivery workflows. Many Kanban implementations are started locally and provide only local benefits. Kanban was always intended as an end-to-end enterprise solution. Enterprise Services Planning (ESP) is the future of Kanban. ESP is an outside-in approach that looks at strategy, fitness for purpose and alignment of capability with strategy enabling better business results from the use of Kanban as a service delivery method and enabler of evolutionary change. ESP includes guidance on portfolio management, capacity planning, scheduling, forecasting, and risk. ESP is supported by new software tools that facilitate decision making and process improvement helping managers do their jobs by informing them better and making recommendations based on an understanding of business risks. ESP offers AI for your technology business. It's the future for how modern 21st Century businesses will be managed!
David J. Anderson leads a management consulting firm focused on improving performance of technology companies. He has many years management experience leading teams on agile software development projects. David was a founder of the agile movement through his involvement in the creation of Feature Driven Development. He was also a founder of the APLN, a non-profit dedicated to improving management and leadership in technology companies. Recently David has been focusing his attention on business agility and enterprise scale agile software transitions through a synergy of the CMMI model for organizational maturity with Agile and Lean methods.
An overview of the programme can be found here.