Credit: Google News
Worried that you’re not on trend with one of the hottest innovations in tech? It’s time to get ready for ML Conference 2019, where renowned pioneers of the software industry will gather from June 17-19 in Munich to celebrate everything machine learning.
ML Conference is designed to help you understand your data, optimize your models, and enhance your business. With three days of sessions, workshops, and exciting opportunities to meet and interact with both international experts and other members of the field, ML Conference is the place to go for everything machine learning.
Right now, you can take advantage of our Special Discount sale before March 7 and save up to €480!
Here are some of the cool things we’ve got in store for you! Our schedule is still a work in progress, but we’ve already confirmed these amazing sessions!
ML Conference 2019 sessions
Productionizing machine learning models: Lessons learned in the Hadoop ecosystem – Steffen Bunzel and Simon Weiss
The deployment of machine learning models can be challenging, especially in the context of distributed systems. Although Python is the dominant language among data scientists, it can create friction when integrating with JVM-based tools such as Spark or managing application dependencies on clusters of heterogeneous machines. Many data scientists developing on such systems struggle with the subtleties of these challenges.
This presentation will share lessons learned working on large-scale Hadoop clusters and examine the most promising approaches to alleviate common issues. In particular, we will discuss our experience with leveraging containerization to tackle the dependency management challenge from a data scientist’s point of view.
Honey bee conservation using deep learning – Thiago da Silva Alves and Jean Metz
Honey bee colony assessment is usually carried out via the laborious manual task of counting and classifying comb cells. Beekeepers perform this task many times throughout the year to asses the colony’s strength and to track its development. As you can imagine, this is an extremely time-consuming and error-prone task. We will share our experience with the development of a tool for automatic honeybee colony assessment, the DeepBee.
DeepBee is a tool that encapsulates an image classification pipeline using classical image processing methods and state-of-the-art Deep Neural Networks (DNN) for image segmentation and classification. To get to the final solution, we have compared 13 distinct DNN architectures and chosen the best model based on several metrics. We discuss the steps taken from image collection to the delivery of the final solution, highlighting the mistakes we have done during the process, the hurdles we overtook, and the lessons learned.
Towards meaningful AI – Imola Fodor
Electrolux is one of the global leaders in household appliances that shapes living for the better. With a history of continuous improvement applying various technologies, we have recently adopted AI. But as an established, hundred-year-old company, how did we create a start-up like ecosystem? By using an agile approach and the newest technologies for developing a more sustainable operation of our appliances. As a consequence, we are presenting our way of working on meaningful AI solutions.
Predicting New York City taxi demand: Spatio-temporal time series forecasting – Fabian Hertwig
Time series forecasting has always been an important field in machine learning and statistics, as it helps us to make decisions about the future. A special field is spatio-temporal forecasting, where predictions are not only made on the temporal dimension, but also on a regional dimension.
In this session, we will present a demonstration project to predict taxi demand in Manhattan, NYC for the next hour. We’ll show some of the basic principles of time series forecasting and compare different models suited for the spatio-temporal use case. Therefore, we will take a closer look at the principles of models like long short-term memory networks and temporal convolutional networks. We will show that these models decrease the prediction error by 40% as compared to a simple baseline model, which predicts the same demand as in the last hour.
Credit: Google News