Logistics

Package delivery monitoring and Analysis with MongoDB, Apache Flink and Gelly

In recent years more and more devices and data sources emerged in business environments. Web 2.0 and the Internet of things, make it feasible to track all kinds of information over time, in particular fine-grained user activities and sensor data. This real-time data is very important for fast decision making in operational monitoring and controlling tasks.

For example in package delivery the monitoring and controlling of expensive or sensitive goods is very important to comply service level agreements. Imagine you want to ship organs from Hamburg to Munich, you have to assure that the inside temperature never exceeds a defined value, e.g. 5 degrees Celsius.

On the first day, the data will be explained in detail and be imported in MongoDB. Students will learn how to work with NoSQL database, especially how to store and query this data from the MongoDB command line tool.

Day two we start processing this real-world data sets with new data processing technologies of Apache Flink + Streaming. You will learn how to use this frameworks for distributed data processing and information generation in large datasets. Furthermore you get an introduction into data streaming engines and how to use them for detection and treatment of complex business events in high-scale data streams.

On the last of the three days we deal with graph analysis algorithms to get more information out of the data sets. We use the Flink Gelly graph library of Apache Flink to get more insights in the logistics data set.

Back to the Summer School 2016 overview

TU
Universität
Max
Leibnitz-Institut
Helmholtz
Hemholtz
Institut
Fraunhofer-Institut
Fraunhofer-Institut
Max-Planck-Institut
Institute
Max-Plank-Institut