Workshops & Tutorials

Monday, July 25th

  • Workshops
    • Name: Workshop on Advanced Tools, Programming Languages, and Platforms for Implementing and Evaluating Algorithms for Distributed Systems (ApPLIED)
      Organizers: Elad Michael Schiller, Chryssis Georgiou
      Website: https://www.cse.chalmers.se/~elad/ApPLIED2022/
      Brief Description: Designers of advanced systems wishing to implement and evaluate distributed algorithms in practical settings are often faced with challenging questions regarding the transformation from design to a working prototype. The purpose of this workshop is to bring together designers and practitioners of distributed systems from both academia and industry to share their points of view and experiences. The goal of the workshop is to act as a bridge between the traditional PODC community, which has a more analytical focus, and researchers working on more applied approaches and building large-scale distributed systems.
    • Name: Principles of Distributed Learning (PODL)
      Organizers: Rachid Guerraoui, Gupta Nirupam, Rafaël Pinot
      Website: https://dcl.epfl.ch/site/podc2022
      Brief Description: The workshop, as the name suggests, focuses on the important topic of distributed machine learning (ML) – a modern-day computing framework that enables the training of large complex models (e.g., convolutional deep neural networks) over massive amounts of high-dimensional data (e.g., pictures or videos). Distributed ML promises several advantages over its centralized counterpart, such as computing flexibility, expedited learning and data privacy. Nevertheless, due to its distributed nature, distributed ML suffers from several issues including (but not limited to) asynchronicity, system failures, heterogeneous sampling, and consistency. The purpose of our workshop is to gather researchers that address the challenges in distributed ML, and facilitate fruitful collaborations between the two communities of distributed computing and machine learning.

Friday, July 29th

  • Workshops
    • Name: Distributed Algorithms on Realistic Network Models (DARe)
      Organizers: Laurent Feuilloley, Yannic Maus, Alexandre Vigny
      Website: https://podc-dare.github.io/
      Brief Description: The PODC community focuses on theoretical models and tools to understand distributed computing. A challenge is to develop novel and powerful mathematical frameworks that also have impact outside theoretical computer science. The goal of this workshop is to tackle this challenge in the area of distributed computing on networks of different origin. In the last decades, this area has been flourishing on the theoretical side, and it is meaningful to evaluate the relevance of these results on the practical side, and to give new directions for future research originating from practice.
  • Tutorials
    • Name: Dispersion of Mobile Robots
      Organizers: Anisur Rahaman Molla, William K. Moses Jr.
      Website: https://sites.google.com/view/dispersion-mobilerobots-podc22/
      Brief Description: In this tutorial, we provide an extensive survey of the work on dispersion of mobile robots, introduced by Augustine and Moses Jr. [ICDCN 2018]. The problem of dispersion of k robots, initially arbitrarily placed on the nodes of an n node graph, requires the robots to autonomously move around to reach a configuration such that there are at most $\lceil k/n \rceil$ robots on each node. Typically, the metrics used to gauge solutions to this problem are the time until dispersion is achieved and the memory requirement per robot. Although this problem was introduced recently, much work has been done on it in the setting it was originally introduced in, as well as extensions to new settings. We will provide an overview of the techniques and results until now as well as possible future work. Our presentation will be in two parts. In the first part, we will present the foundations of the dispersion problem and some fundamental results. In the second part, we discuss various extensions to different settings and recent developments.