Special Session on Distributed AI across Edge-Cloud Continuum

The rapid proliferation of intelligent applications across Internet of Things (IoT), cyber-physical systems, smart cities, autonomous systems, and next-generation networks has accelerated the shift of artificial intelligence from centralized cloud infrastructures toward the network edge. Edge and Distributed AI has emerged as a key paradigm to enable low-latency, privacy-aware, energy-efficient, and scalable intelligence by leveraging collaborative learning and inference across the edge–cloud continuum. This Special Session aims to bring together researchers and practitioners to present recent advances in learning paradigms, system architectures, hardware–software co-design, and real-world deployments for Edge and Distributed AI. The session particularly emphasizes system-level challenges and innovations at the intersection of AI, networking, and computing infrastructures, in line with the scope of IEEE ICCE 2026, covering both foundational techniques and emerging applications enabled by edge intelligence.

Topics of interest include, but are not limited to:

Paradigms and Models

  • Federated Learning and Privacy-Preserving Distributed Learning
  • Edge-Cloud AI Security
  • Collaborative, Split, and Hybrid Learning across Edge–Cloud Continuum
  • Lightweight and Efficient AI Models for Edge Intelligence
  • Large Language Models (LLMs) and Foundation Models at the Edge
  • Edge and Hybrid Inference across Edge-Cloud Continuum

System Architecture and Infrastructure

  • Architectures for Large-Scale Edge and Distributed AI Systems
  • Edge–Cloud–AI Continuum: Architectures, Orchestration, and Scheduling
  • Infrastructure and Platforms for Distributed AI
  • AI-as-a-Service (AIaaS) Architectures for Edge and Cloud Environments
  • Edge AI over 5G/6G and Beyond

Hardware, Software and Co-Design

  • Emerging Software and Hardware Co-Design for Edge AI
  • Acceleration Techniques for Edge and Distributed AI (GPU, NPU, FPGA, ASIC)
  • Runtime Systems and Middleware for Distributed AI Execution

Performance, Energy and Sustainability

  • Energy-Efficient and Sustainable AI for Edge and Cloud
  • Resource-Aware AI: Computation, Communication, and Memory Optimization

Use Cases and Applications

  • Edge AI for IoT, Smart Cities, Healthcare, Industrial Automation, and Robotics
  • Real-Time AI Inference in Latency-Critical Applications

Evaluation and Experiments

  • Datasets, Benchmarks, and Testbeds for Edge and Distributed AI
  • Experimental Platforms and Real-World Deployments of Edge AI

Organizing Committee:

  • Hoang Dinh, University Technology Sydney, Australia
  • Nguyen Huu Thanh, Hanoi University of Science and Technology, Vietnam
  • Tran Thi Thanh Hai, Hanoi University of Science and Technology, Vietnam
  • Huynh Nguyen, University of Liverpool, UK
  • Nhien-An Le-Khac, University College Dublin, Ireland
  • Hung Nguyen, Adelaide University, Australia

Submission Guidelines:

Submissions should adhere to the IEEE's formatting guidelines and be submitted through the conference's submission system. Full papers must be original, unpublished, and not currently under review by another conference or journal. Detailed submission instructions, including format templates and submission portal information, can be found on the conference website at https://www.ieee-icce.org/p/submission.

Paper Submission Deadline: March 1st, 2026.

Submission Link:

https://edas.info/newPaper.php?c=34278&track=136256

Contact information:

For further information or inquiries about Special Session on Distributed AI across Edge-Cloud Continuum, please contact the special session chairs: Dr. Nguyen Huu Thanh (thanh.nguyenhuu@hust.edu.vn) for general matters.