Keynotes

Keynote #1 Brain Computer Interface in Augmented Reality and Metaverse

Chin-Teng Lin
University of Technology Sydney, Australia

Abstract: Brain-Computer Interface (BCI) enhances the capability of a human brain in communicating and interacting with the environment directly. BCI plays an important role in natural cognition, which is to study the brain and behavior at work. Human cognitive functions such as action planning, intention, preference, perception, attention, situational awareness, and decision-making are omnipresent in our daily life activities. BCI has been considered as the disruptive technology for the next-generation human computer interface in wearable computers and devices. In addition, there are many potential real-life impacts of BCI technology in both daily life applications for augmenting human performance, and daily care applications for elder/patients healthcare in real world and virtual world. Talk focus will be the applications of BCI technology on AR-based brain robot interface, BCI-based assistive glasses for the blind, Biofeedback for chronic pain mitigation, and BCI-based human-machine cooperation. The potential applications of BCI in the coming Metaverse will be also introduced in this talk.

Biography: Chin-Teng Lin received the B.S. degree from the National Chiao-Tung University (NCTU), Taiwan in 1986, and the Master and Ph.D. degree in electrical engineering from Purdue University, West Lafayette, Indiana, U.S.A. in 1989 and 1992, respectively. He is currently a Distinguished Professor, Director of UTS Human-centric AI Center, Co-Director of Australian AI Institute, and Director of CIBCI Lab, FEIT, UTS. He is also invited as the International Faculty of the University of California at San Diego (UCSD) from 2012 to 2020 and Honorary Professorship of University of Nottingham from 2014 to 2021.

Prof. Lin’s research focuses on machine-intelligent systems and brain computer interface, including algorithm development and system design. He has published over 460 journal papers (H-Index 99 based on Google Scholar) and is the co-author of Neural Fuzzy Systems (Prentice-Hall) and author of Neural Fuzzy Control Systems with Structure and Parameter Learning (World Scientific). Dr. Lin served as Editor-in-Chief of IEEE Transactions on Fuzzy Systems from 2011 to 2016 and has served on the Board of Governors of IEEE Circuits and Systems Society, IEEE Systems, Man, and Cybernetics Society, and IEEE Computational Intelligence Society. He is the Chair of the 2022-2023 CIS Awards Committee. Dr. Lin is an IEEE Fellow and received the IEEE Fuzzy Pioneer Award in 2017. He received the UTS Chancellor’s Medal of Research Excellence in 2015.

Keynote #2 Low-Complexity Hardware Solutions for Baseband Algorithms in Massive MIMO Systems

Mojtaba Mahdavi
Ericsson Research, Sweden

Abstract: As mobile communication systems have evolved through five generations, we have witnessed an exponential increase in data rates, network capacity, and computational demands—trends that are expected to accelerate with the emergence of 6G. This growth is driven by an explosion in connected devices, vast data volumes, and the increasing need for ultra-high throughput, low latency, and enhanced reliability. Moreover, emerging applications such as virtual reality (VR), autonomous vehicles, smart cities, and e-health are imposing even stricter performance requirements for future wireless networks.

Central to the 5G standard is the adoption of massive multiple-input multiple-output (MIMO) technology, which equips base stations with a large number of antennas to serve multiple users simultaneously on the same frequency-time resources. While massive MIMO significantly enhances spectral efficiency and network performance, it also introduces significant computational complexity and memory overhead—primarily due to the need to process and store high-dimensional matrices such as channel state information (CSI). Meeting these challenges is becoming increasingly difficult for conventional digital baseband processors, which face limitations from the slowing of Moore’s Law, constrained memory bandwidth, and energy-intensive data transfers between memory and compute units. These issues hinder the scalability and efficiency of next-generation 6G systems.

This keynote will explore hardware-efficient and low-complexity approaches for implementing baseband signal processing algorithms, focusing on novel computing paradigms that move beyond traditional architectures. In particular, it will highlight the potential of in-memory computing (IMC) as a promising solution to alleviate memory bottlenecks and reduce power consumption. Using practical examples from baseband processing, we will demonstrate how IMC-based architectures can help reshape the future of wireless signal processing and pave the way for realizing the full capabilities of 5G and 6G networks.

Biography: Mojtaba Mahdavi received his M.Sc. degree in Electrical Engineering from Sharif University of Technology, Tehran, Iran, in 2010, and his Ph.D. in Electrical and Information Technology (EIT) from Lund University, Sweden, in 2021. His Ph.D. research focused on baseband processing for 5G and beyond, specifically on algorithms, VLSI architectures, and co-design for next-generation wireless communication systems. In 2020, he was a Visiting Researcher at the Division of Microelectronic Systems Design at the University of Kaiserslautern, Germany. In 2021, Mojtaba joined Ericsson Research in Lund, Sweden, where he currently works as a Senior Researcher in the Device Platform Research group. He has authored several patent applications, as well as journal and conference papers, with a particular emphasis on the hardware implementation of algorithms and architectures for wireless communication systems.