FAU’s Federated Learning AI Model Presented at Top AI Conference

Artificial Intelligence, AI


By gisele galoustian | 2/16/2026

Study Snapshot: Researchers at FAU’s College of Engineering and Computer Science have developed a novel solution to key challenges in federated learning. Their system, called the personalized federated dual-branch framework (pFedDB), redefines how shared and local knowledge are managed. Rather than forcing all participants to rely on a single global model, pFedDB splits each model into two parts: a shared component, trained collaboratively, and a private component, retained exclusively by each participant. The private component remains untouched, preserving specialized local knowledge, while only the shared portion is exchanged. This approach reduces communication costs by roughly 30% and improves overall efficiency.

The findings were published in the Proceedings of the AAAI Conference on Artificial Intelligence in a paper titled “Decoupling Shared and Personalized Knowledge: A Dual-Branch Federated Learning Framework for Multi-Domain with Non-IID Data,” presented at the AAAI-26 Conference, which had an acceptance rate of 17.6%.

Artificial intelligence is transforming industries from health care to finance. Training powerful AI models often requires large amounts of data. However, sharing that data is not always possible. Patient records, financial transactions and personal device information are sensitive, private and often restricted by law.

Federated learning offers a solution: it allows multiple organizations or devices to train a single AI model together without ever sharing their raw data. Instead, each participant keeps their data locally and only sends the insights their model has learned. This approach helps organizations collaborate while maintaining privacy and meeting regulatory requirements.

Even with federated learning, challenges remain. Different organizations often have very different data, such as hospitals serving unique patient populations, vehicles operating in different traffic environments, or devices capturing images under varied conditions. Training a single shared model can cause it to forget important local knowledge or be confused by conflicting updates, a problem known as negative transfer. Another issue, catastrophic forgetting, occurs when local models are overwritten by global updates and must relearn their own data repeatedly. These problems slow progress and reduce the reliability of shared models.

Researchers from the College of Engineering and Computer Science at Florida Atlantic University have developed a new solution to these challenges. Their system, called the personalized federated dual-branch framework, or pFedDB, changes how federated learning manages shared and local knowledge. Instead of forcing all participants to rely on a single model, pFedDB divides each model into two components. One component is shared and trained collaboratively, while the other remains private to each participant. The private component is never overwritten, preserving specialized local knowledge. At the same time, only the shared portion is exchanged, reducing communication costs by about 30% and improving efficiency.

The research results were published in the Proceedings of the AAAI Conference on Artificial Intelligence, and the paper, titled “Decoupling Shared and Personalized Knowledge: A Dual-Branch Federated Learning Framework for Multi-Domain with Non-IID Data,” was presented at the AAAI-26 Conference on Artificial Intelligence, which had an acceptance rate of 17.6%.

The work was conducted in the research groups of Zhen Ni, Ph.D., and Xiangnan Zhong, Ph.D., associate professors in FAU’s Department of Electrical Engineering and Computer Science, who led the research design, methodology and technical contributions. The paper includes Yiran Pang, a Ph.D. student in the department, who contributed to the implementation and experimental evaluation as part of his doctoral research under their supervision.

The researchers also redesigned the training process to improve performance. Each participant first trains a local expert model to capture the unique characteristics of its data. Once that foundation is established, federated learning begins, with the shared portion of the model complementing rather than replacing local expertise.

“This dual-branch approach prevents catastrophic forgetting and avoids negative transfer,” said Zhong. “Tests, including chest X-ray analysis, showed the system improves accuracy for every participant while reducing the data that must be shared between devices.”

Through this work, FAU researchers demonstrate that collaborative AI can be more practical, privacy-preserving and adaptable, with potential applications in health care, finance, mobile technology and intelligent transportation systems.

“The key innovation in our research is how we separate shared knowledge from personalized knowledge in federated learning,” said Ni. “By allowing each participant to retain its own expertise while still collaborating on general patterns, we address problems that have long limited real-world AI deployment, especially when data varies significantly across locations or devices.”

FAU’s College of Engineering and Computer Science is internationally renowned for its innovative research and education in the areas of AI, big data and computer science.  

“This work reflects the strength of faculty-led research at FAU and the impact it can have on real-world applications,” said Stella Batalama, Ph.D., dean of the FAU College of Engineering and Computer Science. “Advances like this make AI systems more reliable, more efficient and more privacy-conscious across critical sectors.”

-FAU-

©