isense logo

Infrastructure Systems


REU Scholar: Koy Torres

REU Scholar Home Institution: Lehman College, City University of New York (CUNY)

REU Scholar: David Sanchez

REU Scholar Home Institution: Lehman College, City University of New York (CUNY)

REU Mentor: Yufei Tang, Ph.D.

Project: Federated Learning for Autonomous Driving


Federated learning, also known as collaborative learning, is a machine learning technique that trains an algorithm via multiple independent sessions, each using its own dataset. Federated learning (FL) can help devices become more secure and be an effective method to train a foundational machine learning model. In this project, we studied the application of federated learning for autonomous driving (AD). Previous research has demonstrated that training foundation models using other traditional methods results in challenges with accuracy, security, and transparency. Using a centralized cloud approach causes performance issues and data privacy-invasive issues. Reliance on a central server has been deemed unreliable. FL can influence the advancement of autonomous driving by improving model stability and handling mistrust. In this research, a network is proposed to solve the issues of a centralized learning technique with a focus on peer-to-peer. The datasets used for this research are Carla dataset, a simulated dataset with 73, 235 samples distributed over 11 sequences of scenes under different lighting and weather conditions, and GAZEBO dataset, also a simulated dataset using mobile robot and the built-in scenes. We have concluded that federated learning can be a solution to efficiently train a new foundational model as the results have shown to have better accuracy compared to other training techniques and the predicted values while reducing cycle times, up to 64%.

InETech Intern: Raul Mendy

InETech Intern Home Institution: Florida Atlantic University

InETech Intern: Aira Torres

InETech Intern Home Institution: Florida Atlantic University

REU Scholar: Joseph Accurso

REU Scholar Home Institution: Benedictine College

REU Mentor/Intern Advisor: Yufei Tang, Ph.D.

Project: GPT-like Attention Mechanisms for Power Transformer Condition Monitoring and Prognostics


This research explores the application of Large-Scale Foundation (LSF) models in power transformer prognostic health management (PHM). Leveraging the capabilities of transformer-based language models, the proposed approach aims to enhance the predictive capabilities of PHM systems for power transformers. By integrating the attention mechanisms of the deep learning transformer architecture with structured power transformer PHM data, the system gains the ability to efficiently analyze diverse data sources, including textual descriptions, sensor data, and discrete sinusoidal waveforms. The advanced natural language processing techniques of the transformer architecture effectively interpret transformer-related data and extracts valuable insights for detected fault classification and remaining useful life estimations. This study also evaluates this solution against traditional machine learning techniques such as polynomial regression and support vector classification, showcasing the potential advantages that the transformer architecture provides in improving power transformer condition monitoring and contributing to proactive and efficient maintenance strategies.

InETech Intern: Carter Nichols

InETech Intern Home Institution: Florida Atlantic University

InETech Intern: Elisa Weinberg

InETech Intern Home Institution: Florida Atlantic University

Intern Advisor: Yufei Tang, Ph.D.

Project: Hardware-in-the-Loop Simulation of Ocean Current Turbines for Grid Integration


Increased interest in renewable energy production has created demand for novel methods of electricity production. With a high potential for low-cost power generation in locations otherwise isolated from the grid, marine hydrokinetic turbines could serve to help meet this growing power demand. The large potential for electricity generation powered by ocean currents has gained attention as a viable renewable energy source that can be harnessed by ocean current turbines (OCTs). In this project, a 3-HP OCT dynamometer is customized and the hardware-in-the-loop-power (HILP) simulations are executed in the OPAL-RT environment to observe the real-time performance of the dynamometer.

First, the turbine mechanical torque output is evaluated while operating in realistic conditions simulated using Acoustic Doppler Current Profiler (ADCP) readings that were collected in the Gulf Stream. Both the collected ADCP data and the constant simulated flow speed of 1.5 m/s are modified to create a realistic environment by accounting for the impact of significant wave height, water turbulence intensity, and depth of the center of the rotor. The second case involves the OCT system that undergoes a water flow speed change at the turbine input. For the flow speed change case, we observe the step change from 1.5 (m/s) to 2.5(m/s), and the power generation profiles are recorded from the dynamometer. Results exhibit power stability of the dynamometer and this research contribute to demonstrating the effectiveness of the OPAL-RT for HILP emulation of marine energy.