Mission

The mission of the Department of Computer Engineering and Sciences is to prepare computing, engineering and systems  students for success and leadership in the conception, design, management, implementation and operation of complex engineering problems, and to expand knowledge and understanding of computing and engineering through research, scholarship and service.

Electrical and Computer Engineering

Microgravity Simulator



Team Leader(s)
Alexander Montano

Team Member(s)
Aruna Dookeran, Elias Orellana, Aiden Smart

Faculty Advisor
Dr. Andrew G. Palmer

Secondary Faculty Advisor
Dr. Edward L. Caraway



Microgravity Simulator  File Download
Project Summary
This project involves the design and development of an automated two-axis microgravity simulator (clinostat) for biological research. The system uses continuous dual-axis rotation to simulate microgravity conditions on Earth. It integrates mechanical, electrical, and software components, including stepper motors, slip rings, LED lighting, and a Raspberry Pi-controlled touchscreen interface.


Project Objective
The objective of this project is to design and build an automated microgravity simulator capable of continuous dual-axis rotation. The system aims to provide consistent lighting, real-time user control, and reduced manual intervention through an integrated touchscreen interface.

Manufacturing Design Methods
The system was designed using Fusion 360 and fabricated primarily through 3D printing. A dual-axis frame was developed to support continuous rotation using two NEMA 17 stepper motors. Slip rings were incorporated to allow power transfer during rotation, while electrical components such as the motor driver HAT, MOSFET, and power supplies were integrated into a compact enclosure. A Raspberry Pi was used for system control and user interaction.

Specification
Dual-axis rotation system Two NEMA 17 stepper motors 24V LED lighting with PWM control Raspberry Pi-based control system Touchscreen interface for user interaction Slip rings for continuous rotation Separate 12V (motors) and 24V (LEDs) power systems

Analysis
The system successfully integrates mechanical rotation, electrical control, and user interface design into a single platform. The use of slip rings allows continuous operation without wire interference, while the touchscreen interface improves usability by enabling real-time adjustments. Initial testing demonstrates stable LED control and system responsiveness, with ongoing work focused on motor performance and full system integration.

Future Works
Future work includes completing system integration, performing extended testing, and optimizing performance for long-duration operation. Additional improvements may include enhanced automation features, improved thermal management, and further refinement of the user interface.

Other Information
This project provides an accessible and cost-effective platform for simulating microgravity conditions on Earth. The system is designed to support continued development and future enhancements for expanded research applications.

Acknowledgement
The team would like to thank Dr. Palmer for his guidance and project direction, as well as Dr. Caraway and TA Elis for their support throughout the senior design process. Additional thanks to Florida Tech and the OEC lab for providing resources for fabrication and testing.




CALI - Cobot Autonomous Living Interface



Team Leader(s)
Nicholas Santamaria

Team Member(s)
Heber Lopez, Berke Dogan

Faculty Advisor
Dr. Edward L. Caraway




CALI - Cobot Autonomous Living Interface  File Download
Project Summary
The CALI (Computer-Assisted Living Interface) system is a low-cost, assistive robotic feeding solution designed to improve independence for individuals with motor impairments. Built on the AR4 6-DOF robotic arm platform, the system integrates computer vision, depth sensing, and custom control software to detect food items, track user position, and autonomously guide a utensil for feeding. A neural network processes visual data in real time, enabling accurate food localization, while a depth camera provides spatial awareness for safe and precise motion. The system is controlled through a custom user interface and implemented using ROS 2 (Robot Operating System 2) for modular, scalable operation. By combining accessible hardware with intelligent software, CALI demonstrates a practical and affordable approach to assistive robotics, with strong potential for further development in healthcare and at-home support applications.


Project Objective
The objective of the CALI project is to design and develop a low-cost, assistive robotic feeding system that enables individuals with motor impairments to eat independently and safely. This system aims to integrate computer vision, depth sensing, and robotic manipulation to detect food, track user position, and execute precise, autonomous movements using a 6-DOF robotic arm. A key goal is to create a modular and scalable platform controlled through a custom user interface and built on ROS 2, allowing for real-time operation and future expansion. Ultimately, the project seeks to demonstrate that advanced assistive technology can be both accessible and effective, providing a practical foundation for further development in healthcare and at-home support environments.

Manufacturing Design Methods
The CALI system was developed using an iterative design process that combined mechanical fabrication, electronics integration, and software testing to create a functional assistive feeding prototype. The project was built around the AR4 robotic arm platform, with custom-designed components modeled in CAD and fabricated primarily through 3D printing to support rapid prototyping, low cost, and easy modification. Multiple utensil-mount concepts and support components were designed, tested, and refined to improve fit, usability, and overall appearance, including custom housings and attachments tailored to the system’s needs. On the electrical and control side, the design integrated depth-sensing hardware, embedded control components, and a custom interface to connect sensing, processing, and robotic motion into a single system. Throughout development, subsystems were repeatedly evaluated and adjusted based on testing results, allowing the team to improve reliability, functionality, and manufacturability while maintaining a modular design approach.

Specification
The CALI system is designed as a 6-degree-of-freedom (6-DOF) assistive robotic platform built on the AR4 robotic arm, capable of precise and repeatable motion suitable for feeding tasks. The system operates using stepper motor-driven joints with encoder feedback for improved positioning accuracy, achieving sub-centimeter end-effector precision within its working envelope. A depth-sensing camera provides real-time spatial data, enabling accurate food localization and user tracking within a typical operating range of approximately 0.2 to 1.0 meters. The control architecture is implemented using ROS 2, allowing modular communication between perception, planning, and actuation nodes. The vision system utilizes a trained neural network for object detection, running on a standard computing platform capable of real-time inference. Custom 3D-printed PLA components are used for the utensil mount and protective enclosures, ensuring lightweight and cost-effective fabrication. The system is powered by standard DC power supplies and interfaces with a custom-built user interface for manual control, calibration, and automated operation modes.

Analysis
The CALI system demonstrates that a low-cost, modular assistive robotic platform can effectively integrate computer vision, depth sensing, and robotic control to perform feeding tasks with reasonable accuracy and reliability. By leveraging ROS 2, the system achieves real-time communication between perception and actuation, validating the feasibility of combining modern software frameworks with accessible hardware like the AR4 arm. While testing showed strong performance in detecting food and guiding motion, limitations such as sensitivity to lighting conditions, system latency, and the inherent constraints of a non-human-centered robotic platform highlight areas for improvement. Overall, the project confirms the potential for affordable assistive robotics while identifying key opportunities for enhancing robustness, safety, and user adaptability.

Future Works
Future work for the CALI system will focus on improving reliability, safety, and user adaptability to move closer to real-world deployment. Enhancements to the computer vision pipeline, including more robust neural network training and expanded datasets, will improve accuracy across varying lighting conditions and a wider range of food types. Reducing system latency through optimized processing and more efficient communication within ROS 2 will enable smoother and more responsive motion. Mechanical upgrades to the AR4 platform, such as improved safety features, softer end-effectors, and more ergonomic utensil designs, will make the system better suited for direct human interaction. Additional developments may include user-specific calibration profiles, voice or gesture-based controls, and expanded autonomy for tasks beyond feeding. Ultimately, future iterations aim to refine the system into a more robust, intuitive, and clinically viable assistive technology.


Acknowledgement
The team would like to thank the Machine Learning Team (Aruna Dookeran, Michael Yanke, Kari Voelstad Bogen, Levent Kahveci), as well as Dr. Caraway and TA Elis for their support throughout the senior design process.




Computer Science and Software Engineering

FITARNA



Team Leader(s)
Jacob Hall-Burns

Team Member(s)
Vincenzo Barager, Dathan Dixon, Jacob Hall-Burns, Ethan Wadley

Faculty Advisor
Eraldo Ribeiro

Secondary Faculty Advisor
Philip Chan



FITARNA  File Download
Project Summary
FIT AR Navigation App (FITARNA) is an indoor augmented reality navigation system designed to help students and visitors navigate complex campus buildings such as the Evans Library. The project uses AR-based spatial mapping, indoor localization, and real-time route visualization to provide an intuitive wayfinding experience where traditional GPS systems fail. By combining Unity, Vuforia Area Targets, AR Foundation, and custom pathfinding, the system overlays directional guidance and navigation markers directly onto the real-world environment. Problem Statement: Large academic buildings can be difficult for new students and visitors to navigate. Existing outdoor navigation systems, such as GPS, do not work reliably indoors due to signal attenuation and limited floor-level accuracy. As a result, users often struggle to find specific rooms, study areas, help desks, or library wings efficiently. Project Objective: The objective of FITARNA is to create an indoor AR wayfinding application that provides accurate, real-time navigation without GPS. The system aims to achieve high-precision indoor localization, display intuitive 3D guidance overlays, and allow users to search for points of interest within a campus building. Manufacturing / Design Methods: The project was developed by first scanning the Evans Library using Vuforia Area Targets to build a high-fidelity digital representation of the environment. Unity’s AR Foundation was integrated with the Vuforia Engine to support augmented reality features and robust area recognition. A custom NavMesh was implemented in Unity for shortest-path route calculation, and a spatial UI was designed to project AR markers and breadcrumb-style directional indicators into the user’s physical surroundings. Specifications: The system is designed to support indoor navigation in complex academic environments with sub-meter localization accuracy. It includes searchable campus points of interest such as library wings, study rooms, and help desks. The software stack includes Unity, AR Foundation, Vuforia Engine, ARCore/ARKit support, and a custom Unity NavMesh-based routing system. Analysis: The project demonstrates that augmented reality can improve indoor wayfinding by more naturally bridging digital maps and physical spaces than conventional navigation tools. Using spatial mapping and area targets allows the system to maintain persistent tracking and deliver more precise indoor guidance than GPS. The custom route calculation and 3D overlays create a more intuitive experience for users navigating complex buildings. Future Works: Future work could include expanding the system to additional campus buildings, improving route adaptability in changing indoor conditions, enhancing the searchable point-of-interest database, and refining localization accuracy and user interface responsiveness. Additional features such as accessibility-aware routing or multi-floor route optimization could also strengthen the system. This part goes a bit beyond the exact poster text, but it follows naturally from the project scope shown in the presentation. Acknowledgement: This project was completed by Vincenzo Barager, Dathan Dixon, Jacob Hall-Burns, and Ethan Wadley, with faculty advising from Eraldo Ribeiro in the Department of Computer Science at Florida Institute of Technology. Other Information: FITARNA focuses on solving indoor navigation challenges in environments where GPS is unreliable. Its main innovation is the use of AR-based spatial understanding and real-time visual guidance to create a seamless indoor wayfinding experience for campus users.












Panther Shuttle App



Team Leader(s)
Joseph Hilte

Team Member(s)
Joseph Hilte, Tony Arrington, Jonathan Suo, Chase Monigle

Faculty Advisor
Khaled Slhoub

Secondary Faculty Advisor
Philip Chan



Panther Shuttle App  File Download
Project Summary
We developed a mobile Android application to improve the on-campus shuttle experience for students, drivers, and managers. The application supports three user roles. Students can view the live shuttle location, check the daily shuttle schedule, receive driver notifications, and save favorite stops and times. Drivers can view the route, see the next scheduled stop, estimate how many students may be waiting at a stop based on favorite-stop data, and send notifications to students. Managers can add, edit, and remove shuttle stops on the map and create or update the shuttle schedule for each day of the week. These updates are shared with the student and driver sides of the app through Firebase so that all users see the most current route information. The goal of the project is to provide a more organized shuttle system and encourage greater student use of campus transportation.


Project Objective
The objective of this project was to create an Android-based shuttle application that provides real-time and schedule-based information for students while also giving drivers and managers tools to manage communication, stops, and routes. The app is intended to improve convenience, reduce uncertainty, and make campus shuttle transportation more efficient and easier to use.

Manufacturing Design Methods
The application was designed and developed using Android Studio with Kotlin for the front-end and Firebase for backend services such as authentication, Firestore database storage, and live data synchronization. Google Maps was integrated to display shuttle location and stop markers visually. The system was divided into three main interfaces based on user role: student, driver, and manager. A modular design approach was used so that scheduling, stops, and notifications could be managed centrally and reflected across all user views. Testing was performed throughout development to verify navigation, Firebase connectivity, schedule updates, and notification behavior.

Specification
Platform: Android mobile devices Programming Language: Kotlin/Xml Development Environment: Android Studio Backend Services: Firebase Authentication and Cloud Firestore Mapping Service: Google Maps API User Roles: Student, Driver, Manager Student Features: live map, daily schedule, favorite stops, notifications, estimated student count at next stop Driver Features: live location sharing, route view, next-stop view, stop-based notifications Manager Features: add/edit/delete stops, update daily schedules, manage route data across the app

Analysis
The project demonstrates how a simple mobile system can improve communication and visibility in the campus shuttle environment. By allowing managers to control the official stop locations and schedule, the app reduces inconsistencies across users. Firebase integration enables real-time updates so that schedule and stop changes are reflected without manually updating each device and theuse of favorite-stop data also adds a predictive element by estimating student demand at upcoming stops. Overall, the design supports better decision-making for drivers and more reliable information for students.

Future Works
Future improvements could include automatic background notifications even when the app is fully closed, more advanced delay prediction using live shuttle movement, manager analytics dashboards for stop demand trends, and stronger role-based security to limit certain actions to approved driver or manager accounts only. Additional features such as route history, accessibility settings, and support for multiple shuttle routes could also be added in future versions. We also hope for it to be able to be implanted on the IPhone.


Acknowledgement
We would like to acknowledge our instructor, advisor, and all others who provided feedback and support during the design and development of this project. We also acknowledge the use of Android Studio, Firebase, and Google Maps as essential tools that made this project possible.




Wallee.



Team Leader(s)
Emma Bahr

Team Member(s)
Emma bahr, Kyle Gibson, Joshua Cajuste, and Matteo Caruso

Faculty Advisor
Dr. Siddartha Bhattacharyya

Secondary Faculty Advisor
Dr. Phillip Chan



Wallee.  File Download
Project Summary
We developed a mobile cross-platform personal finance application called Wallee to help users better understand and manage their money in one place. The app connects securely to users’ bank accounts through the banking API through Plaid to automatically import and update transactions in real time. Users can view an overview of their finances on a home dashboard, explore spending breakdowns and trends, and track recent transactions categorized automatically. The application also includes a dynamic budgeting system that adjusts based on spending behavior, along with savings goals that allow users to set targets and monitor their progress over time. In addition, Wallee includes an AI-powered chat assistant (Wallo) that provides personalized financial insights, answers questions about spending patterns, and helps users make informed budgeting decisions. The goal of the project is to simplify personal financial management and give users clearer, more actionable insight into their financial health.


Project Objective
The objective of Wallee is to address the limitations of existing personal finance tools by providing an adaptive, intelligent, and user-centered financial management system specifically designed for variable-income users. The application aims to replace static budgeting models with an automated, paycheck-aware system that continuously recalibrates budgets based on real-time income changes. It also seeks to improve the reliability of financial guidance by using a two-layer AI architecture that combines generative AI with verified financial logic to ensure that all recommendations are consistent with the user’s actual financial data. In addition, Wallee introduces a dynamic financial health scoring system to encourage better financial habits through continuous feedback and engagement. Finally, the project focuses on delivering a clean, cognitively accessible interface that transforms complex financial data into clear, actionable insights, ultimately bridging the gap between raw transaction data and meaningful financial decision-making.

Manufacturing Design Methods
The manufacturing and design methods for Wallee follow an iterative, user-centered development approach focused on modular architecture, secure data handling, and scalable system integration. The system is designed using a layered architecture consisting of a Flutter-based cross-platform frontend, a Node.js (NestJS) backend for core application logic, and in app advanced analytics and financial computation. Secure financial data integration is achieved through the Plaid API, which enables real-time transaction syncing via webhooks and structured ingestion into a Supabase database. On the design side, the application follows a component-driven UI approach in Flutter, emphasizing clarity, minimal cognitive load, and accessibility for variable-income users. Key features such as the dashboard, budgeting system, goals tracker, and transaction views are developed as reusable modules to ensure consistency and maintainability. The financial logic layer is separated from the UI to ensure that budgeting recalculations, income detection, and health scoring remain accurate and independently testable. The AI system is implemented as a two-layer model: Wallee Zero, which handles deterministic financial calculations and rule-based validation, and Wallo, which serves as the user-facing conversational interface that retrieves only verified insights from the underlying logic layer. Development follows an agile workflow with continuous testing and refinement based on usability feedback, ensuring that both financial accuracy and user experience are maintained throughout the system.



Future Works
Future work for Wallee will focus on expanding predictive and automation capabilities to make the system more proactive and personalized for users. This includes adding forecasting features that estimate future income, spending, and savings based on historical financial patterns, as well as improving the AI system to support more advanced financial guidance such as tax estimation, debt management strategies, and long-term planning while maintaining verified, logic-based outputs. Additional improvements include smarter automation for detecting recurring bills, managing subscriptions, and refining transaction categorization over time through adaptive learning. The platform could also be expanded to support more financial institutions beyond the current integration, along with enhanced gamification of financial health scoring to increase user engagement. Finally, future development will focus on improving scalability, performance, and customization options to support a growing user base and provide a more tailored financial management experience.


Acknowledgement
We would like to acknowledge the support and contributions of everyone who helped make the Wallee project possible. We extend our gratitude to our faculty advisor for their guidance, feedback, and encouragement throughout the development process, as well as for providing valuable insight into system design and implementation. We also thank the developers and maintainers of the technologies used in this project, including Flutter, Node.js, in-app calculations, Supa BaseSQL, and the Plaid banking API, which were essential in building a secure and scalable financial platform. Finally, we appreciate the support from peers and reviewers who provided feedback during testing and helped improve the usability, functionality, and overall design of the application.