SMART Manufacturing Testbed Development
Students: Tyler Toner, Abigail Rafter, Kavan Shah
Smart Manufacturing aims to improve the robustness of a manufacturing system to disruptions in resource availability and product demand through enhanced agility and flexibility with minimal human intervention. Our testbed supports such research efforts by facilitating both large-scale data aggregation and system-level coordination with highly heterogeneous assets. At its core, the testbed is enabled by a commercial-grade central controller capable of low-latency communication with traditional manufacturing machines.
An interface with the Robot Operating System (ROS) allows the use of research-grade planning and control algorithms for each robot, as well as integration of distributed vision sensing throughout the testbed. All machines are configured for safe human interaction. Human-robot collaboration is enabled by modern safety monitoring rather than legacy solutions like fencing. This system is designed for efficient integration of digital twins to monitor and predict the behavior of the system or manufactured parts in order to make intelligent control decisions.
![]() |
![]() |
Testbed Assets | Testbed Networking |
Pose Estimation for Robust Manipulation
Students: Hojun Lee, Tyler Toner
Smart Manufacturing aims to improve the robustness of a manufacturing system to disruptions in resource availability and product demand through enhanced agility and flexibility with minimal human intervention. Our testbed supports such research efforts by facilitating both large-scale data aggregation and system-level coordination with highly heterogeneous assets. At its core, the testbed is enabled by a commercial-grade central controller capable of low-latency communication with traditional manufacturing machines.
[1] Lee, H., Toner, T., Tilbury, D.M., and Barton, K. “Multi-sensor aided deep pose tracking.” IFAC-PapersOnLine, 55(37), pp.326-332.
![]() |
Multi-Sensor Deep Pose Tracking Framework |
Iterative Feedback Tuning for Probabilistically Safe Mobile Manipulation with Temporary Sensing
Student: Tyler Toner
Deployed sensing in a smart manufacturing system would enable robots to cope with task variations rapidly. However, the majority of new tasks with which an industrial manipulator is tasked are repetitive in nature, even if they are periodically updated. In this work, we consider a mobile sensor providing temporary vision feedback for a mobile manipulator without vision. The sensor gives the pose of the part to be grasped, as well as a raw point cloud of nearby obstacles. A controller is designed to drive the manipulator to reach the part while avoiding obstacles. To tune the controller for a particular scenario, an iterative feedback tuning approach is developed which halts after a measure of safety is satisfied, based on an uncertainty model of the mobile manipulator’s base pose.
[1] Toner, T., Tilbury, D.M. and, Barton, K., 2022. “Probabilistically Safe Mobile Manipulation in an Unmodeled Environment with Automated Feedback Tuning“. In 2022 American Control Conference (ACC) (pp. 1214-1221). IEEE.