Integrating robots and facility sensors cuts development time
Cogniteam, recognized as an NVIDIA partner and creator of the Nimbus platform that has been used globally to develop, deploy, manage, and even perform remote updates, has announced a new capability to develop and manage the robot, along with its IoT environment in one platform. This not only means a 60% faster time to prototype, but it allows for more reliable Internet of Robotic Things (IoRT) devices, which have been developed using Nimbus’ library of proven field-deployed software components.
Sectors that have increased their reliance on autonomous robots, such as healthcare facilities, hotels, and sensitive environments, have to consider operational challenges caused by external factors. This can be a human that walks in a robot’s path or an unexpected high-traffic area. To address this, software and hardware engineers code communication protocols and develop standards that can be understood by both IoT and IoRT devices, just not on the same platform. This opens up potential errors due to a misalignment in the various softwares, or some unknown disturbance between independent development platforms.
While this may be an inconvenience for many, it could be catastrophic for autonomous robots operating in mission-critical roles. In response, Cogniteam has bolstered its Nimbus Robotic Operating System to manage the full lifecycle process for both standalone IoRT devices, IoRT fleets, and IoT devices. The offered software components will now include packets for global IoT leaders along with their communication protocols, all in the drag-and-drop format that Nimbus is known for.
“Successful robot deployments demand a single development platform for both the machine and facility integrations,” Said Dr. Yehuda Elmaliah, Co-Founder & CEO of Cogniteam.
“Bringing both the robotic and environmental IoT device development under one roof opens exciting new opportunities for beginner through advanced developers.” Examples of where we are seeing large-scale adoptions are in medical labs or sensitive facilities. Programmed correctly, sensors can be programmed alongside the robot, allowing it to better sense the autonomous device and clear a path for it to reach its destination.
Supporting the robotics revolution, Cogniteam has advocated for greater robot autonomy, helping lead the Human Robotics Interaction Consortium. “This is a major step in integrating robots in a shared human space,” said Dr. Eliahu Khalastchi, research scientist at Cogniteam. “We’re training robots to read social cues and act in a more predictable and natural manner when on the street or in a facility, completing its mission while naturally interacting with humans”.
Acceleration Robotics and AMD collaborate on ROS 2
Artificial seed robot flies by the power of wind and light
Embedded computer for vision-guided robot applications
Smart microrobots use AI to learn how to swim and navigate
Robotic gripper uses tentacles to gently grasp fragile objects
3D ToF imager enables smart navigation in vacuum cleaning robot