Autonomous Transports Provide Safe, Contactless Delivery of Essentials

With object detection, collision avoidance, and mapping technology, intelligent robotic transport solutions are being put to work, providing efficient, contactless delivery services for daily essentials, such as food to hospital workers treating COVID-19 patients.

Image credit: Pudu Technology

While autonomous cars and flying taxis are still in the early testing phase, other types of smaller self-driving robotics with navigation capabilities are already being put to work. Solutions such as drones and robots are stepping in to provide automated, contactless transport and perform repetitive tasks in many different markets, including retail and healthcare. And we may be seeing even more of them soon, as Amazon’s new Prime Air drone delivery fleet got the green light in late August from the Federal Aviation Administration (FAA), opening the door for the company to expand unmanned package delivery.

Autonomous robotic solutions use IoT, cameras, and sensors to assist with several key capabilities, including object detection, collision avoidance, path planning, and simultaneous localization and mapping. The functionality of solutions can range from sterilization and cleaning to routine deliveries, which are becoming increasingly important during the pandemic for safe transport of food, medications, mail, and other essentials.

Contactless Food Delivery

One company who has developed a highly automated food delivery robot is Pudu Technology, a developer and system integrator based in Shenzen, China. Its PuduBot delivery robots have a touchscreen that allows a food expeditor to assign each of the four trays on the robot to go to a specific table or destination.

Once the table is selected, the robot navigates its way to the table, and will then give instructions to the receiving person to allow them to take their own plates and food items from the designated trays, for example, telling them that their items are on the third and fourth trays. Once the person takes the items, a simple button press on the touchscreen instructs the robot to continue its journey, either returning home or going to the next table or destination.

The robots navigate using a proprietary PuduSLAM algorithm which integrates data from multiple sensors: Lidar, vision camera with a sensor pointing directly up, and two Intel® RealSense™ Depth Cameras located in the lower part of the robot. The depth cameras are crucial for avoiding collisions and adapting to a changing environment.

Delivery robot

Image credit: Pudu Technology

The Pudu robots have been deployed in a number of hospitals and hotels in China, where patients are in quarantine due to COVID-19. By deploying the robots to deliver food or other supplies to quarantined patients and their care providers, human-to-human contact is limited, which both avoids potential spreading of the disease, and removes people who might normally be performing that task from repeated high-risk encounters.

IoT Sensors for Navigation Intelligence

Another automated delivery solution was launched late last year at the University of California San Diego (UCSD). This one is not designed for food but for another routine daily essential: the US mail. Engineers and computer scientists teamed up to put self-driving carts on campus roads to deliver mail to a campus that is home to more than 65,000 students, staff, and faculty.

A few months after the solution began, the university campus was mostly closed to students because of the COVID-19 pandemic. The engineering team members dispersed and the carts were parked. Fortunately, the team was then able to rapidly evolve the project, making significant improvements to the algorithms and self-driving technologies. The carts will be ready to roll again when the university begins its Return to Learn initiative this fall.

In the launch phase, the team used its self-driving delivery carts along an hour-long mail route, from the university’s main parking structure to the campus mail center. The university uses Polaris Ranger carts equipped with multiple IoT sensors from Illinois’ AutonomouStuff, as well as dozens of cameras. The carts operate and navigate autonomously. The navigation intelligence was built with an open source software package in conjunction with algorithms created by the leader of this project, Henrik Christensen, Director of the Contextual Robotics Institute at UCSD.

The algorithms embedded in the software system allow the carts to share the road with cars and people. Current algorithms are based on data collected by researchers that predict the next (within five seconds) direction and speed of pedestrians’ movements, plus timing for when adjacent vehicles are likely to stop moving.

According to Christensen, the larger purpose of this project is to figure out how to best utilize automation for last mile logistics and contactless deliveries on city streets, where robotic and self-driving machinery has to share space with vehicles, bikes, and pedestrians.

Because of the project’s early success, the long-term goal is for the self-driving carts to make 11 more stops, picking up mail and packages in each department. In an upcoming revision, the team intends to equip the mail delivery carts with radar technology along with wireless vehicle-to-vehicle communications.

Automation and navigation robotics