Warehouse automation projects that blend voice- and vision-based picking with robotics are on the rise as systems integrators and technology developers seek ways to help customers maximize their labor resources and speed productivity on the warehouse floor. Tying these technologies together can deliver the ultimate in efficiency, experts say: Robots handle heavy-lifting tasks such as conveyance through the building, and pickers become faster and more accurate thanks to voice- and vision-based wearables with data-capture capabilities. In this strategy, human workers use tools like smart glasses, ring scanners, wrist-mounted computers, wireless headsets, and the like to direct their movements.
Eric Harty, senior director of strategic initiatives at supply chain tech firm Zebra Technologies, says more warehouse operators are looking for ways to connect these technologies in order to optimize workflows throughout their facilities.
“This [form of] integration has been fairly steady,” he explains. “What you see now is more people adopting robotics in general—and that generally triggers some level of modification to improve current workflows.”
More than 80% of warehouse managers agree they will rely more on automation in the future—especially in the form of autonomous mobile robots (AMRs) for picking and materials movement, according to Zebra’s most recent Warehousing Vision Study. Integrating other technologies that can streamline increasingly complex workflows goes hand-in-hand with those efforts, Harty adds.
Zebra has been accelerating its focus on integration since acquiring AMR developer Fetch Robotics in 2021. Harty points to the company’s cloud-based FetchCore software system as a case in point. The fleet and workflow management solution allows warehouse operators to integrate AMRs with scanners, tablets, and other mobile technologies. The software connects the data-capture devices and the AMRs, allowing operators to program workflows that blend the two technologies.
“The scanners and mobile computers are used to trigger workflows,” Harty explains. “For example, a worker picks a pallet, scans the bar code or clicks on a screen, and that signals the mobile robot to pick up the [pallet].”
He uses a recent customer application to illustrate the point: A distribution customer now uses Fetch AMRs to automate the delivery of pallet orders to its shipping department, a process that was previously done manually with forklifts. A worker builds the order on the pallet and scans it into the company’s warehouse management software (WMS) system after dropping the completed pallet order at the pallet transfer station. That action signals an AMR to take over material transport; the AMR travels to the pallet transfer station, picks up the pallet, and delivers it to the designated shipping lane.
“They used to have people doing [the conveyance] and dropping it off,” Harty explains. “[The integration] frees up those people to do more picking.”
The project is improving productivity, although Harty says he can’t yet share results because it’s still in the testing phase. He adds that Zebra is seeing increased interest in similar solutions—especially those that blend voice-directed picking with robotics.
“We’re seeing more interest, and we’re working with partners to build that out,” he explains.
Leaders at systems integrator Numina Group are seeing growing demand for mixing voice-activated picking and robotic solutions as well, and company president Dan Hanrahan says they’ve been making steady progress on those innovations over the past few years. The goal is to improve operations by freeing workers to focus on the value-added tasks associated with order picking while automating the conveyance function with robots.
“We’ve looked at the AMRs, in a broad sense, as an automation component, much like a typical systems integrator would do with a conveyor system,” Hanrahan explains. “We ask, what advantages can AMRs [offer] in moving materials more efficiently?”
And then they connect the dots. Numina Group uses its proprietary warehouse execution system (WES) to tie the AMRs to the customer’s WMS or enterprise resource planning (ERP) system for case or pallet picking, integrating wearable technology for piece picking. In a typical solution, the WES connects to the customer’s WMS or ERP, gleans the day’s order information, and then dispatches work orders to AMRs, which pick up a designated pallet or case and bring it to a predetermined pick zone. Pickers—outfitted with wrist-mounted mobile computers and wireless headsets—meet the AMRs at the designated zone after receiving a voice command telling them where to go.
“Basically, we have the AMRs on a bus route—a variable bus route—based on pick stops in different zones in the warehouse,” Hanrahan explains. “And like when you are waiting for an Uber, we’re using voice commands telling the picker, ‘It’s time for you to meet the AMR at this location.’”
The worker then performs the necessary picking tasks at that location. When the AMR has finished its route, it picks up a batch of finished orders from its last stop and delivers them to the packing area.
“The goal is to get the people moving simultaneously with the AMRs so the AMRs are not waiting at stops,” Hanrahan adds. “We want them to work in a continuous-travel mode.”
Several customers are using the AMR/voice solution to optimize picking and scale for growth, including an online retailer based in Northern Illinois that Hanrahan says has reduced labor costs by more than 50%, and a paint supply company in Ohio that has cut forklift driving time by 30%, freeing drivers to do more picking tasks.
“Keep the [workers] within the picking area; have them perform the value tasks—that’s what we’re really being challenged with,” Hanrahan explains, emphasizing the wide variety of technologies that can make that happen. “There are a lot more choices available to [warehouse] operations today.”
Another choice for blended warehouse automation: pick-by-vision. In this process, pickers wear smart glasses—such as those developed by supply chain tech firm Picavi—that direct them through the picking process via a visual interface. Equipped with a bar-code scanner for data capture, the glasses allow for hands-free operation, speeding the picking process and improving accuracy. When used with goods-to-person robotics, vision systems can help optimize piece picking while also alleviating stress and strain on workers, developers say. Picavi and Japanese robotics developer SoftBank Robotics have launched a pilot project to illustrate those benefits at SoftBank’s innovation lab, an 11,000-square-foot demonstration facility in Ichikawa City, Japan.
The companies have paired Picavi’s smart glasses with an automated storage and retrieval system (AS/RS) from automation specialist AutoStore. Goods are stored in the AutoStore system, which combines product bins, robots, picking and putaway stations, a storage and retrieval “grid,” and a software-based controller to move inventory in and out of storage for automated fulfillment. Workers at the AutoStore’s picking stations use Picavi smart glasses for multi-order picking out of containers as well as for more complex multi-order put applications that would typically incorporate a more expensive and less flexible pick-to-light system. Both systems connect directly to the facility’s WMS.
“The AutoStore [system] automatically provides the containers with the right goods. The employees receive all the information necessary to pick the right goods in the right quantity via the user interface of the smart glasses, and to acknowledge changes in the inventory on the software side,” Picavi said in a statement describing the pilot project. “The container is then picked up again by the AutoStore system and placed in storage.”
The demo is yet another example of using goods-to-person robotics for the heavy lifting while fine-tuning the picking process with additional technology.
Companies of all shapes and sizes can develop similar projects—provided they focus on a particular task or workflow and be open to a range of solutions, according to Zebra’s Harty.“I suggest starting with a specific use case or workflow you want to automate,” he explains. “Think through what that looks like, what your current operation is, and work with [partners] who can map out that workflow and figure out the solution for you. Be specific, but also be flexible. What I’ve found in my personal experience …. [is that those] that say they just want to work with robots don’t know [what they want to accomplish]. Those that say, ‘I want a robot to do XYZ’ will get a robot to do XYZ.”
Copyright ©2023. All Rights ReservedDesign, CMS, Hosting & Web Development :: ePublishing