Skip to content
Search AI Powered

Latest Stories

THOUGHT LEADERS

I think … therefore I might be a material handling robot: interview with Ted Stinson

The days when robots were limited to a set of preprogrammed responses are over. Today’s AI-enabled bots are able to analyze, adapt, and even learn from each other, says Ted Stinson of AI software startup Covariant.

Ted Stinson

Artificial intelligence is set to explode across the distribution center, according to Ted Stinson, chief operating officer of Covariant, a Berkeley, California-based startup specializing in innovative software applications. We’ve heard similar predictions for years, but Stinson is confident the time is ripe—and he’s betting his career on it. Stinson left his job as partner with the venture capital firm Amplify Partners 18 months ago to join Covariant, which he had become acquainted with while helping the company develop its first business plan. After getting to know the Covariant team, he says, he decided it was time to chart a new career course and “be part of building an extraordinary company in an industry that will be at the forefront of the next wave of industrialization.”

He may be right. In its short life, Covariant has already made a splash in the supply chain industry, having forged impressive partnerships with Knapp and ABB to integrate its artificial intelligence (AI) software with their robotics systems.


But what exactly is artificial intelligence? How does it work and compare with the way humans think and act? DC Velocity Editorial Director David Maloney recently spoke with Stinson to learn more about this promising, yet complex, technology.

Q: How do you define artificial intelligence, or AI, as it applies to supply chain applications?

A: The concept of artificial intelligence is decades old. At one level, it is meant to represent the idea that systems software is capable of adapting and interacting in the way that you and I do as humans. AI today has evolved into ways that are achieving that goal. It is based on something called deep learning, where for the first time, systems have the ability to consume and analyze extraordinary amounts of data using a new, “neural network” approach to artificial intelligence. It has allowed people to apply and achieve artificial intelligence breakthroughs that simply weren’t possible prior to the last couple of years.

Q: You mention deep learning. And then there is reinforcement learning. Can you explain what they are?

A: The idea of deep learning and reinforcement learning is to learn from experience and to reinforce a behavior through failure or success. These techniques can also be applied to software, having it learn to adapt through trial and error. If you step back and think about that, that’s a huge contrast to the way software has worked historically, where every behavior, every interaction of the software had to be preprogrammed. You would have to specifically write into your code an action you wanted the software program or a robot to take. Now, the nature of these programs is that they enable the software or robot to take a try at something, and then the act of trying is analyzed and adapted to enable the software to learn from success and learn from failure.

Q: The artificial intelligence that you’re developing is going into robotics systems, such as picking robots. Why did you choose this area for your initial deployments?

A: The Covariant ambition is to build a universal AI. We call it the Covariant Brain. It is meant to be essentially the cognitive system for a robot—the “brain” that gives it the ability to see and reason and act on the world around it. We chose to focus on warehouses initially because the logistics market offers such a great opportunity to deploy these capabilities to automate jobs that are tedious and, thus, hard to fill. Things like order picking in an e-commerce or grocery or apparel warehouse are great examples. These jobs are clearly repetitive in nature, but at the same time, every pick has a degree of variability and change, which is what makes warehouse environments so challenging.

Q: What are some of the products robots have difficulty recognizing or handling?

A: A robot system essentially has cameras as its eyes. Seeing things in boxes that come in basic colors is relatively easy. They were among the first things that were “solvable,” but what’s beyond that? What about objects with subtle variations in their shapes, things that are flexible like apparel? A human can look inside a bin or tote containing different types of white T-shirts and pretty easily identify one white T-shirt from another. But to a robot, each T-shirt looks a little different when they are all folded and placed in the bin slightly differently. All those subtle variations represent a change—small, but fundamental.

Reflections from objects are also challenging. Even the way the objects get stacked and placed in the tote can present problems. A human can move around to get a better view of the objects in a box. But with a robot, getting a clearer view of what’s inside the box presents a fundamental challenge.

Q: So, we’ve talked about the recognition problems. What about the problems robots have in picking a variety of different objects?

A: Before the advent of modern AI, it simply was not possible for a robot to handle many different types of objects. Think of an apparel company, for example, that changes its products every season, so you have tens of thousands of items that change on a three- to four-month basis.

Beyond the changing product mix, there’s the challenge of gathering objects that are difficult to pick up. One of our partners, ABB, is one of the world’s largest robotic manufacturers. When it set out to find a partner for AI about a year ago, it developed a series of 26 different tests that essentially mimic real-world conditions. One of the tests was picking rubber ducks. As you can imagine, there are relatively few ways that a typical robotic end effecter can pick up a rubber duck successfully. So, the ability to understand the process—to learn from trying to pick up a rubber duck—is an example of something that is very simple for you and me to do, but extraordinarily hard for a robot to do and requires the ability to try to learn and adapt its behavior.

Q: I would imagine too that apparel can be very difficult to pick, especially if you have a vacuum end effecter that may not be able to grab cloth easily.

A: It is, to be sure. Picking apparel is one of the challenges we’ve put a lot of effort into solving. When you pick up a piece of clothing, it changes shape, so it’s different from, say, a box. In order to pick apparel successfully, the robot needs to understand that when it picks it up an object, that object is going to change shape, and that the new shape, whatever it is, will then have a very material impact on how the robot completes its task.

Q: Just as you’d find with a claw machine at a game arcade, objects in a box may also be too close to the sides for the robot to grab it. In that case, would the robot have to figure out how to manipulate or move the object to gain better access?

A: Yes. A key aspect of how we approach our systems is there is literally infinite variability in how objects can be sitting in a bin. So, you have to have a system that can both deal with all of the known conceivable ways that these objects can be positioned, then also adapt on the fly. We have developed a couple of different proprietary techniques. There are ways in which, for new scenes, new types of environments, the robot can adapt to that environment. It goes back to the core capability of trying, learning, and adapting its behavior until it succeeds.

Q: Could you address how a robot must also learn to adjust its approach based on the product, such as when picking a fragile item?

A: The example of how much pressure a robot should exert in picking up an object is something that could only be solved in a data-driven way. When it comes to the amount of pressure that could theoretically be applied, the possibilities are literally infinite. There is no way you could write a traditional software program that accounts for every possible level of pressure. You have to have a system that can learn and adapt in order to be able to assess the right amount of pressure to apply, just as for a human, there are an incredible number of instinctual judgments that have to be made in handling an object. It is through this deep-learning based approach that we are able to essentially mimic that sort of judgment and adaptability on the fly.

Q: What goes into training the robots to handle products they haven’t encountered before? Do you have to physically manipulate a robot to train it?

A: This is actually part of the secret sauce, which is that the system does this on its own. It is able, through trial and error, to learn and adapt to things it has never seen before and do so without any human intervention or human manipulation.

The initial work of adapting the AI to a customer’s operation is built into the preliminary process that we and our partners go through with customers. The Covariant Brain essentially comes out of the box “smart,” so implementations can be surprisingly fast. Most of the time, it is quick enough that it has no material impact on an operation’s throughput.

In the very beginning, the system might encounter new things that take an extra pick or two before it learns and adapts. But it generalizes these learnings and adapts relatively quickly, so that before long, it is, for the most part, operating at the performance levels that you’re looking for in the system. That accelerated learning process is one of the key benefits unlocked by modern AI systems.

Q: So basically, when you deploy a system, you’re actually using the knowledge from other systems that have been deployed before it? The robots can draw on the experience of previous robots?

A: Yes. This is really an important concept in envisioning operations—the idea that each robot deployed after the first one learns from those around it. So collectively over time, they accelerate the learning amongst themselves. That is one of the key value propositions. So, you could have a decanting robot that is learning from the experiences of an order picking robot. That ability to share learnings is unlocked by this underlying modern AI deep-learning based approach.

Q: Is this learning done in real time?

A: As you find with people, learning happens at different levels. There is some learning that happens in the moment and other learning that happens upon what I call “reflection.” Both are aspects of how the systems improve over time.

Q: You have industry partnerships with Knapp and ABB. Could you talk about how you’re working with these companies to deploy AI robotic solutions?

A: Covariant is an AI company. Our expertise is in building software. We partner with companies that have strong domain knowledge and strong robotics capabilities. Knapp and ABB are examples of both. Knapp has been one of the pioneers in the robotics field. It was first to market with an order picking solution. It has a long history of investing to bring this capability to market, and it recognized the limitations of traditional software when it comes to solving robotic picking problems.

Knapp concluded that we had developed something that was essentially the key to unlocking the adaptability and learning capability of robotics systems used in material handling applications. We entered into a partnership last year and have now brought two different use cases to market, with several more on the horizon. Overall, I am really excited about the partnership.

Q: Could you elaborate a little on those deployments and the kinds of products being handled?

A: Obeta is a German hardware supplier, and we’re seeking to introduce robotic systems into its operations. The system that is deployed at Obeta is the first system that we and Knapp have publicly announced. What’s notable about this deployment is that it’s a system that has achieved autonomy, which we think is a really important notion.

At the end of the day, what you want as an operator is the ability to have a robotic system that performs at the same level as your traditional manual processes. No asterisks, no exceptions. It just works in the same way. Hopefully, over time, it actually works better. That for us has been the benchmark.

Q: You are obviously looking at use cases for these technologies. Are there particular applications you’re looking at?

A: Within materials handling, we’re just getting started. We and Knapp have brought to market the order picking solution. We have half a dozen or more use cases that we and Knapp and other partners are in the process of formalizing. Our goal for supply chain and material handling leaders is to show that we will deliver a roadmap for a set of use cases and stations that are going to provide substantial coverage across a modern warehouse operation. One thrust of what we are looking to do is expand within that environment.

Q: So, this might involve other types of robotic systems beyond the robotic arm?

A: Absolutely. We haven’t talked a lot about that publicly, but we tend to think of a warehouse itself as one big robot. The conveyance systems and various mechanical devices are all systems that ultimately can be optimized through AI in terms of throughput, density, and performance. Our vision is to be able to take and leverage the underlying models and the Covariant Brain to unlock those gains over time.

Q: Obviously, we’re going through some very unusual times right now with the Covid-19 crisis. I would guess that automation is top of mind for many people because of the need for social distancing and eliminating human touches. Has that changed how you go to market?

A: Our focus from a commercial perspective through Covid-19 is to be there with our partners and for our customers to help them figure out where to get started. The question I get from every supply chain leader I talk to is not “Should I deploy robotics,” but “How do I deploy it and where do I start?” Our energy has been trying to guide our prospective customers through the choices and develop a framework that lets them start in the right place and then develop a roadmap for where to go next.

Q: Have we reached the point where the technology has finally caught up with the promise of AI?

A: The technology is ready. It is here today, and at the same time, it is still at the first stage of the journey. But we’re certainly at the point where I would encourage folks to invest the time to understand the different offerings. Now is the time in the evolution of these technologies for supply chain leaders to be looking at them closely and figuring out how they might use them to enhance their operations. I really encourage folks to look past the marketing and the demos in controlled environments. Ultimately, we need to be solving the challenges in the real world. I am really optimistic but also want to encourage people to be smart shoppers as they go through and look at these different capabilities.

The Latest

More Stories

team collaborating on data with laptops

Gartner: data governance strategy is key to making AI pay off

Supply chain planning (SCP) leaders working on transformation efforts are focused on two major high-impact technology trends, including composite AI and supply chain data governance, according to a study from Gartner, Inc.

"SCP leaders are in the process of developing transformation roadmaps that will prioritize delivering on advanced decision intelligence and automated decision making," Eva Dawkins, Director Analyst in Gartner’s Supply Chain practice, said in a release. "Composite AI, which is the combined application of different AI techniques to improve learning efficiency, will drive the optimization and automation of many planning activities at scale, while supply chain data governance is the foundational key for digital transformation.”

Keep ReadingShow less

Featured

dexory robot counting warehouse inventory

Dexory raises $80 million for inventory-counting robots

The British logistics robot vendor Dexory this week said it has raised $80 million in venture funding to support an expansion of its artificial intelligence (AI) powered features, grow its global team, and accelerate the deployment of its autonomous robots.

A “significant focus” continues to be on expanding across the U.S. market, where Dexory is live with customers in seven states and last month opened a U.S. headquarters in Nashville. The Series B will also enhance development and production facilities at its UK headquarters, the firm said.

Keep ReadingShow less
container cranes and trucks at DB Schenker yard

Deutsche Bahn says sale of DB Schenker will cut debt, improve rail

German rail giant Deutsche Bahn AG yesterday said it will cut its debt and boost its focus on improving rail infrastructure thanks to its formal approval of the deal to sell its logistics subsidiary DB Schenker to the Danish transport and logistics group DSV for a total price of $16.3 billion.

Originally announced in September, the move will allow Deutsche Bahn to “fully focus on restructuring the rail infrastructure in Germany and providing climate-friendly passenger and freight transport operations in Germany and Europe,” Werner Gatzer, Chairman of the DB Supervisory Board, said in a release.

Keep ReadingShow less
containers stacked in a yard

Reinke moves from TIA to IANA in top office

Transportation industry veteran Anne Reinke will become president & CEO of trade group the Intermodal Association of North America (IANA) at the end of the year, stepping into the position from her previous post leading third party logistics (3PL) trade group the Transportation Intermediaries Association (TIA), both organizations said today.

Reinke will take her new job upon the retirement of Joni Casey at the end of the year. Casey had announced in July that she would step down after 27 years at the helm of IANA.

Keep ReadingShow less
NOAA weather map of hurricane helene

Florida braces for impact of Hurricane Helene

Serious inland flooding and widespread power outages are likely to sweep across Florida and other Southeast states in coming days with the arrival of Hurricane Helene, which is now predicted to make landfall Thursday evening along Florida’s northwest coast as a major hurricane, according to the National Oceanic and Atmospheric Administration (NOAA).

While the most catastrophic landfall impact is expected in the sparsely-population Big Bend area of Florida, it’s not only sea-front cities that are at risk. Since Helene is an “unusually large storm,” its flooding, rainfall, and high winds won’t be limited only to the Gulf Coast, but are expected to travel hundreds of miles inland, the weather service said. Heavy rainfall is expected to begin in the region even before the storm comes ashore, and the wet conditions will continue to move northward into the southern Appalachians region through Friday, dumping storm total rainfall amounts of up to 18 inches. Specifically, the major flood risk includes the urban areas around Tallahassee, metro Atlanta, and western North Carolina.

Keep ReadingShow less