MIT robot sorts trash and recycling material by simply touching it

MIT robot sorts trash and recycling material by simply touching it

Researchers from MIT and Yale University have devised a method to use robots to pick out paper, plastic, and glass as a way to sort recycling.

You may presume a trash-sorting robot would rely on computer vision to recognize the difference between different materials. The discipline to give machines sight is the primary tech behind AMP Robotics system that’s been used at a Denver, Colorado recycling facility, as well as TrashBot and Oscar, AI sorting trash cans being sold for use in homes and commercial offices.

But, in fact, the RoCycle system relies solely on sensors and soft robotics to identify and sort glass, plastic, and metal through touch alone.

MIT professor, Daniela Rus, mentioned in a statement that “computer vision on itself is not able to solve the problem of giving machines human-le perception and using the tactile input will be of vital importance”.

As a greater push is made for increased sustainability, durable versions of previously disposable items are increasingly common and visually indistinguishable from the disposable versions, such as plasticware that looks metallic.

Researchers next plan to incorporate a camera and computer vision in conjunction with RoCycle’s sense of touch to improve its accuracy.

RoCycle can be attached to any robotic arm. Its gripping appendage is made from a material called auxetics that gets wider when pulled on. The use of auxetics also allows a robotic hand to conform to an object’s surface and to form in twisted strands.

A sensor is first used to determine the size of an object, and pressure sensors measure how much force is exerted to grasp the object. This information is in turn used to determine what kind of material a robotic arm picks up.

The accuracy of the Recycling robot leaves much to be desired

Relying on a purely optical object-sorting process introduces inaccuracy because material type is not a visual property, but a tactile one, researchers note in the paper “Automated Recycling Separation Enabled by Soft Robotic Material Classification.”

Unfortunately, we haven’t yet reached the stage when a recycling bot cand take over the recycling business. The RoCycle is 85% accurate when objects are stationary, but only 63% accurate with a simulated conveyor belt.

This also doesn’t factor in the complexities of sorting recycling in real life. What if, for example, you were to put your soft drink cans back into a cardboard box? While there are refinements underway, the next big leap is likely to involve a planned combination of the touch system with camera-based computer vision.

If you can wait, the impact could be significant. While it could represent another instance of automation impacting jobs, it could also free waste workers to deal with safer, more pleasant tasks. It might also reduce costs for cities and, crucially, reduce the amount of recycling that winds up in landfills.

The research, supported by Amazon, JD, the Toyota Research Institute, and the National Science Foundation (NSF), will be presented later this month at the IEEE International Conference on Soft Robotics in Seoul, South Korea.

Properly sorting and disposal of trash and recycling has become an even bigger priority since last year, when China declared it would no longer be willing to receive plastic recycling imports in 2017.