Technology

How Eka Robotics Is Building Robots That Feel Their Way Through Tasks

Eka Robotics has developed a force-aware system that combines vision and touch feedback to help robots handle diverse manipulation tasks without task-specific programming. Founded by MIT and DeepMind

Martin HollowayPublished 2w ago6 min readBased on 4 sources
Reading level
How Eka Robotics Is Building Robots That Feel Their Way Through Tasks

How Eka Robotics Is Building Robots That Feel Their Way Through Tasks

Eka Robotics has developed a new system that lets robots use both sight and touch to handle a wide variety of objects and tasks without being programmed for each one. The Cambridge, Massachusetts company, founded by MIT professor Pulkit Agrawal and former Google DeepMind researcher Tuomas Haarnoja, is treating force feedback — the physical pressure and resistance a robot feels — as central to how robots understand and interact with the world.

The team includes researchers from MIT, Berkeley, Harvard, CMU, and other top institutions, as well as former engineers from DeepMind and Boston Dynamics. Their approach relies on training robots to learn from their own experience: they collect large amounts of real-world robot data, use machine learning to find patterns, and test what they learn in simulation before moving to real hardware.

Why Force Matters More Than You Might Think

Most robot arms today control movement by tracking a target position — imagine a crane trying to place objects in exact spots. They have basic collision detection but don't really "feel" what they're handling. Eka's system inverts this. It treats the forces the robot senses — the push-back from an object, the resistance of a thread, the weight shifting in a gripper — as the primary way the robot understands what's happening.

This addresses a real problem. Most robot arms on the market today cannot screw in a light bulb, a task that requires feeling the thread catch and adjusting pressure in real time. Eka's robots can handle tasks ranging from sorting chicken nuggets to screwing in light bulbs, according to company demonstrations.

Agrawal frames it this way: "forces are the language of the physical world". In other words, instead of relying on vision alone, robots should learn to "speak" through the feedback they feel when interacting with objects.

Learning by Doing, Not by Programming

Rather than manually programming rules for how much force to apply to different objects or how to adjust grip pressure, Eka's robots learn these things through trial and interaction. The system combines large datasets of real robotic actions with machine learning that lets robots discover effective strategies on their own.

Before being deployed on physical hardware, the robots train in simulated environments. This reduces the need for endless expensive real-world experiments and avoids damaging equipment during learning.

The software also aims to learn rules that apply broadly. Instead of programming a robot separately for each new object or environment, the system looks for patterns that generalize — manipulation strategies that work across different scenarios without retraining from scratch.

What This Could Mean in Practice

The force-aware approach does address a genuine headache in factory automation. Over decades covering manufacturing, I've consistently heard from plant managers and engineers that deploying industrial robots requires months of customization for each new product line or component type. If general-purpose robots could adapt to new objects and tasks with minimal setup, that would translate to real operational savings.

The broader pattern here echoes what we saw when deep learning first transformed computer vision. Systems trained on large image datasets suddenly could recognize objects in new photos without humans having to hand-craft features for each scenario. The question now is whether the same principle works for physical manipulation, where safety and reliability have higher stakes than they do for image recognition.

Real Challenges Remain

The robotics industry has struggled with generalization for a long time. Today's industrial robots work beautifully in tightly controlled factories with known objects and predictable variations. The challenge for general-purpose systems is the sheer variety: thousands of object shapes and textures, different environments, different task requirements. That complexity explodes quickly.

Adding force sensing makes the technical problem harder in both the physical hardware and the software. Force sensors can pick up noise and drift over time, require calibration, and demand real-time processing that challenges even modern computers — especially when combined with vision analysis and motion planning.

The timing is worth noting. Interest in AI-powered robotics has surged following breakthroughs in large language models and systems that combine multiple types of information (text, images, sound). Whether those lessons will transfer to robotic control is still an open question.

It's important to separate what Eka has shown from what it claims. The company speaks about pushing robots "beyond human limits" and making them accessible to "everyone," which are aspirational goals. The technical work — learning to manipulate objects through force feedback — appears promising based on demonstrations. But the jump from controlled tests to superhuman performance and widespread availability involves challenges beyond the core technology itself.

Wider Implications

If Eka's approach proves reliable in real-world, uncontrolled settings, it could influence how the entire robotics industry develops. Manufacturing plants, warehouses, and service businesses all depend on manipulation tasks that could benefit from robots that adapt to new objects without extensive reprogramming.

The founding team's strong academic connections suggest the company will maintain both research depth and commercial focus. Universities often provide access to cutting-edge work and student talent, which can speed development.

For the broader robotics field, Eka is a test case: can an AI-first approach overcome the traditional obstacles that have slowed robotic deployment in the real world — reliability, safety, and cost-effectiveness. The emphasis on force as a primary sense is a different bet from most other teams, which have focused heavily on vision.

The real proof will come when Eka moves beyond carefully controlled demonstrations in a lab. History shows that this transition from research prototype to reliable commercial product in general-purpose robotics has been consistently hard. That doesn't mean it's impossible; it just means the company's progress in rough, uncontrolled environments will be the genuine test.