Eka Robotics Unveils Force-Aware VFA Model for General-Purpose Robot Manipulation
Eka Robotics has developed a Vision-Force-Action model that uses force as a primary sensory input for general-purpose robot manipulation, co-founded by MIT and former Google DeepMind researchers to ad

Eka Robotics Unveils Force-Aware VFA Model for General-Purpose Robot Manipulation
Eka Robotics has developed a Vision-Force-Action (VFA) model that combines visual perception with force feedback to enable general-purpose robot manipulation across objects, tasks, and environments. The Cambridge, Massachusetts-based company, co-founded by MIT professor Pulkit Agrawal and former Google DeepMind robotics researcher Tuomas Haarnoja, is building what they call "intelligence for the physical world in its native language: force."
The Eka Robotics team draws from alumni of MIT, Berkeley, Harvard, CMU, Boston University, University of Pennsylvania, DeepMind, Boston Dynamics, and Microsoft. Their technical approach centers on self-supervised learning, large-scale robotic data collection, reinforcement learning, and sim-to-real transfer technologies.
The Force-Centric Architecture
Traditional robot control systems treat force as a secondary consideration, typically relying on position-based control with basic collision detection. Eka's VFA model inverts this hierarchy by treating force as a primary sensory modality alongside vision. The architecture enables what the company describes as a single model that generalizes across diverse manipulation scenarios without task-specific programming.
The technical gap this addresses is substantial. Most robot arms on the market today cannot screw in a light bulb, a task requiring delicate force modulation and real-time adaptation to thread engagement. Eka's robots can perform tasks ranging from sorting chicken nuggets to screwing in light bulbs, according to company demonstrations.
The force-first approach reflects Agrawal's framework that "forces are the language of the physical world" when discussing the VFA model. This positions tactile feedback not as an auxiliary sensor input but as the fundamental communication protocol between robot and environment.
Self-Learning Through Physics Mastery
The company's robots master physics through self-learning rather than explicit programming of physical laws or task parameters. This represents a departure from traditional robotic control systems that require extensive manual tuning of force thresholds, compliance parameters, and motion primitives for each new task or object type.
Eka's approach leverages large-scale robotic data collection to train models that can infer appropriate force application, grip modulation, and contact dynamics from visual and tactile observations. The self-supervised learning framework allows robots to discover manipulation strategies through interaction rather than human demonstration or rule-based programming.
The sim-to-real pipeline enables initial training in simulation environments before deployment on physical hardware. This addresses the data-hunger problem in robotics, where physical training time is expensive and potentially damaging to equipment.
Safety and Generalization Claims
Eka positions its robots as inherently safe and collaborative, designed to operate in unstructured environments without extensive safety barriers or workspace isolation. The force-aware control system theoretically enables more nuanced interaction with humans and fragile objects compared to position-controlled systems that can apply excessive force when encountering unexpected resistance.
The generalization claims extend beyond task variety to environmental adaptability. The company states that its robots can operate across different environments without recalibration or retraining, suggesting the VFA model learns environment-invariant manipulation policies.
Looking at the broader context here, we have seen this pattern before when deep learning first enabled computer vision systems to generalize across image datasets without manual feature engineering. The key question for robotics is whether the same generalization principles that worked for perceptual tasks will transfer to the sensorimotor domain, where physical constraints and safety requirements are more stringent.
The force-centric approach does address a real limitation in current industrial robotics. During my years covering factory automation, I consistently heard from manufacturers that robot deployment required extensive customization for each new product line or component variant. General-purpose manipulation that adapts to new objects and tasks without reprogramming would represent a significant operational improvement.
Technical Challenges and Market Context
The robotics industry has long struggled with the generalization problem. Successful industrial robots typically excel in highly controlled environments with known objects and predictable variations. The challenge for general-purpose systems lies in handling the combinatorial explosion of object properties, environmental conditions, and task requirements.
Force sensing and control add complexity to both hardware and software systems. Force sensors introduce noise, drift, and calibration requirements that vision-only systems avoid. The real-time processing demands for force feedback control can stress computational resources, particularly when combined with vision processing and planning algorithms.
The market timing coincides with renewed interest in AI-powered robotics following advances in foundation models and multimodal learning. Whether the lessons from large language models will transfer to robotic control remains an open technical question, but the capital and talent flowing into the space suggests sustained development effort.
Worth flagging: Eka's claims about pushing robots "beyond human limits" and making them accessible to "everyone" represent aspirational goals rather than demonstrated capabilities. The current evidence points to promising technical progress in force-aware manipulation, but the leap to superhuman performance and mass accessibility involves challenges beyond the core technical architecture.
The company's testing facilities in Cambridge position them within the broader Boston-area robotics ecosystem, which includes established players like Boston Dynamics and a dense network of academic research programs. This geographic concentration of robotics expertise often accelerates technology transfer and talent mobility between institutions and companies.
Industry Implications
The force-aware approach could influence broader robotics development if the generalization claims prove robust across real-world deployment scenarios. Manufacturing, logistics, and service robotics all involve manipulation tasks that would benefit from reduced programming overhead and improved adaptability to object and environmental variations.
The academic pedigree of the founding team and advisory network suggests sustained research investment alongside commercial development. University collaborations often provide access to cutting-edge research and graduate student talent that can accelerate technical progress.
For the robotics industry, Eka represents another test case for whether AI-first approaches can overcome the traditional challenges of robotic deployment: reliability, safety, and economic viability. The emphasis on force as a primary sensory modality offers a differentiated technical approach that could complement or compete with vision-heavy systems currently dominating development attention.
The ultimate measure will be deployment performance in uncontrolled environments where the physics of manipulation encounters the messy realities of real-world variation. The transition from controlled demonstrations to robust commercial applications has historically proven challenging for general-purpose robotics systems, making Eka's progress worth monitoring as they scale beyond laboratory conditions.


