Broadcom and CAMB.AI Move AI Translation On-Chip for Instant, Private Communication
On-device AI chips unlock instant, private, offline voice translation, dissolving language barriers and making global communication native.
November 11, 2025

A landmark collaboration is set to fundamentally reshape the landscape of digital communication, moving powerful artificial intelligence capabilities from the cloud directly into the hardware of consumer devices. Voice AI startup CAMB.AI and semiconductor giant Broadcom have partnered to embed CAMB.AI’s sophisticated real-time translation models directly onto Broadcom's neural processing unit (NPU) chipsets.[1] This integration of advanced software with specialized silicon is engineered to bring seamless, instantaneous voice translation in over 150 languages to a new generation of devices, operating independently of an internet connection.[2][1] The move signals a pivotal shift toward on-device AI, promising to eliminate latency, bolster user privacy, and make multilingual communication a native, rather than a cloud-dependent, feature of our digital lives.
The partnership represents a strategic convergence of two highly specialized technological forces. CAMB.AI has established itself as a leader in AI-powered localization, developing proprietary models like MARS that are capable of not just translating, but also preserving the nuances, emotions, and distinct vocal characteristics of the original speaker.[3][4] This focus on hyper-realistic, context-aware translation has attracted major partners in the sports and entertainment industries, including IMAX and Major League Soccer, for applications like real-time multilingual dubbing of live events.[5][3] On the other side of the collaboration is Broadcom, a dominant force in the design of custom chips that power the world's leading technology products.[6] While competitors focus on general-purpose GPUs, Broadcom has carved out a critical niche by creating application-specific integrated circuits (ASICs) and other custom silicon for hyperscale companies like Google, Meta, and OpenAI.[7][8][9] These chips are tailored for specific AI tasks, offering superior energy efficiency and performance compared to more generalized hardware.[7] The new collaboration will see CAMB.AI's generative voice model run directly on Broadcom's purpose-built NPUs, a fusion of intelligent software and optimized hardware designed to bring complex AI tasks to the edge.
The decision to process AI translation directly on a chip marks a significant departure from the prevailing cloud-based model and unlocks a host of transformative benefits. The most immediate advantage is the elimination of latency.[1] Cloud-based translation requires sending audio data to a remote server for processing and then back to the device, a round trip that, while fast, introduces a noticeable delay that can disrupt the natural flow of conversation. By performing the translation on the device itself, the process becomes virtually instantaneous, enabling truly real-time communication.[10] Furthermore, this on-chip approach profoundly enhances privacy and security.[1] With sensitive voice data processed locally, it never has to be transmitted to an external server, mitigating concerns about data interception and misuse. This is a crucial selling point in an era of heightened consumer awareness around data privacy. The integration also enables robust offline functionality, allowing translation to work seamlessly in areas with poor or no internet connectivity, a game-changer for international travelers, emergency responders, and users in underserved regions.[11] Finally, it offers significant cost efficiencies by reducing the reliance on expensive and energy-intensive cloud data centers for both content providers and end-users.[1][12]
The implications of this on-device voice AI integration extend far beyond simple translation apps, promising to redefine how users interact with technology and consume content globally. By building this capability directly into the core hardware, manufacturers can create devices that are inherently multilingual. This could fundamentally alter industries ranging from consumer electronics to automotive, where partners are already exploring in-vehicle AI translation for communication between drivers and passengers and for interpreting foreign road signs.[13] For the media and entertainment sectors, the ability to perform high-quality dubbing and localization at the device level could democratize content distribution, allowing creators to reach global audiences effortlessly and affordably.[4] It opens the door for a new ecosystem of applications where language is no longer a barrier, from interactive educational tools to more inclusive and accessible customer service platforms. This shift could accelerate a future where interacting with someone in a different language is as simple and natural as speaking into your own device.
In conclusion, the partnership between CAMB.AI and Broadcom is more than a simple technological integration; it is a strategic move that heralds the next phase of artificial intelligence. By bringing one of AI's most complex and valuable applications—real-time, nuanced voice translation—directly to the silicon level, the collaboration addresses the core challenges of latency, privacy, and connectivity that have limited cloud-based solutions. It empowers device manufacturers to offer powerful new features and provides users with a more seamless and secure way to communicate across linguistic divides. As this on-chip AI technology proliferates, it has the potential to dissolve language barriers on a global scale, making instant and authentic multilingual communication an everyday reality built into the very fabric of our devices.