Arm expands Flexible Access for AI licensing at the edge

  • Arm extends Flexible Access to cover its AI-focused Armv9 platform at the edge and reduce barriers to entry.
  • Access to IP, tools, and training at minimal cost; license payments upon design completion; more than 300 partners and 400 designs.
  • Armv9 integrates Cortex-A320 and Ethos-U85 to run models with up to 1 billion parameters with advanced security.
  • Market Boost: Qualcomm adopts v9; competes with Nvidia and Intel; shares rise ~4% after the announcement.

Arm's AI Licensing Program at the Edge

Arm has taken a further step in its AI strategy by extend Flexible Access to include its edge-oriented Armv9 platform. The company is looking to attract manufacturers and startups developing On-device AI, lowering barriers to entry and shortening design cycles.

With this expansion, Arm promises easier access and a profitable way to create artificial intelligence solutions that run locally. The initiative, as its management emphasizes, aims to make the technology of the house more accessible for the entire ecosystem.

What's changing in Flexible Access

wafer.space lets you make your own chips for $7 per die
Related article:
wafer.space and GF180MCU: Manufacture your chips for $7 per die

The program allows companies to use chip design tools, IP and training with minimal or no cost, making it easy to experiment, iterate, and validate prototypes without large initial outlays. License fees are paid at the end of the design process, when the project is ready for production. closure and manufacturing.

This model has served as a springboard for ecosystem partners such as Raspberry Pi, Hailo and SiMa.ai, and currently has more than 300 member companies. According to Arm, around 1,000 have already been completed. 400 chip designs under this umbrella.

Armv9 for AI at the Edge: Architecture and Capabilities

Presented in early 2025, the platform Armv9 It incorporates the Arm Cortex-A320 CPU and the Arm Ethos-U85 NPU as its main components. The combination is designed to run on the device. models with up to a billion parameters, in addition to strengthening security with advanced technologies.

With this hardware base, developers can building AI applications at the edge, with improvements in performance per watt and data protection by maintaining local inference, without having to send sensitive information to the cloud.

Use cases: from vision to multimodal interaction

Arm highlights that this proposal opens the door to smart cameras, connected homes, industrial automation and more advanced human-machine interfaces. Thanks to local execution, interactions across vision, voice and gestures They are more natural and have lower latency.

The focus on the device helps machines perceive and respond more immediately, reducing dependence on connectivity and improving privacy by processing data where it is generated.

Competition and market context

The program expansion strengthens Arm's positioning in a AI market at the edge increasingly crowded, where it competes with Nvidia, Intel and new specialized players such as YesFive. Sectors such as autonomous car or retail are adopting edge AI for real-time analysis and decision-making.

From inventory management to collaborative robotics, the loads that benefit from local processing due to issues of latency, operating cost and confidentiality, areas where Arm seeks to consolidate its technological footprint.

Market figures and signals

Arm claims that more than 300 companies are part of Flexible Access and there are some 400 ready designs for manufacturing under this scheme. In addition, it has emerged that Qualcomm has migrated its flagship chips to the v9 architecture, which reinforces the appeal of the proposal.

Following the announcement, Arm shares advanced around one 4%, up to $172,23, and so far this year they have accumulated an increase of 34,2%, a reaction that reflects the market's interest in on-device AI and flexible access to intellectual property.

With the inclusion of Armv9 in Flexible Access, Arm democratizes access to your IP and accelerates the arrival of AI products at the edge: less friction to enter, more options to iterate and an ecosystem that already shows traction in partners, designs and real applications.