Invention Grant
- Patent Title: Attention free transformer
-
Application No.: US17308033Application Date: 2021-05-04
-
Publication No.: US12271791B2Publication Date: 2025-04-08
- Inventor: Shuangfei Zhai , Walter A. Talbott , Nitish Srivastava , Chen Huang , Hanlin Goh , Joshua M. Susskind
- Applicant: Apple Inc.
- Applicant Address: US CA Cupertino
- Assignee: Apple Inc.
- Current Assignee: Apple Inc.
- Current Assignee Address: US CA Cupertino
- Agency: BAKERHOSTETLER
- Main IPC: G06N20/00
- IPC: G06N20/00 ; G06F17/16 ; G06F40/58 ; G06N5/04 ; G06T3/4053

Abstract:
Attention-free transformers are disclosed. Various implementations of attention-free transformers include a gating and pooling operation that allows the attention-free transformers to provide comparable or better results to those of a standard attention-based transformer, with improved efficiency and reduced computational complexity with respect to space and time.
Public/Granted literature
- US20220108212A1 ATTENTION FREE TRANSFORMER Public/Granted day:2022-04-07
Information query