DATA MODEL INFRASTRUCTURE AS A SERVICE

    公开(公告)号:US20230102486A1

    公开(公告)日:2023-03-30

    申请号:US17486563

    申请日:2021-09-27

    Applicant: SAP SE

    Inventor: David Kunz

    Abstract: In an example embodiment, a data model infrastructure is implemented as a service rather than as an always-running server. Specifically, one of the technical issues with past implementations is that the data models are deployed onto a server that is intended to be “always running”, even if there are no requests to the server. This utilizes memory and processing power. While it may be useful to have an always running server for commonly used applications, for applications that are infrequently used (e.g., 10 times a month), it can mean that memory and processing power is wasted. Thus, by implementing the data model infrastructure as a service rather than an always-running server, the service can be launched only when actually needed, saving both memory and processing power.

    Data model infrastructure as a service

    公开(公告)号:US11614925B1

    公开(公告)日:2023-03-28

    申请号:US17486563

    申请日:2021-09-27

    Applicant: SAP SE

    Inventor: David Kunz

    Abstract: In an example embodiment, a data model infrastructure is implemented as a service rather than as an always-running server. Specifically, one of the technical issues with past implementations is that the data models are deployed onto a server that is intended to be “always running”, even if there are no requests to the server. This utilizes memory and processing power. While it may be useful to have an always running server for commonly used applications, for applications that are infrequently used (e.g., 10 times a month), it can mean that memory and processing power is wasted. Thus, by implementing the data model infrastructure as a service rather than an always-running server, the service can be launched only when actually needed, saving both memory and processing power.

    TEXT-GENERATED INSTRUCTION OBJECTS USING LARGE LANGUAGE MODEL

    公开(公告)号:US20250086171A1

    公开(公告)日:2025-03-13

    申请号:US18367778

    申请日:2023-09-13

    Applicant: SAP SE

    Inventor: David Kunz

    Abstract: In an example embodiment, a solution is provided that automatically adds a system message to natural language text provided by a user to generate a prompt to a Large Language Model (LLM) to automatically generate a code in a declarative language format, the code corresponding to the natural language text. Furthermore, retrieval augmented generation may be utilized to overcome the maximum number of contextual tokens permitted as input to an LLM. More particularly, the system message may be designed to include an instruction to the LLM to generate search calls for one or more entity definitions in a specified format from a database. The search calls may then be performed on the database via a similarity search to obtain the relevant information, which can then be passed back into the LLM for the generation of the code.

    Request handling in a multi-protocol cloud environment

    公开(公告)号:US12001395B2

    公开(公告)日:2024-06-04

    申请号:US17962884

    申请日:2022-10-10

    Applicant: SAP SE

    Inventor: David Kunz

    CPC classification number: G06F16/144 G06F16/116

    Abstract: Various examples are directed to systems and methods for operating an application for use with an enterprise database system. A common format process may receive, from a user device, a first request directed to the enterprise database system, convert the first request into a common protocol, and send a first common protocol request to the application logic code. The application logic code may generate a second request in the common protocol and send the second request to the common format process. The common format process may convert the second request from the common protocol to a database query protocol to generate at least one database query and send the at least one database query to the enterprise database system.

    Cloud application programming model

    公开(公告)号:US11204818B1

    公开(公告)日:2021-12-21

    申请号:US17160587

    申请日:2021-01-28

    Applicant: SAP SE

    Inventor: David Kunz

    Abstract: Methods, systems, and computer-readable storage media for receiving, by an application programming framework within the cloud platform, a first request from an application, determining, by a generic event handler of the application programming framework, to handle the first request, transmitting, by the generic event handler, a second request to a sidecar that is executed within the cloud platform, the sidecar processing the second request to communicate with an unsupported resource and provide a first result comprising data from the unsupported resource, receiving, by the generic event handler and from the sidecar, the first result, and transmitting, from the generic event handler, a first response to the application, the first response comprising at least a portion of the data of the first result.

    FINE-TUNABLE DISTILLED INTERMEDIATE REPRESENTATION FOR GENERATIVE ARTIFICIAL INTELLIGENCE

    公开(公告)号:US20250117577A1

    公开(公告)日:2025-04-10

    申请号:US18378005

    申请日:2023-10-09

    Applicant: SAP SE

    Inventor: David Kunz

    Abstract: In an example embodiment, rather than use large language model (LLM) to directly generate desired computer code, an intermediate representation is generated by the LLM. The LLM is used to generate the portion of the computer code that cannot be computed programmatically (which may be called the “creative” part for purposes of the present disclosure). The intermediate representation can then be fed into a separate programmatic component that compiles the intermediate representation into compilable computer code. This fine-tuning may involve, for example, sanitizing the intermediate representation, enhancing the intermediate representation, and formatting the intermediate file, as well as modifying the intermediate representation based on a feature set.

    Concurrent outbox table for reliable messaging

    公开(公告)号:US12236294B2

    公开(公告)日:2025-02-25

    申请号:US18073678

    申请日:2022-12-02

    Applicant: SAP SE

    Inventor: David Kunz

    Abstract: In an example embodiment, a solution is provided in which an outbox table is added to each container in a database. Rather than perform an emit solely using the application instance alone (in response to a notification from the database that the underlying database action has been performed), or using an outside outbox table processor, an outbox table processor is integrated into the microservice application. When a database action is performed by the microservice application, that database action is written into the outbox table in the corresponding container in the database. Furthermore, whenever the outbox table processor determines that the state of the database has changed in a way that leads to an emit, it then reads the actions in the outbox table, issues an emit to notify one or more external systems of the actions, and deletes those actions from the outbox table.

    COMPOSABLE PROCESSING FRAMEWORK BASED ON WEB ASSEMBLY COMPONENT MODEL

    公开(公告)号:US20240303050A1

    公开(公告)日:2024-09-12

    申请号:US18179850

    申请日:2023-03-07

    Applicant: SAP SE

    Inventor: David Kunz

    CPC classification number: G06F8/41 G06F8/31

    Abstract: In an example embodiment, a common, composable abstraction is provided that allows components to work efficiently across programming languages and services without the need to write glue code. Application developers can concentrate on the application logic itself. The functionality of services can be developed by framework developers only once, using the programming language of their choice.

    TOKEN OPTIMIZATION THROUGH MINIMIZED PAYLOAD FOR LARGE LANGUAGE MODELS

    公开(公告)号:US20250156635A1

    公开(公告)日:2025-05-15

    申请号:US18388686

    申请日:2023-11-10

    Applicant: SAP SE

    Inventor: David Kunz

    Abstract: In an example embodiment, a solution is provided by utilizing an intermediate file format for either an input prompt to or output from a large language model (LLM) (or both). This intermediate file format is one that has the property of being minimizable, meaning that tokens contained in a file of that intermediate file format can be stripped out or otherwise removed without changing the semantic meaning of the file. The input prompt can be created in or converted to this intermediate file format and then minimized prior to being sent to the LLM for text generation. Furthermore, a system message included with the input prompt may instruct the LLM to generate text in the intermediate file format, in minimized form. A specialized parser may then be included to parse the minimized output produced by the LLM.

    Composable processing framework based on web assembly component model

    公开(公告)号:US12260193B2

    公开(公告)日:2025-03-25

    申请号:US18179850

    申请日:2023-03-07

    Applicant: SAP SE

    Inventor: David Kunz

    Abstract: In an example embodiment, a common, composable abstraction is provided that allows components to work efficiently across programming languages and services without the need to write glue code. Application developers can concentrate on the application logic itself. The functionality of services can be developed by framework developers only once, using the programming language of their choice.

Patent Agency Ranking