G06F5/14

ENGINE ARCHITECTURE FOR PROCESSING FINITE AUTOMATA

An engine architecture for processing finite automata includes a hyper non-deterministic automata (HNA) processor specialized for non-deterministic finite automata (NFA) processing. The HNA processor includes a plurality of super-clusters and an HNA scheduler. Each super-cluster includes a plurality of clusters. Each cluster of the plurality of clusters includes a plurality of HNA processing units (HPUs). A corresponding plurality of HPUs of a corresponding plurality of clusters of at least one selected super-cluster is available as a resource pool of HPUs to the HNA scheduler for assignment of at least one HNA instruction to enable acceleration of a match of at least one regular expression pattern in an input stream received from a network.

ENGINE ARCHITECTURE FOR PROCESSING FINITE AUTOMATA

An engine architecture for processing finite automata includes a hyper non-deterministic automata (HNA) processor specialized for non-deterministic finite automata (NFA) processing. The HNA processor includes a plurality of super-clusters and an HNA scheduler. Each super-cluster includes a plurality of clusters. Each cluster of the plurality of clusters includes a plurality of HNA processing units (HPUs). A corresponding plurality of HPUs of a corresponding plurality of clusters of at least one selected super-cluster is available as a resource pool of HPUs to the HNA scheduler for assignment of at least one HNA instruction to enable acceleration of a match of at least one regular expression pattern in an input stream received from a network.

Data processing apparatus and data processing method
09853919 · 2017-12-26 · ·

A data processing apparatus includes a shared buffer; an issuing unit that issues a write address for writing incoming data to the shared buffer; a receiving unit that receives a returned read address for the data read from the shared buffer; a monitoring buffer that saves information indicating use status of an address for the shared buffer; and a monitoring unit that monitors write address issuance and returned read address reception, changes the information for the write address, from an unused state to a used state, when the write address is issued, and changes the information for a read address to be returned, from a used state to an unused state when the returned read address is received. The monitoring unit determines the address for the shared buffer is overlapping, when the information for the write address indicates a used state when the write address is issued.

Method and apparatus for handling incoming data frames
09851941 · 2017-12-26 · ·

A method and apparatus for handling incoming data frames within a network interface controller. The network interface controller comprises at least one controller component operably coupled to at least one memory element. The at least one controller component is arranged to identify a next available buffer pointer from a pool of buffer pointers stored within a first area of memory within the at least one memory element, receive an indication that a start of a data frame has been received via a network interface, and allocate the identified next available buffer pointer to the data frame.

Method and apparatus for handling incoming data frames
09851941 · 2017-12-26 · ·

A method and apparatus for handling incoming data frames within a network interface controller. The network interface controller comprises at least one controller component operably coupled to at least one memory element. The at least one controller component is arranged to identify a next available buffer pointer from a pool of buffer pointers stored within a first area of memory within the at least one memory element, receive an indication that a start of a data frame has been received via a network interface, and allocate the identified next available buffer pointer to the data frame.

INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING DEVICE
20170366309 · 2017-12-21 ·

This information processing system inputs/outputs data normally, even when a serial communication bus is extended by network communication. The information processing system is provided with: a device; a device control unit for controlling the device; a device interface unit which interfaces with the device control unit; an information processing device provided with an application interface unit which interfaces with an application; a channel establishment unit which connects, via a communication unit, the application interface unit and the device interface unit, and establishes a control channel and a data channel between the application and the device; and an error suppression unit which suppresses the occurrence of error in data transfer over the channel established by the channel establishment unit.

INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING DEVICE
20170366309 · 2017-12-21 ·

This information processing system inputs/outputs data normally, even when a serial communication bus is extended by network communication. The information processing system is provided with: a device; a device control unit for controlling the device; a device interface unit which interfaces with the device control unit; an information processing device provided with an application interface unit which interfaces with an application; a channel establishment unit which connects, via a communication unit, the application interface unit and the device interface unit, and establishes a control channel and a data channel between the application and the device; and an error suppression unit which suppresses the occurrence of error in data transfer over the channel established by the channel establishment unit.

PACKET DESCRIPTOR STORAGE IN PACKET MEMORY WITH CACHE
20170353403 · 2017-12-07 ·

A first memory device stores (i) a head part of a FIFO queue structured as a linked list (LL) of LL elements arranged in an order in which the LL elements were added to the FIFO queue and (ii) a tail part of the FIFO queue. A second memory device stores a middle part of the FIFO queue, the middle part comprising a LL elements following, in an order, the head part and preceding, in the order, the tail part. A queue controller retrieves LL elements in the head part from the first memory device, moves LL elements in the middle part from the second memory device to the head part in the first memory device prior to the head part becoming empty, and updates LL parameters corresponding to the moved LL elements to indicate storage of the moved LL elements changing from the second memory device to the first memory device.

DISPLAY CONTROLLER

The disclosure describes a display controller and a method that includes monitoring a fill-level of a first in, first out (FIFO) block in the display controller, generating a regulation signal that depends on the fill-level of the FIFO block, and regulating, based on the regulation signal, access to a system interconnect by a master unit other than the display controller.

DISPLAY CONTROLLER

The disclosure describes a display controller and a method that includes monitoring a fill-level of a first in, first out (FIFO) block in the display controller, generating a regulation signal that depends on the fill-level of the FIFO block, and regulating, based on the regulation signal, access to a system interconnect by a master unit other than the display controller.