G06F12/125

BIT LEVEL SHARDING OF SENSITIVE DATA FOR INCREASED SECURITY

Techniques for obfuscating and/or de-obfuscating data using bit-level shard masks are disclosed. Shard masks are generated. The shard masks are designed to shard a block of data into a number of shards for distribution and storage among a number of storage arrays. The shard masks shard the block of data at a bit-level granularity. The shard masks are applied to the block of data to generate the shards. The shards are then distributed among the storage arrays for storage on the storage arrays.

SEMICONDUCTOR MEMORY DEVICE INCLUDING UNIT PAGE BUFFER BLOCKS HAVING FOUR PAGE BUFFER PAIRS

A unit page buffer block includes first to fourth page buffer pairs. Each of the page buffer pairs includes a common column decoder block; and an upper page buffer stage and a lower page buffer stage electrically and commonly connected to the common column decoder block. Each of the upper page buffer stages includes an upper selection block; an upper latch block; and an upper cache block. Each of the lower page buffer stage includes a lower selection block; a lower latch block; and a lower cache block. Each of the upper selection blocks includes first to fourth sub-selection blocks. Each of the upper and lower latch blocks includes first to twelfth upper sub-latch blocks. Each of the upper and lower cache blocks includes first to twelfth upper sub-cache blocks. Each of the common column decoder block includes first to third sub-common column decoder blocks arranged in a row direction.

PHYSICAL PAGE TRACKING FOR HANDLING OVERCOMMITTED MEMORY IN A VIRTUALIZED ENVIRONMENT
20200409576 · 2020-12-31 ·

A system for computer memory management includes a memory pool table, each memory pool in the table representing memory pages related by common attributes, and each memory pool associated with a virtual machine index; a per-page tracking table, each entry in the per-page tracking table to relate a memory page with virtual machine indices of the memory pool table; and processing circuitry to: scan each entry in the per-page tracking table and, for each entry: determine an amount of memory released if the memory page related with the entry is swapped; and aggregate the amount of memory for the respective virtual machine related with the memory page related with the entry in the per-page tracking table, to produce a per-virtual machine memory aggregate using the respective virtual machine index; and output the per-virtual machine memory aggregate for virtual machines related with the memory pages in the per-page tracking table.

Combined transparent/non-transparent cache

In one embodiment, a memory that is delineated into transparent and non-transparent portions. The transparent portion may be controlled by a control unit coupled to the memory, along with a corresponding tag memory. The non-transparent portion may be software controlled by directly accessing the non-transparent portion via an input address. In an embodiment, the memory may include a decoder configured to decode the address and select a location in either the transparent or non-transparent portion. Each request may include a non-transparent attribute identifying the request as either transparent or non-transparent. In an embodiment, the size of the transparent portion may be programmable. Based on the non-transparent attribute indicating transparent, the decoder may selectively mask bits of the address based on the size to ensure that the decoder only selects a location in the transparent portion.

Storing arrays of data in data processing systems
10552307 · 2020-02-04 · ·

In a data processing system that comprises a memory 8 comprising N memory banks 11, a memory controller is configured to store one or more N data unitN data unit arrays of data in the memory 8 such that each data unit in each row of each NN data unit array is stored in a different memory bank of the N memory banks 11, and such that each data unit in each column of each NN data unit array is stored in a different memory bank of the N memory banks 11.

Method and System for Biological Information Pattern Storage and Readout
20190370188 · 2019-12-05 ·

Provided herein are biological information pattern (BIP) arrays and related methods for reading out information stored in a biological medium. In this manner, encoded digital information in biomolecular medium can be used as a high data density storage medium that may be read-out and accessed in a label-free manner.

Caching data based on greenhouse gas data
11966336 · 2024-04-23 · ·

Some embodiments provide a program that receives a first set of data and a first greenhouse gas emission value. The program stores, in a cache, the first set of data and the first greenhouse gas emission value. The program receives a second set of data and a second greenhouse gas emission value. The program stores, in the cache, the second set of data and the second greenhouse gas emission value. The program receives a third set of data and a third greenhouse gas emission value. The program determines one of the first and second sets of data to remove from the cache based on the first and second greenhouse gas emission values. The program replaces, in the cache, one of the first and second sets of data and the corresponding first or second greenhouse gas emission value with the third set of data and the third greenhouse gas emission value.

Semiconductor memory device including unit page buffer blocks having four page buffer pairs

A unit page buffer block includes first to fourth page buffer pairs. Each of the page buffer pairs includes a common column decoder block; and an upper page buffer stage and a lower page buffer stage electrically and commonly connected to the common column decoder block. Each of the upper page buffer stages includes an upper selection block; an upper latch block; and an upper cache block. Each of the lower page buffer stage includes a lower selection block; a lower latch block; and a lower cache block. Each of the upper selection blocks includes first to fourth sub-selection blocks. Each of the upper and lower latch blocks includes first to twelfth upper sub-latch blocks. Each of the upper and lower cache blocks includes first to twelfth upper sub-cache blocks. Each of the common column decoder block includes first to third sub-common column decoder blocks arranged in a row direction.

Combined Transparent/Non-Transparent Cache
20190171380 · 2019-06-06 ·

In one embodiment, a memory that is delineated into transparent and non-transparent portions. The transparent portion may be controlled by a control unit coupled to the memory, along with a corresponding tag memory. The non-transparent portion may be software controlled by directly accessing the non-transparent portion via an input address. In an embodiment, the memory may include a decoder configured to decode the address and select a location in either the transparent or non-transparent portion. Each request may include a non-transparent attribute identifying the request as either transparent or non-transparent. In an embodiment, the size of the transparent portion may be programmable. Based on the non-transparent attribute indicating transparent, the decoder may selectively mask bits of the address based on the size to ensure that the decoder only selects a location in the transparent portion.

Combined transparent/non-transparent cache

In one embodiment, a memory that is delineated into transparent and non-transparent portions. The transparent portion may be controlled by a control unit coupled to the memory, along with a corresponding tag memory. The non-transparent portion may be software controlled by directly accessing the non-transparent portion via an input address. In an embodiment, the memory may include a decoder configured to decode the address and select a location in either the transparent or non-transparent portion. Each request may include a non-transparent attribute identifying the request as either transparent or non-transparent. In an embodiment, the size of the transparent portion may be programmable. Based on the non-transparent attribute indicating transparent, the decoder may selectively mask bits of the address based on the size to ensure that the decoder only selects a location in the transparent portion.