Patent classifications
G11C11/41
Shared bit lines for memory cells
Methods and devices including a plurality of memory cells and a first bit line connected to a first column of memory cells of the plurality of memory cells, and a second bit line connected to the first column of cells. The first bit line is shared with a second column of memory cells adjacent to the first column of memory cells. The second bit line is shared with a third column of cells adjacent to the first column of cells opposite the second column of cells.
Semiconductor device comprising memory cells
A semiconductor device that writes data to, instead of a defective memory cell, another memory cell is provided. The semiconductor device includes a first circuit and a second circuit over the first circuit; the first circuit corresponds to a memory portion and includes a memory cell and a redundant memory cell; a second circuit corresponds to a control portion and includes a third circuit and a fourth circuit. The memory cell is electrically connected to the third circuit, the redundant memory cell is electrically connected to the third circuit, and the third circuit is electrically connected to the fourth circuit. The fourth circuit has a function of sending data to be written to the memory cell or the redundant memory cell to the third circuit, and the third circuit has a function of bringing the memory cell and the fourth circuit into a non-conduction state and the redundant memory cell and the fourth circuit into a conduction state to send the data to the redundant memory cell when the memory cell is a defective cell.
SRAM array
SRAM arrays are provided. A SRAM array includes a plurality of SRAM cells and a plurality of well strap cells. Each of the SRAM cells arranged in the same column of the cell array includes a first transistor formed in a first P-type well region of a substrate, a second transistor formed in an N-type well region of the substrate, and a third transistor formed in a second P-type well region of the substrate. Each well strap cell is arranged on one of the columns in the cell array and includes a first P-well strap structure formed on the first P-type well region, a second P-well strap structure formed on the second P-type well region, and an N-well strap structure formed on the N-type well region. The first and second P-well strap structures and the N-well strap structure are separated from the SRAM cells by a dummy area.
Memory devices and methods of manufacturing thereof
A memory cell is disclosed. The memory cell includes a first transistor. The first transistor includes a first conduction channel collectively constituted by one or more first nanostructures spaced apart from one another along a vertical direction. The memory cell includes a second transistor electrically coupled to the first transistor in series. The second transistor includes a second conduction channel collectively constituted by one or more second nanostructures spaced apart from one another along the vertical direction. At least one of the one or more first nanostructures is applied with first stress by a first metal structure extending, along the vertical direction, into a first drain/source region of the first transistor.
Event-based classification of features in a reconfigurable and temporally coded convolutional spiking neural network
Embodiments of the present invention provides a system and method of learning and classifying features to identify objects in images using a temporally coded deep spiking neural network, a classifying method by using a reconfigurable spiking neural network device or software comprising configuration logic, a plurality of reconfigurable spiking neurons and a second plurality of synapses. The spiking neural network device or software further comprises a plurality of user-selectable convolution and pooling engines. Each fully connected and convolution engine is capable of learning features, thus producing a plurality of feature map layers corresponding to a plurality of regions respectively, each of the convolution engines being used for obtaining a response of a neuron in the corresponding region. The neurons are modeled as Integrate and Fire neurons with a non-linear time constant, forming individual integrating threshold units with a spike output, eliminating the need for multiplication and addition of floating-point numbers.
Event-based classification of features in a reconfigurable and temporally coded convolutional spiking neural network
Embodiments of the present invention provides a system and method of learning and classifying features to identify objects in images using a temporally coded deep spiking neural network, a classifying method by using a reconfigurable spiking neural network device or software comprising configuration logic, a plurality of reconfigurable spiking neurons and a second plurality of synapses. The spiking neural network device or software further comprises a plurality of user-selectable convolution and pooling engines. Each fully connected and convolution engine is capable of learning features, thus producing a plurality of feature map layers corresponding to a plurality of regions respectively, each of the convolution engines being used for obtaining a response of a neuron in the corresponding region. The neurons are modeled as Integrate and Fire neurons with a non-linear time constant, forming individual integrating threshold units with a spike output, eliminating the need for multiplication and addition of floating-point numbers.
Systems And Methods For Generating A Temperature Dependent Supply Voltage
An integrated circuit includes a diode for generating a temperature dependent voltage, a resistor divider for generating divided voltages by dividing the temperature dependent voltage, and a multiplexer circuit for selecting one of the divided voltages as a reference voltage used for setting a supply voltage.
Capacitive-based determination of micromirror status
A digital micromirror device includes a plurality of micromirror cells on a semiconductor die. Each respective cell includes a memory circuit and an electrode selection circuit. At least some of the micromirror cells include a micromirror and each respective memory circuit controls a micromirror tilt angle. For a given memory circuit controlled to a first tilt angle, a measurement circuit measures a first value indicative of a capacitance between a first electrode and the micromirror and measures a second value indicative of a capacitance on the second electrode. For a second micromirror tilt angle, the measurement circuit measures a third value indicative of a capacitance between the first electrode and the micromirror and measures a fourth value indicative of a capacitance on the second electrode. The measurement circuit generates a signal indicative of whether the micromirror is stuck at a particular angle or missing.
Capacitive-based determination of micromirror status
A digital micromirror device includes a plurality of micromirror cells on a semiconductor die. Each respective cell includes a memory circuit and an electrode selection circuit. At least some of the micromirror cells include a micromirror and each respective memory circuit controls a micromirror tilt angle. For a given memory circuit controlled to a first tilt angle, a measurement circuit measures a first value indicative of a capacitance between a first electrode and the micromirror and measures a second value indicative of a capacitance on the second electrode. For a second micromirror tilt angle, the measurement circuit measures a third value indicative of a capacitance between the first electrode and the micromirror and measures a fourth value indicative of a capacitance on the second electrode. The measurement circuit generates a signal indicative of whether the micromirror is stuck at a particular angle or missing.
Semiconductor device
A semiconductor device that can reduce power consumption while improving the accuracy of learning and inference is provided. The semiconductor device is connected to data lines PBL, NBL, and comprises a product operation memory cell 1 for storing data of ternary value and performing a product-sum operation between a stored data and an input data INP and a data in the data lines PBL, NBL.