Abstract:
L'invention concerne un procédé de formation d'une structure capacitive (C) dans un niveau métallique (Mn) d'un empilement d'interconnexion comprenant une succession de niveaux métalliques et de niveaux de vias, comprenant les étapes suivantes : (a) former, dans ledit niveau métallique, au moins une piste conductrice (34) dans laquelle est définie une tranchée ; (b) former, de façon conforme, une couche isolante (54) sur la structure ; (c) former, dans la tranchée, un matériau conducteur (58) ; et (d) réaliser une planarisation de la structure.
Abstract:
There is disclosed an improved artificial neural network (ANN) (120') comprised of a conventional ANN (120), a database block (220) and a compare & update circuit (230). The conventional ANN is formed by a plurality of n neurons (130), each neuron having a prototype memory (140) dedicated to store a prototype and a distance evaluator (150) to evaluate the distance between the input pattern presented to the ANN and the prototype stored therein. The data base block is comprised of three data bases: a first data base (222) containing all the p prototypes arranged in s slices, each slice being capable to store up to n prototypes, a second data base (224) being capable to store the q input patterns to be presented to the ANN (queries) and a third data base (226) being capable to store the q distances resulting of said evaluation performed during the recognition/classification phase. The role of the compare & update circuit is to compare said distance with the distance previously found for the same input pattern (or pre-existing at initialization) and based upon the result of that comparison, to update or not said distance previously stored.
Abstract:
PROBLEM TO BE SOLVED: To provide a method and a circuit for relating a norm with each component of an input pattern presented in an artificial neural network(ANN) based on an input space mapping algorithm during a distance evaluation processing. SOLUTION: A set of norms called a 'component' norm is stored in a special storing means in the ANN. The ANN is provided with a global memory for storing the whole component norms that are common to the whole neurons in the ANN. The whole neurons use a corresponding prototype component stored in the global memory in each input pattern component and use a related prototype norm to carry out a basic (or partial) distance calculation during distance evaluation processing. Then, a 'distance' norm is used to combine distance basic calculations, and the final distance between the input pattern and a prototype stored in a neuron is determined. COPYRIGHT: (C)2003,JPO
Abstract:
PROBLEM TO BE SOLVED: To retrieves the minimum/maximum values in the group of numbers. SOLUTION: At first, p numbers encoded on q bits are turned into K (q>=K×n) partial values encoded on n bits, and parameters k (k=1-K) for respectively assigning ranks to the partial value of each number are defined so that K bit slices can be formed, and each slice is constituted of the plural partial values having the same rank. Then, each partial value is encoded on m bits (m>n) by using a 'thermometric' encoding technique. Afterwards, the minimum partial value in the first slice (MSB) of the encoded partial values is decided by a parallel type retrieval, and all the numbers related with the larger partial values than the partial value are selectively released. An evaluation process is repeated in the same configuration until the last slice (LSB) is processed, and the number selected as it is in the final stage is allowed to have the minimum value.
Abstract:
In a neural network comprised of a plurality of neuron circuits, there is disclosed an improved neuron circuit architecture (11) that generates local result signals, e.g. of the fire (F) type and a local output signal of the distance or category type. The neuron circuit which is connected to buses which transport input data (e.g. the input category) and control signals includes the following circuits. A multi-norm distance evaluation circuit (300) calculates the distance D between the input vector (A) and the prototype vector (B) stored in a R/W (weight) memory circuit (250). A distance compare circuit (300) compares the distance D with either the actual influence field (AIF) of the stored prototype vector or the lower limit thereof (MinIF) to generate first and second intermediate signals (LT, LTE). An identification circuit (400) processes the said intermediate result signals, the input category signal (CAT), the local category signal (C) and a feedback signal (OR) to generate the local result signals which represent the response of a neuron circuit to the presentation of an input vector. A minimum distance determination circuit (500) is adapted to determine the minimum distance Dmin among all the distances calculated by all the neuron circuits of the neural network to generate a local output signal (NOUT) of the distance type. The same processing applies to categories. The feed-back signal which is collectively generated by all the neuron circuits results of ORing all the local distances/categories. A daisy chain circuit (600) is serially connected to the corresponding daisy chain circuits of the two adjacent neuron circuits to structure the neural network as a chain. Its role is to determine the neuron circuit state: free (in particular, the first free in the chain) and engaged. Finally, a context circuitry (100/150) is capable to allow or not the neuron circuit to participate with the other neuron circuits in the generation of the said feed-back signal.
Abstract:
There is disclosed the architecture of a neural semiconductor chip (10) first including a neuron unit (11(#)) comprised of a plurality of neuron circuits (11-1, ...) fed by different buses transporting data such as the input vector data, set-up parameters, ... and control signals. Each neuron circuit (11) includes means for generating local result (F, ... ) signals of the "fire" type and a local output signal (NOUT) of the distance or category type on respective buses (NR-BUS, NOUT-BUS). An OR circuit (12) performs an OR function for all corresponding local result and output signals to generate respective first global result (R*) and output (OUT*) signals on respective buses (R*-BUS, OUT*-BUS) that are merged in an on-chip common communication bus (COM*-BUS) shared by all neuron circuits of the chip. An additional OR function is then performed between all corresponding first global result and output signals to generate second global result (R**) and output (OUT**) signals, preferably by dotting on an off-chip common communication bus (COM**-BUS) in the driver block (19). This latter bus is shared by all the neural chips that are connected thereon to incorporate a neural network of the desired size. In the chip, a multiplexer (21) may select either the first or second global output signal to be reinjected in all neuron circuits of the neural network as a feed-back signal depending on the chip operates in a single or multi-chip environment via a feed-back bus (OR-BUS). The feedback signal results of a collective processing of all the local signal.
Abstract:
The method and circuits of the present invention aim to associate a norm to each component of an input pattern presented to an input space mapping algorithm based artificial neural network (ANN) during the distance evaluation process. The set of norm s, referred to as the "component" norms is memorized in specific memorization means in the ANN . In a first embodiment, the ANN is provided with a global memory, common for all the neurons of the ANN, that memorizes all the component norms. For each component of the input pattern, all the neurons perform the elementary (or partial) distance calculation with the corresponding prototype components stored therein during the distance evaluation process using the associated component norm. The distance elementary calculations are then combined using a "distance" norm to determine the final distance between the input pattern and the prototypes stored in the neurons. In another embodiment, the set of component norms is memorized in t he neurons themselves in the prototype memorization means, so that the global memory is no longer physically necessary. This implementation allows to significantly optimize t he consumed silicon area when the ANN is integrated in a silicon chip.
Abstract:
The base circuit (30) comprises a self-referenced preamplifier (31) of the differential type connected between first and second supply voltages (VEE1, VC) and a push-pull output buffer stage (32) connected between second and third supply voltages (VC, VEE2). The push-pull output buffer stage (32) comprises a pull-up transistor (TUP) and a pull-down transistor (TDN) connected in series with the circuit output node (OUT3) coupled therebetween. These transistors are driven by complementary and substantially simultaneous signals S and S supplied by said preamplifier. Both branches of the preamplifier are tied at a first output node (M). A current source (I) is connected to said first output node. The first branch comprises a logic block (LB) performing the desired logic function of the base circuit that is connected through a load resistor (R1) to said second supply voltage (VC). In this instance, logic block consists of three parallel-connected input NPN transistors (T1, T2, T3), whose emitters are coupled together at said first output node (M) for NOR operation. The second branch is comprised of a biasing/coupling block (BB) connected to said second supply voltage and coupled both to said first output node (M) and to base node (B) of said pull-down transistor. In a preferred embodiment, this block consists of a diode-connected transistor (TC) and of a resistor (RC) connected in series with the base node (B) coupled therebetween. This block ensures both the appropriate polarization of said nodes (M, B) in DC without the need of external reference voltage generators and a low impedance path for fast signal transmission of the output signal (S) from node M to node B in AC, when input transistors of the logic block (LB) are ON. Optionally, the AC transmission can be improved by mounting a capacitor (C) between said first output and base nodes. An antisaturation block (AB), consisting typically of a Schottky Barrier Diode (SBD), is useful to prevent saturation of the pull down transistor (TDN) to further speed up the circuit.
Abstract:
A daisy chain circuit (600) is placed in each neuron circuit of a neural network. Each daisy chain circuit is serially connected to the two adjacent neuron circuits, so that all the neuron circuits are structured as a chain. Its main role is to distinguish between the two possible states of the neuron circuit: engaged or free and moreover to identify the first free "or ready to learn" neuron circuit in the chain. This distinction is based on the respective values of the input (DCI) and output (DCO) signals of the daisy chain circuit. The ready to learn neuron circuit is the only neuron circuit of the neural network whose said input and output signals are complementary to each other. It is built around a 1-bit register (601) controlled by a store enable signal (ST) which is set active at initialization or during the learning phase when a new neuron circuit must be engaged. The input terminal of the first daisy chain circuit in the chain is connected to a first logic value, so that it is the ready to learn neuron circuit by construction after initialization. After initialization, all the registers of the chain are set to a second logic value. In the learning phase, the 1-bit register contents of the ready to learn neuron circuit is set to the said first logic value by the store enable signal, it is said "engaged". The following neuron circuit in the chain then becomes the new ready to learn neuron circuit. In addition, the daisy chain circuit is adapted to generate various control signals e.g. the control signal (RS) that allows to load the input vector components in the weight memory of only the ready to learn neuron circuit during the recognition phase.