diamondback.models package

Submodules

diamondback.models.DiversityModel module

Description

A diversity model realizes the selection and retention of a state as a finite collection of observations extracted from an incident signal, to maximize a minimum distance between any members of a state, according to a specified style or distance metric.

\[d_{k} = \min(\ d_{u,v}\ )\quad\quad u, v \in [\ 0,\ M\ ),\ u \neq v\]
\[d_{k} \geq d_{n}\qquad \longrightarrow\qquad d_{n} = d_{k}\]

A diversity model is an opportunistic unsupervised learning model which typically improves condition and numerical accuracy and reduces storage relative to alternative approaches including generalized linear inverse.

A state array of a specified order is defined. A stationary dimension is inferred. A style and order are specified.

Style is in ( ‘Chebyshev’, ‘Euclidean’, ‘Geometric’, ‘Manhattan’ ).

  • ‘Chebyshev’ distance is an L-infinity norm, a maximum absolute difference
    in any dimension.
\[d_{u,v} = \max(\ |\ \vec{x_{u}} - \vec{x_{v}}\ |\ )\]
  • ‘Euclidean’ distance is an L-2 norm, a square root of a sum of squared
    differences in each dimension.
\[d_{u,v} = \matrix{\sum_{i=0}^{N}(\ |\ \vec{x_{u,i}} - \vec{x_{v,i}}\ )^2|}^{0.5}\]
  • ‘Geometric’ distance is a ordered root of a product of absolute differences
    in each dimension.
\[d_{u,v} = \prod_{i=0}^{N}{(\ |\ \vec{x_{u,i}} - \vec{x_{v,i}}\ |\ )}^{\frac{1}{N}}\]
  • ‘Manhattan’ distance is an L-1 norm, a sum of absolute differences in each
    dimension.
\[d_{u,v} = \sum_{i=0}^{N}{\ (\ |\ \vec{x_{u}} - \vec{x_{v}}\ |\ )\ }\]

Example

from diamondback import DiversityModel

# Create an instance.

obj = DiversityModel( style = 'Euclidean', order = 4 )

# Model an incident signal and extract a state.

x = numpy.random.rand( 2, 32 )
y = obj.model( x )
s = obj.s
License

BSD-3C. © 2018 - 2022 Larry Turner, Schneider Electric Industries SAS. All rights reserved.

Author

Larry Turner, Schneider Electric, Analytics & AI, 2018-02-08.

class diamondback.models.DiversityModel.DiversityModel(style: str, order: int)[source]

Bases: object

Diversity model.

Initialize.

Arguments :

style : str - in ( ‘Chebyshev’, ‘Euclidean’, ‘Geometric’, ‘Manhattan’ ). order : int.

property s

Union[ List, numpy.ndarray ] - state.

Type

s

clear() None[source]

Clears an instance.

model(x: Union[List, numpy.ndarray]) numpy.ndarray[source]

Models an incident signal and produces a reference signal.

Arguments :

x : Union[ List, numpy.ndarray ] - incident signal.

Returns :

y : numpy.ndarray - diversity.

diamondback.models.PrincipalComponentModel module

Description

A principal component model analyzes an incident signal to define transformation matrices which consume an incident signal to produce a reference signal, normalized and ordered to define orthogonal axes of descending variance.

A principal component model is a supervised learning model which analyzes an incident signal representing a training set to learn a mean vector, standard deviation vector, and a collection of eigenvectors associated with an incident signal.

\[\vec{\mu_{i}} = \matrix{\ \frac{\sum_{n=0}^{N}\vec{x_{i,n}}}{N}}\]
\[\vec{\sigma_{i}} = \matrix{\ \frac{\sum_{n=0}^{N}(\ \vec{x_{i,n}} - \vec{\mu_{i}}\ )^{2}}{N}}^{0.5}\]
\[\Lambda_{n} = eig\matrix{\ cov\matrix{\ \matrix{\frac{\ X_{n}^{T} - \vec{\mu}\ }{\vec{\sigma}}\ }\ }^{T}\ }^{T}\]

An incident signal which is not a part of an inital training set is transformed without modifying a principal component model, by translation, normalization, and rotation to produce a reference signal which is a candidate for dimension reduction, in which higher order dimensions are discarded, reducing the order of the reference signal, while preserving significant and often sufficient information.

\[Y_{n} = \Lambda_{n} \ \matrix{\frac{\ X_{n}^{T} - \vec{\mu}\ }{\vec{\sigma}}\ }^{T}\]

Principal component analysis and dimension reduction has application in clustering, classification, pattern recognition, and visualization.

Example

from diamondback import PrincipalComponentModel

# Create an instance.

obj = PrincipalComponentModel( )

# Model an incident signal and extract eigenvalue, eigenvector, mean, and deviation arrays.

x = numpy.random.rand( 3, 32 )
y = obj.model( x )
eigenvalue, eigenvector, mean, deviation = obj.eigenvalue, obj.eigenvector, obj.mean, obj.deviation
License

BSD-3C. © 2018 - 2022 Larry Turner, Schneider Electric Industries SAS. All rights reserved.

Author

Larry Turner, Schneider Electric, Analytics & AI, 2019-01-25.

class diamondback.models.PrincipalComponentModel.PrincipalComponentModel[source]

Bases: object

Principal component model.

Initialize.

property deviation

numpy.ndarray.

Type

deviation

property eigenvalue

numpy.ndarray.

Type

eigenvalue

property eigenvector

numpy.ndarray.

Type

eigenvector

property mean

numpy.ndarray.

Type

mean

clear() None[source]

Clears an instance.

model(x: Union[List, numpy.ndarray]) numpy.ndarray[source]

Models an incident signal and produces a reference signal.

Arguments :

x : Union[ List, numpy.ndarray ] - incident signal.

Returns :

y : numpy.ndarray - reference signal.

Module contents

Description

Initialize.

License

BSD-3C. © 2018 - 2022 Larry Turner, Schneider Electric Industries SAS. All rights reserved.

Author

Larry Turner, Schneider Electric, Analytics & AI, 2018-03-22.