This talk is about artificial spin ice. This model is based on a grid of coupled magnets that can be controlled with a magnetic field. The geometry of that grid can very greatly the kind of behavior one may observe in such systems.
The authors want to use the spin ice model for reservoir computing. They measure useful quantities such as kernel quality \(K\) (ability to separate inputs) and generalization capabilities \(G\) (how similar inputs yield similar results). Because both are needed for computation according to the authors, they also compute the computational capacity \(Q = K - G\).
Input is a binary stream and is encoded in the following way:
0 is coded as and angle \(\phi_0\) of the magnetic
1 is coded with \(\phi_1 = \phi_0 + 90°\).
The bit streams are 100 bits, and for measuring \(K\) they use 220
random sequences and for \(G\) 220 sequences with 40 random bits
and the rest never changes.
They show good results in term of computational capacity and possibility to coarse-grain by taking the mean value of magnets to create a large cell. It seems therefore that ASI are reasonably good computers according to those reservoir computing metrics and show evidence of slowly fading memory. The computational capacity is also scalable with the number of nodes and accessible at both the large and small scales.