当前位置: X-MOL 学术Cognit. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Shift-Equivariant Similarity-Preserving Hypervector Representations of Sequences
Cognitive Computation ( IF 5.4 ) Pub Date : 2024-03-12 , DOI: 10.1007/s12559-024-10258-4
Dmitri A. Rachkovskij

Abstract

Hyperdimensional Computing (HDC), also known as Vector-Symbolic Architectures (VSA), is a promising framework for the development of cognitive architectures and artificial intelligence systems, as well as for technical applications and emerging neuromorphic and nanoscale hardware. HDC/VSA operate with hypervectors, i.e., neural-like distributed vector representations of large fixed dimension (usually > 1000). One of the key ingredients of HDC/VSA are the methods for encoding various data types (from numeric scalars and vectors to graphs) by hypervectors. In this paper, we propose an approach for the formation of hypervectors of sequences that provides both an equivariance with respect to the shift of sequences and preserves the similarity of sequences with identical elements at nearby positions. Our methods represent the sequence elements by compositional hypervectors and exploit permutations of hypervectors for representing the order of sequence elements. We experimentally explored the proposed representations using a diverse set of tasks with data in the form of symbolic strings. Although we did not use any features here (hypervector of a sequence was formed just from the hypervectors of its symbols at their positions), the proposed approach demonstrated the performance on a par with the methods that exploit various features, such as subsequences. The proposed techniques were designed for the HDC/VSA model known as Sparse Binary Distributed Representations. However, they can be adapted to hypervectors in formats of other HDC/VSA models, as well as for representing sequences of types other than symbolic strings. Directions for further research are discussed.



中文翻译:

序列的移位等变相似性保持超向量表示

摘要

超维计算 (HDC),也称为矢量符号架构 (VSA),是用于认知架构和人工智能系统开发以及技术应用和新兴神经形态和纳米级硬件的有前途的框架。HDC/VSA 使用超向量进行操作,即大固定维度(通常> 1000)的类似神经的分布式向量表示。HDC/VSA 的关键要素之一是通过超向量对各种数据类型(从数值标量和向量到图形)进行编码的方法。在本文中,我们提出了一种形成序列超向量的方法,该方法既提供了序列移位的等方差,又保留了附近位置具有相同元素的序列的相似性。我们的方法通过组合超向量表示序列元素,并利用超向量的排列来表示序列元素的顺序。我们使用一组不同的任务和符号字符串形式的数据来实验性地探索了所提出的表示。尽管我们在这里没有使用任何特征(序列的超向量仅由其位置处的符号的超向量形成),但所提出的方法证明了其性能与利用各种特征(例如子序列)的方法相当。所提出的技术是为称为稀疏二进制分布式表示的 HDC/VSA 模型设计的。然而,它们可以适应其他 HDC/VSA 模型格式的超向量,以及表示符号字符串以外类型的序列。讨论了进一步研究的方向。

更新日期:2024-03-13
down
wechat
bug