Jump to content

Latent semantic mapping

From Wikipedia, the free encyclopedia

This is the current revision of this page, as edited by Jarble (talk | contribs) at 02:04, 19 May 2021 (link dimensionality reduction using Find link). The present address (URL) is a permanent link to this version.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Latent semantic mapping (LSM) is a data-driven framework to model globally meaningful relationships implicit in large volumes of (often textual) data. It is a generalization of latent semantic analysis. In information retrieval, LSA enables retrieval on the basis of conceptual content, instead of merely matching words between queries and documents.

LSM was derived from earlier work on latent semantic analysis. There are 3 main characteristics of latent semantic analysis: Discrete entities, usually in the form of words and documents, are mapped onto continuous vectors, the mapping involves a form of global correlation pattern, and dimensionality reduction is an important aspect of the analysis process. These constitute generic properties, and have been identified as potentially useful in a variety of different contexts. This usefulness has encouraged great interest in LSM. The intended product of latent semantic mapping, is a data-driven framework for modeling relationships in large volumes of data.

Mac OS X v10.5 and later includes a framework implementing latent semantic mapping.[1]

See also

[edit]

Notes

[edit]

References

[edit]
  • Bellegarda, J.R. (2005). "Latent semantic mapping [information retrieval]". IEEE Signal Processing Magazine. 22 (5): 70–80. Bibcode:2005ISPM...22...70B. doi:10.1109/MSP.2005.1511825. S2CID 17327041.
  • J. Bellegarda (2006). "Latent semantic mapping: Principles and applications". ICASSP 2006. Archived from the original on 2013-08-24. Retrieved 2013-08-24.