Advanced search
Start date
Betweenand


Visual mapping of text collections through a fast high precision projection technique

Author(s):
Show less -
Paulovich, Fernando Vieira ; Nonato, Luis Gustavo ; Minghim, Rosane ; Levkowitz, Haim ; Banissi, E ; Burkhard, RA ; Ursyn, A ; Zhang, JJ ; Bannatyne, MWM ; Maple, C ; Cowell, AJ ; Tianm, GY ; Hou, M
Total Authors: 13
Document type: Journal article
Source: INFORMATION VISUALIZATION-BOOK; v. N/A, p. 3-pg., 2006-01-01.
Abstract

This paper introduces Least Square Projection (LSP), a fast technique for projection of multi-dimensional data onto lower dimensions developed and tested successfully in the context of creation of text maps based on their content. Current solutions are either based on computationally expensive dimension reduction with no proper guarantee of the outcome or on faster techniques that need some sort of post-processing for recovering information lost during the process. LSP is based on least square approximation, a technique originally employed for surface modeling and reconstruction. Least square approximations are capable of computing the coordinates of a set of projected points based on a reduced number of control points with defined geometry. We extend the concept for general data sets. In order to perform the projection, a small number of distance calculations is necessary and no repositioning of the final points is required to obtain a satisfactory precision of the final solution. Textual information is a typically difficult data type to handle, due to its intrinsic dimensionality. We employ document corpora as a benchmark to demonstrate the capabilities of the LSP to group and separate documents by their content with high precision. (AU)

FAPESP's process: 04/01756-2 - Haim Levkowitz | University Massachusetts - Estados Unidos
Grantee:Maria Cristina Ferreira de Oliveira
Support Opportunities: Research Grants - Visiting Researcher Grant - International
FAPESP's process: 04/09888-5 - InfoVis2: a repository of visual mining and information visualization and sonification techniques
Grantee:Rosane Minghim
Support Opportunities: Regular Research Grants