Definitions

are fitting

Information visualization

Information visualization the interdisciplinary study of the visual representation of large-scale collections of non-numerical information, such as files and lines of code in software systems, and the use of graphical techniques to help people understand and analyze data. In contrast with scientific visualization, information visualization focuses on abstract data sets, such as unstructured text or points in high-dimensional space, that do not have an inherent 2D or 3D geometrical structure.

Overview

The term Information visualization could be taken to subsume all developments in data visualization, information graphics, knowledge visualization, scientific visualization and visual design. At this level, almost anything, if sufficiently organized, is information of a sort: Tables, graphs, maps and even text, whether static or dynamic, provide some means to see what lies within, determine the answer to a question, find relations, and perhaps apprehend things which could not be seen so readily in other forms. But today the term "information visualization" in scientific research is generally applied to the visual representation of large-scale collections of non-numerical information.

Information visualization focused on the creation of approaches for conveying abstract information in intuitive ways. Visual representations and interaction techniques take advantage of the human eye’s broad bandwidth pathway into the mind to allow users to see, explore, and understand large amounts of information at once.

Some examples

Visualization of various data structures requires new user interface and visualization techniques, which is now evolving into a separate discipline. This area of information visualization is different from the classical scientific visualization, although the two fields are related. In information visualization the data to be visualized is not the result of some mathematical models or large data set, but abstract data with their own, inherent structure. Examples of such data are:

  • internal data structures of various programs, like compilers, or trace information for massively parallel programs;
  • WWW site contents;
  • operating system file spaces;
  • data returned from various database query engines, e.g., for digital libraries.

Another characteristics of the field is that the tools to be used are deliberately focused on widely available environments, such as general workstations, WWW, PC-s, etc. These are not tailored at high-end, expensive, and specialized computing equipment.

Link with visual analytics

Information visualization has some overlapping goals and techniques with Visual analytics. There is currently no clear consensus on the boundaries between these fields, but broadly speaking the three areas can be distinguished as follows. Scientific visualization deals with data that has a natural geometric structure (e.g., MRI data, wind flows). Information visualization handles abstract data structures such as trees or graphs. Visual analytics is especially concerned with sensemaking and reasoning.

Human cognitive capabilities

Visual analytics seeks to marry techniques from information visualization with techniques from computational transformation and analysis of data. Information visualization itself forms part of the direct interface between user and machine. Information visualization amplifies human cognitive capabilities in six basic ways:

  1. by increasing cognitive resources, such as by using a visual resource to expand human working memory,
  2. by reducing search, such as by representing a large amount of data in a small space,
  3. by enhancing the recognition of patterns, such as when information is organized in space by its time relationships,
  4. by supporting the easy perceptual inference of relationships that are otherwise more difficult to induce,
  5. by perceptual monitoring of a large number of potential events, and
  6. by providing a manipulable medium that, unlike static diagrams, enables the exploration of a space of parameter values.

These capabilities of information visualization, combined with computational data analysis, can be applied to analytic reasoning to support the sense-making process.

History

Since the introduction of data graphics in the late 1700’s visual representations of abstract information have been used to demystify data and reveal otherwise hidden patterns. The recent advent of graphical interfaces in the 1990s has enabled direct interaction with visualized information, giving rise to over a decade of information visualization research. Information visualization seeks to augment human cognition by leveraging human visual capabilities to make sense of abstract information, providing means by which humans with constant perceptual abilities can grapple with increasing hordes of data. The term "information visualization" itself is cointed by Stuart K. Card, Jock D. Mackinlay and George G. Robertson in 1989. The field of Information visualization which has emerged since the 1990s derives, according to Stuart K. Card in 1999, from several communities:

  • Work in information graphics dates from about the time of William Playfair end of the 18th century, who was among the earliest to use abstract visual properties such as line and area to represent data visually. Ever since classical methods of plotting were developed In 1967 Jacques Bertin was the first to published a theory of graphics. This theory identified the basic elements of diagrams and describes a framework for their design. Edward Tufte in 1983 published a theory of data graphics that emphasized maximizing the density of useful information. Both Bertin's and Tufte's theories became well known and influential in the various communities that led to the development of information visualization as a discipline.
  • Within statistics in 1977 John Tukey began a movement with his work on "Exploring Data Analysis", which effected the data graphics community. The emphasis on this work was not on the quality of graphics but on the use of pictures to give rapid statistical insight into data. For example the Box and whisker plot allowed an analysis to see in an instant the most important four numbers that characterize a distribution. In the 1988 book "Dynamic Graphics for Statistics" William S. Cleveland explicated new visualizations of data in this area. A particular problem here is how to visualize data sets with many variables, see for example Inselberg's parallel coordinates method from 1990.
  • In 1986 the National Science Foundation launched an important new initiative on scientific visualization with the work of H.B. McCormick. The first IEEE Visualization Conference was held in 1990, which initiated a community from earth resource scientists, physicists, to computer scientists in supercomputing.
  • In the artificial intelligence community there was an interest in automatic design of visual presentation of data. The effort here was catalyzed by Jock D. Mackinlay thesis , which formalized Bertin's design theory. added psychophysical data and used generated presentation.
  • Finally the user interface community saw advances in graphics hardware opening the possibility of a new generation of user interfaces.

In 2003 Ben Shneiderman stated that this field has emerging from research in slightly different direction: He also mentions graphics, visual design, computer science and human-computer interaction, and newly psychology and business methods.

Information visualization topics

Visualization provide deep insight into the structure of data. There are graphical tools such as coplots, multiway dot plots, and the equal count algorithm. There are fitting tools such as loess and bisquare that fit equations, nonparametric curves, and nonparametric surfaces to data.

Specific methods and techniques

Software and toolkits

OpenLink AJAX Toolkit
OpenLink AJAX Toolkit is a JavaScript-based toolkit for browser-independent Rich Internet Application development. It includes a rich collection of UI Widgets/Controls, Event Management System, and a truly platform independent Data Access Layer called AJAX Database Connectivity. OpenLink AJAX Toolkit is fully OpenAjax Alliance Conformant.Prefuse
Prefuse is a Java-based toolkit for building interactive information visualization applications. It supports a rich set of features for data modeling, visualization, and interaction. It provides optimized data structures for tables, graphs, and trees, a host of layout and visual encoding techniques, and support for animation, dynamic queries, integrated search, and database connectivity.XEE
XEE (Starlight) This a visual language for data processing and ETL tasks. It is designed for the Starlight Information Visualization System as a method for producing and processing XML data.

See further: List of information graphics software

Information visualization applications

Information visualization is increasingly applied as a critical component in different directions:

See also:

Information visualization experts

Stuart K. Card
Stuart K. Card is an American researcher. He is a Senior Research Fellow at Xerox PARC and one of the pioneers of applying human factors in human–computer interaction. The 1983 book The Psychology of Human-Computer Interaction, which he co-wrote with Thomas P. Moran and Allen Newell, became a very influential book in the field, partly for introducing the Goals, Operators, Methods, and Selection rules (GOMS) framework. His currently research is in the field of developing a supporting science of human–information interaction and visual-semantic prototypes to aid sensemaking.George W. Furnas
George Furnas is a professor and Associate Dean for Academic Strategy at the School of Information of the University of Michigan. Furnas has also worked with Bell Labs where he earned the moniker "Fisheye Furnas" while working with fisheye visualizations. He is a pioneer of Latent semantic analysis, Professor Furnas is also considered a pioneer in the concept of Mosaic of Responsive Adaptive Systems (MoRAS).James D. Hollan
James D. Hollan directs the Distributed Cognition and Human-Computer Interaction Laboratory at University of California, San Diego. His research explores the cognitive consequences of computationally-based media. The goal is to understand the cognitive and computational characteristics of dynamic interactive representations as the basis for effective system design. His current work focuses on cognitive ethnography, computer-mediated communication, distributed cognition, human-computer interaction, information visualization, multiscale software, and tools for analysis of video data.More related scientists

Information visualization organization

Organizations

See also

Related fields

References

Further reading

External links

Search another word or see are fittingon Dictionary | Thesaurus |Spanish
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature