In computer science and human-computer interaction, the user interface (of a computer program) refers to the graphical, textual and auditory information the program presents to the user, and the control sequences (such as keystrokes with the computer keyboard, movements of the computer mouse, and selections with the touchscreen) the user employs to control the program.
Types
Currently (as of 2009[update]) the following types of user interface are the most common:
* Graphical user interfaces (GUI) accept input via devices such as computer keyboard and mouse and provide articulated graphical output on the computer monitor. There are at least two different principles widely used in GUI design: Object-oriented user interfaces (OOUIs) and application oriented interfaces[verification needed].
* Web-based user interfaces or web user interfaces (WUI) accept input and provide output by generating web pages which are transmitted via the Internet and viewed by the user using a web browser program. Newer implementations utilize Java, AJAX, Adobe Flex, Microsoft .NET, or similar technologies to provide real-time control in a separate program, eliminating the need to refresh a traditional HTML based web browser. Administrative web interfaces for web-servers, servers and networked computers are often called Control panels.
User interfaces that are common in various fields outside desktop computing:
* Command line interfaces, where the user provides the input by typing a command string with the computer keyboard and the system provides output by printing text on the computer monitor. Used by programmers and system administrators, in engineering and scientific environments, and by technically advanced personal computer users.
* Tactile interfaces supplement or replace other forms of output with haptic feedback methods. Used in computerized simulators etc.
* Touch user interface are graphical user interfaces using a touchscreen display as a combined input and output device. Used in many types of point of sale, industrial processes and machines, self-service machines etc.
Other types of user interfaces:
* Attentive user interfaces manage the user attention deciding when to interrupt the user, the kind of warnings, and the level of detail of the messages presented to the user.
* Batch interfaces are non-interactive user interfaces, where the user specifies all the details of the batch job in advance to batch processing, and receives the output when all the processing is done. The computer does not prompt for further input after the processing has started.
* Conversational Interface Agents attempt to personify the computer interface in the form of an animated person, robot, or other character (such as Microsoft's Clippy the paperclip), and present interactions in a conversational form.
* Crossing-based interfaces are graphical user interfaces in which the primary task consists in crossing boundaries instead of pointing.
* Gesture interface are graphical user interfaces which accept input in a form of hand gestures, or mouse gestures sketched with a computer mouse or a stylus.
* Intelligent user interfaces are human-machine interfaces that aim to improve the efficiency, effectiveness, and naturalness of human-machine interaction by representing, reasoning, and acting on models of the user, domain, task, discourse, and media (e.g., graphics, natural language, gesture).
* Motion tracking interfaces monitor the user's body motions and translate them into commands, currently being developed by Apple[1]
* Multi-screen interfaces, employ multiple displays to provide a more flexible interaction. This is often employed in computer game interaction in both the commercial arcades and more recently the handheld markets.
* Noncommand user interfaces, which observe the user to infer his / her needs and intentions, without requiring that he / she formulate explicit commands.
* Object-oriented user interface (OOUI)
* Reflexive user interfaces where the users control and redefine the entire system via the user interface alone, for instance to change its command verbs. Typically this is only possible with very rich graphic user interfaces.
* Tangible user interfaces, which place a greater emphasis on touch and physical environment or its element.
* Text user interfaces are user interfaces which output text, but accept other form of input in addition to or in place of typed command strings.
* Voice user interfaces, which accept input and provide output by generating voice prompts. The user input is made by pressing keys or buttons, or responding verbally to the interface.
* Natural-Language interfaces - Used for search engines and on webpages. User types in a question and waits for a response.
* Zero-Input interfaces get inputs from a set of sensors instead of querying the user with input dialogs.
* Zooming user interfaces are graphical user interfaces in which information objects are represented at different levels of scale and detail, and where the user can change the scale of the viewed area in order to show more detail.
-
▼
2009
(14)
-
▼
August
(14)
- Examples of embedded systems
- Basic of Embedded system
- User interfaces of HMI
- Introduction of HMI
- Wireless SCADA System
- SCADA Systems
- What is SCADA?
- MicroLogix 1200 Controllers
- MicroLogix 1100 Controllers
- MicroLogix 1000 System
- PLC compared with other control systems
- Features of PLC
- Origin of PLC
- Basic of PLC
-
▼
August
(14)
Blog Archive