Tuesday, 20 April 2021

Interactive Systems and User Interface

INTERACTIVE SYSTEMS AND USER INTERFACE 

In interactive systems the user and system exchange information regularly and dynamically. Norman’s evaluation/execution model is a useful approach of understanding the nature of interaction: 

* User has a goal ( something to achieve)

* User looks at the system and attempts to exercises how he would perform a series of tasks to accomplish the goal. 

* User carries out several events (providing input to the system by touching a screen, pressing buttons, speaking words, etc) 

* User looks at the results of his action and attempts to assess whether or not the goals have been achieved. 

A good interactive system is one where: 

  • Users can effortlessly work out how to control the system in an attempt to accomplish his goals. 
  • The user can simply assess the results of his action on the system.

What interactive systems do we use in our day-to-day life? However, the term interactive system can be applied to a greatly broader collection of devices, such as: 

  • The World Wide Web
  • Mobile phones
  • Cash dispensing machines
  • Windows operating systems
  • Car navigation systems
  • Data entry systems
  • Video recorders 
  • Machines driven call centre (e.g. for telephone banking) 
  • Workflow system to coordinate a team’s work-efforts.


The original interactive systems were command-line systems, which strongly controlled the interaction between the human and the computer. The user was compulsory to know the commands that might be issued and how the arguments were to be controlled. UNIX operating system and DOS (Disk Operating System) are classic examples of this category. Users were mandatory to enter data in a particular sequence. The options for the output of data were also strongly controlled and usually limited. Such systems normally put a high demand on the user to memorize commands and the syntax for issuing these commands. 

Command-line systems progressively gave way to the second generation of menu-based systems, form-based systems, and dialog-based systems that eased some of the load on memory. An automatic teller machine (ATM) is a good example of a form-based system where users are given a tightly controlled setol of promising actions. Data entry systems commonly form or dialog-oriented systems offer the user a limited set of choices but deeply relieving the memory demands of the earlier command-line systems.

Next, the third generation of interactive computing was introduced by Xerox Corporation in 1980. The Xerox Star was the outcome of a half dozen years of research and development through which the desktop metaphor, mouse, windows, icons, and bit-mapped displays were all brought together and completed to function. The Xerox Star was simulated in the Lisa and Macintosh first presented by Apple Computer Inc. in the mid-1980s. The windows, icon, menu, and pointer (WIMP) approach was made worldwide by Microsoft in the Windows family of operating systems introduced in the 1990s. With the maturation of WIMP interfaces, also known as graphical user interfaces (GUIs), interaction moved from command-based to direct manipulation. 

In command-based systems, the user specifies an action and then an object on which that action is to be performed. In a direct manipulation system, an object is selected, and then the user specifies the action to be performed on that object. The most recent developments in interactive systems have focused on visualization, virtualization, and agents. During the 1980s and 1990s, there were many efforts to take benefit of the human capability to process information visually. At the simplest level, consider that a human looking at a picture on a television screen has no problem in sharp a pattern that consists of millions of individual pixels per second, changing in both time and space. Visualization systems manipulate information at high levels of aggregation, making the information additional reachable to users.

In the 1990s, researchers began to experiment with extending interactive systems from symbolic interaction such as: mice, icons, and pointers - to virtual systems. In these systems, every effort was made to allow the user to explore a virtual world with small or no translation to symbolic form. Thus, using visualization techniques and novel forms of input devices, such as data gloves, hand movements could be used to manipulate virtual objects represented graphically in a virtual world. This virtual environment was presented to the user using two display screens, each of which provided a somewhat different perspective, giving the user a stereoscopic view of a virtual space that appeared to have strength. Work on virtual and artificial realism continues on a number of particular fronts, including a field known as telemedicine. 

The next generation of interactive systems, represented by agents in embedded systems, will yet again change how humans and computers interact. Direct manipulation environments will still be around for many years to come. At the same time, we have begun to see both agents and embedded systems make their manifestation. Embedded systems can be as easy as the analog sensor systems that open a department store door, or turn on lights when someone enters a room. At a more complex level, most cars being built today include air bag deployment systems and antilock brakes that operate invisibly by gathering data from the environment and inserting computer control between our actions and the environment. As air bag deployment systems become more difficult, they react based not simply on acceleration data, but also based on the weight of the individuals occupying the seat and their relative position (leaning forward or back) on the seat.

The basic programming paradigm had to change from the process-driven approach to an event-driven perspective. In earlier systems, the program’s main process would control what the user could do. Now, it was possible for the user to initiate a broad series of actions by selecting an object - a window, an icon, a text box. This required some method for collecting events and handling them. The X Window System on UNIX was one of the early famous systems for doing this. Each graphical component of the interface was able of producing one or more events. For example a window might be opened or closed generating an event. Similarly, a button might be pressed, or the text in a text box might be changed. The programmer’s task is to show a coordinated set of components that can generate events. The programmer is also required to write code that will start some action when an event occurs. These code fragments are called event handling functions. In object-based and object-oriented programming (OOP) environments, this task of handling events is made easier through object classes that associate default event handling methods with specific classes of objects. For example: The code for how the look of a button is changed when it is pressed may be provided as a default method of the button objects. 

The user interface (UI), in the software industrial design field of human-computer interaction, is the area where interactions between humans and systems occur. The aim of this interaction is to agree to effective operation and control of the system from the human end, whilst the system simultaneously feeds back information that aids the operator’s decision-making process. Examples of this wide concept of user interfaces include the interactive aspects of computer operating systems, heavy machinery operator controls, hand tools, and process controls. The design considerations applicable when creating user interfaces are related to or occupy such disciplines as ergonomics and psychology. Usually, the goal of user interface design is to produce a user interface that makes it easy (self-explanatory), enjoyable (user friendly), and efficient to operate a system in the way which produces the preferred result. This generally means that the operator needs to give minimal input to achieve the desired output, and also that the system minimizes undesired outputs to the human. 

With the increased use of personal computers and the relative decline in a common awareness of heavy machinery, the term user interface is generally assumed to mean the graphical user interface, while industrial control panel and machinery control design discussions more commonly refer to human-machine interfaces. The next terms for user interface are man-machine interface (MMI) and when the machine in question is a computer human-computer interface. All effective interfaces share eight quality or characteristics: 

  1. Clarity: The interface avoids uncertainty by making everything clear through language, hierarchy, flow, and metaphors for visual elements. 
  2. Concision: It’s easy to make the interface obvious by over clarifying and labeling everything, but this leads to interface bloat, where there is just too much stuff on the screen at the same moment. If too many things are on the screen, finding what you’re looking for is tricky, and so the interface becomes dull to use. The real challenge in making a good interface is to make it concise and clear at the same time. 
  3. Consistency: Keeping your interface consistent across your application is vital because it allows users to recognize usage patterns. 
  4. Efficiency: Time is money and a good interface should make the user more productive through shortcuts and good design. 
  5. Familiarity: Even if someone uses an interface for the first time, certain elements can still be familiar. Real life similes can be used to communicate meaning.
    Responsiveness: A good interface should not experience slow. This means that the interface should provide good feedback to the user about what’s happening and whether the user’s input is being successfully processed. 
  6. Responsiveness: A good interface should not experience slow. This means that the interface should provide good feedback to the user about what’s happening and whether the user’s input is being successfully processed. 
  7. Aesthetics: While we don’t need to make an interface attractive for it to do its work, making something look good will make the time users spend using the application more enjoyable; and happier users can only be a good thing. 
  8. Forgiveness: A good interface should not punish users for their mistakes but should instead provide the resource to cure them . 
__________________________________________________________________________
Reference : Brijendra Singh and Shikha Gautam, "Systems and Software Process", Published by Narosa Publication House,  Delhi 2020, ISBN: 978-81-8487-661-1


 

0 comments:

Post a Comment