Understanding the UI Layer and Its Execution Framework
What is the UI Layer?
Composition of the UI Layer
The Window Manager
Understanding the UI Layer and Its Execution Framework
What is the UI Layer?
Interactive applications are implemented and executed using the user interface (UI) software layers
(collectively the UI layer). The UI layer refers to a set of software that operates above the core operating system (and underneath
the application). It encapsulates and exposes system functions for
Fluid input and output (I/O)
Facilitation of development of I/O functionalities (in the form of an application programming
interface/library [API] or toolkit)
Run-time management of graphical applications and UI elements often manifested as windows or graphical user
interface
(GUI) elements (in the form of a separate application often called the window manager)
Understanding the UI Layer and Its Execution Framework
Composition of the UI Layer
Since most interfaces are graphical, the UI layer uses a two- or three-dimensional (2-D or 3-D) graphical
system based on which GUI elements are implemented (lower part of Figure 5.1). Thus, to summarize, the UI layer is largely
composed of (a) an API for creating and managing the user interface elements (e.g., windows, buttons, menus) and (b) a
window manager to allow users to operate and manage the applications through its own user interfaces.
Most interfaces are graphical.
True False
Understanding the UI Layer and Its Execution Framework
Composition of the UI Layer - Image
Figure 5.1 illustrates the UI layer as part of the system software in many computing platforms.
Understanding the UI Layer and Its Execution Framework
Composition of the UI Layer
The user interacts with the window/GUI-based applications using various input and output devices. At the same
time, aside from the general applications, the user interacts with the computer and manages multiple application
windows/tasks (e.g., resizing, focusing, cutting and pasting, etc.) using the (background running) window
manager.
Understanding the UI Layer and Its Execution Framework
The Window Manager
The window manager is regarded as both an application and an API. User applications are developed using the
APIs that represent abstracted I/O-related functionalities of the UI layer, such as those for window managing (resizing, iconifying,
dragging, copy and paste, etc.), GUI elements and widgets (windows, menus, buttons, etc.), and basic windowing
(creating/destroying window, deactivating window, etc.). These APIs are abstracted from the even lower-level APIs for 2-D/3-D
graphics and the operating system. Note that the architecture described here can be equally applied to non-window-based
systems, such as those for layer-based systems (e.g., mobile phones). Through such architecture and abstraction, it becomes
much easier to develop and implement interactive applications and their user interfaces.
Input and Output at the Low Level
At the Lowest Level
At a Higher Level
Input and Output at the Low Level
At the Lowest Level
At the lowest level, inputs and outputs are handled by the interrupt mechanism of the system software
(operating system). An interrupt is a signal to the processor indicating that an event (usually an I/O event) has occurred
and must be handled. An interrupt signal is interpreted so that the address of its handler procedure can be looked up and
executed while suspending the ongoing process for a moment. After the handler procedure is finished, the
suspended process resumes again.
An arrival of an interrupt is checked at a very fast rate as part of the processor execution cycle. In
practice, this means that the processor is always listening to the incoming events, ready to serve them as needed.
The interrupt mechanism is often contrasted with polling (also known as busy waiting). In polling, the
processor (rather than the I/O device) initiates input or output. In order to carry out I/O tasks, the processor
enters a loop, continually checking the I/O device status to see whether it is ready, and incrementally accomplishes the I/O
task. This form of an I/O is deficient in supporting asynchronous user-driven (anytime) I/O and wastes CPU time by
blocking other non-I/O processes to move on.
What is polling also called?
busy waiting the window manager widgets waiting
Input and Output at the Low Level
At a Higher Level
At a higher level, the I/O operation is often described in terms of events and event handlers, which is in fact
an abstraction of the lower-level interrupt mechanism.
This is generally called the event-driven architecture in which programs are developed in terms of events (such
as mouse clicks, gestures, keyboard input, etc.) and their corresponding handlers. Such information can be captured in
the form of a table and used for efficient execution. Figure 5.2 shows the rather complicated interrupt mechanism
abstracted into the form of a simple event handler table.
The most basic UI object in today’s visually oriented UI system would be the window (or layer). A window is a
rectangular portion of the screen associated with a given application that is used as a space and channel for
interacting with the application.
Other UI objects include buttons, menus, icons, forms, dialog boxes, text boxes, and so forth. These are often
referred to as GUI objects or widgets.
Most typically, GUI-based interactive applications would have a top window that includes all other UI objects or
widgets that are logically and/or spatially subordinate to it (Figure 5.3).
Events, UI Objects, and Event Handlers
GUI objects
The most basic UI object in today’s visually oriented UI system
would be the:
window door widget object
Events, UI Objects, and Event Handlers
Widgets and Concurrency
With the current operating systems mostly supporting concurrency, separate windows/widgets for concurrent applications
can coexist, overlapping with one another so that they can be switched to the current focus. That is, when there are
multiple windows (and one mouse/keyboard), the user carries out an action to designate the active or current window in
focus to which the input events will be channeled.
Two major methods for focusing are:
click-to-type
move-to-type
In the former, the user has to explicitly click on the window before making input into it (regardless of the mouse
position, the last object that was clicked on will be the one in focus), and in the latter, the window over which
the mouse cursor hovers becomes the focus. The move-to-type method is generally regarded as less convenient because of
the likelihood of unintended focus change due to accidental mouse movements.
While not all UI layers are modeled and implemented in an object-oriented fashion, many recent ones are. Thus we can
think of generic or abstract object classes for a window and other UI objects and widgets as being organized
hierarchically (Figure 5.4). Moreover, we can designate the background screen space as the default root system window
(which becomes automatically activated upon system start) onto which children application windows and GUI elements
(e.g., icons, menus) are placed. The background also naturally becomes the top window for the window manager process.
Events, UI Objects, and Event Handlers
UI Layers in Object-oriented Fashion
Events, UI Objects, and Event Handlers
UI Layers in Object-oriented Fashion
Top Window
File Menu
Edit Menu
Textbox 1
Textbox 1 Handler
Add Dialog Box
Textbox 2
Textbox 2 Handler
Figure 5.4 Event being dispatched to the right UI object
handler for a given application (organized as a set of UI objects and
associated event handlers in a hierarchical manner) from the application event queue
Events, UI Objects, and Event Handlers
Event handling
Whether it is the root (background) window, application (top) window, or GUI widget, as an interaction channel or
object, it will receive input from a user through input devices such as the keyboard, mouse, etc.
The physical input from the user/devices is converted into an event (e.g., by the device drivers and operating system),
which is simply data containing information about the user’s intent or action.
Aside from the event value itself (e.g., which key was pressed), an event usually contains additional information
such as its type, a time stamp, the window to which it was directed, and screen coordinates (e.g., in the case of an event
activated by a mouse or a stylus). These events are put into a queue by the operating system (or the windowing system)
and dispatched (or dequeued), e.g., according to the current focus (to be directed to the target program or process),
to invoke its corresponding handler.
Events, UI Objects, and Event Handlers
Two-tier Event-queuing System
Figure 5.5 shows the two-tier event-queuing system in greater detail. There is the system-level event-queuing system
that dispatches the events at the top application level. Each application or process also typically manages its own
event queue, dispatching them to its own UI objects. The proper event is captured by the UI object as it traverses down
the application’s hierarchical UI structure, e.g., from top to bottom. Figures 5.4 and 5.5 illustrate this process. Then
the event handler (also sometimes called the callback function) associated with the UI object is activated in response
to the event that is captured.
Events, UI Objects, and Event Handlers
Two-tier Event-queuing System
What is the event handler also called?
\callback function
Events, UI Objects, and Event Handlers
Pseudo Events
The events do not necessarily have to be generated externally by the interaction devices; indeed, sometimes they are
generated internally for special purposes (these are sometimes called pseudo-events).
For instance, when a window is resized, in addition to the resizing event itself, the internal content of the window
must be redrawn, and the same goes for the other windows occluded or newly revealed by the resized window. Special
pseudo-events are enqueued and conveyed to the respective applications/windows.
Events, UI Objects, and Event Handlers
Pseudo Events
In the case of resizing/hiding/activating and redrawing windows, it is the individual application’s
responsibility,
rather than the window manager’s, to update its display contents, because only the respective applications have the
knowledge of how to update their content. Thus a special redraw pseudo-event is sent to the application with
information about which region is to be updated (Figure 5.6).
The window content might need to be redrawn not because of the window management commands such as
resizing and window
activation, but due to the needs of the application itself, which can generate special pseudo-events for redrawing
parts of its own window. More generally, UI objects can generate pseudo-events for creating chain effects. For
example, when
a scroll bar is moved, both the window content and the scroll bar position have to be updated.
Events, UI Objects, and Event Handlers
Pseudo Events
Event-Driven Program Structure
Event-Driven Program Structure
Based on what we have discussed so far, the event-driven program structure generally takes the form of the structure
depicted in Figure 5.7.
The first initialization part of the program creates the necessary UI objects for the application and declares the
event-handler functions and procedures for the created UI objects.
Then the program enters a loop that automatically keeps removing an event from the application event queue and invoking
the corresponding handler (i.e., dispatching the event). The development environment often hides this part of the
program so that the developer does not have to worry about such system-level implementations.
However, depending on the development toolkits, the user may have to explicitly program this part as well.
Event-Driven Program Structure
Event-Driven Program Structure
Output
Explicit response
Time of computation
Output
Explicit response
Interactive behavior that is purely computational will simply be carried out by executing the event-handler procedure.
However, response to an event or application behavior is often manifested by explicit visual, aural, and haptic/tactile
output as well.
In many cases, the event handlers only compute for the response behavior and for needed changes in data or new output in
a chosen modality (e.g., visual, aural, haptic, tactile, etc.). A separate step for refreshing the display based on the
changed screen data is called the last part of the event processing loop.
Analogous processes will be called for sending commands to output devices of other modalities as well (see the last line
in Figure 5.7). Sometimes, with multimodal output, the outputs in different modalities need to be synchronized (e.g.,
output visual and aural feedback at the same, or nearly the same, time). However, not many interactive programming
frameworks or toolkits offer provisions for such a situation.
Output
Time of computation
While internal computation takes relatively little time (in most cases), processing and sending the new/changed data to
the display devices can take a significant amount of time.
For instance, a heavy use of 3-D graphic objects can be computationally expensive (e.g., on a mobile device without a
graphics subsystem), and this can become a bottleneck in the event-processing loop, thereby reducing interactivity. Thus
sometimes rendering and sensing parts can be separated into independent threads and processed at different rates to
ensure realtime interactivity.
Internal computation takes a significant amount of time.
True False
Summary
In this chapter, we looked at the inner workings of the general underlying software structure (UI layer or UI execution
framework) on which interactive programs operate.
Most UI frameworks operate in similar ways according to an event-driven structure. The hardware input devices generate
events that are conveyed to the software interfaces (i.e., UI objects), and they are processed to produce output by the
event-handling codes.
The UI layer sitting above the operating system (OS) provides the computational framework for such an
event-driven processing model and makes useful abstractions of the lower OS details for easier and more intuitive interactive
software and interface development.
The next chapters introduce toolkits and development frameworks that make interface development even
more convenient and faster.
Image Gallery
Complex interrupt mechanism abstracted as an event-handler table
Event queuing at the top application level
Event being dispatched to the right UI object handler
for a given application.
Reference
Reference Olsen, Dan. 1998. Developing user interfaces: Interactive technologies. San
Francisco, CA: Morgan Kaufman.
Questions?
Time for a Quiz!
You have 30 minutes to answer 15 questions.
Good luck!
Name one UI object.
\button
Aside from the event value itself (e.g., which key was pressed), an event usually contains additional information such as:
the layer from which it was called screen coordinates the window to which it was directed a time stamp a date its type
What does the term layer refer to?
GUI objects are also called widgets.
True False
In what ways can a user interact with the computer?
focusing loading closing resizing opening cutting and pasting
What is the physical input from the user/devices converted into?
\an event
While internal computation takes a significant amount of time, processing
and sending the new/changed data to the
display devices takes relatively little time.
True False
Connect terms with their explanations.
It is often described in terms of events and event
handlers
Synonym for GUI objects
The most basic UI object in today’s visually oriented UI
system
The monitor
The window
Busy waiting
The I\O operation
Widgets
Event handler
At the lowest level, inputs and outputs are handled by the?
\interrupt mechanism
Implement an example of a callback function.
What is the physical input from the user/devices converted into?
\an event
Specify the order of UI Software Layer.
GUI API (Toolkit)
Windowing API
Window Manager System/API
Explain what is event-driven architecture.
\button
Current operating systems don't support concurrency.
True False
At the lowest level, inputs and outputs are handled by the?
\interrupt mechanism
End of Quiz!
How would you rate the quiz? (1-very easy, 5-very hard)