Given the wide variety of machines that can possibly be handled with YaST2, it is important to keep the user interface (UI) abstraction in mind - very much like the SCR, the UI does not necessarily run on the installation target machine. It doesn't even need to run on the same machine as the WFM.
The UI provides dialogs with "widgets" - user interface elements such as input fields, selection lists or buttons. It is transparent to the calling application if those widgets are part of a graphical toolkit such as Qt, or text based (using the NCurses library) or something completely else. An input field for example only guarantees that the user can enter and edit some value with it. A button only provides means to notify the application when the user activated it - by mouse click (if the UI supports using pointing devices such as a mouse), by key press or however else.
The UI has a small number of built-in functions - for example:
UI::OpenDialog() accepts a widget hierarchy as an argument and opens a dialog with those widgets
UI::CloseDialog() closes a dialog
UI::QueryWidget() returns a widget's property such as the current value of an input field or selection box
UI::ChangeWidget() changes a widget's property
UI::UserInput() waits until the user has taken some action such as activate a button - after which the application can call UI::QueryWidget() for each widget in the dialog to get the current values the user entered. The application does not have to handle every key press in each input field directly - the widgets are self-sufficient to a large degree.
There is virtually no low-level control for the widgets - nor is it necessary or even desired to have that. You don't specify a button's width or height - you specify its label to be "Continue", for example, and it will adapt its dimensions accordingly. If desired, more specific layout constraints can be specified: For example, buttons can be arranged in a row with equal width each. The UI will resize them as needed, giving them additional margins if necessary.
The existing UIs provide another layer of network abstraction: The graphical UI uses the Qt toolkit which is based on the X Window System's Xlib which in turn uses the X protocol (usually) running on top of TCP/IP. X Terminals can be used as well as a Linux console (that may be the installation source machine or the installation target machine or another machine connected via the network) running the X Window System or even X servers running on top of other operating systems.
The NCurses (text based) UI requires no more than a shell session - on a text terminal (serial console or other), on a Linux console, in an XTerm session, via ssh or whatever.
Currently, there is no web UI, but YaST2's concepts would easily allow for that if it proves useful or necessary.