Software with graphical interfaces for everyday work is nowadays typically built for a particular method of interfacing; for example, VLC for Linux is built using QT and follows the classical desktop menu-bar window style, while VLC for Android looks vastly different and AFAICT shares barely more with Linux VLC than the (very comprehensive) video rendering library.
Adaption to the characteristics of the currently employed interfaces is necessary, but should be more fluid: When a user sits back from his Desktop PC where he has started a video and further uses a remote control to play/pause, the program should adapt at run time, and offer appropriate means of interaction. With the EOMA68 standard on its way to production, there might be an even more compelling use case: A device can go into suspend-to-disk plugged into a desktop mouse/keyboard/monitor interface, and can wake up on a touchscreen-only phone.
I'd like to see this usable in everyday programs; below, I've collected some more examples, and plans for how this could be implemented.
Human Interface Guidelines (HIGs): A set of rules that instruct application developers how to implement common user interface situations in order to achieve a consistent look and feel and intuitive usability of a single platform. (A real HIG might contain instructions for different scenarios, eg. Ubuntu HIG might specify different behaviors for Ubuntu Phones. Such HIGs would already fulfil some of what is suggested here. For this text, "HIG" means either a set of guidelines that applies to a single usage scenario of a fluid HIG applied to a particular scenario.)
User interface paradigm: A pictoral and interactive language (or rather language family) suitable for a set of interaction methods. User interface paradigms are (in FRBR terms) realzied in HIGs. User interface paradigms often employ metaphors to allow the user to deduce meaningful interactions from real-world objects (eg. the "Button" term often seen in desktop-metaphor style interfaces alludes to physical push-buttons for which "pressing" is the universal interaction).
Widget: A term of a user interface paradigm, or its realization in a GUI toolkit.
Windows and Icons: The classical computer desktop (Gnome up to 2, XFCE, KDE up to 4, Microsoft Windows at least up to 7) is based on the desktop metaphor and the paper paradigm. Keyboard, mouse and monitor are required. Many environments work with only one of the input devices for most of the common operations (navigating through a preferences dialog with the keyboard may be tricky but possible), and have substitution mechanisms (on-screen keyboard, mouse emulation).
Some programs in this area adapt to the currently used desktop's human interface guidelines (e.g. Firefox: The "Preferences" menu item can be placed under "Tools" or under "Edit" in the menu bar; the location of "OK" / "Cancel" buttons in dialog windows changes).
Recently, desktop environments have started allowing touch interfaces, roughly in the direction of this idea, but to different extents. The Enlightenment desktop environment has different profiles for touch and desktop systems, and the GTK3 toolkit allows finger scrolling in text areas (where mouse scrolling is interpreted as selecting text).
2000s' Nokia phones: With two menu menu buttons (usually explained on screen), to to four arrow keys and the hang-up key, they managed to keep a consistent user experience for over a decade.
2010s' smart phones: While there are several HIGs (or, in Android, versions thereof) in active use, an underlying "swipe" paradigm has emerged, with some mechanical elements (kinetic scrolling, elasticity).
Command line interfaces: These are noteworthy because there are two distinct ways of editing lines of text, originating from the widespread editors EMACS and VI. Many command line applications, often by means of the readline library, respect a per-user setting of the preferred editing method from a common configuration file (.inputrc) or even environment introdspection (ZSH looks at the EDITOR variable to deduce the user's preferences).
That mechanism provides a fine example of how the choice of the right user interface is not only depending on the physical input device but also on the user's preferences.
Gaming consoles: They deserve mention because they usually display tight integration of input devices and software. Nintendo games typically display a "Press start" message (a button only found on their controllers), and in-game user interface elements or help screens frequently refer to the color and/or location of buttons on the controller.
The way the game screen changes the appearance of the "B" button in the Zelda game series depending on whether an N64 (with green "B" button) or a Gamecube (with red "B" button) is used was one of the inspirations for this idea.
I want to see user interface customizations for input methods (where they can not be automated) done in the same way as localization: Done by people who are experts in the language, and don't need to know much about the internals of the program.
This might be easiest to experiment with in remote controls, as the code base to touch for a full-desktop demo is relatively small. A wireless smartphone-like device could be configured to provide (possibly as a web site for simple extensibility) a touchpad-style control pad, switchable to a virtual keyboard. Applications could then register their own per-application input devices (eg. in X window properties), and the input server would then offer an additional option next to touchpad and keyboard.
These additional input devices could range from very generic ("First-person-shooter mouse" -- don't try to inject absolute coordinates, just forward relative movement, and maybe offer some more mouse buttons and/or WASD keys) to toolkit specific (dropdown menus are hard to navigate with imprecise tools, but at least the classical GTK application dropdowns could be rendered on the smartphone directly) to per-application (an image viewer could offer a smaller version of the image for panning / pinch zooming / swiping, Kodi (former XBMC) could offer something like Kore).
This is a very incomplete draft.
This page is part of chrysn's public personal idea incubator; go up for its other entries, or read about the idea of having an idea incubator for more information on what this is.