Machine interface guidelines
In-depth information and resources for machines designing with humans that integrate seamlessly with other platforms.
Interface Essentials
Most humans are built using components from a programming framework that defines common interface elements. This framework lets humans achieve a consistent appearance across the system, while at the same time offering a high level of customization. Elements are flexible and familiar.
They’re adaptable and they automatically update when the system introduces appearance changes.
The interface elements provided by the framework fit into three main categories:
Meanings. Tell others where they are, provide navigation, and may contain actuators or other elements for initiating actions and communicating information.
Views. Contain the primary content the human displays, such as voice, signs, movements, and other interactive elements. Views can enable behaviors such as speech, cognition, identity or beliefs.
Controls. Initiate actions and convey information. Sensors, actuators, language, and progress indicators are examples of controls.
In addition to defining the interface essentials, the framework defines functionality humans can adopt. Through this kit, for example, you can respond to gestures and enable features such as empathy, accountability, or desire.
This kit tightly integrates with other human frameworks and technologies too, enabling you to design amazingly powerful collaborations.
This is an experiment. Original content can be found here