You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Provide the means for specifying from where to listen for input, such as a device like the mouse or a Wacom tablet.
Motivation
The example application currently listens for input coming from the mouse, via ImGui::IsItemActive() and ImGui::GetMouseDragDelta(). But if you wanted input from somewhere else you're out of luck. Furthermore, there is no interface for choosing an input source.
DAWs lack a standard input like mouse and keyboard, and instead dedicate a section on each track from which the end user specifies which device a given track is meant to listen to once it comes time to record.
Here's what that looks like in e.g. Bitwig.
Explore
Would such an interface make sense for 2d/3d content creation?
Tracks could then represent individual characters or parts of a character like an arm or a hand. One hand is driven by Mouse 1, the other by Mouse 2. The feet could be driven by something more high-level, like a pre-authored animation loop of a walk cycle. The local time of that animation clip could be driven by a linear pedal, capable of outputting values between e.g. 0-127. With that, you could potentially animate a walkcycle in real-time using 2 mice and a pedal.
Layout
Each track contains a number of channels. The more channels you have, the more space if made available within a track, behind each of the channels. Maybe thats's a good spot?
_____________________________
| m s track 1 |
| ____________ channel 1 o |
| | | channel 2 o |
| | space here | channel 3 o |
| |____________| channel 4 o |
|_____________________________|
Device Capabilities
Some devices are capable of providing 2d position data, like a mouse. A keyboard is somewhat able, if you consider the WASD or arrow keys. For MIDI, devices typically support both notes and modulation, pitch and such. And you still assign the whole shebang to a given track. The track then records each of these capabilities.
Is there an equivalent we could apply for computer peripherals like mouse and keyboard? The keyboard being able to provide both button presses and second-order data like position via the arrow keys over time.
The text was updated successfully, but these errors were encountered:
Goal
Provide the means for specifying from where to listen for input, such as a device like the mouse or a Wacom tablet.
Motivation
The example application currently listens for input coming from the mouse, via
ImGui::IsItemActive()
andImGui::GetMouseDragDelta()
. But if you wanted input from somewhere else you're out of luck. Furthermore, there is no interface for choosing an input source.DAWs lack a standard input like mouse and keyboard, and instead dedicate a section on each track from which the end user specifies which device a given track is meant to listen to once it comes time to record.
Here's what that looks like in e.g. Bitwig.
Explore
Would such an interface make sense for 2d/3d content creation?
Inputs could be:
Tracks could then represent individual characters or parts of a character like an arm or a hand. One hand is driven by Mouse 1, the other by Mouse 2. The feet could be driven by something more high-level, like a pre-authored animation loop of a walk cycle. The local time of that animation clip could be driven by a linear pedal, capable of outputting values between e.g. 0-127. With that, you could potentially animate a walkcycle in real-time using 2 mice and a pedal.
Layout
Each track contains a number of channels. The more channels you have, the more space if made available within a track, behind each of the channels. Maybe thats's a good spot?
Device Capabilities
Some devices are capable of providing 2d position data, like a mouse. A keyboard is somewhat able, if you consider the WASD or arrow keys. For MIDI, devices typically support both notes and modulation, pitch and such. And you still assign the whole shebang to a given track. The track then records each of these capabilities.
Is there an equivalent we could apply for computer peripherals like mouse and keyboard? The keyboard being able to provide both button presses and second-order data like position via the arrow keys over time.
The text was updated successfully, but these errors were encountered: