A single-file, immediate-mode sequencer widget for C++17, Dear ImGui and EnTT


A single-file, immediate-mode sequencer widget for C++17, Dear ImGui and EnTT

Table of Contents

Try it

You could build it, or you can download a pre-built version and fool around.


Key Description
QWER Switch between tools
Click + Drag Manipulate the coloured squares
Click + Drag Move events around in time
Alt + Click + Drag Pan in the Event Editor
Delete Delete all events
Space Play and pause
F1 Toggle the ImGui performance window
F2 Toggle the Theme Editor
Backspace (debug) Pause the rendering loop
Enter (debug) Redraw one frame


It's an sequence editor, in the spirit of MIDI authoring software like Ableton Live, Bitwig and FL Studio, where each event carry a start time, a duration and handle to your custom application data.

Heads up!

This is a work in progress, alpha at best, and is going through changes (that you are welcome to participate in!)

What makes Sequentity different, and inspired its name, is that it is built as an Entity-Component-System (ECS), and events are a combination of start time, length and custom application data; as opposed to individual events for start and end; suitable for e.g. (1) create a dynamic rigid body, (2) edit said body, whilst maintaining reference to what got created and (3) delete the body at the end of the event.

entt::registry& registry;

auto entity = registry.create();
auto& track = registry.assign
   "My first track");

   auto& channel = Sequentity::PushChannel(track, MyEventType, 
   "My first channel");

   auto& event = Sequentity::PushEvent(channel, 
   // time, length

   while (
   true) {
   "Event Editor");

What can I use it for?

If you need to record anything in your application, odds are you need to play something back. If so, you may also need to edit what got recorded, in which case you can use something like Sequentity.

I made this for recording user input in order to recreate application state exactly such that I could record once more, on-top of the previous recording; much like how musicians record over themselves with various instruments to produce a complete song. You could theoretically decouple the clock-time aspect and use this as playback mechanism for undo/redo, similar to what ZBrush does, and save that with your scene/file. Something I intend on experimenting with!


  • Build upon the decades of UI/UX design found in DAWs like Ableton Live and Bitwig
  • Visualise 1-100'000 events simultaneosuly, with LOD if necessary
  • No more than 1 ms per call on an Intel-level GPU
  • Fine-grained edits to properties of individual events up close
  • Coarse-grained bulk-edits to thousands of events from afar

Is there anything similar?

I'm sure there are, however I was only able to find one standalone example, and only a few others embedded in open source applications.

If you know any more, please let me know by filing an issue!

Finally, there are others with a similar interface but different implementation and goal.


  • Per-event application data Attach any of your application data to an event, and retrieve it later
  • Per-event, channel and track coloring To stay organised with lots of incoming data
  • Consolidated Events Individual events with a start and length, as opposed separate events for start and end
  • Overlapping Events Author events
  • Single-file library Distributed as a single .h file of ~1'000 lines for easy inclusion into your project
  • Cross-fade Overlap the end of one event with the start of another for a cross-blend, to do e.g. linear interpolation
  • Event Priority When events overlap, determine the order in which they are processed
  • Group Events For drawing and manipulating multiple groups of entities together
  • Mini-map Like the one in Sublime Text; for when you've got lots of events
  • Event Scaling Sometimes, the input is good, but could use some fine-tuning
  • Event Cropping Other times, part of the input isn't worth keeping
  • Track Folding For when you have way too much going on
  • Track, Channel and Event Renaming Get better organised
  • Custom Event Tooltip Add a reminder for yourself or others about what an event is all about
  • Event Vertical Move Implement moving of events between tracks (i.e. vertically)
  • Zoom Panning works by holding ALT while click+dragging. Zooming needs something like that.
  • One-off events Some things happen instantaneously

sequentitydemo1 sequentitydemo3 sequentitydemo2 sequentity_zooming sequentitydemo4 sequentitydemo6

Design Decisions

  • No class instance ImGui widgets generally don't require an instance, and neither does Sequentity
  • Events -> Channels -> Tracks Events are organised into these three groups
  • Integer Event Type Leaving definition and interpretation of types to the application author
  • Integer Time Time is represented as samples, rather than frames*
  • 1 Entity, 1 Track Events ultimately operate on components relative some entity
  • void* for application data In search of a better alternative, as it complicates cleanup. Let me know!
  • No clock time The application is responsible for managing the event loop

* The difference being that a sample is a complete snapshot of your application/game state, whereas a frame is a (potentially fractal) point in time, e.g. 1.351f


These are going into GitHub issues shortly.

  • Stride There are a few values that work, but make no sense, like stride
  • Bug, hot-swap tool Translate something and switch tool without letting go
  • Bug, event at end Click to add an event on the end frame, and it'll create one erroneously
  • Cosmetics, transitions Duration of transitions is based on a solid 60 fps, should be relative wallclock time
  • Refactor, Unify data types Data types in Sequentity is a mixture of native, Magnum and ImGui.
  • Smooth Panning and Zooming Any change to these have a nice smoothing effect
  • Drag Current Time You can, but it won't trigger the time-changed callback

Open Questions

I made Sequentity for another (commercial) project, but made it open source in order to seek help from the open source community. This is my first sequencer-like project and in fact my first C++ project (with <4 months of experience using the language), so I expect lots of things to be ripe for improvement.

Here are some of the things I'm actively looking for answers to and that you are welcome to strike up a dialog about in a new issue. (Thank you!)

  • Start, End and Beyond Events are currently authored and stored in memory like they appear in the editor; but in your typical MIDI editor the original events don't look like this. Instead, events are standalone, immutable. An editor, like Cubase, then draws each consecutive start and end pair as a single bar for easy selection and edits. But do they store it in memory like this? I found it challenging to keep events coming in from the application together. For example, if I click and drag with the mouse, and then click with my Wacom tabled whilst still holding down the mouse, I would get a new click event in the midst of drag events, without any discernable way to distinguish the origin of each move event. MIDI doesn't have this problem, as an editor typically pre-selects from which device to expect input. But I would very much like to facilitate multiple mice, simultaneous Wacom tablets, eye trackers and anything capable of generating interesting events.
  • How do we manage selection? Sequentity manages the currently selected event using a raw pointer in its own State component, is there a better way? We couldn't store selection state in an event itself, as they aren't the ones aware of whether their owner has got them selected or not. It's outside of their responsibility. And besides, it would mean we would need to iterate over all events to deselect before selecting another one, what a waste.

On top of these, there are some equivalent Application Open Questions for the Tools and Input handling which I would very much like your feedback on.


Sequentity is distributed as a single-file library, with the .h and .cpp files combined.

  1. Copy Sequentity.h into your project
  2. #define SEQUENTITY_IMPLEMENTATION in one of your .cpp files
  3. #include
  4. See below


  • ImGui Which is how drawing and user input is managed
  • EnTT An ECS framework, this is where and how data is stored.


Sequentity can draw events in time, and facilitate edits to be made to those events interactively by the user. It doesn't know nor care about playback, that part is up to you.

New to EnTT?

An EnTT Primer

Here's what you need to know about EnTT in order to use Sequentity.

  1. EnTT (pronounced "entity") is an ECS framework
  2. ECS stands for Entity-Component-System
  3. Entities are identifiers for "things" in your application, like a character, a sound or UI element
  4. Components carry the data for those things, like the Color, Position or Mesh
  5. Systems operate on that data in some way, such as adding +1 to Position.x each frame

It works like this.

(entity, 5.0f, 1.0f); // 2nd argument onwards passed to constructor // ..and then iterate over that data registry.view ().each([](auto& position) { position.x += 1.0f; });">
// You create a "registry"
auto registry = entt::registry;

// Along with an entity
auto entity = registry.create();

// Add some data..
struct Position {
    float x { 0.0f };
    float y { 0.0f };
     // 2nd argument onwards passed to constructor

     // ..and then iterate over that data
      auto& position) {
      x += 

A "registry" is what keeps track of what entities have which components assigned, and "systems" can be as simple as a free function. I like to think of each loop as its own system, like that one up there iterating over positions. Single reponsibility, and able to perform complex operations that involve multiple components.

Speaking of which, here's how you combine components.

    auto& position, 
    auto& color) {
    x += color.

This function is called on every entity with both a position and color, and combines the two.

Sequentity then is just another component.


This component then stores all of the events related to this entity. When the entity is deleted, the Track is deleted alongside it, taking all of the events of this entity with it.


You could also keep the entity, but erase the track.


And when you're fed up with entities and want to go home, then just:


And that's about it as far as Sequentity goes, have a look at the EnTT Wiki along with my notes for more about EnTT. Have fun!

Here's how you draw.

// Author some data
entt::registry& registry;
entity = registry.create();

// Events may carry application data and a type for you to identify it with
struct MyEventData {
    float value { 0.0f };

enum {
    MyEventType = 0

auto& track = registry.assign
   (entity); {
   label = 
   "My first track";
   color = 

   auto& channel = Sequentity::PushChannel(track, MyEventType); {
   label = 
   "My first channel";
   color = 

   auto& event = Sequentity::PushEvent(channel); {
   time = 
   length = 

   // Draw it!


And here's how you query.

const int time { 13 };
Sequentity::Intersect(track, time, [](const auto& event) {
    if (event.type == MyEventType) {

        // Do something interesting

The example application uses events for e.g. translations, storing a vector of integer pairs representing position. For each frame, data per entity is retrieved from the current event and correlated to a position by computing the time relative the start of an event.

Event Handlers

What you do with events is up to you, but I would recommend you establish so-called "event handlers" for the various types you define.

For example, if you define Translate, Rotate and Scale event types, then you would need:

  1. Something to produce these
  2. Something to consume these

Producers in the example applications are so-called "Tools" and operate based on user input like the current mouse position. The kind of tool isn't necessarily bound or even related to the type of event it produces. For example, a TranslateTool would likely generate events of type TranslateEvent with TranslateEventData, whereby you may establish an equivalent TranslateEventHandler to interpret this data.

enum EventTypes_ : Sequentity::EventType {
    TranslateEvent = 0;

struct TranslateEventData {
    int x;
    int y;

void TranslateEventHandler(entt::entity entity, const Sequentity::Event& event, int time) {
    auto& position = Registry.get
   auto& data = 
    // ...


Tracks are sorted in the order of their EnTT pool.

   const entt::entity lhs, 
   const entt::entity rhs) {
   return Registry.
    (lhs) < Registry.


State - such as the zoom level, scroll position, current time and min/max range - is stored in your EnTT registry which is (optionally) accessible from anywhere. In the example application, it is used to draw the Transport panel with play, stop and visualisation of current time.

auto& state = registry.ctx

When state is automatically created by Sequentity if you haven't already done so. You may want to manually create state for whatever reason, which you can do like this.

auto& state = registry.set
state.current_time = 

   // E.g.


To draw the event editor with the current time set to 10.


Sequentity provides 1 ECS component, and 2 additional inner data structures.

events; }; /** * @brief A collection of channels * */ struct Track { const char* label { "Untitled track" }; ImVec4 color { ImColor::HSV(0.66f, 0.5f, 1.0f) }; bool solo { false }; bool mute { false }; std::unordered_map channels; // Internal bool _notsoloed { false }; };">
 * @brief A Sequentity Event
struct Event {
    TimeType time { 0 };
    TimeType length { 0 };

    ImVec4 color { ImColor::HSV(0.0f, 0.0f, 1.0f) };

    // Map your custom data here, along with an optional type
    EventType type { EventType_Move };
    void* data { nullptr };

     * @brief Ignore start and end of event
     * E.g. crop = { 2, 4 };
     *       ______________________________________
     *      |//|                              |////|
     *      |//|______________________________|////|
     *      |  |                              |    |
     *      |--|                              |----|
     *  2 cropped from start             4 cropped from end
    TimeType crop[2] { 0, 0 };

    /* Whether or not to consider this event */
    bool enabled { true };

    /* Events are never really deleted, just hidden from view and iterators */
    bool removed { false };

    /* Extend or reduce the length of an event */
    float scale { 1.0f };

    // Visuals, animation
    float height { 0.0f };
    float thickness { 0.0f };


 * @brief A collection of events
struct Channel {
    const char* label { "Untitled channel" };

    ImVec4 color { ImColor::HSV(0.33f, 0.5f, 1.0f) };



     * @brief A collection of channels



    Track {
    char* label { 
    "Untitled track" };

    ImVec4 color { 
    0f) };

    bool solo { 
    false };
    bool mute { 
    false };

     // Internal
     bool _notsoloed { 
     false };


All data comes in the form of components with plain-old-data, including state like panning and zooming.



See Todo for now.

  • Safe guards for operator overloading?

    Safe guards for operator overloading?


    This might be due to the limits of what I know so I'm not sure if it's an issue or a support request :)

    I'm using ImGui's math overloads by using the define IMGUI_DEFINE_MATH_OPERATORS


    doing that conflicts with yours, and I have no idea how to overcome that... I'm using those in a bunch of places in my own libraries.

    Is there a way to make the overloads available only inside a namespace for instance or a way to avoid clashes?


    opened by melMass 3
  • License?



    Would you consider licensing the software under a standard bsd-2 license? I wouldn't be permitted to provide a copy of the software or the software product being developed for a customer.


    opened by bsthorpe 3
  • Event cropping with Head and Tail sections on Event clips

    Event cropping with Head and Tail sections on Event clips


    I'd use Sequentity in our demotool, as seemingly this is the most mature one out there for ImGui I could find. However we'd needed to modify Event ranges from the GUI (instead of PushEvent). So I took a day and done this pull-request. I've seen there were already plans to make it, I hope what I did aligns with those plans actually.


    One caveat I haven't payed too much attention to yet: the Example application resets event.length during playback, when the current time hits the start of the event, I hope that's coming from the example application and not from Sequentity itself, if that's the case, sorry for my oversight, I'm not yet too familiar with Entt (seems to be nice thing tho, as I can see from the Example code)

    So as you can see I haven't touched the Example application actually to deal with changeable event length.

    I hope you like it, if you have any feedback, I can try to fix it or conform to it.

    opened by microdee 3
  • Custom deleter for application data

    Custom deleter for application data


    Avoid memory leaks.


    Events carry your application data as a void* which means Sequentity can't help you with the removal of said data.

    // Store
    MyData data = new MyData;
    Sequentity::Event event;
    event.data = static_cast<void*>(data);
    // Retrieve
    MyData& data = static_cast<MyData*>(event.data);

    Instead, your application is solely responsible for both creating and removing of related memory. That's no good..

    One way of tackling this could be a unique_ptr with a custom deter, see suggestion here

    opened by alanjfs 1
  • Newbie question... How to generate the VS project...?

    Newbie question... How to generate the VS project...?

    Hey @alanjfs , thanks for sharing this. It looks really cool.

    Sorry for this silly question, Just learning to compile the project. But could you write some lines about generating the projects?

    What I am doing wrong?

    • Way 1: I downloaded Ninja and added it to the system path environment (Because it was mentioned into Example/CMakeSettings.json)

    From the sequentity-master/example folder I run cmake -G Ninja, and I am getting this errors:

    PS F:\openFrameworks\addons\ofxSurfingImGui\examples\5_Sequentity\MISC\sequentity-master\sequentity-master\Example> cmake -G Ninja

    CMake Warning:
      No source or binary directory provided.  Both will be assumed to be the
      same as the current working directory, but note that this warning will
      become a fatal error in future CMake releases.
    -- The CXX compiler identification is unknown
    CMake Error at CMakeLists.txt:3 (project):
      No CMAKE_CXX_COMPILER could be found.
      Tell CMake where to find the compiler by setting either the environment
      variable "CXX" or the CMake cache entry CMAKE_CXX_COMPILER to the full path
      to the compiler, or to the compiler name if it is in the PATH.
    -- Configuring incomplete, errors occurred!
    See also "F:/openFrameworks/addons/ofxSurfingImGui/examples/5_Sequentity/MISC/sequentity-master/sequentity-master/Example/CMakeFiles/CMakeOutput.log".
    See also "F:/openFrameworks/addons/ofxSurfingImGui/examples/5_Sequentity/MISC/sequentity-master/sequentity-master/Example/CMakeFiles/CMakeError.log".
    PS F:\openFrameworks\addons\ofxSurfingImGui\examples\5_Sequentity\MISC\sequentity-master\sequentity-master\Example>
    • Way 2: When using the Gui version of make I have come further and created the VS2017/VS2019 projects, but I am having errors when compiling:




    opened by moebiussurfing 3
  • Per-track input

    Per-track input


    Provide the means for specifying from where to listen for input, such as a device like the mouse or a Wacom tablet.


    The example application currently listens for input coming from the mouse, via ImGui::IsItemActive() and ImGui::GetMouseDragDelta(). But if you wanted input from somewhere else you're out of luck. Furthermore, there is no interface for choosing an input source.

    DAWs lack a standard input like mouse and keyboard, and instead dedicate a section on each track from which the end user specifies which device a given track is meant to listen to once it comes time to record.

    Here's what that looks like in e.g. Bitwig.



    Would such an interface make sense for 2d/3d content creation?

    Inputs could be:

    • Mouse
    • Keyboard
    • Wacom Pen
    • Touchscreen
    • Midi XY Pad
    • More

    Tracks could then represent individual characters or parts of a character like an arm or a hand. One hand is driven by Mouse 1, the other by Mouse 2. The feet could be driven by something more high-level, like a pre-authored animation loop of a walk cycle. The local time of that animation clip could be driven by a linear pedal, capable of outputting values between e.g. 0-127. With that, you could potentially animate a walkcycle in real-time using 2 mice and a pedal.


    Each track contains a number of channels. The more channels you have, the more space if made available within a track, behind each of the channels. Maybe thats's a good spot?

    | m  s                track 1 |
    |  ____________   channel 1 o |
    | |            |  channel 2 o |
    | | space here |  channel 3 o |
    | |____________|  channel 4 o |

    Device Capabilities

    Some devices are capable of providing 2d position data, like a mouse. A keyboard is somewhat able, if you consider the WASD or arrow keys. For MIDI, devices typically support both notes and modulation, pitch and such. And you still assign the whole shebang to a given track. The track then records each of these capabilities.

    Is there an equivalent we could apply for computer peripherals like mouse and keyboard? The keyboard being able to provide both button presses and second-order data like position via the arrow keys over time.

    opened by alanjfs 0
  • Input Handling 2.0

    Input Handling 2.0

    A discussion topic regarding the way input is managed in the example application and how it can be improved.


    Translate input from any device - e.g. mouse, Wacom, Wii or XBox controller, eye tracker - to generic input, independent of device origin, with support for two or more devices operating in parallel, such as two mice.


    The coloured squares in the example application are currently controlled by click + dragging with your mouse. It should also work with a touch screen, courtesy of GLFW translating those hardware events into mouse events for us.

    Next I'd like to "map" the position of a square to one input - such as the mouse position - and the rotation of another to another input - such as the angle of my Wacom pen - and the color of another to whether or not the H-key on my keyboard is pressed, red if it is, blue otherwise.

    I figure there are a total 4 different kinds of input that we as humans are able to provide the computer, irrespective of hardware, in either relative or absolute form, at various resolutions.


    • On/Off for any number of keys
    • 1D Range
    • 2D Range
    • 3D Range


    | Event | Type | Mode | Resolution |:------------------|:------------|:---------|:--------- | Mouse 2D | 2D Range | Rel | 16-bit | Mouse 2D | 2D Range | Abs | 0-screen | Mouse Key | On/Off | | 3-10 | Keyboard Key | On/Off | | 20-50 | Keyboard WASD | 2D Range | Rel | 20-50 | Wacom Position | 2D Range | Abs | 16-bit | Wacom Pressure | 1D Range | Abs | 0-4096 | Wacom Angle | 2D range | Abs | 0-4096 | Playstation Key | On/Off | | 4-12 | Playstation D-Pad | 2D Range | Abs | 4-12 | Playstation Range | 1D Range | Abs | 4-12 | Playstation Touch | 2D Range | Rel | 256 | iPad Touch | 2D Range | Abs | 256 | iPad Gyro | 2D Range | Abs | 256 | Index 3D | 3D Range | Abs | 16-bit | Index Key | On/Off | | 5-10 | Index Finger | 1D Range | Abs | 256 | GPS | 2D Range | Abs | 32-bit | Midi Key | On/Off | | 0-127 | Midi Knob | 1D Range | Rel | 127 | Midi Slider | 1D Range | Abs | 127 | Midi 2D | 2D Range | Abs | 127 | Midi 2D | 2D Range | Rel | 127 | Midi Velocity | 1D Range | Abs | 0-127 | Midi Aftertouch | 1D Range | | 0-127 | Motion Capture | 3D Range | | 16-bit | Gesture | On/Off | | 1-n

    I'd like to build my application around these 4 fundamental input types, and enable the user to pick any of these as sources from which to generate it.


    I'm not sure.

    I figure there must at least be a translation layer. Something dedicated to interpreting the data coming from the device, like the mouse, Xbox or Valve Index controllers, the keyboard and so forth.

    Your average application already provides two of these translation layers, for your mouse and for your keyboard.

    void Application::mousePressEvent(...) {}
    void Application::keyPressEvent(...) {}

    That's great, we can translate these into our 4 general-purpose input handlers.

    void Application::buttonEvent(...) {}
    void Application::range1DEvent(...) {}
    void Application::range2DEvent(...) {}
    void Application::range3DEvent(...) {}

    And now we can respond to these throughout our application, instead of to the mouse or keyboard directly. Then, when support for a new device is added, we can simply translate it to one or more of these 4 general purpose events.

    To poll or not to poll

    This one is always tricky. We don't care for events that happen more often than once every frame, and when we do care we want them to happen either at the beginning or end of each iteration.

    For example, if an event comes in before the scene is rendered, then we can take it into account. If it comes in during a render, it's somewhat pointless. But that's exactly what could happen in the current example application, as drawing and receiving events are entirely separate and happen independently. (As far as I can tell?)

    So polling seems the better option; at least in terms of predictability which I would trade for performance, if that is actually a tradeoff.


    So input can come at any time, great. But some input have a distinct beginning and an end. Like dragging. Dragging is a combination of a button being pressed, a series of range2d's (in the case of a mouse) followed by a button being released.

    Other input is a fire-and-forget type deal, like keyboard presses. Those are easier to conceptualise.

    example application 
    opened by alanjfs 0
  • Tools 2.0

    Tools 2.0

    A discussion topic regarding the current implementation of the "tools" in the example application.



    Overcome limitations of the current implementation.

    1. Tools are called every frame, even when not "active"
    2. Tools are free functions, to avoid issues with type when storing a _activeTool in the application
    3. Tools do not support multiple inputs in parallel, e.g. wacom tablet + mouse movements
    4. Tools need metadata, like a label and type, which are currently independent
    5. Tools can only manipulate the next frame, i.e. does nothing (useful) during pause
    6. Tools could be entities, but are not
    7. Inputs are assigned to entities being manipulated, should maybe be assigned to the tool instead?

    Overview of Current Implementation

    The user interacts with the world using "tools".

    A tool doesn't immediately modify any data, instead a tool generates events. Events are then interpreted by the application - via an "event handler" - which in turn modify your data. The goal is being able to play back the events and reproduce exactly what the user did with each tool.

    Tool     Event    Application
    | |        _
    | |------>| |        _
    | |       | |------>| |
    |_|       | |       | |
              |_|       | |
                        | |
                        | |

    The application has a notion of an "Active Tool" which is called once per frame.


    The tool does nothing, unless there is an entity with a particular component called Active along with some "input".


    The Active component is assigned by the UI layer, in this case ImGui, whenever an entity is clicked.


    These are the three states of any entity able to be manipulated with a Tool.

    • Activated entity has transitioned from being passive to active, this happens once
    • Active entity is activated, and being manipulated (e.g. dragged)
    • Deactivated entity transitioned from active back to passive, happens once


    Overall I'm looking for thoughts on the current system, I expect similar things have been done lots of times, perhaps with the exception of wanting to support (1) multiple inputs in parallel, e.g. two mice and (2) wanting the user to ultimately assign an arbitrary input to arbitrary tools, e.g. swap from the mouse affecting position to the Wii controller.

    Ultimately, the application is meant to facilitate building of a physical "rig", where you have a number of physical input devices, each affecting some part of a character or world. Like a marionettist and her control bar.

    Code wise, there are a few things I like about the current implementation, and some I dislike.


    • Tristate I like the Activated, Active and Deactivated aspect; I borrowed that from ImGui which seem to work quite nicely and is quite general.
    • Events rule I like that tools only have a single responsibility, which is to generate events. Which means I could generate events from anywhere, like from mouse move events directly, and it would integrate well with the sequencer and application behavior. =
    • Encapsulation I also like that because events carry the final responsibility, manipulating events is straightforward and intuitive, and serialising those to disk is unambiguous.
    • Generic inputs And I like how inputs are somewhat general, but I'll touch on inputs in another issue to keep this one focused on tools.


    • UI and responsibility I don't like the disconnect between how Active is assigned from the UI layer
    • Inputs on the wrong entities I don't like how inputs e.g. InputPosition2D are associated with entities like "hip", "leftLeg" etc. rather than a tool itself, which seems more intuitive.
    example application 
    opened by alanjfs 2
  • Arrangement Editor

    Arrangement Editor


    Organise groups of events at a high level.


    DAWs like Ableton Live, Bitwig, FL Studio and others enable you to work with a group of events called a "clip" in a loop, to then later arrange this clip alongside other clips on a global timeline. That's very handy for when you've got too many events to manage individually. Swap out the surgical knife for a chainsaw.


    There's lots of reference to draw from here. Bitwig does a good job at this I find.


    It can even combine that with effects and and a clip view, all in one screen, without being overwhelming.


    Live does something similar that also works well.


    As does Helio.


    And Cubase.


    Both Cubase and Logic are able to draw both arrangement and event editor at the same time, drawing events from the currently selected clip in the arrangement view, which I especially like and think could be a good fit for Sequentity as well.


    opened by alanjfs 0
  • Curve Editor

    Curve Editor


    A general-purpose editor of the application data associated with an event.


    Events ultimately represent your application data, and you typically edit that elsewhere. But sometimes the data is general enough for it to be suitable for a basic graph editor, like position or rotation over time.


    Where DAWs implement modulation/velocity editors, we could make a curve editor akin to Blender, Maya and Houdini.

    Bitwig implements an editor for stepped keys, without interpolation between values and thus doesn't really qualify as "curves". That works, though as a user I never really was a fan and typically re-record instead to avoid the finnicky interface.


    Live does the same.


    Helio does the same as well, but with a handy slice-tool.


    Cubase is getting closer to curves, whereby lines are drawn between events as opposed to boxes or emptiness.


    What we want though is closer to that of Blender and Maya.



    In each of the references, I think a separate window/panel is a good fit, for an option to overlay the data with its parent event.

    opened by alanjfs 0
Alan Jefferson
Alan Jefferson
A drop-in entity editor for EnTT with Dear ImGui

imgui_entt_entity_editor A drop-in, single-file entity editor for EnTT, with ImGui as graphical backend. demo-code (live) Editor Editor with Entiy-Lis

Erik Scholz 151 Jan 2, 2023
Dear ImGui Addons Branch = plain unmodified dear imgui plus some extra addon.

Dear ImGui (This library is available under a free and permissive license, but needs financial support to sustain its continued improvements. In addit

Flix 352 Dec 28, 2022
An improved plot widget for Dear ImGui, aimed at displaying audio data

imgui-plot An improved plot widget for Dear ImGui, aimed at displaying audio data TOC Screenshots Rationale Usage Installation FAQ Screenshots Display

Anton Lobashev 399 Jan 3, 2023
imGuIZMO.quat is a ImGui widget: like a trackball it provides a way to rotate models, lights, or objects with mouse, and graphically visualize their position in space, also around any single axis (Shift/Ctrl/Alt/Super)

imGuIZMO.quat v3.0 imGuIZMO.quat is a ImGui widget: like a trackball it provides a way to rotate models, lights, or objects with mouse, and graphicall

Michele Morrone 276 Dec 28, 2022
Immediate-mode fork of the mitsuba renderer. (See Wiki for branches.)

Mitsuba IM — Physically Based Renderer (Interactive Fork) Mitsuba IM is a fork of the comprehensive physically-based renderer mitsuba (http://mitsuba-

Tobias Zirr 65 Nov 20, 2022
An experimental sprite rendering setup utilizing SSBO's, Threading, EnTT reactive systems, and array-textures based sprite caching.

entt-reactive An experimental sprite rendering setup utilizing pooled SSBO's, a multithreaded setup based on Even Todd's The Poor Man's Threading Arch

Jackie Fuchs 7 Apr 29, 2022
ESP32 drum computer / sample player / midi sequencer (Arduino audio project)

esp32_drum_computer ESP32 drum computer / sample player / midi sequencer (Arduino audio project) The project can be seen in my video https://youtu.be/

Marcel 42 Dec 6, 2022
Meta - static reflection tools for c++. i mostly use this with entt.

meta Static reflection tools for C++. I use it with EnTT but it can work with anything. The main features the library provides are: Registering types

Nikhilesh S 9 Jul 12, 2022
Through hole PCB version of the HAGIWO 005 Generative Sequencer Eurorack module.

HAGIWO 005 Eurorack Sequencer PCB and Code Through hole PCB version of the HAGIWO 005 Generative Sequencer Eurorack module. The module is a very simpl

null 11 Sep 28, 2022
A small, dependency-free node editor extension for dear imgui.

imnodes A small, dependency-free node editor extension for dear imgui. Features: Create nodes, links, and pins in an immediate-mode style Single heade

Johann Muszynski 1.3k Dec 28, 2022
If the button pressed esp will reset and App mode will on. App mode will on then led will on, network is connected led will off.

DHT22-to-Google-sheet-Reset-Using-ESP8266-LED-Switch If button pressed esp will reset and App mode will on. App mode will on then led will on, network

Md. Harun-Or-Rashid 3 Aug 17, 2022
api & source menu base import imgui from imgui-js

onetap v4 crack https://discord.gg/AXCtxVH4PB people asking me for otv4 source "bin2h" (meaning binary to hex) large hexadecimal array deleted all the

h4xr0x#1337 9 Sep 6, 2022
A Flutter Web Plugin to display Text Widget as Html for SEO purpose

SEO Renderer A flutter plugin (under development) to render text widgets as html elements for SEO purpose. Created specifically for issue https://gith

Sahdeep Singh 103 Nov 21, 2022
YouTube subscriber counter widget

YouTube subscriber counter My version of AlexGyver's project. My improvements Fix work with new youtube API Change subscribers displaying from 42000 t

null 1 Oct 26, 2021
A native textfield that can be used in place of Flutter's TextField widget.

Better Textfield A native textfield that can be used in place of Flutter's TextField widget. Demo demo.mp4 Here are some screenshots of the demo app:

Abhay Maurya 1 Sep 13, 2022
Flutter real-time magnifying glass lens widget with Barrel/Pincushion distortion

MagnifyingGlass Flutter plugin Flutter real-time magnifying glass lens widget with Barrel/Pincushion distortion. Works on Android, iOS and desktop. Do

Marco Bavagnoli 10 Nov 9, 2022
A Navigator 2.0 based Flutter widget that automatically splits the screen into two views based on available space

A Navigator 2.0 based Flutter widget that automatically splits the screen into two views based on available space

null 5 Sep 17, 2022
A single file, single function, header to make notifications on the PS4 easier

Notifi Synopsis Adds a single function notifi(). It functions like printf however the first arg is the image to use (NULL and any invalid input should

Al Azif 9 Oct 4, 2022
GTA Online survival missions in Single Player mode

SurvivalsModCPPVersion You liked the survival missions from GTA Online? Well, me too. This mod aims to recreate the same gamemode in Single Player mod

null 1 Nov 23, 2021