THE COMPUTER MOUSE AS A MIDI INSTRUMENT: a new feature for the open-source project The Amanuensis: Automated Songwriting and Recording

in #utopian-io6 years ago (edited)

amanuensis final-03.png
logo by @camiloferrua

Repository

https://github.com/to-the-sun/amanuensis

The Amanuensis is an automated songwriting and recording system aimed at ridding the process of anything left-brained, so one need never leave a creative, spontaneous and improvisational state of mind, from the inception of the song until its final master. The program will construct a cohesive song structure, using the best of what you give it, looping around you and growing in real-time as you play. All you have to do is jam and fully written songs will flow out behind you wherever you go.

If you're interested in trying it out, please get a hold of me! Playtesters wanted!

New Features

  • What feature(s) did you add?

The Singing Stream is a vital aspect of The Amanuensis, allowing you to turn any stream of data into MIDI notes. An obvious stream has been missing up until this point however: data from the computer's mouse. Now the horizontal position, vertical position, as well as mouse clicks can be utilized, allowing you to use your mouse as your instrument.

Combined with an eye tracker, this allows you to play notes based solely on where your eyes are looking on the screen, leaving your hands free to play any other instruments at the same time.

  • How did you implement it/them?

If you're not familiar, Max is a visual language and textual representations like those shown for each commit on Github aren't particularly comprehensible to humans. You won't find any of the commenting there either. Therefore, I will present the work that was completed using images instead. Read the comments in those to get an idea of how the code works. I'll keep my description here about the process of writing that code.

These are the primary commits involved:

A new subpatcher was created to handle the mouse functionality. It operates in much the same way as the hid (human interface devices, specifically game controllers and the like) subpatcher does, except that it can be assumed that all computers have an operating mouse, so the streams can be initialized by default rather than having to wait to see each new control come in. This simplifies things a bit, for example alleviating the need for dynamic scripting.


the new mouse subpatcher, complete with commenting

In addition, the process of initialization has been reworked to be less convoluted and more streamlined. Previously the audio driver streams were arbitrarily chosen as the first to initialize and only after that were the other streams allowed to go ahead. This basically served the purpose only of ensuring the UI menu listing all the sources was cleared at the appropriate time and not re-cleared later on.

As more and more stream sources have been added, this led to a cumbersome chain of events, so I decided to rework the system so that all sources are free to "race" towards initialization, but the first to make it is the one to clear the menu.

In the process ---initialize_stream was instituted, which each stream now uses to explicitly initialize itself. Before, the streams themselves were continually monitored, looking for new ones, and when one was seen, it was initialized. This was not at all efficient and even seemed to start failing in inexplicable ways when the mouse streams were added, which is the main reason I decided to bypass the whole issue with this renovation.


the reworked initialize subpatcher, complete with commenting. Each of the other stream source subpatchers have also been reworked (in a manner resembling p mouse in the first screenshot above) to utilize ---initialize_stream, which is received here

GitHub Account

https://github.com/to-the-sun

Sort:  

Welcome back @to-the-sun! It's great to see you are still actively working on the project and adding new features. As always it's difficult for me to give feedback about something I don't know anything about, so I apologise for that. Anyway, here are a couple of things:

  • The commit was made 19 days ago, try to keep it < 14 days when submitting it to Utopian.
  • Not really applicable for this post since you only changed 1 line, but in timelineGL.js there are a lot of things that could be improved. I'd recommend installing a linter and running it over it, as it will probably fix the majority of problems.
  • Good commit messages and pictures with comments - made it a lot more understandable for me!

Any closer to being satisfied with the Amanuensis, or do you still have plenty of ideas?


Your contribution has been evaluated according to Utopian policies and guidelines, as well as a predefined set of questions pertaining to the category.

To view those questions and the relevant answers related to your post, click here.


Need help? Chat with us on Discord.

[utopian-moderator]

Thanks @amosbastian! I have CRPS (severe nerve damage) in my hands and I suffered another major injury a couple months ago, so I was completely out of it for a while. I still don't really have the use of my hands, but I'm slowly working my way back into being productive and it feels good. Unfortunately I'm still stuck using voice recognition on my computer for everything, so the going is very slow compared to what I would be accomplishing otherwise.

  • Sorry, I didn't realize how strung out the commit was getting (I accomplish so little in a day, I feel like I'm stuck in a time warp), but all of what I outlined here was actually done more recently than 14 days. I'll try to be more careful in the future.
  • A linter huh… You mean the thing that VS Code is always trying to get me to install but I keep telling it to go to hell because the one I had for Python was constantly annoying me with pop-ups that got in the way of what I was doing? :) Ha ha, I've been avoiding them like the plague, but I suppose I wasn't even entirely sure what they were for, admittedly. Perhaps I can look into it again. Is there anything in particular you were seeing?
  • gives a tip of the hat

Always closer, but I'll never be satisfied. But I do hope to be getting to a point soon where all of the fundamental, functional aspects of the system are in place and I can move on to focusing more heavily on the AI aspect. Getting it smarter about identifying rhythm/aberrations in rhythm and beyond that, larger patterns of all kinds (e.g. repeated riffs, the scale, choruses etc.). Then there's always my grand schemes of turning it into an entire rhythm game, as I outlined once in a blog post. Then eventually bringing it to mobile platforms… The list goes on. This thing is basically my life's work. I won't be done with it until I'm dead, or possibly until I've used it to transform myself into some sort of virtual songwriting incarnation, perpetually churning out a Pandora's music box of more and more enthralling auditory patterns…

O man, that's terrible to hear. I remember you mentioning something like this before (I think you also had a video where you were using voice recognition software), so I really hope everything goes well and you recover as much as you can. It's great that you are working your way back into being productive, hopefully it continues, as I can imagine it being very frustrating.

Linters are great! They basically do all the formatting for you, which is really nice as you can just focus on the coding.

Is there anything in particular you were seeing?

Well, there's a lot of little things that could be improved, like using let and const instead of just var - once you get Eslint you'll see. ;) Anyway, hope everything is okay and you keep producing cool updates for the Amanuensis!

Thank you for your review, @amosbastian! Keep up the good work!

Hi @to-the-sun!

Your post was upvoted by @steem-ua, new Steem dApp, using UserAuthority for algorithmic post curation!
Your post is eligible for our upvote, thanks to our collaboration with @utopian-io!
Feel free to join our @steem-ua Discord server

Hey, @to-the-sun!

Thanks for contributing on Utopian.
We’re already looking forward to your next contribution!

Get higher incentives and support Utopian.io!
Simply set @utopian.pay as a 5% (or higher) payout beneficiary on your contribution post (via SteemPlus or Steeditor).

Want to chat? Join us on Discord https://discord.gg/h52nFrV.

Vote for Utopian Witness!