I recently gave a talk to my local tech network. You.know, those things that used to be get-togethers and networking events for tech people, entrepreneurs, start-up people, innovators, inventors, etc., and that people discovered you could do using Zoom during the pandemic. And they obviously work, because we are still doing them.
Anyways, a casual comment I made at one of these tech chats turned into a talk about one of the things that I do, inspired by by soundtrack entry in the famous Westworld competition organised by Spitfire Audio a couple of years ago. It seems that there's a lot of interest in how current technology can make working with audio and music a lot easier than it was in the previous century, and so I just basically did a bit of show and tell...
Remember 'big presentations? Photo by Sigmund on Unsplash |
Now when I say 'show and tell', I do mean exactly that. I never wanted to do yet another boring slide presentation full of slides with bullet points. But just watching someone share their screen for an hour is also not so great - I've been in quite a few zoom calls where person after person shared their screen and worked on software, and after watching someone else tweaking MaxForLive for a while, you kind of want to do some programming yourself. Probably my least favourite calls have been the ones where a series of musicians talk for about 30 seconds on some of their techniques, and then spend 20 minutes doing DAWless improvisation. It's the inevitability of it - you get 30 seconds of interesting information, and just when you start to learn about a technique that might be useful, they say: '...and here's a track I put together using a different approach...'. Cue 20 minutes of doodling...
So, no slides, no bullet points, and not too much boring screen sharing. It's a challenging recipe. So I used online videos (mostly YouTube, although I subscribe to Nebula and love it, but YouTube has the advantage of being accessible (and I'm struggling to think of any other advantage...)), web-pages instead of photos 'from the internet', and yes, some screen sharing where I avoided any code and concentrated on showing interactive arranging stuff.
At the end of it, I thought that I should capture it, so that others could have a similar experience, and so the rest of this blog is just the resources that I used, minus the potentially boring screen sharing where I probably droned on about doing music for pictures. So you get just the good stuff to browse through as you wish, and that's all upside, as far as I can tell...
(When I type: 'Just the good stuff', there's a caveat, but you probably know that already - you have to wade through me adding all of these explanatory words. Unless you just ignore my words and click on the videos, of course...)
"And now, over to Martin..." <screen goes black>
Resources...
To set the scene, I used an opening music clip - 'Journey across the Red Planet', an excellent piece of music from Paul Thomson, which demonstrates some of the sounds from the Spitfire Audio 'Abbey One library. (Paul is one of the two founders of Spitfire Audio, a cutting-edge UK ‘sample library’ company: https://www.spitfireaudio.com ) I explained that 'everything you are hearing is produced by a computer, using recordings of real instruments'.
I suggested that they should close their eyes for a minute or so, listen(!), then open them and look for the connections between what was happening on the screen and what they could hear. The video shows a DAW (Logic) playing the music, and so you can get some sense of how a DAW uses lots of individual tracks of virtual instruments to reproduce music, and there were piano rolls and MIDI Controller editing shots that illustrate that there's a lot of fine detailed control. Overall, the linkage between the music and the video is pretty effectively shown, but then Spitfire Audio do make vey good videos. So, yes, I started with virtually an advert for Spitfire Audio, but then I do have quite a few of their libraries, LABS instruments and a lot of the associated Pianobook.co.uk instruments, so I'm slightly biased. If you've read this blog for any time, then you will have seen that I've been to various events at their HQ (back before Pan Demic and her band put the world on pause for a while...) and I've met Christian Henson and Paul Thomson... (But do they remember me?)
Anyways, the music and the video serve as that all-important bridge, where you leave the real world, and enter the artificial world of 'the talk'. I've never liked the idea that putting up a slide that shows the title of your talk, followed by another slide that tells your life achievements in bullet points, is the perfect way to move people out of their default mind-set and into one where they are ready for fully engaging in a presentation. Closing your eyes and listening helps too, and it often puts any older members of the audience to sleep, so they can't ask tricky questions about DIN sync in the 1970s.
Anyways, I introduced virtual instruments, and how they replayed recordings of real instruments. Or unreal instruments, and so I showed them my BankOSC MaxForLive device that makes 32-oscillator drones and sweep sounds, and basically makes it sound like you have a humungous hardware modular synth, when actually you must have Ableton Live and a free bit of software that I published on MaxForLive.com.
Synthesizerwriter's Store (New 'Modular thinking' designs now available!)
No comments:
Post a Comment