lpTouch iPad app

Hey everyone, just wanted to let you know about a pretty cool iOS app I have been using lately. I know I don’t normally post recommendations for apps in the blog, but this one I found especially useful.  If you have Logic and an iPad, this is a really easy to use Control Surface app that lets you control Logic Pro. Easy to use and understand, and very simple to pair with your computer as well.

There’s two skins to choose from (light and a dark), support for the new retina iPads, and 5 screen layouts:

– Main mixer screen.
– Pans and sends.
– Channel strip.
– EQ page.
– Instrument parameters.

I used this quite a lot over the weekend, and I have to say it’s one of the first iOS control surface apps that I haven’t found fiddly to use or set up.  The controls are large enough to easily grab the right one, and knob control can be set up as linear or circular dragging motions.  Worth a look for any iPad and Logic users wanting a simple but powerful control surface for the two.  Best of all it’s only $4.99.


The Flickering Dark

This is a sort of proof of concept for a new type of live pa I’ve been working on for the last 8 days.  Not so much a demo as it’s only 20 minutes long, more an experiment for me to see if this was a valid way to play live.


There are some benefits to being left home alone (plus a dog) for 16 days, while the wife goes on vacation with her sister.  Knowing I’d have this time to myself to work any schedule, and do whatever I wanted in the meantime, I planned on writing a LOT of music.  I prepped material for a new Machinedrum live set, I bought some new apps for the iPad, I even prepped some song writing templates in Live just in case I got an idea.  In short, I got all the BS out of the way before she even left 🙂

Of course, things never go according to plan, and literally on the day she left I got this left field idea to try and get a working live pa set up with Stylus RMX and Omnisphere in Ableton Live.  I’ve tried it a few times before, but always ran into hurdles that kept me from getting it set up in a fluid, performable way.

The key this time, was that I realized I could use Live’s Looper devices, much like I do with the Elektron RAM machines in my Machinedrum live sets.  So I have one instance of RMX and one Omnisphere (Omni) in the set, and I use them both in Multi mode.  This way I could use a Multi in each device for each of my “songs” in the set.  With a Looper on each of those tracks, I can capture the audio from them, and have it start looping immediately while I switch to a new Multi on the plug ins.  Switching the Multi in Stylus was only time I needed to use the trackpad in fact.

The only tricky bit was figuring out how to fade from the audio looping in Looper on each track, to the new material I had just loaded.  I ended up using an audio effect rack, with one chain for the Looper, and one for the dry audio.  Using a track fader on the APC, I could crossfade from looped material and new stuff for the next song by mapping a fader to the chain selector.

I used 3 tracks of drums from Stylus, and 4 tracks of synths in Omnisphere.  The APC40 would handle clip launching, and tweaking all the Stylus and Looper parameters.  (Stylus and Omni both have excellent MIDI mapping utilities btw.)

Here’s a couple views of the Live Set:

I used OmniTR on the iPad2 to control everything in Omnisphere, from switching sounds, to tweaking everything live, to selecting the Multi for each song.  Strangely I’d get an audible glitch when switching Multi’s with TR, a super short audible pop.  Even more strangely, this did not get recorded in the audio I saved to post online.  ????

Anyway, pretty happy with it overall, even if it does sound a little confusing on paper.  I’ll start working on some more material for the set over the next month or so.  Fun stuff, enjoy!

What’s Next?

First up, sorry for the blog seeming to wind down the last few weeks.  Every year the end of summer becomes an incredibly busy time for me, both with the mastering business and my own personal life.  Hoping to get back on a weekly schedule with the blog now that things are getting back to normal.  And thanks to all the people who continue to recommend my studio to their friends and fellow musicians.  Recommendations like this still help a lot, so I really appreciate it.


I’ve been spending a lot of time lately trying to figure out what type of music project I should tackle next. I’ve been getting the itch again lately to put together a new live set, I just seem to be in a performance-oriented mindset lately.  Having just wrapped up the Monomachine and Machinedrum live project, I thought it might be an interesting exercise to see if I could do a live set 100% on the iPad.

So far I’ve done two complete songs on the iPad, one using Garageband (Slip), and one using NanoStudio (Slat).  So I know that there’s apps out there capable of making some good sounds. I just hadn’t spent much time thinking about the platform as a live tool, aside from say Ableton Live controllers like TouchOSC or TouchAble.  I really didn’t want to just use a controller though, I wanted to create and perform an entire set using one app natively on the iPad.

Of course, I often get ideas like this without doing any research first, and it soon became apparent that there just aren’t a lot of apps out there capable of being used in a live manner.  I was willing to deal with a lot of limitations however, so I started looking into a lot of apps that might fit the bill.  My only real requirement was that I was able to write songs or patterns with the app, and somehow switch from one song or pattern to another on the fly.  This way I could chain together songs to do a full hour set, and hopefully there was at least some measure of real-time control and sound tweaking too.

One of the first apps that caught my eye was TechnoBox2, which is basically a really nice looking 909/808/303 clone ala Rebirth.  Much cleaner interface though, with 2 synths and 2 drum machines active at any time.  It supports up to 12 patterns per device, and you can switch between them in real-time.  However, each pattern can only be one bar long, and with only 12 on offer, I wasn’t sure I could do anything interesting for more than 20 minutes straight with it.  Plus, I’m not sure that the old school Roland sound is really something I want to limit myself to for this project.  Been there, done that. In the end, I decide to pass.

Honestly, for the longest time I just didn’t find anything else out there that looked like it would suit my needs, even with imposing some severe limitations on my working methods.  Most of the apps that seemed to maybe be oriented to live use, were straight up drum machines like Bleep!Box or Korg’s iElectribe.  Powerful and fun to use tools, but I’ve owned a real ER-1 in the past and know how frustrating it can be to create good synths sounds from a drum synth.

Then I started looking into maybe trying to figure out a workaround with one of the DAW style apps like BeatMaker or NanoStudio.  Certainly they offer a lot of sound generating capabilities via sampling or synthesis, but I just couldn’t come up with a fluid way of using them in a live situation.  Too much jumping around between screens trying to both tweak sounds and work on some way of switching songs on the fly.

Then by chance I ran into a review of an app called Electrify, which to be honest looked perfect.  Sort of a mini version of Ableton Live, it has 8 tracks and 8 scenes available at once, x-y pads for tweaking effects, and the ability to load your own samples.  More than anything, it looked like a fun way to create your own loops and grooves, and it even had a pretty decent factory set of samples to get you started.  I set to work making material for a new live set, and I had a great time doing so.

At first.

Unfortunately, the latest version of the app is super buggy, and before long I was running into issues and other weirdness that more or less made me give up on the idea for now.  I think for what I want to do, it’s definitely the best app out so far, but it’s just not stable enough, or even fully useable to see this project through.  I’ve been writing music for too long to feel I need to soldier on and deal with buggy software, so for now I’m stuck waiting for the developer to release an update which sorts out the confirmed issues.  The good news is that he’s aware of the issue and has promised an update soon.


So, I then started considering doing another Machinedrum-only live set.  I know a lot of people have told me they feel my Machinedrum (MD) sets are really basic sounding compared to my usual downtempo productions, but honestly using the MD is the most fun I’ve been had making music lately.  I know it very well, and can get ideas down pretty quickly on it, so it just feels like a real instrument to me.  It’s one of the only bits of hardware I’ve used that feels truly performance oriented.  Same sort of vibe I get from say, playing my guitar.

I decided that if I was going to go down this route again though, I needed some new samples for the UW aspect of the Machinedrum.  For this project I wanted to stick with the internal synthesis engines for all of the drum and percussion sounds, and use the samples for my instruments and synths.  I spent the last week or so using Live and softsynths like Omnisphere and Synplant to get about 40 samples created, all totalling only 1.3MB.  Remember, the MD-UW MKII only has 2.5MB of sample space, and some of that I need to save for live on the fly resampling 🙂  Always fun trying to get samples as small as possible, and a good reminder these days that you don’t need GB’s worth of samples to make compelling music.  At least I hope it’s compelling….

Just when I had all the prep work wrapped up and was ready to start writing though, I suddenly got it into my head that maybe it was time to revisit the idea of turning Ableton Live and the APC40 into a super groovebox.  Something that I could use not only to quickly sketch up loops and grooves, but also perform and manipulate those in a live performance setting.  All centered on MIDI loops and softsyths.  But as this is already getting to be pretty long, I think I’ll save that topic for another time.  🙂


Still got room for a couple more questions for next week’s Production Q&A as well, so if you have any questions, please submit them via email or in the comments of this post asap.  Thanks!

…and tigers and bears.

Well, despite all the warnings not to, last night I updated my OS to OSX Lion.  Just thought I’d share some of my thoughts on it so far for those still on the fence, or wisely waiting a little bit longer.  The main thing that made me decide to give it a try was seeing that the latest beta of Live (8.2.5b1) is now listed as Lion compatible. I figure I have a current Time Machine backup of my Snow Leopard (SL) install if things just aren’t working, and I’m going to need to upgrade to Lion eventually anyway in the future.  The fact that you can redownload if needed means I can always reinstall Lion at a later date.

I’m running it on a 2010 i7 2.66 MacBook Pro, for those that are curious.

The download and install process was pretty simple, took only 45 minutes to download, and roughly 20 minutes to install.  No fancy launch screens after the install is complete this time, your computer just boots back to the desktop and looks just like SL did before the install.  There’s some new icons in the Dock, mainly for Mission Control and the App Store, and the current user name is now displayed in the Task Bar (easy enough to CMD drag it off though).

Once you start using the computer more, the changes are more obvious.  The main thing I read about was people hating on the new reversed scrolling behaviour.  It was a little weird at first, but honestly within a couple minutes I was used to it and I think it makes more sense.  The one thing I didn’t realize is that the scroll wheel on my mouse is also reversed now, and that I I’m having a harder time getting used to.  The scrolling preferences are global across all devices, so you can’t have different settings for the trackpad and a mouse for instance.

The fact that the scroll bars now auto hide ala iOS is super nice too, saves a lot of space on the screen.  The full-screen mode that most OSX apps have now is also REALLY nice in my opinion, I always loved the fact that Ableton Live could do this as it really makes you focus on the app you’re using.  Super glad to have this natively in Logic now, without having to resort to preference hacks like in the past.  It’s also really useful in Safari and Preview too.

When apps are in full-screen mode, you can 3 finger swipe on the track pad left or right to switch between apps, or get back to your desktop.  Works well and makes sense, looks nice too.  Slightly confusing for me was fact that you now 2 finger swipe left or right in Safari to go forward or backward in your history, and the animation looks really similar to the one that is used to switch between apps.  Takes me a second to remember that I’m in the history, and not switching to a different app, just because the actions are so similar visually.

The new saved state in apps is ok, but I’m not sure how I feel about it yet.  It’s weird not seeing the little x in the red close button reminding you to save, though your document title does have an “Edited” text next to it until you do.  I’m not sure I really like how opening a closed app recalls the last document I worked on though.  For instance, if I close Text Edit after jotting down some notes, when I relaunch it later it pulls up the notes I was working on instead of a new blank document.  Ditto for things like Safari, it no longer opens with my homepage, it opens with the last website I was on when I closed it.  Not a huge deal as it’s only a key command away from what I want anyway, but it’s kind of weird to get used to.

Mission Control and Launchpad are honestly really not my thing, both seem way to cluttered to be of any time savings.  Mission Control is just a cluster fuck of information on the screen at once if you have a lot of apps open, and I really don’t think it’s organized that intuitively.  I don’t use Spaces at all though, so perhaps those kinds of users will find it easier to navigate in.  Likewise Launchpad is supposed to mimic a the home screen of an iOS device, but instead you get a ton of icons present across multiple pages.  Worse, they’re in all sorts of random order too, so it’s not that easy to find what you need.  You can rearrange them and create folders, but honest for me it’s easier to just spotlight search for an app if I need it and it’s not in my Dock.

On the music app side, so far things appear to be working ok.  I had one hiccup where Omnisphere was not recalling the correct patch in a saved Live project.  After closing and reopening the project though, it is now as it should be.  Logic and my mastering set up (Wave Editor) appear to be working fine too, but I still haven’t run them for a really long time to be absolutely sure.  I don’t have a ton of plug ins and instruments, but after trying them all last night everything seems to be working ok.

A real concern for me was reading that my version of Quickbooks (2009, which I use for my business accounting) was not going to be supported in Lion, and that I would need to pay $170 to upgrade to Quickbooks 2011.  As anyone with their own business will tell you, the accounting software is the backbone of the business, so when it’s working fine you’re really loath to mess with updates.  A little more digging and it looked like most of the issues with the 2009 version had to do with printing, which I never do, so I decided to chance it under Lion.  So far it appears to be running fine, so hopefully I can hold out a little longer to update Quickbooks.

Overall it’s been a pretty painless update for me, and for now at least, I think I’m going to stick with Lion.  Some aspects like full-screen mode I really enjoy, while things like Launchpad and Mission Control seem more like gimmicks to me.  The one thing that really strikes me though, is that since so much of this is centered around new gestures for navigation and taking advantage of new functions, that if you’re still using a mouse, you’ll miss out on a lot of the new features.  I know it’s definitely made me seriously think about using a Magic Trackpad instead of a mouse, even though I generally find the mouse more accurate.  We’ll see I guess!

Anyway, that’s my thoughts for now, I’ll post updates in the comments if something changes or if I decide to revert back to SL.  Feel free to ask any questions or post comments about your upgrade experiences too.


I also wanted to take a second to thank everyone for their kind words about the new Inner Portal Studio website and my services.  I appreciate all the nice sentiments people sent my way about that.

Out Of Office

I find it pretty funny. After years of spending a lot of time and money to get nice monitors for the studio, proper acoustic treatment, a really nice chair, and generally making things as comfortable as possible, as soon as the weather turns nice all I want to do is get out of there.  🙂

As most people can probably tell by looking at any of my album or song covers, I love being out in nature.  Sitting inside when the weather is nice (heck, even when it isn’t sometimes) can be torture, and one of the last things I can focus on is writing music.  I think that’s one reason I love Seattle so much, so much incredible scenery close by, but also a lot of rainy days where I’m forced to stay in the studio and get things done.

Needless to say, I was pretty happy a few years ago when I got my first laptop, and  realized I could make music while out and about.  I’d often grab that, my DJ headphones, and a small midi controller (a Edirol PCR-m1 at first) and head to a local park to make music.  Even for a portable set up it was a little overkill I think, kind of heavy, and honestly not very discrete.  I’d often get people coming up to me wanting to see what I was doing, and the last thing I want to do in that kind of situation is talk to strangers while trying to write music.


Overtime I slowly stopped bringing the midi keyboard, and started just using the QWERTY keys to enter notes.  A handy function that I first started using with Ableton Live, and sometimes Logic too.  After the keyboard started staying home, eventually I stopped bringing the large DJ headphones, and switched to smaller earbuds (Shure E2c’s then, now E3c’s).  Now I no longer looked like some weird musician in the park, and instead like some loser doing office work at the park instead of enjoying nature 🙂  Plus, it’s not like I need perfect audio in a situation like that, I’m not doing a final mixdown.  I just need to hear basically what I’m doing, any critical listening is going to be done back in the studio.

But, while it was a huge improvement in portability over my first attempts, it still was kind of bulky to bring a backpack padded enough for my laptop.  And truth be told, I was always nervous about something happening to the laptop, which by then was my main production machine too.  I’d ride my bikes to parks a lot, and you never know when you might crash, or some stranger decides they want your laptop.

So I was really interested when a friend told me about the program called “Bhajis Loops” that you could run on a Palm Pilot.  Finally, something I could put in my pocket and make music on!  I quickly bought a Palm TX1 just to use for Bhajis, and suddenly my options for locations to make music on the go increased greatly.  At the time, Bhajis was one of the best options for music making with such a small form-factor, it was loads of fun.  There were a lot of limitations, you couldn’t do a lot before running out of CPU power, or killing the battery, but it still was pretty powerful in what you could do.

It was a great improvement in portability, and meant I could now bring my music making tools with me further out into the woods.  I wasn’t lugging a large and heavy laptop around, so I no longer was confined to local parks that often had a lot of other people there too.  I’d toss the Palm in my Camelbak and hit the trails on my mountain bike looking for nice spots to spend the day writing music.  Or I’d just go to a local forest and pick a direction, and just start walking in the woods until I found somewhere comfortable to work for awhile.  Finally, beautiful locations AND solitude!

As great as Bhajis was then, there were still some things that made me always on the look out for a new tool to fill the same role.  For one, syncing with the laptop was always kind of hit and miss, especially once I got my MacBook Pro.  More a fault of Palm than Bhajis, but still something that could cause you to pull some hair out when you were trying to get data back and forth in the studio.

Today things are of course a lot easier, with an iPhone/iPad and NanoStudio or Garageband, I have exponentially more power and options at my finger tips.  Syncing and data transfer is dead simple, and the tools are just plain easier to use.  Sound quality is better too, the Palm TX1 always had a slight whine in the headphones that could get to you.  🙂

There’s still a lot of times these days I’ll just grab my iPhone and head out into the woods to work on music.  In a bit of irony, now that I have devices more capable of creating finished songs on them, I no longer really focus on that.  I find that I’m much more productive just working on melodies, maybe mangling a sample, or just playing around with random ideas looking for a new hook.  Or even sampling sounds to mangle later, or taking pictures for new album covers.  Then I can transfer all that to the laptop back in the studio and I’ve got the hard part of coming up with new song ideas already done.  I feel less pressure to ‘finish the song’ and I can just enjoy music making in scenic places.  After all, ultimately that’s what getting out of the studio is all about.


A lot of people seemed to like the Production Q&A I did last week, so it looks like it’s something I’ll continue to do.  So, if you’ve got any questions you’d like me to tackle, please send them my way.  Haven’t gotten many questions yet for round two yet!

Also, check back in a couple days, the new downtempo DJ set I did for the RK2 Podcast 5 Year Anniversary will be going live on June 19th.

New Track – “Slip”

Slip – Downtempo 04-18-2011
(right click the name above to play or download)

As I mentioned in my Touching Matters blog post a couple weeks ago, I recently got an iPad2.  One of the apps I was most interested in trying out was Apple’s new Garageband for the iPad, as I know the desktop version is usually under-appreciated considering how much it can do for the price.  So I decided to see how much I could do by writing an entire song in just that one app on the iPad.

For the most part it was a very enjoyable process, and I was able to complete the track in only a few days without once opening the help documentation.  You don’t get a lot of instruments in the iPad verison, but the ones they do give you are pretty good sounding and fun to play, and they cover a good range of sounds too.  The Smart Guitar in particular was more fun than I thought it would be, and the best part is that you can ignore all the default picking patterns and just play it manually yourself (which is what I did in this song).

At first I was a little bummed out that you can’t go back and edit anything you’ve recorded, at least not on a piano role like on the desktop version (or any DAW for that matter).  However, if you zoom in a lot, you CAN edit the regions by slicing them into smaller regions, cutting and pasting them where you want, and then rejoining them into new regions.  So you still have quite a bit of flexibility when it comes to editing, provided you have the patience to work that way.  Reminded me a lot of my old SP808 back in the day, so it wasn’t a huge deal for me.

It wasn’t all fun and games though, there were some frustrating moments too.  For instance, without per track effects (you only get a globel reverb and delay) or EQ, balancing everything was a little rough at time.  All of the instruments are very full sounding, so they can tend to mush up on the low end when you add enough together.  This track only uses 7 of the available 8 tracks, and I had to fight that a little bit.  Especially with the drums, the low end on the kicks and toms is monsterous!  Luckily the Classic Drum Machine kit I used offers a high pass filter, so I was able to tame most of the subby low end (still a touch much for my own liking though).

More discouraging though, was the fact that the much hyped ability to import projects created on the iPad into Garageband on the Mac didn’t work correctly.  The guitar parts in this song did not play right at all on the Mac version, despite the fact that I’m using the latest versions of both.  Notes were skipped, and the string bends I played in the drop and outro were ignored on the Mac.  That was a let down, as I was forced to record this via the headphone out on the iPad.  Still sounds ok, but not as good as a direct render would have sounded I’m sure.

Anyway, the version you hear is recorded straight from the iPad2 with no additional processing other than normalizing to -0.3dBFS and converted to 320kbps MP3.  I dind’t do any mastering on file, this is exactly how it sounds straight from the app.

Touching Matters

So, like millions of other people, I now own an iPad.  I like Apple products, I admit it, but even I was a bit skeptical if I really needed one when they were first announced.  I have a current MacBook Pro, and an iPhone4 that I use all the time, and it was tough to see a use where the iPad was really going to make much of a difference between those two.

So, I waited to see how well it would be accepted, and what a new revision might bring.  I admit, I love the retina display on the iPhone4 and really hoped that was coming to the second version of the iPad.  The first iPad also felt a touch sluggish to me compared to the iPhone, no doubt due to having less memory for one thing.  I also wanted to see how the music community adopted it, since at the time the Lemur was really the only viable touch screen device for music apps.

Well, by Winter NAMM 2011 it was pretty apparent that the music manufacturers were starting to pay attention and embrace it completely.  Akai and others were coming out with new hardware to interface with it, more apps than ever were being written specifically for it, and most importantly for me, Spectrasonics was releasing the Omni TR app for it.  Since Omnisphere is my main synth these days, this alone was incentive for me to take the plunge.  It got rid of the pain in the ass need to manually map a generic controller to a software synth, and really embraced a method of controlling a synth ideally suited to a touchscreen (the Orb).

And of course shortly after this, Apple announced the iPad2.  While lacking the retina display, it was faster, lighter, and thinner, which are the usual adjectives one expects relating to Apple updates.  So, a couple of weeks after it was launched, I managed to finally get my hands on one.  I went with the base model 16GB, Wi-Fi model, as I’m still not convinced it’s something I’ll use enough to warrant the 3G connection or larger storage yet.  I’m not going to review the iPad2 itself, as there’s countless other places you can get that info, and my thoughts largely echo what you’ll read there.  Instead I wanted to share my thoughts on where it fits into my music making work-flow, and how it fits that niche between the iPhone4, and a laptop.

If you’ve used any iOS device in the past, then using the iPad is as simple as can be. Well, at least I thought it would be.  First thing I did was sync all my favorite apps, plus a couple new ones I bought specifically for the iPad, and then set about organizing them all like I had on my iPhone.   Right away you notice just how bad iPhone apps look on the iPad when in 2x mode, and suddenly all the complaints I had read from people wanting iPad native versions made sense.  I mean, they’re definitely useable, but they just look really poor scaled up that big.  So, it was a trip back to the App Store to see which ones had iPad specific versions, and to download all of those instead.

In most cases, this was well worth the time spent, as not only do the iPad native apps look better, in most cases they provide much better functionality.  The only one I found to be worse on the iPad was Yelp!, but that’s just me and not worth going into details about.  Weather apps, news apps, the normal day to day stuff you might use are by and large a much better experience on the iPad.  Not a huge surprise, but I was shocked at how much better they were laid out and how user-friendly they were with the larger screen.  News360, WeatherBug, NPR, Twitter, and Zite all stand out on this front, the iPad versions will change the way you think about getting information in the future, whatever that information smay be.

The other interesting thing for me, was how my previous method of organizing things on the iPhone translated so poorly to the iPad.  In the past I had folders set up for different groups of apps and shortcuts.  One for music apps, one for video apps, one for news websites, one for the forums I frequent, etc.  On the iPad version of Safari however, you retain the Bookmarks bar of the desktop version.  So you can access all of your normal web bookmarks without having to leave Safari.  On the iPhone, this is not the case, so home screen icons are the fastest way to navigate to these pages.  This way of working is counter-productive on the iPad though, it just doesn’t make sense to leave Safari and go back to your home screen each time you want to access a new website.  Add to that the fact that you can fit more icons on the larger iPad home screen, and I find that I end up using a lot less folders and icons in general for Safari related items.  It seemed counter-intuitive at the time, but treating it more like my laptop, and less like a larger iPhone was definitely the better way to work.

In general I think that I use it more like my laptop and less like my phone surprised me.  It’s easier to  hold and work with in landscape mode.  Typing is DEFINITELY easier in this orientation, and not as hard as one would think.  I’m still faster on my laptop keys by a long shot (and no, this post was not written on my iPad, how cliche), but I can type out decent length posts on the iPad really without much thought.  The only thing that is mildly distracting is iOS auto-correcting technical words I did spell right, into something totally unrelated.  But with use it’s learning and getting better at this. 🙂

Other than not being able to download and upload stored files, I can honestly see this replacing a laptop for most people.  One of the first things I do each day is sit in my living room with my laptop, checking email and various forums while I drink my morning coffee and plan out the rest of my day.  The iPad fits this role perfectly, and I have to admit it’s nice not having to disconnect the laptop from my studio rig of soundcards, hard drives, monitors, headphones, etc each morning (not that it was really THAT hard anyway, but still).  Anything requiring a longer reply or more detailed work, I just email myself a link to it and check it later in the studio.  Works great, for every day use I can see why so many people like the iPad.  Takes a little bit to get used to, but once you do, you don’t miss the laptop or larger screen that much at all.

Which brings me to what most people reading this probably care about: music apps.  I’ll try and squash some disappointment right away and state that I don’t have a ton of music apps yet, and if people know me by now, they know I’m not one to horde apps or plug ins anyway.  Still, I have managed to try quite a few over different categories, so I’ll share my thoughts on those.

For the most part, most of the mini-DAW type apps like Beat Maker, NanoStudio, and now Garageband work great.  The larger displays make working with them easy as can be, and I see no reason why someone couldn’t get some really great sounding demos or scratch ideas down anywhere they want.  Keys on their keyboards are easy to play, and moving around the apps is pretty simple.  This is one of my main goals for my iPad, using it to capture melodies and other ideas to expand on later in the studio.  Primarily using NanoStudio, though now I REALLY want an iPad specific version of that.

The new Garageband app is really well done I have to admit.  You can’t edit midi data you record, and you’re limited to the preset sounds they give you, but both of these are hardly show stoppers IMO.  Being able to open the work you do in Garageband on the Mac works flawlessly, and you can even open the files in Logic 9 directly too (though most of the instruments won’t work, the MIDI data all shows up fine).  I could care less about the new Smart Instruments they tout, but for quickly getting ideas down, even somewhat complex ones, Apple did a great job with this app, and other developers have some catching up to do IMHO.  Sound quality is excellent as well, and the velocity sensing you might have read about works better than you would think.

In terms of iOS controller apps, I haven’t tried TouchAble or Griid yet, but I do have TouchOSC from some stuff I was doing on my iPhone, and that works great as well.  Really no different from the iPhone version, just more screen space to fill with your own creations.  The Logic template that you get with TouchOSC works perfect too, if being a little bit busy for my tastes.  I eventually will redo my DJing template for TouchOSC for the iPad, but I’m still not sold on the idea of using a touch pad for all my DJing needs.  I can see it would be possible, but the touch screen is still not precise enough for me, and you always need to look at it to see exactly what you’re touching.  A good back up or for when I want to add some type of visual element, but I’m old-school I guess and prefer dedicated hardware mixers at this point still.  Maybe after some more use or with a better app I’ll change my mind.

There’s quite a few synth options on the iPad as well, and so far the one that I’ve enjoyed the most is Synthtronica.  The GUI is hit or miss at times, but it uses some cool formant shaping and filtering techniques that work well on the iPad.  Definitely worth the price if you’re looking for something unique and playable.  And of course apps like Bloom and Trope work just well on the iPad as on the smaller iPhone.  Ambient goodness, just larger.  🙂

On a side note, while its likely over-priced, the new Smart Cover works great for propping the iPad at a really nice angle when typing or working on music apps.  Ditto for watching movies.  Not trying to sound like an ad or anything, but if you’re on the fence about getting one, my vote is to go for one of the cheaper plastic ones.  It’s been more useful than I expected, even if most of them are in garishly bright colors.

So, some final thoughts.  My main intention when getting the iPad was for casual use at home, and for mobile music making. For casual use it’s been all I expected and then some, once I got over some small organizational issues I carried over from my iPhone experiences.  Web surfing is fast and fluid, movies look great, and it’s not as hard to type on as one would think.  For the average person, I could definitely see something like this replacing a laptop or a netbook, it’s almost a no brainer if you don’t need much file access.

For music making, well…. I’m split still.  For sitting out on my deck, or just in different rooms of the house, there’s no question I’ll use it a lot to sketch out song ideas when I just don’t feel like being in the studio.  I imagine I’ll bring it with me quite a bit to local parks like I used to with my iPhone for this reason too, though since it’s larger that will mean having to lug it about in a backpack instead of just my pocket.  It’s thin, small, and light, but still not as portable as a phone obviously.  Capable enough to handle creating complete songs on even, though I think I would still prefer doing this in the studio (as it should be)

As a DJ or general purpose live controller, there’s no doubt that it can be quite good at what it does.  I’m just not yet convinced that I want to work like that.  I certainly plan on working with it some more like this though, just to give it a fair chance at winning me over.  The flexibility and customization is there, I’m just not sure about sliding my fingers on glass to control parameters precisely yet.  Maybe the right app will win me over in time.

Dedicated control apps like Omni TR are a different matter though, and on that front I’m completely happy.  It completely changes the way I look at and interact with Omnisphere, and for me, that was worth the cost of the iPad alone.  It’s like having a brand new synth in fact. And on that note, there’s no denying how cool it is that for only $5-20 you can easily get some pretty cool sounding and powerful music tools via the App Store.  For those on a budget, $500 for an iPad plus $100 for apps will get you some really nice music making tools.  All of which can be run on battery power for up to 10 hours.

So while i might not be totally sold on the idea of a touch screen for making music all the time, there’s no denying that it’s something a lot of people and companies are going to be exploring in the near future.  I’m glad to at least have the hardware on hand to try out anything new that comes out, and if ultimately the touchscreen thing fizzles, well at least it was a lot cheaper than most hardware synths!

I’ll update my thoughts on all this as time goes on.  If anyone has any specific questions they want me to answer, just post them in the comments and I’ll get right on it.  Thanks for reading, and until next time, peace and beats.



Synths Reborn

I have to admit something.  Every time I see a new software synthesizer released these days, a small part of me runs weeping to the corner and cries for days.  Tears and tears of frustration, I literally own stock in Kleenex thanks to software synths.

Ok, maybe it’s not that bad, but it seems like over and over again I see synths released that seem to miss the opportunity to really take advantage of the computer and it’s input devices to to something new and unique.  Some examples:

– Emulating vintage analog synths.  Ok, I get it, some of these are incredibly rare or out of reach financially of most musicians.  But do we really need another Minimoog clone?  It’s time to move on create this generation’s own legacy of synthesis, instead of always putting the tools of the past on a pedestal as the ‘ideal’ synth.

– A GUI that’s just too darn small and cluttered.  Face it, the days of everyone using a 1024×768 resolution display are gone.  With most people today using LCD monitors or high resolution laptops, you end up with a synth panel that requires reading glasses just to operate it (and yes, my eye sight is fine, thanks to recent Lasik surgery).  If you must cater to the lowest common denominator by using a lower resolution for your GUI, at least give us the option to choose a higher res version if we want.

Similarly, cramming as many controls into the GUI in order to prevent the use of multiple pages or scrolling is admirable, but not if it makes them a pain to use.  You have to hunt around to find what you need, or be super careful when selecting the right control so that you don’t inadvertently adjust the wrong parameter.  Spread them out!

– Knobs.  Seriously, I think the idea of using a virtual knob on a software synth panel is one of the most unwieldy forms of control there is. Even with things like linear control mode, it’s just not a motion or visual element that works well with a mouse.  The only benefit I can see, is that it allows more controls to fit into a space, and then we’re back to my point above.

– This isn’t Star Trek.  I get that a lot of computer musicians and programmers are sci-fi fans, but this trend to make things look futuristic merely for the sake of it can get out of hand.  Logic’s synths are a prime example, fancy looking to the point of being distracting.  Simple and to the point can often be a lot better, there’s some benefit to a clean design over a cluttered one.

– On the flip side, some synths are so simple that they are just boring and uninspiring to look at every day.  I get that a synth’s sound should be the important part, but as an artist, I take inspiration when writing music from all the elements of my surroundings.  Live’s instruments fall into this category for me.  Boring, same-looking controls crammed into a tiny little display. I understand that’s just the way Ableton likes to style things, but it ends up giving their synths no real identity for me.  The synths look like the effects, which look like the rest of the app, and in the end you just have this bland mess of samey looking sameness. Boring, no matter how they sound (and they sound pretty good btw).

– Oh look,  yet another 2 or 3 OSC subtractive synth!  Wow, talk about going out on a limb!  The same 4 filter types, the same 2-3 envelopes and LFO’s, and the same effects included.  I understand it’s the synthesis method that most people are familiar with, but honestly, it’s been done to death by now.  If you know what you’re doing, you make almost all of these sound exactly the same, so what’s the point?

Ok, so those are a few of the things I really don’t like about some of the virtual synths out there.  Rant-fest over.  To be fair, there are some developers who’ve really started to rethink how we interact with a music making tool on the computer.  Some of them are quite clever too:

– Synplant.  Pretty obvious one here, but I think Magnus really hit this one out of the park.  You might not like it’s sound all the time, but I applaud someone finally releasing something truly different.  Not only does it have a really wild new way of controlling the synthesis behind the scenes, but it forces you to actually LISTEN to what you are doing when creating patches since there’s none of the usual controls immediately visible.  Gasp, the horror!  🙂

– Native Instruments Kore.  Not so much the plug in or the hardware, but the fact that they were one of the first to start tagging their preset libraries with descriptive attributes.  I never thought categorizing sounds by their type (i.e. bassline, lead, pad, etc) was the best way to do it.  Often time a lead can be a killer bassline if you just play it a couple octaves down, or a bassline can make a killer drone just by turning up the amp env attack.  By categorizing things in a more descriptive way regardless of the type of sound it is, NI came up with a really good way of approaching things from the way an artist might think.

– Omnisphere 1.5.  The new orb method of controlling and playing the synth from a touchpad is a brilliant idea.  The actual orb idea isn’t totally new, as the Lemur had a similar object for years now.  But the real star of the show is the fact that you can ‘roll the dice’ and Omnisphere will intelligently remap the controls that are assigned to the orb for you.  This is huge in my opinion, and overcomes the major hurdle that most synths suffer when used with MIDI controllers.  Namely that you have to manually assign anything that you want to control, and this just slows down the process of music creation too much for me to be really viable in the heat of the moment.

Sure, some companies like Novation have tried to create auto-mapping schemes to handle this for you too, but in my experience the results are often very hit or miss.  Too often it’s dependent on how the parameters in the synth are ordered, and many times the first 8 controls you’re presented with aren’t things really suited to real time tweaking.  With Omni 1.5, you know that the parameters assigned to the orb will actually do something you’d want to control in an expressive way.

Just a few positive examples that have stood out to me in my use, I’m sure there’s others out there.

However, I’d love to see more developers take things one step further.  Instead of presenting people with the same synth controls they’ve seen for years, come up with not only new control schemes, but new ways of presenting them as well.  Use more descriptive terms for a control, and use interactive GUI control elements to change more than one thing at a time.  Parameter names like color shift, warmth, distance, time constants, modernize, etc are all things that are vague enough to give programmers plenty of flexibility in what they control behind the scenes, and yet descriptive enough that musicians will know right off the bat how they might affect the sound.

Stop using the same old knob, fader, x-y control schemes we’ve seen time and time again.  What if visually stacking a pile of pebbles made a sound more complex, and the color of the stones affected how bright or dark the sound is?  Or a picture or the night sky, where the number of stars controls how many layers are used in the sound, and the number and direction of comets controls how the sound evolves over time.  You can even save your favorite sounds as constellations.

Maybe these are dumb ideas off the top of my head I admit, but I think there’s a lot more ways out there to create new sounds in a way that engages the artist, without forcing them to try and turn a tiny little knob with mouse.  When you think about how these might be projected on a screen during a live performance, it really opens up a whole new era of how electronic musicians not only interact with their sounds, but how it’s presented to other people as well.

Here’s to hoping the near future brings about a revolution in software synthesis.