AI musicians don’t “think” like humans, but they can still surprise you. In producing a full album with Logic Pro’s AI session players, I found myself challenged by their quirks, inspired by their flexibility, and occasionally frustrated by their limitations. This article examines whether creativity can emerge from algorithmic tools—and what their rise means for the next generation of music makers.
The stories are true. I did it. Can’t hide behind any plausible deniability. I worked for a full year in a studio with two AI musicians, functioning as a band I call “Fret Salad.” I played guitars and bass, the AI session musicians handled drums and keyboards. They were an interesting pair to hang out with. Much different than the myriad musicians I have had the pleasure to make music with over my musical career. And neither of them owe me money!
The introduction of Logic Pro’s bass and keyboard “session players” was just the thing I had been looking for. The ability in one program to create a full musical composition, with realistic performances from software players. Using those players avoid working with loops to create your drums. Allowing the drum session player to bring their expertise. The drum controls in Logic Pro are pretty much the most advanced, as they have been around in previous versions. Between the controls you have of the drum activity and the ability to set regions and execute changes in drum style, intensity and complexity throughout the song.
Likewise, a similar set of controls are in place to control the keyboard player. The players’ sound can again be fine-tuned, with the ability to dictate both left and right hand style. The players also react to other stimuli during the song; sections identified as “verse” are treated different than “chorus” or “outro.” Once again, the ability to create different “regions” and apply different rules, is key.
While I did put considerable time into working with Logic’s new bass player, I was never fully happy with the results. It should be somewhat obvious that I have a horse in the race so to speak, as I play bass. I also know what I want to hear in my songs and the session player never produced what I was looking for. I’m quite pleased with the bass parts I have recorded, thank you very much.
But I did work extensively with the other two session players, the drums and keyboard versions. Each had quirks and the control you have is within the context of a song and many potential conflicting and overriding directions.
The drummer handled the job with quiet confidence. Once you’ve locked in the beat pattern and tempo, you can give the drummer different mixes of marching orders and see what you have. Overall, I’d say the drummer reacts fairly predictably. The base beat is there, you control how many fills and the intensity of the attack. You can change these every few measures, or keep the same one all song long.
The keyboard player was more nuanced and had some hard-to-define options. Some of the changes in style were hard to perceive once implemented, perhaps coming out more strongly in different compositions. While I used the keyboard player to introduce a whole range of synth and instrument sounds in my songs, the player was most compatible with piano and organ sounds. It was in these instances that the keyboard player interacted the best with the stringed instruments. For both the Steinway Piano and Hammond B3 sounds the session player shines, and I recorded several songs with one or more of these sounds. For the syth, brass and strings, some songs worked better than others. The Mellotron parts I used in one song are incredible, likely playing exactly what I would have played.
I interacted with the players enough to know that some experimentation was in order to match my guitar sound with whatever the session keyboard player was playing. Fortunately, within Logic Pro, that experimentation can continue after the recording is done, on both the keyboard sound and the guitar plus effects stack. You can also retroactively choose a drum kit to match the overall sound of the band. These abilities are game changers.
While the Session Players in Logic Pro are competent and solid players, they did little to “wow” me. Don’t get me wrong; they did the job and did it fairly well. They did so, however, from a safe zone and without taking any real musical risks. But then, I doubt we’d expect anything different. These players also do not solo, in any way. That made the keyboard player a lousy sax player. So that instrument was not used on this album.
That all being said, the players played their parts effectively. They covered the basics and provided fills and lead-ins as required. As I stated above, the piano and Hammond B3 players very alive and similar to playing with a human player. Those instruments took the lead and played a major role. The synth, strings and horns each had their own specific style. For example, one of the horn sounds used on the album was described as “grungy horn hits” which are a specific approach independent of player style.
I think we are in the early stages of AI-based instrument players. My observation is that vocalists will get a lot more attention from the AI community. For vocals, AI can play a unique role in the use of an artist’s instrument. Once that is accomplished, perhaps instruments can receive the same treatment.
So, after this year-long experience, am I a fan of using AI players? You bet. I miss the humans and will have projects with them again soon, but I’ve added a powerful component to my music-making process. I’m using it going forward and I expect it will continue to improve. I’d recommend everyone use it to improve and enhance their music.


