In this article we would be covering about an interesting music tech startup named PopGun
Artificial intelligence (AI) is set to become the most groundbreaking and defining technology of the 21st century There is now a new wave of emerging potential for AI in creative industries – one of which happens to be musical composition. For songwriters, the subject of artificial intelligence is an especially fraught one Music composed by artificial intelligence (AI) rather than humans is a controversial topic for two reasons. There is an enduring fear and disbelief in some quarters that an AI will ever be able to create music as well as a human can. And second: fear is that if it can, that’s the next tech trend sucking income away from working musicians.
Artificial intelligence will replace the artists we love, and end creativity as we know it. They ask Will technology that learns by watching and listening to us enhance human creativity or replace it? Considering that music is inherently subjective and emotional, the notion of having machine quantify and optimize for “aesthetic quality” understandably ruffles some feathers

PopGun is one of the startups exploring artificial intelligence ability to create music Specializing in AI-composed music,Popgun uses deep learning to create awesome original pop music Popgun is hoping that its deep-learning based technology can play a more positive role for human musicians. PopGun is the brainchild of artificial intelligence (AI) engineers Adam Hibble and Stephen Phillips. One of its founders Mr Phillips well-known in music tech circles, built the hit-predicting algorithm behind We Are Hunted, which was sold to Twitter in 2013. and eventually became Twitter Music.
The Brisbane startup trains computers to play and sing their own pop hits, PopGun aims to not only identify songs that will be popular, but feed their raw audio into a neural network that will learn how to compose and perform its own chart-toppers.
The company, co-founded by former We Are Hunted and Twitter Music exec Stephen Phillips,has won a place on the prestigious TechStars music tech accelerator program.and was part of the recent Techstars Music accelerator. Its first project is ‘Alice’, an AI that plays piano with humans she listens to your piano-playing, then tries to play melodies to complement it..Popgun sees Alice as a creative foil for musicians, as well as a teacher and playing companion for people learning to play piano for the first time. Alice started playing the piano three months ago, and has been learning in the same way that children do: listening to thousands of songs and more experienced players, then mimicking them and practising lots.
She/it (more on that later) can now listen to and understand what a human plays, then reply with what she thinks may come next. Many early attempts at AI-generated pop songs, such as last September's "Daddy's Car" out of the Sony Computer Science Laboratory, have relied on human lyricists and arrangers to improve their commercial prospects, and reduce the "creepiness" for which AI music has been criticized. Many apprehensions were there with the launch of Alice “What does that sound like? What stuff will it play? Particularly discussing the human element: some argue computers will not be capable of creativity, but in the way Popgun perceive the world around us,they believe that humans will use their creativity anyway. and that opens up the possibility for a future in which AI-created art can become mainstream and when they talked to artists, that’s what turned this from something they feared to something they were deeply intrigued by. They wanted to play with it and hear what the bloody thing sounds like!”
“PopGun is a generative music company that measures success indicators from streaming services. They use that data to train AIs to write music to match consumer habits.”"If we make a top 40 hit, it means we have solved music," Mr Hibble said. "If you can understand the waveforms behind music, the same algorithms can be used to improve anything whose activity can be expressed as a waveform, like devices that monitor bodily functions."That knowledge could then have applications in other industries, such as medicine. and Advertisement.
The startup is at present focusing on teaching its software to "sing".“Deep learning is not the same thing as machine learning. It’s got the same general feel, and the ideas are 20 years old, but the tools are all 2-3 years old, and the big breakthroughs on how to do it are all new,” he says.
The AlphaGo guys realised they had to get an AI that could play humans, and then clone that AI to play itself a billion times.” So the way to true novelty in music is going to first start with an AI you can play with. Then we can clone that and see what happens!” says Phillips. Phillips feels like Popgun is “racing” against Google’s Magenta team to understand how soon that point will come, and what it will mean for musicians, listeners and all the middlemen between them.