Creating Spotify Music Player using SwiftUI
In this article we will be implementing Spotify music player UI using SwiftUI. We will design custom music progress bar, implement Audio, play/pause and animate the progress bar as well. You can download the source code from here.
Let's begin by creating a simple ZStack and filing it with a gradient.
ZStack {
LinearGradient(colors: [.pink, .cyan], startPoint: .top, endPoint: .bottom)
}
Next we will add a VStack on top of this LinearGradient and in this VStack, we will have an image view as well as song info and progress bar. So let's add that:
VStack(alignment: .leading,spacing:0) {
Image("album1")
.resizable()
.frame(width: 350, height: 350)
.aspectRatio(contentMode: .fit)
.cornerRadius(10)
.shadow(radius: 5)
.padding()
HStack {
VStack(alignment: .leading, spacing: 0) {
Text("Lost In Istanbul")
.font(.title2)
.foregroundColor(.white)
.bold()
Text("Brianna")
.font(.subheadline)
.foregroundColor(.white)
}
Spacer()
Image(systemName: "suit.heart.fill")
.resizable()
.frame(width: 20, height: 20)
.foregroundColor(.green)
}.padding()
.....//more code here
}
There is an HStack right underneath the Image view and inside that HStack we have a VStack and an Image view for the heart button (like button). The second VStack contains
text view for the song title and song artist.
Right underneath the HStack we will add the progress bar and the time duration and time elapsed labels. While SwiftUI provides a progress bar, I wanted to make one from ground up using SwiftUI views. It's really easy how you can make a fully animatable progress bar in seconds using SwiftUI.
Here is how to do it:
ZStack(alignment: .leading) {
GeometryReader { proxy in
Rectangle()
.frame(maxWidth: .infinity, maxHeight: 3)
.foregroundColor(.black)
.opacity(0.3)
.padding()
Rectangle()
.frame(width: (proxy.size.width - 32) * progress, height: 3)
.foregroundColor(.white)
.padding()
.overlay(
HStack(spacing: 0) {
Spacer()
Circle()
.frame(width: 12, height: 12)
.padding()
.foregroundColor(.white)
}
)
}.frame(maxWidth: .infinity, maxHeight: 40)
}
Create a ZStack and inside it create a GeometryReader. We will insert all the views inside the GeometryReader so that we have access to parent view frame. This will be useful when we will animate the view. Inside, the GeometryReader, add a Rectangle()
with maxWidth
of .infinity
and maxHeight
of 3. This will be the background of the progress bar. Then we will add another Rectangle()
view. The width of this will be animatable. As seen we have a width
of (proxy.size.width - 32)
which is the full width of the background Rectangle view after catering for the horizontal padding. However, we have multiplied this width with progress
variable. We will calculate the progress
variable shortly and this will make sense whole thing will sense. In order to get that circle that you see on the progress bar, we will use the overlay property on SwiftUI to add a Circle()
view right on the trailing edge of the top Rectangle
view.
Right underneath this is another HStack
, with the two time labels. One on the left shows the elapsed time and the label on the right shows the total duration of the song. The code for these views is as follows:
HStack(spacing: 0) {
Text(self.getFormattedProgressTime(totalSongDuration: self.currentTime))
.font(.custom("san francisco", size: 12))
.foregroundColor(.white)
Spacer()
Text(self.getFormattedProgressTime(totalSongDuration: Int(self.nowPlayingViewModel.durationTime)))
.font(.custom("san francisco", size: 12))
.foregroundColor(.white)
}.padding()
.padding(.top, -28)
The remaining items left in the UI is the shuffle, repeat, play/pause and forward/back buttons. The best way to arrange all these buttons horizontally with equal spacing between them is by using HStack
and a nice little trick to arrange all the buttons with equal spacing without having to use spacers
. First create an array of strings having the names of the button label images that you will use. I am using SF Symbols
and here is the array of all the names of the images I will use as button labels:
var systemNamePics = ["shuffle", "backward.end.fill", "pause.circle.fill", "forward.end.fill", "repeat"]
Next, inside a HStack
, create a ForEach
loop on the array systemNamePics
and create a ZStack
with maxWidth
of infinity
. This will basically arrange 5 ZStacks
horizontally with equal spacing between each of them. Now we will populate each ZStack
with the image. So this is how the code will look:
HStack {
ForEach(systemNamePics, id: \.self) { item in
ZStack {
if item == "pause.circle.fill" || item == "play.circle.fill" {
Button(action: {
self.isPlaying.toggle()
if self.isPlaying {
self.playAudio()
}else {
self.pauseAudio()
}
}, label: {
Image(systemName: self.isPlaying ? "pause.circle.fill" : "play.circle.fill" )
.resizable()
.frame(width: 60, height: 60)
.foregroundColor(.white)
})
}else if item == "backward.end.fill" {
Button(action: {}, label: {
Image(systemName: item)
.resizable()
.frame(width: 30, height: 30)
.foregroundColor(.white)
})
}else if item == "forward.end.fill" {
Button(action: {}, label: {
Image(systemName: item)
.resizable()
.frame(width: 30, height: 30)
.foregroundColor(.white)
})
}else {
Button(action: {}, label: {
Image(systemName: item)
.resizable()
.frame(width: 20, height: 20)
.foregroundColor(.white)
})
}
}.frame(maxWidth: .infinity)
}
}.padding(.top, 10)
Using switch or if statements we can add the appropriate button labels.
This is how the full UI code should look like:
NavigationView {
ZStack {
LinearGradient(colors: [.pink, .cyan], startPoint: .top, endPoint: .bottom)
VStack(alignment: .leading,spacing:0) {
Image("album1")
.resizable()
.frame(width: 350, height: 350)
.aspectRatio(contentMode: .fit)
.cornerRadius(10)
.shadow(radius: 5)
.padding()
HStack {
VStack(alignment: .leading, spacing: 0) {
Text("Lost In Istanbul")
.font(.title2)
.foregroundColor(.white)
.bold()
Text("Brianna")
.font(.subheadline)
.foregroundColor(.white)
}
Spacer()
Image(systemName: "suit.heart.fill")
.resizable()
.frame(width: 20, height: 20)
.foregroundColor(.green)
}.padding()
VStack(alignment: .leading, spacing: 0) {
ZStack(alignment: .leading) {
GeometryReader { proxy in
Rectangle()
.frame(maxWidth: .infinity, maxHeight: 3)
.foregroundColor(.black)
.opacity(0.3)
.padding()
Rectangle()
.frame(width: (proxy.size.width - 32) * progress, height: 3)
.foregroundColor(.white)
.padding()
.overlay(
HStack(spacing: 0) {
Spacer()
Circle()
.frame(width: 12, height: 12)
.padding()
.foregroundColor(.white)
}
)
}.frame(maxWidth: .infinity, maxHeight: 40)
}
//.background(Color.red)
HStack(spacing: 0) {
Text(self.getFormattedProgressTime(totalSongDuration: self.currentTime))
.font(.custom("san francisco", size: 12))
.foregroundColor(.white)
Spacer()
Text(self.getFormattedProgressTime(totalSongDuration: Int(self.nowPlayingViewModel.durationTime)))
.font(.custom("san francisco", size: 12))
.foregroundColor(.white)
}.padding()
.padding(.top, -28)
HStack {
ForEach(systemNamePics, id: \.self) { item in
ZStack {
if item == "pause.circle.fill" || item == "play.circle.fill" {
Button(action: {
self.isPlaying.toggle()
if self.isPlaying {
self.playAudio()
}else {
self.pauseAudio()
}
}, label: {
Image(systemName: self.isPlaying ? "pause.circle.fill" : "play.circle.fill" )
.resizable()
.frame(width: 60, height: 60)
.foregroundColor(.white)
})
}else if item == "backward.end.fill" {
Button(action: {}, label: {
Image(systemName: item)
.resizable()
.frame(width: 30, height: 30)
.foregroundColor(.white)
})
}else if item == "forward.end.fill" {
Button(action: {}, label: {
Image(systemName: item)
.resizable()
.frame(width: 30, height: 30)
.foregroundColor(.white)
})
}else {
Button(action: {}, label: {
Image(systemName: item)
.resizable()
.frame(width: 20, height: 20)
.foregroundColor(.white)
})
}
}.frame(maxWidth: .infinity)
}
}.padding(.top, 10)
}
Spacer()
}
.padding(.top, 120)
}
.edgesIgnoringSafeArea(.all)
.navigationBarTitleDisplayMode(.inline)
}
Now it's time to implement audio as well as animate the progress bar. Let's create a simple ViewModel that will contain the logic regarding setting up of AVKit
. Create a new file and name it NowPlayingViewModel
. And please enter this code:
\
import Combine
import AVKit
class NowPlayingViewModel: ObservableObject {
var player: AVAudioPlayer!
@Published var durationTime : Int = 0
init() {}
public func setupAudio(musicFileName: String) {
guard let data = NSDataAsset(name: musicFileName)?.data else {
print("Error")
return
}
do {
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default)
try AVAudioSession.sharedInstance().setActive(true)
player = try AVAudioPlayer(data: data)
self.durationTime = Int(player.duration)
} catch let error {
print(error.localizedDescription)
}
}
public func playAudio() {
guard let player = player else { return }
player.play()
}
public func pauseAudio() {
guard let player = player else { return }
player.pause()
}
}
We are creating a class called NowPlayingViewModel
which is conforming to ObservableObject
. We have a player
property of type AVPlayer
. In the setupAudio
function, we are getting the musicFileName
which is a string and is a local mp3 file which we have in Xcode's Assets
folder. Inside this function, we are converting the musicFileName
into NSDataAsset
. After handling any error the constructor may throw, we are creating an AVAudioSession
shared instance. Once the shared instance is created we are creating an AVAudioPlayer
with the data of the mp3 file and we are also setting the durationTime
property here as well. Next we have two functions, playAudio()
and pauseAudio()
which is pausing or playing the audio.
Let's create an instance of this object in our main ContentView
now.
@StateObject var nowPlayingViewModel = NowPlayingViewModel()
We are then creating another function called setupAudio
in ContentView
where we are accessing the setupAudio
function of the ViewModel and setting up the AVPlayer
:
func setupAudio() {
self.nowPlayingViewModel.setupAudio(musicFileName: "istanbul")
}
We also make some help functions to play and pause the audio:
func playAudio() {
self.nowPlayingViewModel.playAudio()
_ = timer.connect()
}
func pauseAudio() {
self.nowPlayingViewModel.pauseAudio()
}
Let's now create a Timer
publisher, that will publish a notification every 1 second. In ContentView
please add this line:
let timer = Timer.publish(every: 1, on: .main, in: .common)
The timer will connect or start whenever the playAudio()
subroutine is invoked.
We will then use .onAppear()
function to invoke setupAudio()
function, so that as soon as the view loads up our player is set up with the audio file. We will then use .onReceive()
function to receive notification from the Timer publisher every 1 second. Inside the closure of this we will do the following:
ZStack {
.......
//Code Here...
}.onAppear {
self.setupAudio()
}
.onReceive(timer) { input in
self.getCurrentProgress()
self.currentTime = Int(self.nowPlayingViewModel.player.currentTime)
}
Let's create the getCurrentProgress()
subroutine:
func getCurrentProgress() {
let progress = CGFloat(self.nowPlayingViewModel.player.currentTime / self.nowPlayingViewModel.player.duration)
self.progress = progress
print(progress)
}
This subroutine will grab the duration of the song, and the current elapsed time, simply divide the two times and we will get the elapsed pogress between 0 to 1. We will assign this progress
value to a @state
property variable called progress
. This variable in multiplied in the width of the top rectangle view in our custom progress view. This subroutine will be invoked every second, hence animating our labels and progress bar.
Just for reference, this is the Rectangle()
view whose width will be updated every second::
Rectangle()
.frame(width: (proxy.size.width - 32) * progress, height: 3)
.foregroundColor(.white)
.padding()
.overlay(
HStack(spacing: 0) {
Spacer()
Circle()
.frame(width: 12, height: 12)
.padding()
.foregroundColor(.white)
}
)
Also in .onReceive
closure we are assigning the currentTime
of the song which is being played to the @state property variable currentTime
which is then shown underneath the progress bar and gets updated every second. The currentTime
and duration
returned by the AVPlayer
is in seconds. We need to format the seconds into hour, minutes and seconds. We do this by using the getFormattedProgressTime
subroutine:
\
private func getFormattedProgressTime(totalSongDuration: Int) -> String{
let progressedAudio = Double(totalSongDuration)
let seconds = Int(progressedAudio) % 60
let minutes = (Int(progressedAudio) / 60) % 60
let secondFormat = String(format: "%02d", seconds)
return String("\(minutes):\(secondFormat)")
}
So this is how you can Spotify music player UI using swiftUI. You can download the full source of this project from my Github repo (here).
\