Month: March 2017

How to create a volume meter in Swift 3

We’re going to create this easy volume meter with AVFoundation and a progress view

You can also add this to incoming audio(microphone), add audio effects, etc… but for now a player will do the job.

-First we start by adding a Progress view to the storyboard

Screen Shot 2017-03-18 at 3.05.36 PM.png

 

-Create an outlet in the Viewcontroller of the progress view and name it volumeMeter.

-Import AVFoundation

import AVFoundation

 

-include an audio File to your project navigator, you can use mine if you want

Click save link as…

-add these properties on the class  ViewController

    var engine: AVAudioEngine!      //The Audio Engine
    var player: AVAudioPlayerNode!  //The Player of the audiofile
    var file = AVAudioFile()        //Where we're going to store the audio file
    var timer: Timer?               //Timer to update the meter every 0.1 ms
    var volumeFloat:Float = 0.0     //Where We're going to store the volume float

 

-On the ViewDidLoad() add this code that will Initialize the engine and load the .m4a audio into an audiofile.

Note: replace the parameter strings for your audio file name “forResource” and “ofTYpe”.

        //init engine and player
        engine = AVAudioEngine()
        player = AVAudioPlayerNode()

        //Look for the audiofile on the project
        let path = Bundle.main.path(forResource: "Electronic", ofType: "m4a")!
        let url = NSURL.fileURL(withPath: path)

        //create the AVAudioFile
        let file = try? AVAudioFile(forReading: url)
        let buffer = AVAudioPCMBuffer(pcmFormat: file!.processingFormat, frameCapacity: AVAudioFrameCount(file!.length))
        do {
            //Do it
            try file!.read(into: buffer)
        } catch _ {
        }

 

-Next we’re going to connect the player and the engine together

        engine.attach(player)
        engine.connect(player, to: engine.mainMixerNode, format: buffer.format)

 

-We’re going to installTap on the mainMixerNode to extract that juicy float


        //installTap with a bufferSize of 1024 with the processingFormat of the current audioFile on bus 0
        engine.mainMixerNode.installTap(onBus:0, bufferSize: 1024, format: file?.processingFormat) {
            (buffer : AVAudioPCMBuffer!, time : AVAudioTime!) in

            let dataptrptr = buffer.floatChannelData!           //Get buffer of floats
            let dataptr = dataptrptr.pointee
            let datum = dataptr[Int(buffer.frameLength) - 1]    //Get a single float to read

            //store the float on the variable
            self.volumeFloat = fabs((datum))

        }

 

-Next we’re going to start the engine and play file.


        //Loop the audio file for demo purposes
        player.scheduleBuffer(buffer, at: nil, options: AVAudioPlayerNodeBufferOptions.loops, completionHandler: nil)

        engine.prepare()
        do {
            try engine.start()
        } catch _ {
        }

        player.play()

 

-Create a timer to update the volumeMeter every 0.1 miliseconds

        //start timer to update the meter
        timer = Timer.scheduledTimer(timeInterval: 0.1 , target: self, selector: #selector(updateMeter), userInfo: nil, repeats: true)

 

-Finally we’re going to create a new function named updateMeter() that’s going to be called by the timer every 0.1 ms to update the progressView with the new float value.


     func updateMeter() {
        self.volumeMeter.setProgress(volumeFloat, animated: false)

        if volumeMeter.progress > 0.8{//turn red if the volume is LOUD
            volumeMeter.tintColor = UIColor.red

        }else{//else green
            volumeMeter.tintColor = UIColor.green
        }

    }

 

Conclusion:
This is not a perfect volume meter, but it’s a start for more advanced meters.

Github or die

volume meter 2

How to create a SoundCloud like waveform in Swift 3

 

The first thing you need to get is the array of floats from the audio file and we’re going to use AVFoundation for that.

import AVFoundation

create a struct outside the class to store the audio information.

struct ReadFile {
    static var arrayFloatValues:[Float] = []
    static var points:[CGFloat] = []

}

-Add the audio file you want to analyze to the project navigator
-Add this code in ViewDidLoad()
Note: Make sure to change the forResource and withExtension parameters.

override func viewDidLoad() {
        super.viewDidLoad()

        //Look for sample2.m4a audio file
        let url = Bundle.main.url(forResource: "sample2", withExtension: "m4a")
        let file = try! AVAudioFile(forReading: url!)//Read File into AVAudioFile
        let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: file.fileFormat.sampleRate, channels: file.fileFormat.channelCount, interleaved: false)//Format of the file

        let buf = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: UInt32(file.length))//Buffer
        try! file.read(into: buf)//Read Floats
        //Store the array of floats in the struct
         readFile.arrayFloatValues = Array(UnsafeBufferPointer(start: buf.floatChannelData?[0], count:Int(buf.frameLength)))

    }

-Create new swift file and name it DrawWaveform
-subclass UIView and import UIKit and Accelerate frameworks

import UIKit
import Accelerate

class DrawWaveform: UIView {

    //This is where we're going to draw the waveform
    override func draw(_ rect: CGRect) {

     }

}

But before drawing we need to downsample the array of floats, because we don’t need to draw ALL of the floats.
To do this we’re going to use Accelerate framework

-create a new function in the new swift file ‘convertToPoints()’

-Copy this code inside the function convertToPoints(). I will explain this code in detail in another post, but it just reduces the size of the array and converts it to CGFloat.


func convertToPoints() {
        var processingBuffer = [Float](repeating: 0.0,
                                       count: Int(readFile.arrayFloatValues.count))
        let sampleCount = vDSP_Length(readFile.arrayFloatValues.count)
        //print(sampleCount)
        vDSP_vabs(readFile.arrayFloatValues, 1, &processingBuffer, 1, sampleCount);
        // print(processingBuffer)

        //THIS IS OPTIONAL
        // convert do dB
        //    var zero:Float = 1;
        //    vDSP_vdbcon(floatArrPtr, 1, &zero, floatArrPtr, 1, sampleCount, 1);
        //    //print(floatArr)
        //
        //    // clip to [noiseFloor, 0]
        //    var noiseFloor:Float = -50.0
        //    var ceil:Float = 0.0
        //    vDSP_vclip(floatArrPtr, 1, &noiseFloor, &ceil,
        //                   floatArrPtr, 1, sampleCount);
        //print(floatArr)

        var multiplier = 1.0
        print(multiplier)
        if multiplier < 1{
            multiplier = 1.0

        }

        let samplesPerPixel = Int(150 * multiplier)
        let filter = [Float](repeating: 1.0 / Float(samplesPerPixel),
                             count: Int(samplesPerPixel))
        let downSampledLength = Int(readFile.arrayFloatValues.count / samplesPerPixel)
        var downSampledData = [Float](repeating:0.0,
                                      count:downSampledLength)
        vDSP_desamp(processingBuffer,
                    vDSP_Stride(samplesPerPixel),
                    filter, &downSampledData,
                    vDSP_Length(downSampledLength),
                    vDSP_Length(samplesPerPixel))

        // print(" DOWNSAMPLEDDATA: \(downSampledData.count)")

        //convert [Float] to [CGFloat] array
        readFile.points = downSampledData.map{CGFloat($0)}

    }

    

Now we’re going to draw the waveform with the downsampled array.
copy this code inside the draw function in the subview.

This uses UIBezierPath and loops thru the array of floats to draw and at the apply stroke to fill it with color!

Where x is the distance between squares and y is the amplitude of the square.

    override func draw(_ rect: CGRect) {

 //downsample and convert to [CGFloat]
        self.convertToPoints()

        var f = 0
        //the waveform on top
        let aPath = UIBezierPath()
        //the waveform on the bottom
        let aPath2 = UIBezierPath()

        //lineWidth
        aPath.lineWidth = 2.0
        aPath2.lineWidth = 2.0

        //start drawing at:
        aPath.move(to: CGPoint(x:0.0 , y:rect.height/2 ))
        aPath2.move(to: CGPoint(x:0.0 , y:rect.height ))

        //Loop the array
        for _ in readFile.points{
                //Distance between points
                var x:CGFloat = 2.5
                //next location to draw
                aPath.move(to: CGPoint(x:aPath.currentPoint.x + x , y:aPath.currentPoint.y ))

                //y is the amplitude of each square
                aPath.addLine(to: CGPoint(x:aPath.currentPoint.x  , y:aPath.currentPoint.y - (readFile.points[f] * 70) - 1.0))

                aPath.close()

                x += 1
                f += 1
        }

        //If you want to stroke it with a Orange color
        UIColor.orange.set()
        aPath.stroke()
        //If you want to fill it as well
        aPath.fill()

        f = 0
        aPath2.move(to: CGPoint(x:0.0 , y:rect.height/2 ))

        //Reflection of waveform
        for _ in readFile.points{
            var x:CGFloat = 2.5
            aPath2.move(to: CGPoint(x:aPath2.currentPoint.x + x , y:aPath2.currentPoint.y ))

            //y is the amplitude of each square
            aPath2.addLine(to: CGPoint(x:aPath2.currentPoint.x  , y:aPath2.currentPoint.y - ((-1.0 * readFile.points[f]) * 50)))

            // aPath.close()
            aPath2.close()

            //print(aPath.currentPoint.x)
            x += 1
            f += 1
        }

        //If you want to stroke it with a Orange color
        UIColor.orange.set()
        //Reflection and make it transparent
        aPath2.stroke(with: CGBlendMode.normal, alpha: 0.5)

        //If you want to fill it as well
        aPath2.fill()

}
 

Next step is go to the storyboard and create a UIVIEW and change the class to ‘DrawWaveform’

Build and Run.

Screen Shot 2017-03-13 at 8.38.38 PM

Here’s the Github Link

APPSTORE release and first sale!

Finally, my app ‘Loop maker’ it’s in the app store new and alive! It took me long time to build it since I got distracted with objective c and another framework that later was deprecated. It was a long process but I learned a lot from ux to low level core audio and dealing with the tedious appstore. I’m getting downloads but not in app purchases, maybe I should get better at marketing or make a bigger product. Anyway I feel great, proud, and energetic! I also I made my first sale, can I call myself an entrepreneur?

link for app:https://goo.gl/0kjbAzScreen Shot 2017-03-07 at 4.07.17 PM.png

I presented my app to app developers meetup group and met many developers, it was nice to network. I noticed that I suck at presenting my own projects so I should work on my public speaking skills and my English.

Change of plans now! Haha wow life doesn’t go as planned isn’t… I had to give my parents money and I got offered an interesting job for my next rotation “Radio access network engineer”. The reason I say it’s a good option it’s because I’m moving to an apartment that its 680/month instead of 1200/month which aligns to my goals of being an entrepreneur. I’m not going to have to pay $125 for parking and $30 for the gym which brings my cost of living way low. I also finished paying off my car loan(12k) and next it’s my student loan(4k). 

My studio apartment will look like an office and I can already see friends saying “this is where you live?”, but I don’t care. I don’t care about traveling. I don’t care about having a beer after work. I don’t care about watching Netflix. I don’t care about politics. I don’t care about Religion. I don’t care about kids. I don’t care about the news. I don’t care about mortgages. I don’t care about the new Nintendo. I care about changing my life and I’m willing to work for it.

I’m already thinking on my next app idea…