Group notifications in iOS

September 24, 2018

I’ve recently been working with notifications in iOS under beta development. What was interesting about this project, and outside of normal notifications was the amount the device would be receiving. In most cases a device receives a small handful of notifications prompting the user to perform an in app action, usually a one time process. Under the project specification the device would need to go beyond this and receive notifcations under the following conditions:

  • Every x number of seconds.
  • Only max storing 7 at any time.
  • Multiple notification types (simple text or image).
  • Varying notification processing.

By default notifications are to inform the user that something is happening in app that the user “needs to take care of”. These notifications are meant to be unobtrussive and not deter the user from the Apple experience. Introduced in iOS 12 are grouped notifications. This new functionality enables an application to corral their notifications into manageable chunks for the user to process.

The code part

Most of the hard work has already been done by Apple engineers so to implement all thats needed is to create an identifier to your notification and pass it in your notifications payload:

let content = UNMutableNotificationContent()
content.title = "My notifications"
content.body = "Group notifications by Jay Cohen"
content.threadIdentifier = "swiftycoder-team-ios"

Whilst the identifier is optional if your notifications are requiring different actions you should really create it.

 {
    "thread-id" : "swiftycoder-fitness-ios"
 }

What’s great about the identifier is that you can have more than one. In this projects case one was only needed but there also needed to be a way to determine notification heirachy. For this I used summaryArgumentCount. This Integer set allowed ordering of notifications sequentially. There’s also summaryArgument which allows a description for what the notification is about. This is to be passed inside the alert of the notification and shouldn’t be confused with your notification content title. To put it all together and create 5 local notifications to test you can use the example below:

for i in 1...5 {
    let content = UNMutableNotificationContent()
    content.title = "Time to get moving"
    content.body = "Start by jogging on the spot, aim for a medium heart rate."
    content.categoryIdentifier = "alarm"
    content.userInfo = ["start_time": "14:09"]
    content.sound = UNNotificationSound.default
    content.threadIdentifier = "swiftycoder-fitness-app"
    content.summaryArgument = "Fitness app"
    content.summaryArgumentCount = 3

    let trigger = UNTimeIntervalNotificationTrigger(timeInterval: 5, repeats: false)

    let request = UNNotificationRequest(identifier: UUID().uuidString, content: content, trigger: trigger)
    center.add(request)
}

An example of the notification payload delivered by JSON looks like:

{
 "aps" : {
     "alert" : {
         "body" : "Start by jogging on the spot, aim for a medium heart rate.",
         "title" : "Time to get moving":
         "summary-arg" : "Fitness app",
         "summary-arg-count" : 3
     },
     "thread-id" : "swiftycoder-fitness-ios"
 }
}

As you can see iOS 12 makes it incredibly easy to create grouped notifications. The bulk of the custom work comes from handling each notifications actions separately, which is launched via AppDelegate and passed off to a notification service.

I didn’t want this article to be too long so I’ve just shown the “How to group your notifications” part. If you need further help in understanding notification hand-off get in touch. I also teach Swift and iOS via Savvy.

Working with sound

September 13, 2018

I’ve been wanting to use the AudioKit framework and when building Sense’d, an ASMR app to help relieve stress it provided the perfect opportunity. Being an ex music producer it was a skill I wanted to use in iOS but lacked the “great” idea to build. Sense’d came around whilst zoning out on Youtube watching videos on ASMR and thought of locations where I relaxed the most.

Sense'd image showcase

The main challenge was getting hold of medium/high quality audio that had a decent length to avoid loop fatigue. Each recorded area is a mix of various recordings that were processed in Apple Logic before being mastered in Propellerheads Reason. An interesting feature of Sense’d is the repeat action allowing you to stay in a location as long as you need. The recorded area is a seamless loop which AudioKit uses to fade the sound and play a visual waveform.

Beacuase of the broad target audience the UI/UX needed to be clean, minimal and without fancy transitions or animations. It was the broadest audience I’d worked with to date as ASMR & stress relieving options are for varying ages and genders.

The layout was straight forward as was the image processing. Most of the work came from playing the waveform via AudioKit - By default the audio waveform displays a value of zero which leaves a horizontal line across the UIView component. A simple fade in on audioVizView was all that was needed to solve the problem.

fileprivate func analyseAudioFile() {
    if plot != nil {
        plot.removeFromSuperview()
    }
    plot = AKNodeOutputPlot(player, frame: audioVizView.bounds)
    plot.plotType = .rolling
    plot.shouldFill = true
    plot.shouldMirror = true	
    plot.shouldCenterYAxis = true
    plot.color = AKColor.white
    plot.backgroundColor = .clear
    audioVizView.addSubview(plot)
    if isPlaying {
        UIView.animate(withDuration: 10) {
            self.audioVizView.alpha = 0.5
        }
    } else {
        audioVizView.alpha = 0
    }		
}

Release

Sense’d is available on the app store and is optimised for iPhone and iPad devices. The app size is around 200mb which is great as the sound quality is amazing, contains 4 locations + a bonus location with each recording lasting around 1 hour. Sense’d is available to download below for £1.99 (currency dependant).

Download Sense'd from the App Store

Geofencing in iOS

August 27, 2018

Geofencing in iOS is the ability to create a boundry (fence) around a geographical location. As a proof of concept (POC) I had to create a geofencing model that would create fences on the fly when a device reached a certain geographical point. This model would then fire off local and push notifications giving the device data to inteogate. The end code remains the copyright of the agency I worked for but a simplified version can be shown below.

...
// Fences array - Populated from external source
var fences: [Fence] = [
  Fence(title: "Fence 1", coordinate: [51.5074, 0.1278], radius: 20.0, ...),
  Fence(title: "Fence 2", coordinate: [40.7128, 74.0060], radius: 20.0, ...),
  Fence(title: "Fence 3", coordinate: [52.5200, 13.4050], radius: 20.0, ...)
]

fileprivate func createFences() {
  // Check if can monitor regions
  if CLLocationManager.isMonitoringAvailable(for: CLCircularRegion.self) {		
	// Clear any existing regions
	for monitored in locationManager.monitoredRegions {
	  locationManager.stopMonitoring(for: monitored)
	}		
	// Loop through fences array and define regions for a MKMapView object
	for fence in fences {
	  let coordinate   = CLLocationCoordinate2DMake(fence.coordinate[0], fence.coordinate[1])
	  let region = CLCircularRegion(center: CLLocationCoordinate2D(latitude: coordinate.latitude, longitude: coordinate.longitude), radius: fence.radius, identifier: fence.title)
	  region.notifyOnEntry = true
	  region.notifyOnExit  = true
	  locationManager.startMonitoring(for: region)
	  let circle = MKCircle(center: coordinate, radius: fence.radius)
	  mapView.add(circle)
	}		
  }
  else {
	print("[APP]: Unable to track user regions")
  }
}
...

The example above creates geofences around London, New York & Berlin at a radius of roughly 20 points. Each fence is added to a MKMapView object for visual reference testing. In order to check which fences to create the device had to get the users location based off a level between 10-100m. The smaller the distance needed the longer it took for GPS data to be received, although it wasn’t always accurate. region.notifyOnEntry and region.notifyOnExit would then be used to show notifications or perform background operations.

The unique part of the model was knowing which fences to create as the device moved throughout a given day. The way in which I approached this was by giving each Fence object two extra coordinates, the first being the distance from it’s nearest Fence neighbour in a 360 degree space and the second being it’s distance from a central point in the world. All 3 geographical coordinates would then be used to calculate the next series of boundaries to create. As this was a memory hungry operation it required background threading to maximise performance and profilling via instruments to avoid bad memory allocation. Once fences were returned by the method it was simply a case of populating the array and using the function above.

Overall the POC was a success and the model worked, as a result the client took the concept and planned to implement it within their roadmap early next year.