Building an iOS Photo-sharing and Geolocation Mobile Client and API

Last Updated: 25 March 2014

ios mobile

This article is a work in progress, or documents a feature that is not yet released to all users. This article is unlisted. Only those with the link can access it.

Table of Contents

Location-based mobility apps are a central part of any modern product’s strategy more than ever before. Heroku’s simple deployment process and handling of server-side complexities makes it an ideal platform for the backing APIs that power these types of apps.

For additional resources, please see the iOS Quickstart, or Getting Started with Rails 3.x on Heroku.

This article will guide you through the process of developing a photo sharing service with a native iOS client and Rails backend. Though Rails was chosen because of its ability to quickly create JSON APIs, in reality the iOS app can integrate with a back-end API implemented in any language.

Code for the Rails application and the iOS Client is available on GitHub.

Final Product

Prerequisites

  • Basic Objective-C knowledge, including a development environment running Mac OS X with Xcode 4.2 installed.
  • The Heroku Toolbelt has been installed and configured.
  • Basic Ruby knowledge, including an installed version of Ruby 1.9.2, Rubygems, Bundler, and Rails 3.2.
  • A PostgreSQL server running locally. Installation instructions can be found in Local Postgres Installation.
  • A Heroku user account. Signup is free and instant.

Deploy API to Heroku

Because of the scope and complexity of the project being created you may find it useful to clone and deploy the code now, and use it as a reference as you read along.

Write Your Rails application

Following the client-server model, this guide implements both a Rails application on the server, and an iOS client that communicates with it.

We strongly recommend using PostgreSQL during development. If you do not have PostgreSQL installed please see this guide.

You may be starting from an existing app or the included reference app. If not, create a Rails application using the following commands:

$ rails new geo-photo --database=postgresql
create
create  README.rdoc
...
Your bundle is complete! Use `bundle show [gemname]` to see where a bundled gem is installed.
$ cd geo-photo

Store your Rails application in Git

The git version control system is used to deploy application code to Heroku. Initialize a local git repository and make your first commit.

$ git init
$ git add .
$ git commit -m "Initial import"

Deploy to Heroku

Create the app:

$ heroku create
Creating sharp-day-6513... done, stack is cedar
http://sharp-day-6513.herokuapp.com/ | git@heroku.com:sharp-day-6513.git
Git remote heroku added

And deploy your code:

$ git push heroku master
Counting objects: 65, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (51/51), done.
Writing objects: 100% (65/65), 25.90 KiB, done.
Total 65 (delta 4), reused 0 (delta 0)

-----> Heroku receiving push
-----> Ruby/Rails app detected
-----> Installing dependencies using Bundler version 1.1.rc.7
...
-----> Launching... done, v5
       http://sharp-day-6513.herokuapp.com deployed to Heroku

The app is now deployed and running on Heroku. Check that the app is up with the heroku open command. You should see the default Rails index page as you did locally.

Generate a photo resource

Now it’s time to add some functionality. This application will create and manage photos taken at a particular location. First, generate a Photo resource for the API with a lat and lng property, to store the latitude and longitude of the photo’s coordinates:

$ rails generate resource Photo lat:decimal lng:decimal
      invoke  active_record
      create    db/migrate/20120213225717_create_photos.rb
...

Since the lat and lng columns will hold geospatial coordinates, open the migration generated from the previous command and set a precision and scale:

db/migrate/20120421110xyz_create_photos.rb

By designating this precision and scale, Rails will use BigDecimal for those columns, which will guard against floating point imprecision errors.

class CreatePhotos < ActiveRecord::Migration
  def change
    create_table :photos do |t|
      t.decimal :lat, :precision => 15, :scale => 10
      t.decimal :lng, :precision => 15, :scale => 10

      t.timestamps
    end
  end
end

Add image upload columns

In addition to having a location, Photo objects have an image file. Image files will be stored on Amazon S3 using Paperclip, a gem that provides file attachment functionality to Active Record.

Add the following line to Gemfile:

gem 'paperclip', '~>2.6.0'
gem 'aws-sdk', '~>1.3.4'

Re-install your Gem dependencies (to generate a new Gemfile.lock):

$ bundle install

With paperclip installed, generate a migration to add the columns required to store file attachments to the Photo model:

$ rails generate paperclip Photo image
      create  db/migrate/20120213231616_add_attachment_image_to_photo.rb

At this point the migrations are ready to update the database schema. Update the config/database.yml file to reference the correct database user and password for the development and test environments. Then create the local development database and run the migrations.

$ bundle exec rake db:create db:migrate
==  CreatePhotos: migrating ===========
-- create_table(:photos)
...

To configure how image uploads should be processed and stored, add the following to the top of the photo model:

app/models/photo.rb

has_attached_file :image,
                  :styles => { :thumbnail => "100x100#" },
                  :storage => :s3,
                  :s3_credentials => S3_CREDENTIALS

S3_CREDENTIALS is a constant that will be defined in an initializer. Create config/initializers/s3.rb with the following:

For more information on storing configuration as environment variables see Configuration and Config Vars.

config/initializers/s3.rb

S3_CREDENTIALS = {
  :access_key_id     => ENV['S3_KEY'],
  :secret_access_key => ENV['S3_SECRET'],
  :bucket => ENV['S3_BUCKET']
}

Working locally, you can set environment variables in .env, in the root of your project and Foreman will automatically load them at runtime.

Login to your Amazon Webservices account, or create an account if you don’t already have one. Go to the Security Credentials section of your account, and copy the Access Key ID and Secret Access Key to .env in the following format:

.env

S3_KEY=ABCDEFGHIJK1234567890
S3_SECRET=1234567890+ABCDEFGHIJKLMNO
S3_BUCKET=myapp-photo-bucket
RACK_ENV=development

Also be sure to create and reference an S3 bucket to hold the images.

Since access keys and other configuration values shouldn’t be stored in version-control, add .env to .gitignore.

$ echo '.env' >> .gitignore

Add controller actions

Rails controllers handle incoming requests that are dispatched to them according to the routes file. In Rails, a resource controller maps its actions to HTTP requests in the following way:

Method Path Action
GET /photos index
POST /photos create
GET /photos/1/new new
GET /photos/1/edit edit
GET /photos/1 show
PUT /photos/1 update
DELETE /photos/1 destroy

For this example, only index, show, and create will be implemented.

Implement these methods for PhotosController like so:

app/controllers/photos_controller.rb

class PhotosController < ApplicationController
  respond_to :json

  def index
    @photos = Photo.all
    respond_with({:photos => @photos}.as_json)
  end

  def show
    @photo = Photo.find(params[:id])
    respond_with(@photo)
  end

  def create
    @photo = Photo.create(params[:photo])
    respond_with(@photo)
  end
end

In the first line, respond_to :json, specifies that this controller will only respond to requests with the HTTP header Accept: application/json, or with the .json path extension. It’s compliment, respond_with, renders the passed object as a JSON response.

The rest of the controller is simple scaffolding boiler-plate code, finding and creating resources as requested.

Test that the controller is hooked-up with curl or a web browser:

$ foreman start
$ curl http://localhost:5000/photos.json
{"photos":[]}

The response came back alright, but no Photo records have been created, so the collection is empty.

Seed records to be used during development can be generated by running foreman run rake db:seed, which executes /db/seeds.rb.

Foreman is used here to include the S3 credentials in your .env file.

Now retry the previous curl command, or hit refresh in your browser:

$ curl http://localhost:5000/photos.json
{"photos":[{"lat": 37.775, "lng": -122.4183333}, ...]}

It’s important to get in the habit of deploying early and often. Commit your changes, deploy your app to Heroku.

$ git add .
$ git commit -m "Adding Photos model and controller"
$ git push heroku master
Counting objects: 8, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (5/5), done.
Writing objects: 100% (8/8), 2.53 KiB, done.
Total 8 (delta 0), reused 0 (delta 0)

-----> Heroku receiving push
-----> Ruby/Rails app detected
-----> Installing dependencies using Bundler version 1.1.rc.7
...
-----> Launching... done, v6
       http://sharp-day-6513.herokuapp.com deployed to Heroku

The app is deployed but doesn’t have its configuration variables set, which is necessary before seeding the database. Use heroku config:set to set the AWS credentials and bucket.

$ heroku config:set S3_KEY=mykey S3_SECRET=mysecret S3_BUCKET=bucketname
Adding config vars and restarting app... done, v5

Run the rake tasks to migrate the database and generate seed data:

$ heroku run bundle exec rake db:migrate db:seed
Running bundle exec rake db:migrate db:seed attached to terminal... up, run.1
...

Test that the app is running and responding to API calls with curl.

$ curl http://sharp-day-6513.herokuapp.com/photos.json
{"photos":[{"lat": 37.775, "lng": -122.4183333}, ...]}

Diagnose API errors using the heroku logs -t command which will stream the app logs to your console in real-time.

You now have an photo uploading API live and running remotely on Heroku. Now it’s time to create the native iOS mobile application that will interact with this API.

Create Your iOS client app

Open up Xcode and select “File > New ▶ > New Project…”, or use the keyboard shortcut, ⇧⌘N.

New Project Step 1

When prompted to choose a template for your new project, select iOS - Application on the sidebar, and choose the “Empty Application” template. Click “Next” to continue.

New Project Step 2

In the next step, enter your Product Name, Company Identifier, and Class Prefix (optional). For “Device Family”, select “iPhone”. Make sure that the checkboxes for “Use Core Data”, “Use Automatic Reference Counting”, and “Include Unit Tests” are unchecked. Click “Next” to continue.

New Project Step 3

Finally, select a directory to save your new project to, check the box to create a local Git repository for this project, and click “Create”.

Declare iOS dependencies with CocoaPods

Instead of using CocoaPods to manage dependencies, you could also install AFNetworking manually.

Download the latest release

and add the enclosed "AFNetworking" directory to your project folder.

CocoaPods manages library dependencies for your Xcode project, similar to the way Bundler manages Ruby gem dependencies.

Install CocoaPods:

$ sudo gem install cocoapods
$ pod setup

CocoaPods dependencies are declared in Podfile. In this case, AFNetworking will be used to communicate with the web API. Copy the following Podfile in the root of your iOS project:

Podfile

platform :ios
dependency 'AFNetworking', '0.9'

Run pod install GeoPhoto.xcodeproj to setup the dependencies:

$ pod install GeoPhoto.xcodeproj
Installing AFNetworking (0.9.0)
Generating support files
[!] From now on use `GeoPhoto.xcworkspace' instead of `GeoPhoto.xcodeproj'.

Following the instructions from CocoaPods, close GeoPhoto.xcodeproj and open GeoPhoto.xcworkspace. This workspace file contains the build targets for your project as well as its dependencies.

Add location frameworks

In order to work with maps and location information, the MapKit and CoreLocation frameworks are required.

Select the blue project file icon at the top of project navigator, and select the app target in the sidebar.

Add Location Frameworks Step 1

Under the “Build Phases” tab, expand the “Link Binary With Libraries” phase, and click the “+” button.

Add Location Frameworks Step 2

Search for and add both the MapKit and CoreLocation frameworks.

Add Location Frameworks Step 3

Create a photo class

In Xcode, select “File > New ▶ > New File…”, or use the keyboard shortcut, ⌘N.

New Model 1

When prompted to choose a template for your new file, select iOS - Cocoa Touch on the sidebar, and choose the “Objective-C class” template. Click “Next” to continue.

New Model 2

Name the class “Photo” and specify it as a subclass of NSObject.

Photo will act as the model for the iOS client. It is responsible for fetching and initializing objects from the server. In order to be projected on a map view, it will also conform to the MKMapAnnotation protocol.

Photo.h

#import <Foundation/Foundation.h>
#import <CoreLocation/CoreLocation.h>
#import <MapKit/MapKit.h>

Import the Foundation, Core Location, and Map Kit frameworks. Core Location and Map Kit provide APIs to work with and visualize location data.

@interface Photo : NSObject <MKAnnotation> {
@private
    CLLocationDegrees _latitude;
    CLLocationDegrees _longitude;
}

Declare the Photo class, which inherits from NSObject and conforms to the MKAnnotation protocol. MKAnnotation defines a set of methods to implement that allow classes to be projected on an MKMapView

@property (strong, nonatomic, readonly) CLLocation *location;

Rather than exposing latitude and longitude individually as properties, a property for an object representation of location, CLLocation is provided.

- (id)initWithAttributes:(NSDictionary *)attributes;

This initializer will take in deserialized data from the server to construct a Photo object from its JSON representation.

Now in the implementation:

Photo.m

- (id)initWithAttributes:(NSDictionary *)attributes {
    self = [super init];
    if (!self) {
        return nil;
    }

    _latitude = [[attributes valueForKeyPath:@"lat"] doubleValue];
    _longitude = [[attributes valueForKeyPath:@"lng"] doubleValue];

    return self;
}

Initialize the Photo object using data from the server. The lat and lng key paths here correspond to the property names used in the Rails application.

- (NSString *)title {
    return [NSString stringWithFormat:@"Photo at (%f, %f)", _latitude, _longitude];
}

- (CLLocationCoordinate2D)coordinate {
    return CLLocationCoordinate2DMake(_latitude, _longitude);
}

MKAnnotation requires the methods title and coordinate to be implemented. If you imagine a pin annotation on a map view, like in the Maps application, coordinate determines where on the map to drop the pin, and title is the text in the callout that is displayed when tapping the annotation.

Create a photos view controller

In Xcode, select “File > New ▶ > New File…”, or use the keyboard shortcut, ⌘N.

New Controller 1

When prompted to choose a template for your new file, select iOS - Cocoa Touch on the sidebar, and choose the “UIViewController subclass” template. Click “Next” to continue.

New Controller 1

Name the class “PhotosMapViewController” and specify it as a subclass of UIViewController.

PhotosMapViewController will fetch photos from the server, and display on a map.

PhotosMapViewController.m

- (void)loadView {
    [super loadView];

    self.mapView = [[[MKMapView alloc] initWithFrame:self.view.bounds] autorelease];
    self.mapView.delegate = self;
    self.mapView.showsUserLocation = YES;
    [self.view addSubview:self.mapView];
}

Create and initialize the mapView property, setting its frame to the bounds of the view set up in the super implementation of loadView. Set the controller as the map view delegate, and set showsUserLocation = YES, to display the blue dot location indicator.

- (void)viewDidLoad {
    [super viewDidLoad];

    self.title = NSLocalizedString(@"GeoPhoto", nil);

    NSURL *url = [NSURL URLWithString:@"http://localhost:5000/photos.json"];
    [[AFJSONRequestOperation JSONRequestOperationWithRequest:[NSURLRequest requestWithURL:url] success:^(NSURLRequest *request, NSHTTPURLResponse *response, id JSON) {
        for (NSDictionary *attributes in [JSON valueForKeyPath:@"photos"]) {
            Photo *photo = [[[Photo alloc] initWithAttributes:attributes] autorelease];
            [self.mapView addAnnotation:photo];
        }
    } failure:^(NSURLRequest *request, NSHTTPURLResponse *response, NSError *error, id JSON) {
        NSLog(@"Error: %@", error);
    }] start];
}

Create and start an AFJSONRequestOperation to asynchronously fetch the photo JSON from the Rails application. In the success block, enumerate the attributes dictionaries keyed at photos, create and initialize a Photo object for each, and add them as an annotation to the map view.

- (MKAnnotationView *)mapView:(MKMapView *)mapView
            viewForAnnotation:(id<MKAnnotation>)annotation
{
    if (![annotation isKindOfClass:[Photo class]]) {
        return nil;
    }

    static NSString *AnnotationIdentifier = @"Pin";
    MKAnnotationView *annotationView = [mapView dequeueReusableAnnotationViewWithIdentifier:AnnotationIdentifier];
    if (!annotationView) {
        annotationView = [[MKAnnotationView alloc] initWithAnnotation:annotation reuseIdentifier:AnnotationIdentifier];
        annotationView.canShowCallout = YES;
    } else {
        annotationView.annotation = annotation;
    }

    annotationView.image = [UIImage imageNamed:@"photo-placeholder.png"];
    AFImageRequestOperation *operation = [AFImageRequestOperation imageRequestOperationWithRequest:[NSURLRequest requestWithURL:[(Photo *)annotation thumbnailImageURL]] success:^(UIImage *image) {
        annotationView.image = image;
    }];
    [[NSOperationQueue mainQueue] addOperation:operation];

    return annotationView;
}

With the controller set as the delegate, and with photo objects added as annotations on the map view, -mapView:viewForAnnotation: returns the view to be drawn on the map that corresponds to each annotation. This is analogous to the UITableViewDataSource method, -tableView:cellForRowAtIndexPath:, when managing the contents of a UITableView.

First there is a check to see if the specified annotation is a type of Photo, returning early with nil if the check fails. Because mapView.showsCurrentLocation is YES, an annotation for the current location has been added. By returning nil, the standard blue current location dot will be displayed correctly.

Declare a identifier and create or dequeue reusable annotation view. An AFImageRequestOperation is created and enqueued to the main operation queue to load the annotation view image asynchronously and set it once it’s downloaded.

Configure AppDelegate

In order to see PhotosMapViewController in action, configure AppDelegate like so:

AppDelegate.h

#import <UIKit/UIKit.h>

@interface AppDelegate : UIResponder <UIApplicationDelegate>

@property (strong, nonatomic) UIWindow *window;
@property (strong, nonatomic) UINavigationController *navigationController;

@end

AppDelegate.m

#import "AppDelegate.h"

#import "PhotosViewController.h"

@implementation AppDelegate
@synthesize window = _window;
@synthesize navigationController = _navigationController;

- (void)dealloc {
    [_window release];
    [_navigationController release];
    [super dealloc];
}

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
    self.window = [[[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]] autorelease];
    self.window.backgroundColor = [UIColor whiteColor];

    PhotosViewController *viewController = [[PhotosViewController alloc] initWithNibName:nil bundle:nil];
    self.navigationController = [[[UINavigationController alloc] initWithRootViewController:viewController] autorelease];
    [self.window addSubview:self.navigationController.view];

    [self.window makeKeyAndVisible];

    return YES;
}

@end

Declare and synthesize a navigationController property. In application:didFinishLaunchingWithOptions:, create and initialize a PhotosViewController and pass it as rootViewController when initializing navigationController. Add the navigation controller’s view as a subview to the window.

Build and run

In Xcode, click the “Run” play button, or use the keyboard shortcut, ⌘R. If everything worked, the app should be displaying the results fetched from the API. (If you aren’t in a location with a nearby seeded record, zoom out to see pins elsewhere)

Build & Run

Add location services

Right now the /photos.json API call returns all of the Photo records on the server. The next step is to have it only return photos nearby a specified location.

In app/models/photo.rb of the Rails project add the following:

COORDINATE_DELTA = 0.05

scope :nearby, lambda { |lat, lng|
  where("lat BETWEEN ? AND ?", lat - COORDINATE_DELTA, lat + COORDINATE_DELTA).
  where("lng BETWEEN ? AND ?", lng - COORDINATE_DELTA, lng + COORDINATE_DELTA).
  limit(64)
}

Querying with a bounding box is a simple and fast way to get get a subset of all of the records. Because of the ellipsoid shape of the earth, the area of a bounding box with a fixed coordinate-based dimension will vary depending on where it is located. On average, a degree of latitude, for instance, is approximately 111km (69 miles), but varies with a range from 110.57km (68.70 miles) at the equator up to 111.70km (69.41 miles) at the poles.

Since this needs only to be an approximation to get nearby photos, this heuristic works just fine for now. By setting COORDINATE_DELTA = 0.05, scope :nearby will fetch photos within approximately ±5km (~ ±3 miles) of the center coordinate.

In PhotosController, modify index to look for lat and lng parameters and return only photos nearby that:

app/controllers/photos_controller.rb

def index
  lat, lng = params[:lat], params[:lng]
  if lat and lng
    @photos = Photo.nearby(lat.to_f, lng.to_f)
    respond_with({:photos => @photos})
  else
    respond_with({:message => "Invalid or missing lat/lng parameters"}, :status => 406)
  end
end

Next, add geolocation functionality to the iPhone client so that it sends the appropriate lat and lng parameters:

PhotosViewController.m

@property (strong, nonatomic, readwrite) CLLocationManager *locationManager;

Add a locationManager property. CLLocationManager monitors the current positioning of the device, and reports changes to its delegate.

Declare PhotosViewController as conforming to the CLLocationManagerDelegate protocol in the header file by adding it in the list of other protocols between angle brackets in the line beginning with @interface.

Create and initialize the location manager in viewDidLoad:

self.locationManager = [[[CLLocationManager alloc] init] autorelease];
self.locationManager.delegate = self;
self.locationManager.desiredAccuracy = kCLLocationAccuracyHundredMeters;
self.locationManager.distanceFilter = 80.0f;
self.locationManager.purpose = NSLocalizedString(@"GeoPhoto uses your location to find nearby photos", nil);
[self.locationManager startUpdatingLocation];

A location manager’s desiredAccuracy property determines what hardware to use in order to determine location. This should be set to the lowest acceptable accuracy in order to conserve battery.

distanceFilter sets a threshold for the change in distance required to send a message to the delegate. Since a change in location often set up to trigger a network request for location-based applications, it is important to find the right balance.

purpose is a string presented to the user in a dialog box the first time the application requests location information. The user may allow or deny access to location. An application should be designed to gracefully handle cases where the user denies access, however this case will not be covered in this example.

Remove the JSON request code in viewDidLoad:; this will instead be refactored into a new class: GeoPhotoAPIClient.

GeoPhotoAPIClient is an AFHTTPClient subclass that provides a convenient interface to interact with a webservice:

GeoPhotoAPIClient.h

#import <Foundation/Foundation.h>
#import "AFHTTPClient.h"

@interface GeoPhotoAPIClient : AFHTTPClient

+ (GeoPhotoAPIClient *)sharedClient;

@end

GeoPhotoAPIClient

#import "GeoPhotoAPIClient.h"

#import "AFJSONRequestOperation.h"

NSString * const kAFGeoPhotoAPIBaseURLString = @"http://localhost:5000/";

@implementation GeoPhotoAPIClient

+ (GeoPhotoAPIClient *)sharedClient {
    static GeoPhotoAPIClient *_sharedClient = nil;
    static dispatch_once_t oncePredicate;
    dispatch_once(&oncePredicate, ^{
        _sharedClient = [[self alloc] initWithBaseURL:[NSURL URLWithString:kAFGeoPhotoAPIBaseURLString]];
    });

    return _sharedClient;
}

- (id)initWithBaseURL:(NSURL *)url {
    self = [super initWithBaseURL:url];
    if (!self) {
        return nil;
    }

    [self registerHTTPOperationClass:[AFJSONRequestOperation class]];

    // Accept HTTP Header; see http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.1
  [self setDefaultHeader:@"Accept" value:@"application/json"];

    return self;
}

@end

The class method +sharedClient returns a shared singleton instance of the API client. In -initWithBaseURL:, the client adds Accept: applciation/json as a default header to always ask for JSON, and registers AFJSONRequestOperation to have that as the kind of operation created to handle JSON requests.

That code used to request photos will instead be refactored as a part of the Photo model. This new version also adds a location parameter, for retrieving nearby photos:

+ (void)photosNearLocation:(CLLocation *)location
                     block:(void (^)(NSSet *photos, NSError *error))block
{
    NSMutableDictionary *mutableParameters = [NSMutableDictionary dictionary];
    [mutableParameters setObject:[NSNumber numberWithDouble:location.coordinate.latitude] forKey:@"lat"];
    [mutableParameters setObject:[NSNumber numberWithDouble:location.coordinate.longitude] forKey:@"lng"];

    [[GeoPhotoAPIClient sharedClient] getPath:@"/photos" parameters:mutableParameters success:^(AFHTTPRequestOperation *operation, id JSON) {
        NSMutableSet *mutablePhotos = [NSMutableSet set];
        for (NSDictionary *attributes in [JSON valueForKeyPath:@"photos"]) {
            Photo *photo = [[[Photo alloc] initWithAttributes:attributes] autorelease];
            [mutablePhotos addObject:photo];
        }

        if (block) {
            block([NSSet setWithSet:mutablePhotos], nil);
        }
    } failure:^(AFHTTPRequestOperation *operation, NSError *error) {
        if (block) {
            block(nil, error);
        }
    }];
}

Back in PhotosViewController.m, implement the CLLocationManagerDelegate method locationManager:didUpdateToLocation:fromLocation::

PhotosViewController.m

- (void)locationManager:(CLLocationManager *)manager
    didUpdateToLocation:(CLLocation *)newLocation
           fromLocation:(CLLocation *)oldLocation
{
    [Photo photosNearLocation:newLocation block:^(NSSet *photos, NSError *error) {
        if (error) {
            [[[UIAlertView alloc] initWithTitle:NSLocalizedString(@"Nearby Photos Failed", nil) message:[error localizedFailureReason] delegate:nil cancelButtonTitle:NSLocalizedString(@"OK", nil) otherButtonTitles:nil, nil] show];
        } else {
            [self.mapView removeAnnotations:[self.mapView annotations];
        [self.mapView addAnnotations:[photos allObjects]];
        }
    }];
}

locationManager:didUpdateToLocation:fromLocation: is performed each time the location manager updates its current location, making it a convenient time to request nearby photos.

This code is a modified version of the code just removed from viewDidLoad. The URL /photos.json now adds a query string that includes lat and lng parameters supplied by the newLocation parameter of the method.

Also notice in the success block that the map view has its existing annotations removed before adding new ones. This prevents duplicate annotations from being displayed on the map.

Build and Run to see the application now asking for permission to access your current location, and then displaying only nearby photos.

Add photo uploading

The last feature to add is perhaps the most important of all: uploading photos taken on the phone.

Implement the following class method for the Photo model to upload an image to create a photo object:

Photo.m

+ (void)uploadPhotoAtLocation:(CLLocation *)location
                        image:(UIImage *)image
                        block:(void (^)(Photo *photo, NSError *error))block
{
    NSMutableDictionary *mutableParameters = [NSMutableDictionary dictionary];
    [mutableParameters setObject:[NSNumber numberWithDouble:location.coordinate.latitude] forKey:@"photo[lat]"];
    [mutableParameters setObject:[NSNumber numberWithDouble:location.coordinate.longitude] forKey:@"photo[lng]"];

    NSMutableURLRequest *mutableURLRequest = [[GeoPhotoAPIClient sharedClient] multipartFormRequestWithMethod:@"POST" path:@"/photos" parameters:mutableParameters constructingBodyWithBlock:^(id<AFMultipartFormData> formData) {
        [formData appendPartWithFileData:UIImageJPEGRepresentation(image, 0.8) name:@"photo[image]" fileName:@"image.jpg" mimeType:@"image/jpeg"];
    }];

    AFHTTPRequestOperation *operation = [[GeoPhotoAPIClient sharedClient] HTTPRequestOperationWithRequest:mutableURLRequest success:^(AFHTTPRequestOperation *operation, id JSON) {
        Photo *photo = [[[Photo alloc] initWithAttributes:[JSON valueForKeyPath:@"photo"]] autorelease];

        if (block) {
            block(photo, nil);
        }
    } failure:^(AFHTTPRequestOperation *operation, NSError *error) {
        if (block) {
            block(nil, error);
        }
    }];
    [[GeoPhotoAPIClient sharedClient] enqueueHTTPRequestOperation:operation];
}

+uploadPhotoAtLocation:image:block: asynchronously uploads an image to create a Photo object. First, a parameters dictionary is created with the lat and lng of the specified location. Next, an NSURLRequest object is created, with its HTTP body constructed as a multipart form with fields for the parameters and the image to upload. Then, an HTTP request operation is created using that URL request object; its success block constructs a Photo object from the server response, and passes that into the specified block. Finally, the operation is enqueued into the HTTP client’s shared operation queue, which kicks off the operation.

UIImagePickerController provides interfaces both to capture photos and video with the camera, but also to access media from the camera roll and photo library.

In PhotosViewController.h add UIImagePickerControllerDelegate and UINavigationControllerDelegate to the list of protocols for PhotosViewController.

Next, add a button to the navigation bar, hook up an image picker to be presented when a user taps on that button, and have the selected image uploaded to the server.

PhotosViewController.m

Add the following to -viewDidLoad:

self.navigationItem.rightBarButtonItem = [[[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemCamera target:self action:@selector(takePhoto:)] autorelease];

The takePhoto: method does not exist, so implement that now:

#pragma mark - Actions

- (void)takePhoto:(id)sender {
    UIImagePickerControllerSourceType sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
    if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
        sourceType = UIImagePickerControllerSourceTypeCamera;
    }

    UIImagePickerController *imagePickerController = [[[UIImagePickerController alloc] init] autorelease];
    imagePickerController.delegate = self;
    imagePickerController.sourceType = sourceType;
    [self.navigationController presentViewController:imagePickerController animated:YES completion:nil];
}

An image picker controller is configured and presented by the navigation controller. Since not all targets have cameras (including the iOS Simulator), the source type is set to default to the saved photo albums, but use camera if available.

Before implementing the UIImagePickerControllerDelegate method imagePickerController:didFinishPickingImage:editingInfo::, go back to the

PhotosViewController.m

#pragma mark - UIImagePickerControllerDelegate

- (void)imagePickerController:(UIImagePickerController *)imagePickerController
        didFinishPickingImage:(UIImage *)image
                  editingInfo:(NSDictionary *)editingInfo
{
    [imagePickerController dismissModalViewControllerAnimated:YES];
    [Photo uploadPhotoAtLocation:self.locationManager.location image:image block:^(Photo *photo, NSError *error) {
        if (error) {
            [[[UIAlertView alloc] initWithTitle:NSLocalizedString(@"Upload Failed", nil) message:[error localizedFailureReason] delegate:nil cancelButtonTitle:NSLocalizedString(@"OK", nil) otherButtonTitles:nil, nil] show];
        } else {
            [self.mapView addAnnotation:photo];
        }
    }];
}

Since the user is done interacting with the image picker, do dismissModalViewControllerAnimated:, which will remove the modal view.

In the callback block of uploadPhotoAtLocation:image:block, check to see if the upload failed for whatever, and display an alert with a message explaining the error. Otherwise, add the photo as an annotation to the map.

Build and Run again to see your working, finished application.

Final Product

Troubleshooting

If you push up your app and it crashes (heroku ps shows state crashed), check your logs with heroku logs to find out what went wrong.

Unable to build Xcode project

Working Application

If your Xcode project uses ARC and fails to “Build and Run”, and the Issue Navigator reports errors such as “ARC forbids explicit message send of ‘release’” or “‘release’ is unavailable”, then your project includes source files that cannot be compiled with ARC.

To fix this, select your project file at the top of the Source Navigator, select your active target under the “Targets” section, and click the “Build Phases” tab in the editor screen. Expand the “Compile Sources” phase, and for each source file that does not support ARC, add the following compiler flag: -fno-objc-arc. You should now be able to successfully build your project.

Next steps

At this point you have a location-based photo sharing service, with an iPhone app communicating with a web API deployed to Heroku. From here, you can build out a fully-featured application, ready to be sold and distributed on the App Store.

Here are some ideas for next steps:

  • Add user accounts and / or authentication with Facebook, Twitter, or another OAuth provider.
  • Enable more complex geospatial capabilities using the RGeo gem and a Heroku Postgres database with PostGIS
  • Implement social networking features like commenting, liking, or sharing to other social networks.