Freedom, the 2014 edition

Freedom… it ain’t so free.

I’ve been fighting for freedom my whole adult life: from a controlling father (great intentions, questionable methods) to US immigration system that’s hell-bent on deporting you to “daddy issues” to work environments that depress, most of my time was spent on gaining freedom. Freedom to make own decisions, to shape my personality instead of inheriting it, to do what I love and whom I love, to have time to create and delight, to not worry about having to leave the country I call home - the US of A. This fight clinically depressed me, exhausted me, and I’d say reduced my IQ by at least 20-30 points. But at the end of last year I saw the light at the end of the tunnel, and on this Independence Day I celebrate it getting brighter and brighter, despite the thick haze still surrounding it.

This year is my year to step the fuck up. Here’s what happened recently and what’s coming up!

New City

San Francisco… just… WOW. I can’t put it any other way. Moving here was the best decision I ever made in my life. The only thing I miss is my best friend Sebastian and I hope he moves here as soon as he’s done with his obligations in Nashville, TN.

New Eyes

On Thursday, June 26 of 2014, I had a LASIK surgery performed on my eyes. A week later, I have near 20/20 vision and barely remember what it was like to not see perfectly. It’s amazing. This July 4th I celebrate my independence from glasses and contact lenses!

New Job

I quit Substantial 3 days ago. They are awesome people and a very capable company, but as far as software engineering goes my only passion now is iOS. So I got an even awesomer [sic] job where I’ll do iOS development all day, every day. I want to get great at it, and I want to have fun doing it. A raise to finally get my salary back to the level I had in Nashville, and a small chance of getting a windfall in case of the startup succeeding don’t hurt either.  This does mean I have to restart my green card process once again… but YOLO.

‘XXX’ Hobby

Yes, I love sex – doing it, watching it, shooting it. I don’t understand how something so natural, so instinctive, and so necessary to our happiness and survival gets so abused, oppressed, and disrespected in most cultures, religions, and communities around the world and even this country. Sex is life, literally. The human body is a work of art. For a few years I’ve been deliberating learning photography and shooting nudes. It starts this summer. I look forward to meeting beautiful women and exploring my creative side. I couldn’t be more excited!

No More Hiding

I am not a gay or bisexual person. But I am a simultaneously monogamous and polyamorous person. There have been girls that floored me so much, I wanted nothing and no one else (although I wouldn’t say no if she invited other girl(s) to join us for sex). At the same time, I can’t escape the overwhelming feeling that I can, in fact, be in love with two girls at the same time. It’s a very specific, real feeling; for example, three girls is a no-go. I don’t expect most people to understand this. But I also don’t feel like explaining myself or hiding this part of me anymore. This just is, and it’s as real as any other part of me. I am OK with it.

It’s funny how your world changes when you stop hiding. I already met a girl who not only understands, but is similar. If you go to OK Cupid, for example, and search for “poly”, you’ll find that there’s a slice of population that feels the same way. And, like me, they have trouble finding similarly minded people. Poly isn’t about sex and orgies; poly is about removing all jealousy, insecurities, and various cultural restrictions to the point where you feel nothing but purity toward another person. It’s an incredible level of trust; come anything, you won’t betray it. It’s so, so beautiful.

Building Things: Coffee Machine, Apps for Blind/Deaf, Robotics

This year is the year to build – cool iOS applications, an expanded circle of friends, a photo portfolio… but also stuff with Arduino and Raspberry Pi. I don’t know the latter yet, but I want to learn. I have ideas, such as a badass and affordable (read, mass-market) coffee machine. Keurig is cool and all, but omg can we do better than that! I have the next 2-3 weeks all to myself, so I’ll buy some basic components and get the ball rolling.

New Friends

I’ve been working hard in the last month to meet new people. I used to be an ass people didn’t want to hang out with. But despite my rough edges, at heart I always loved people. There’s nothing I want more than to find people I can trust, love, and share an innocent connection with, people who won’t fuck you over for something as unimportant as money, power, or ego. If you are in the same camp - send me a tweet. :)

No Judging

Like this wise man points out, you can’t be truly free if you judge people.

Burning Man

Someone please sell me a ticket! :D Do you want me to beg? I’ll beg. I just know it will be a life-changing experience. Can’t wait!

Damn, it feels good to be a gangsta. Happy Independence Day!

The LASIK experience

I don’t show it, but I am petrified. The papers I signed list “Death” as potential outcome, twice. The numerous other potential side effects listed are arguably more scary than death, at least to a young single man in his prime. What if I am that 0.1%? Is correcting -1.5 diopters worth the risk? What if the laser errors on me because, like all software, it probably has bugs? And what if…

“Common, dear, let’s get you in there, we’re ready,” I hear the nurse saying.

I recognize my doctor and 3 assistants I haven’t seen before. Everyone’s smiling. There’s a leather bed attached to a machine. I lay down, and anesthetic drops go into my eyes. The first ones feel like water and they make me want to blink. But the drops act fast; I can see the second round coming down, but I can’t feel them hitting my eye. Like rain over windshield – they land and wash away without seemingly touching me. Since I can’t feel them, I no longer want to blink as they hit. It’s a strange feeling that, in hindsight, makes sense.

My left eye lids are taped to my cheek and forehead next. Another drop comes down. The doctor brings the flap-creating laser above my eye and presses it down. I feel pressure around the eye socket; it’s uncomfortable, but nothing to write home about. “Try to focus on the dot, Alex, this only takes 14 seconds. Great. We have a lock. Here we go.” says an assistant. “10 seconds left. Your vision should start going out now, yes?” Says another assistant. I murmur “Yep”. “That’s normal. And… done. Now the doctor will lift the flap.”

The femtolaser is removed, and all I can see are blurry lights. I can tell the doctor is moving some small instrument over my eye to lift the flap, but I don’t feel it. Vision clears up a bit once the flap is lifted. “Try to focus on the green dot.” The bigger machine’s arm, the main laser, comes down, and I focus on the dot. “Just 4 seconds, Alex,” I hear someone saying. “We have a lock,” says another voice. I faintly hear the machine kick in, and a second later I smell burnt… something. But 4 seconds is over fast. “That’s it. We’re putting the flap back on.” More drops come down. “We’ll keep this eye open for a few more seconds, and then move on to the next one.” 10 seconds later the tapes on eye lids are removed, I can blink, and I’m asked to close my left eye and open the right one.

Another minute and the right eye is done. I’m asked to open both eyes and am helped off the bed. Expecting to be temporarily blind, I instead find myself being able to see better than I did before I walked in. “Don’t be so surprised,” says my coordinator who appeared just on time, “Your vision is fixed now; all that’s left is for the flap to heal.” Technology – isn’t it fucking amazing?

I’m given a Vicodin and a sleeping pill; the Vicodin will kick in right about the time the eye anesthetic drops start wearing out. Perfect. A friend drives me home, I take a 4-hour nap (I’m told the first 4-6 hours are most important to keep eyes closed). I get up for an hour to snack (noticing that I can see just fine), and go back to bed for a full night’s sleep.

The next morning I don’t even feel I had surgery, and my eyesight is nearly 20/20. My eye doctor says he can barely see where the flap was cut; it’s a faint line even under magnification. “It’s crazy what they do these days,” he murmurs.

In the next post I’ll share what I learned about the type of lasers you want to find for your own surgery and why it’s important to get 2nd and 3rd opinions, so stay posted!

My 26-yo Friend is Dying, You CAN Help. Please help.

Suzy was an amazing human being and a healthy, well-fit MIT grad when she was diagnosed with 2 cancers at age 26. She had 6 chemos and is holding up well, but she’s now bankrupt and cannot afford the 10% downpayment for her 7th. The downpayment is $4300. This means she’ll be saddled with the other 90% as debt, BUT she may live to actually have a chance at paying it off. 

PLEASE donate what you can here: http://www.gofundme.com/zsuzsastrong.

I’ve donated hundreds of dollars over the last few months. I recently saved up to buy the camera of my dreams, a Leica; I will be returning it this week so that I can donate more money. I won’t let my friend die. 

Suzy is a scientist, a lovely human being, and a highly inspiring friend who helped me set high goals back when I was still in Ukraine. I _know_ she will have a great, positive impact on this planet and on humanity in general if she is allowed to live. Let’s please save her. 

Thank you, immeasurably.

How to mass-rename files/change prefix in Xcode

Let’s say you created a BOOMWackadoo project. You’ve been working on it for a while, and now have a ton of files starting with “BOOMWackadoo,” as well as a ton of BOOMWackadoo references in code. One day a friend says, “Why don’t you just remove BOOM? Wackadoo seems cool enough.” If you realize your friend is right, here’s how you implement the advice:

  1. If you specified you project’s prefix as “BOOM” when you created the project, change it to an empty string as shown here. (The UI is the same for Xcode 5 as it was for Xcode 4) Now close your project in Xcode.
  2. Time to mass-rename all your files on disk. Download the Automator template linked to at the end of this article and run it. Make sure that in “Rename Finder Items: Replace Text” workflow you select “full name” from the drop-down to the right of “Find” text field. FYI, I had to run this Automator template twice to get all the files renamed, for some reason.
  3. Now that you’ve renamed all the files, it’s time to change all the references in your .xcodeproj file! Open your Wackadoo.xcodeproj file in some sort of real editor, like Sublime Text 2. (Notice that step above renamed the file from BOOMWackadoo.xcodeproj to Wackadoo.xcodeproj.) In the side bar, right-click on Wackadoo.xcodeproj and select ‘Find in Folder…’ Type BOOMWackadoo in the Find field and Wackadoo in the Replace field. Push “Case Sensitive” button on the left (unless your file name casing isn’t consistent, in which case shame on you). Hit Replace, confirm. Make sure to save the changes! Now you can close Sublime.
  4. Finally, we need to change all the code references. Open your Wackadoo project in Xcode and don’t try to build lest you just like the color red. Press Command + 3 to open the Find Navigator view (or click the magnifying glass icon at the top of the Navigator side bar). Tap the “Find” text above the first text box and select Replace > Text > Starting with, then type “BOOMWackadoo” (the old name) in the search box, hit Enter. This will search for all occurrences and enable the Replace All button. But first type “Wackadoo” in the “With” text box. Now hit Replace All (if it asks if you want to enable snapshots and create one, what you pick is up to you). Screenshot of Find/Replace in Xcode

Das is it! You should be able to build and run your “BOOM”-less project now.

Dungeon Highway

A simple and fun, 8-bit-style iOS and Android infinite-runner game by my new colleagues at Substantial! It’s a hobby project and is free, so go get it and let’s compete for high scores (my current high score in the release version is 33k in Hardcore mode).

So proud of them, and hope we make more games.

Click here to get Dungeon Highway from iOS App Store.

Click here for Google’s Play Store.

I’ve been playing this for a few days now and it’s a perfect game when you need a few minutes away from work, so much fun when you barely survive time after time after time.

Testing a Camera App in iOS Simulator

As of today, iOS simulator doesn’t use your dev machine’s camera to simulate iDevice’s camera – all you get is a black screen. I walked the slow route of testing my first iOS app Progress on actual devices instead of in the simulator for a long time before I realized there’s a simple work-around for this.

I also wanted to show how to start camera asynchronously in case you weren’t already doing that, and how to use AVFoundation for the camera (I was scared of it myself at first, but do trust me when I say I regret it).

Let’s say we have a CameraViewController – how you get to it is up to you. I wrote my own container view controller that pushes/pops the camera VC and acts as its delegate. You may use a navigation controller, or display it modally, etc.

CameraViewController.h:

#import <UIKit/UIKit.h>

@protocol CameraDelegate 

- (void)cameraStartedRunning;
- (void)didTakePhoto:(UIImage *)photo;

@end

@interface CameraViewController : UIViewController

@property (nonatomic, weak) id<CameraDelegate> delegate;

- (void)startCamera;
- (void)stopCamera;
- (BOOL)isCameraRunning;

@end

Your container or calling VC will implement the CameraDelegate protocol, spin up an instance of CameraViewController, register itself as the delegate on it, and call startCamera. Once the camera is ready to be shown, cameraStartedRunning will be called on the delegate controller on the main thread. 

Starting the Camera

CameraViewController.m, part 1:

#import "CameraViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface CameraViewController ()

@property (nonatomic) AVCaptureSession *captureSession;
@property (nonatomic) UIView *cameraPreviewFeedView;
@property (nonatomic) AVCaptureStillImageOutput *stillImageOutput;
@property (nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;

@property (nonatomic) UILabel *noCameraInSimulatorMessage;

@end

@implementation CameraViewController {
	BOOL _simulatorIsCameraRunning;
}

- (void)viewDidLoad
{
    [super viewDidLoad];

	self.noCameraInSimulatorMessage.hidden = !TARGET_IPHONE_SIMULATOR;
}

- (UILabel *)noCameraInSimulatorMessage
{
	if (!_noCameraInSimulatorMessage) {
		CGFloat labelWidth = self.view.bounds.size.width * 0.75f;
		CGFloat labelHeight = 60;
		_noCameraInSimulatorMessage = [[UILabel alloc] initWithFrame:CGRectMake(self.view.center.x - labelWidth/2.0f, self.view.bounds.size.height - 75 - labelHeight, labelWidth, labelHeight)];
		_noCameraInSimulatorMessage.numberOfLines = 0; // wrap
		_noCameraInSimulatorMessage.text = @"Sorry, no camera in the simulator... Crying allowed.";
		_noCameraInSimulatorMessage.backgroundColor = [UIColor clearColor];
		_noCameraInSimulatorMessage.hidden = YES;
		_noCameraInSimulatorMessage.textColor = [UIColor whiteColor];
		_noCameraInSimulatorMessage.shadowOffset = CGSizeMake(1, 1);
		_noCameraInSimulatorMessage.textAlignment = NSTextAlignmentCenter;
		[self.view addSubview:_noCameraInSimulatorMessage];
	}

	return _noCameraInSimulatorMessage;
}

- (void)startCamera
{
	if (TARGET_IPHONE_SIMULATOR) {
		_simulatorIsCameraRunning = YES;
		[NSThread executeOnMainThread: ^{
			[self.delegate cameraStartedRunning];
		}];
		return;
	}

	if (!self.cameraPreviewFeedView) {
		self.cameraPreviewFeedView = [[UIView alloc] initWithFrame:self.view.bounds];
		self.cameraPreviewFeedView.center = self.view.center;
		self.cameraPreviewFeedView.backgroundColor = [UIColor clearColor];

		if (![self.view.subviews containsObject:self.cameraPreviewFeedView]) {
			[self.view addSubview:self.cameraPreviewFeedView];
		}
	}

	if (![self isCameraRunning]) {
		dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
			AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

			if (!self.captureSession) {

				self.captureSession = [AVCaptureSession new];
				self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto;

				NSError *error = nil;
				AVCaptureDeviceInput *newVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
				if (!newVideoInput) {
					// Handle the error appropriately.
					NSLog(@"ERROR: trying to open camera: %@", error);
				}

				AVCaptureStillImageOutput *newStillImageOutput = [AVCaptureStillImageOutput new];
				NSDictionary *outputSettings = @{ AVVideoCodecKey : AVVideoCodecJPEG };
				[newStillImageOutput setOutputSettings:outputSettings];

				if ([self.captureSession canAddInput:newVideoInput]) {
					[self.captureSession addInput:newVideoInput];
				}

				if ([self.captureSession canAddOutput:newStillImageOutput]) {
					[self.captureSession addOutput:newStillImageOutput];
					self.stillImageOutput = newStillImageOutput;
				}

				NSNotificationCenter *notificationCenter =
				[NSNotificationCenter defaultCenter];

				[notificationCenter addObserver: self
									   selector: @selector(onVideoError:)
										   name: AVCaptureSessionRuntimeErrorNotification
										 object: self.captureSession];

				if (!self.captureVideoPreviewLayer) {
					[NSThread executeOnMainThread: ^{
						self.captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
						self.captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
						self.captureVideoPreviewLayer.frame = self.cameraPreviewFeedView.bounds;
						[self.cameraPreviewFeedView.layer insertSublayer:self.captureVideoPreviewLayer atIndex:0];
					}];
				}
			}

			// this will block the thread until camera is started up
			[self.captureSession startRunning];

			[NSThread executeOnMainThread: ^{
				[self.delegate cameraStartedRunning];
			}];
		});
	} else {
		[NSThread executeOnMainThread: ^{
			[self.delegate cameraStartedRunning];
		}];
	}
}

...

At the top, we have some properties:

AVCaptureSession is the main object that handles AVFoundation-based video/photo stuff. Whether you’re capturing video or photo, you use it.

AVCaptureStillImageOutput is an object you use to take a still photo of AVCaptureSession’s video feed.

AVCaptureVideoPreviewLayer is a subclass of CALayer that displays what the camera sees. Because it’s a CALayer, you can insert it into any UIView, including controller’s own view. However, to separate concerns and for easier management, I don’t recommend using VC’s view as the parent view for that layer. Here, you can see that I use a separate cameraPreviewFeedView as the container for AVCaptureVideoPreviewLayer.

There’s also a private _simulatorIsCameraRunning boolean. It’s here because when we are running the app in the simulator, we can’t use AVCaptureSession and hence we can’t ask AVCaptureSession if the camera is running. So we need to track whether camera is on or off manually.

The noCameraInSimulatorMessage property is self-explanatory and entirely optional – it just lazy-instantiates a label and adds it to the main view. viewDidLoad simply shows/hides this label as appropriate. We make a call to TARGET_IPHONE_SIMULATOR to know when we’re in the simulator, which is provided by Apple as part of TargetConditionals.h, which is a part of UIKit.

With the variables in place, the logic is simple. When startCamera is called, we check if we’re running in a simulator (TARGET_IPHONE_SIMULATOR will be YES) and, if so, set _simulatorIsCameraRunning to YES. We then tell the delegate that the camera started and exit out of the method. If we’re not in the simulator, however, then we start up an AVFoundation-based camera asynchronously. Note that if a call to [self isCameraRunning] returns YES (code for that method is in part 2 below), we simply tell the delegate that we’re already running without doing anything else.

Sidenote: if you’re way smarter than me, you may be saying, “Wait, NSThread doesn’t define executeOnMainThread selector!” I award you geek points and explain that it’s just a category method I added to ensure a block executes on the main thread – you’ll see code for it at the end of the post (tip of the hat to Marco Arment for that one).

Why So Asynchronous?

Since the camera can take a second or two to start up and perception is reality, this is your opportunity to trick the user by making the app feel faster when it’s really not any faster at all. For example, I mentioned having a custom container controller that implements the CameraDelegate protocol. When user wants to start the camera, that container VC adds the CameraViewController as a child controller but makes it’s view transparent (alpha = 0). To the user, this is unnoticeable as they keep seeing the previous VC’s view. The container VC then calls “startCamera” on the newly-added camera VC and immediately starts performing the “I’m switching to camera” animation on the currently shown VC. Our animation is a bit elaborate, but you can do whatever fits your app. Even if you just fade out the current VC’s view over 0.7 seconds or so while the camera is loading, the app will feel faster b/c something is happening.

Then, whenever the container VC receives the cameraStartedRunning message from camera VC, it quickly fades in the camera VC’s view that it kept transparent until now, and it also fades in appropriate overlay controls (such as the shutter button) if they are separate from the Camera VC’s view (in our app, they are). What I mean by “quickly fades in” is 0.25 or 0.3 seconds – but tune it as you see fit.

It’s pretty crazy how much faster even a simple fade out/fade in feels compared to just going to a black screen and waiting for the camera to spin up, even though the time it takes for the camera to open is about the same in both cases. This is true even considering the fact that our fade in animation technically delays full camera appearance by a couple hundred milliseconds. Perception is reality!

Alright, now to part 2 of CameraViewController.m:

...

- (void)stopCamera
{
	if (TARGET_IPHONE_SIMULATOR) {
		_simulatorIsCameraRunning = NO;
		return;
	}

	if (self.captureSession && [self.captureSession isRunning]) {
		dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^ {
			[self.captureSession stopRunning];
		});
	}
}

- (BOOL)isCameraRunning
{
	if (TARGET_IPHONE_SIMULATOR) return _simulatorIsCameraRunning;

	if (!self.captureSession) return NO;

	return self.captureSession.isRunning;
}

- (void)onVideoError:(NSNotification *)notification
{
	NSLog(@"Video error: %@", notification.userInfo[AVCaptureSessionErrorKey]);
}

- (void)takePhoto
{
	dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
		if (TARGET_IPHONE_SIMULATOR) {
			[self.delegate didTakePhoto: [UIImage imageNamed:@"Simulator_OriginalPhoto@2x.jpg"]];
			return;
		}

		AVCaptureConnection *videoConnection = nil;
		for (AVCaptureConnection *connection in self.stillImageOutput.connections)
			{
			for (AVCaptureInputPort *port in [connection inputPorts])
				{
				if ([[port mediaType] isEqual:AVMediaTypeVideo] )
					{
					videoConnection = connection;
					break;
					}
				}
			if (videoConnection)
				{
				break;
				}
			}

		[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
														   completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
		 {
			 NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];

			 [self.delegate didTakePhoto: [UIImage imageWithData: imageData]];
		 }];
	});
}
@end

stopCamera does nothing but sets our ‘fake’ boolean if we’re in the simulator, and actually stops the camera when we’re on the device. Notice that you don’t see this called anywhere inside of this controller. Why? If your case isn’t like mine, you definitely SHOULD call this from within viewWillDisappear in CameraViewController (and perhaps from within dealloc as well). Otherwise, the camera will be running in the background until iOS kills it (and I don’t know what the rules are for that), draining battery and turning your users into an angry mob.

If your case is like mine, however, your users will see some sort of edit screen after taking a picture with an option to retake their picture, and they may need to go back and forth a lot. To make sure they don’t have to wait for the camera to start up every time, I keep the camera running for 10 seconds. This means that, every time user takes a picture, I have to setup (and reset) a timer to kill the camera after 10 seconds. Where you put this timer code will depend on your app architecture, but do test well to make sure it always executes so that AVCaptureSession isn’t running in the background for long.

Back to code… The isCameraRunning variable is self-explanatory, and here we finally see our fake camera status boolean variable come in handy.

If AVCaptureSessions blows up for some reason (device out of memory, internal Apple code error, etc.) – onVideoError: notification will be sent. Notice we subscribe to it when setting up self.captureSession back inside startCamera.

Last but not least is takePhoto. I wrapped it into a dispatch call, but frankly I am not sure it helps any since Apple seems to block all threads when it extracts a photo. But, if it changes in the future, I’ll be ready. This is where the main you’ll find the main workaround code to make camera work in the simulator: we’re simply returning a random picture to the delegate. Well, I’m returning the same picture every time in this code, but you can imagine having a set of pictures from which you select a random one every time.

Wrap-Up

I’m sure you have more questions about AVFoundation such as “what is AVCaptureConnection or AVMediaTypeVideo and why do I see the word ‘video’ so often when I only care about still images?”, but this is just how it’s done and it’s outside of the scope of this post to describe why. Just copy, paste, and enjoy =) And if it breaks, don’t blame me.

Now, what calls takePhoto? Your delegate view controller, most likely. AVFoundation camera is barebones – it has no shutter button, etc. So you’ll have to create your own “overlay” controls view that shows a shutter button, retake button, maybe flash on/off button, etc. In my app the custom container controller I mentioned before is also the delegate for such an overlay view. When users tap the shutter button on the overlay view to take a photo, the overlay view sends a “plzTakePhotoMeow” message (wording is approximate) to it’s delegate (which is the custom container VC I keep talking about), and that delegate (the custom container VC) then calls takePhoto on the CameraViewController instance. In turn, when the camera VC is done getting an image from AVCaptureSession, it returns the image back to the delegate (the same custom container VC). The custom container VC can then remove Camera VC from the view hierarchy and instantiate the next VC, such as EditPhotoVC, passing it the photo from the camera.

Ok, I promised code for that NSThread category, so here it is:

@implementation NSThread (Helpers)

+ (void)executeOnMainThread:(void (^)())block
{
	if (!block) return;

	if ([[NSThread currentThread] isMainThread]) {
		block();
	} else {
		dispatch_sync(dispatch_get_main_queue(), ^ {
			block();
		});
	}
}

@end

Finally, the code for the timer to stop the camera (if you go this route) is below. First we define a few properties on the controller that is the delegate to CameraViewController (again, in my case it’s the container VC):

@property (atomic) BOOL openingCameraView;
@property (nonatomic) NSTimer *stopCameraTimer;

And then, in didTakePhoto:, canceledTakingPhoto: (if your overlay controls UI has that button), etc. spin up the timer:

[NSThread executeOnMainThread:^{
		// display the next VC here
		...

		// start a timer to shut off camera
		if (!self.stopCameraTimer || !self.stopCameraTimer.isValid) {
			self.stopCameraTimer = [NSTimer scheduledTimerWithTimeInterval:15 target:self selector:@selector(stopCameraAfterTimer:) userInfo:nil repeats:NO];
		}
	}];

The actual method the timer references is here, and it just ensures we don’t stop the camera while Camera VC is shown (that would be an oopsie) or we’re in the process of showing it (self.openingCameraView is set manually to true at the beginning of the method responsible for adding Camera VC to the view hierarchy):

- (void)stopCameraAfterTimer:(NSTimer *)timer
{
	if (!self.openingCameraView && !(self.currentlyShownVC == self.cameraVC)) {
		[self.cameraVC stopCamera];
	}
}

I hope this helps! If I made errors or you have questions, please contact me on Twitter.

Happy coding in 2014!