Dungeon Highway

A simple and fun, 8-bit-style iOS and Android infinite-runner game by my new colleagues at Substantial! It’s a hobby project and is free, so go get it and let’s compete for high scores (my current high score in the release version is 33k in Hardcore mode).

So proud of them, and hope we make more games.

Click here to get Dungeon Highway from iOS App Store.

Click here for Google’s Play Store.

I’ve been playing this for a few days now and it’s a perfect game when you need a few minutes away from work, so much fun when you barely survive time after time after time.

Testing a Camera App in iOS Simulator

As of today, iOS simulator doesn’t use your dev machine’s camera to simulate iDevice’s camera – all you get is a black screen. I walked the slow route of testing my first iOS app Progress on actual devices instead of in the simulator for a long time before I realized there’s a simple work-around for this.

I also wanted to show how to start camera asynchronously in case you weren’t already doing that, and how to use AVFoundation for the camera (I was scared of it myself at first, but do trust me when I say I regret it).

Let’s say we have a CameraViewController – how you get to it is up to you. I wrote my own container view controller that pushes/pops the camera VC and acts as its delegate. You may use a navigation controller, or display it modally, etc.

CameraViewController.h:

#import <UIKit/UIKit.h>

@protocol CameraDelegate 

- (void)cameraStartedRunning;
- (void)didTakePhoto:(UIImage *)photo;

@end

@interface CameraViewController : UIViewController

@property (nonatomic, weak) id<CameraDelegate> delegate;

- (void)startCamera;
- (void)stopCamera;
- (BOOL)isCameraRunning;

@end

Your container or calling VC will implement the CameraDelegate protocol, spin up an instance of CameraViewController, register itself as the delegate on it, and call startCamera. Once the camera is ready to be shown, cameraStartedRunning will be called on the delegate controller on the main thread. 

Starting the Camera

CameraViewController.m, part 1:

#import "CameraViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface CameraViewController ()

@property (nonatomic) AVCaptureSession *captureSession;
@property (nonatomic) UIView *cameraPreviewFeedView;
@property (nonatomic) AVCaptureStillImageOutput *stillImageOutput;
@property (nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;

@property (nonatomic) UILabel *noCameraInSimulatorMessage;

@end

@implementation CameraViewController {
	BOOL _simulatorIsCameraRunning;
}

- (void)viewDidLoad
{
    [super viewDidLoad];

	self.noCameraInSimulatorMessage.hidden = !TARGET_IPHONE_SIMULATOR;
}

- (UILabel *)noCameraInSimulatorMessage
{
	if (!_noCameraInSimulatorMessage) {
		CGFloat labelWidth = self.view.bounds.size.width * 0.75f;
		CGFloat labelHeight = 60;
		_noCameraInSimulatorMessage = [[UILabel alloc] initWithFrame:CGRectMake(self.view.center.x - labelWidth/2.0f, self.view.bounds.size.height - 75 - labelHeight, labelWidth, labelHeight)];
		_noCameraInSimulatorMessage.numberOfLines = 0; // wrap
		_noCameraInSimulatorMessage.text = @"Sorry, no camera in the simulator... Crying allowed.";
		_noCameraInSimulatorMessage.backgroundColor = [UIColor clearColor];
		_noCameraInSimulatorMessage.hidden = YES;
		_noCameraInSimulatorMessage.textColor = [UIColor whiteColor];
		_noCameraInSimulatorMessage.shadowOffset = CGSizeMake(1, 1);
		_noCameraInSimulatorMessage.textAlignment = NSTextAlignmentCenter;
		[self.view addSubview:_noCameraInSimulatorMessage];
	}

	return _noCameraInSimulatorMessage;
}

- (void)startCamera
{
	if (TARGET_IPHONE_SIMULATOR) {
		_simulatorIsCameraRunning = YES;
		[NSThread executeOnMainThread: ^{
			[self.delegate cameraStartedRunning];
		}];
		return;
	}

	if (!self.cameraPreviewFeedView) {
		self.cameraPreviewFeedView = [[UIView alloc] initWithFrame:self.view.bounds];
		self.cameraPreviewFeedView.center = self.view.center;
		self.cameraPreviewFeedView.backgroundColor = [UIColor clearColor];

		if (![self.view.subviews containsObject:self.cameraPreviewFeedView]) {
			[self.view addSubview:self.cameraPreviewFeedView];
		}
	}

	if (![self isCameraRunning]) {
		dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
			AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

			if (!self.captureSession) {

				self.captureSession = [AVCaptureSession new];
				self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto;

				NSError *error = nil;
				AVCaptureDeviceInput *newVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
				if (!newVideoInput) {
					// Handle the error appropriately.
					NSLog(@"ERROR: trying to open camera: %@", error);
				}

				AVCaptureStillImageOutput *newStillImageOutput = [AVCaptureStillImageOutput new];
				NSDictionary *outputSettings = @{ AVVideoCodecKey : AVVideoCodecJPEG };
				[newStillImageOutput setOutputSettings:outputSettings];

				if ([self.captureSession canAddInput:newVideoInput]) {
					[self.captureSession addInput:newVideoInput];
				}

				if ([self.captureSession canAddOutput:newStillImageOutput]) {
					[self.captureSession addOutput:newStillImageOutput];
					self.stillImageOutput = newStillImageOutput;
				}

				NSNotificationCenter *notificationCenter =
				[NSNotificationCenter defaultCenter];

				[notificationCenter addObserver: self
									   selector: @selector(onVideoError:)
										   name: AVCaptureSessionRuntimeErrorNotification
										 object: self.captureSession];

				if (!self.captureVideoPreviewLayer) {
					[NSThread executeOnMainThread: ^{
						self.captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
						self.captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
						self.captureVideoPreviewLayer.frame = self.cameraPreviewFeedView.bounds;
						[self.cameraPreviewFeedView.layer insertSublayer:self.captureVideoPreviewLayer atIndex:0];
					}];
				}
			}

			// this will block the thread until camera is started up
			[self.captureSession startRunning];

			[NSThread executeOnMainThread: ^{
				[self.delegate cameraStartedRunning];
			}];
		});
	} else {
		[NSThread executeOnMainThread: ^{
			[self.delegate cameraStartedRunning];
		}];
	}
}

...

At the top, we have some properties:

AVCaptureSession is the main object that handles AVFoundation-based video/photo stuff. Whether you’re capturing video or photo, you use it.

AVCaptureStillImageOutput is an object you use to take a still photo of AVCaptureSession’s video feed.

AVCaptureVideoPreviewLayer is a subclass of CALayer that displays what the camera sees. Because it’s a CALayer, you can insert it into any UIView, including controller’s own view. However, to separate concerns and for easier management, I don’t recommend using VC’s view as the parent view for that layer. Here, you can see that I use a separate cameraPreviewFeedView as the container for AVCaptureVideoPreviewLayer.

There’s also a private _simulatorIsCameraRunning boolean. It’s here because when we are running the app in the simulator, we can’t use AVCaptureSession and hence we can’t ask AVCaptureSession if the camera is running. So we need to track whether camera is on or off manually.

The noCameraInSimulatorMessage property is self-explanatory and entirely optional – it just lazy-instantiates a label and adds it to the main view. viewDidLoad simply shows/hides this label as appropriate. We make a call to TARGET_IPHONE_SIMULATOR to know when we’re in the simulator, which is provided by Apple as part of TargetConditionals.h, which is a part of UIKit.

With the variables in place, the logic is simple. When startCamera is called, we check if we’re running in a simulator (TARGET_IPHONE_SIMULATOR will be YES) and, if so, set _simulatorIsCameraRunning to YES. We then tell the delegate that the camera started and exit out of the method. If we’re not in the simulator, however, then we start up an AVFoundation-based camera asynchronously. Note that if a call to [self isCameraRunning] returns YES (code for that method is in part 2 below), we simply tell the delegate that we’re already running without doing anything else.

Sidenote: if you’re way smarter than me, you may be saying, “Wait, NSThread doesn’t define executeOnMainThread selector!” I award you geek points and explain that it’s just a category method I added to ensure a block executes on the main thread – you’ll see code for it at the end of the post (tip of the hat to Marco Arment for that one).

Why So Asynchronous?

Since the camera can take a second or two to start up and perception is reality, this is your opportunity to trick the user by making the app feel faster when it’s really not any faster at all. For example, I mentioned having a custom container controller that implements the CameraDelegate protocol. When user wants to start the camera, that container VC adds the CameraViewController as a child controller but makes it’s view transparent (alpha = 0). To the user, this is unnoticeable as they keep seeing the previous VC’s view. The container VC then calls “startCamera” on the newly-added camera VC and immediately starts performing the “I’m switching to camera” animation on the currently shown VC. Our animation is a bit elaborate, but you can do whatever fits your app. Even if you just fade out the current VC’s view over 0.7 seconds or so while the camera is loading, the app will feel faster b/c something is happening.

Then, whenever the container VC receives the cameraStartedRunning message from camera VC, it quickly fades in the camera VC’s view that it kept transparent until now, and it also fades in appropriate overlay controls (such as the shutter button) if they are separate from the Camera VC’s view (in our app, they are). What I mean by “quickly fades in” is 0.25 or 0.3 seconds – but tune it as you see fit.

It’s pretty crazy how much faster even a simple fade out/fade in feels compared to just going to a black screen and waiting for the camera to spin up, even though the time it takes for the camera to open is about the same in both cases. This is true even considering the fact that our fade in animation technically delays full camera appearance by a couple hundred milliseconds. Perception is reality!

Alright, now to part 2 of CameraViewController.m:

...

- (void)stopCamera
{
	if (TARGET_IPHONE_SIMULATOR) {
		_simulatorIsCameraRunning = NO;
		return;
	}

	if (self.captureSession && [self.captureSession isRunning]) {
		dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^ {
			[self.captureSession stopRunning];
		});
	}
}

- (BOOL)isCameraRunning
{
	if (TARGET_IPHONE_SIMULATOR) return _simulatorIsCameraRunning;

	if (!self.captureSession) return NO;

	return self.captureSession.isRunning;
}

- (void)onVideoError:(NSNotification *)notification
{
	NSLog(@"Video error: %@", notification.userInfo[AVCaptureSessionErrorKey]);
}

- (void)takePhoto
{
	dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
		if (TARGET_IPHONE_SIMULATOR) {
			[self.delegate didTakePhoto: [UIImage imageNamed:@"Simulator_OriginalPhoto@2x.jpg"]];
			return;
		}

		AVCaptureConnection *videoConnection = nil;
		for (AVCaptureConnection *connection in self.stillImageOutput.connections)
			{
			for (AVCaptureInputPort *port in [connection inputPorts])
				{
				if ([[port mediaType] isEqual:AVMediaTypeVideo] )
					{
					videoConnection = connection;
					break;
					}
				}
			if (videoConnection)
				{
				break;
				}
			}

		[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
														   completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
		 {
			 NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];

			 [self.delegate didTakePhoto: [UIImage imageWithData: imageData]];
		 }];
	});
}
@end

stopCamera does nothing but sets our ‘fake’ boolean if we’re in the simulator, and actually stops the camera when we’re on the device. Notice that you don’t see this called anywhere inside of this controller. Why? If your case isn’t like mine, you definitely SHOULD call this from within viewWillDisappear in CameraViewController (and perhaps from within dealloc as well). Otherwise, the camera will be running in the background until iOS kills it (and I don’t know what the rules are for that), draining battery and turning your users into an angry mob.

If your case is like mine, however, your users will see some sort of edit screen after taking a picture with an option to retake their picture, and they may need to go back and forth a lot. To make sure they don’t have to wait for the camera to start up every time, I keep the camera running for 10 seconds. This means that, every time user takes a picture, I have to setup (and reset) a timer to kill the camera after 10 seconds. Where you put this timer code will depend on your app architecture, but do test well to make sure it always executes so that AVCaptureSession isn’t running in the background for long.

Back to code… The isCameraRunning variable is self-explanatory, and here we finally see our fake camera status boolean variable come in handy.

If AVCaptureSessions blows up for some reason (device out of memory, internal Apple code error, etc.) – onVideoError: notification will be sent. Notice we subscribe to it when setting up self.captureSession back inside startCamera.

Last but not least is takePhoto. I wrapped it into a dispatch call, but frankly I am not sure it helps any since Apple seems to block all threads when it extracts a photo. But, if it changes in the future, I’ll be ready. This is where the main you’ll find the main workaround code to make camera work in the simulator: we’re simply returning a random picture to the delegate. Well, I’m returning the same picture every time in this code, but you can imagine having a set of pictures from which you select a random one every time.

Wrap-Up

I’m sure you have more questions about AVFoundation such as “what is AVCaptureConnection or AVMediaTypeVideo and why do I see the word ‘video’ so often when I only care about still images?”, but this is just how it’s done and it’s outside of the scope of this post to describe why. Just copy, paste, and enjoy =) And if it breaks, don’t blame me.

Now, what calls takePhoto? Your delegate view controller, most likely. AVFoundation camera is barebones – it has no shutter button, etc. So you’ll have to create your own “overlay” controls view that shows a shutter button, retake button, maybe flash on/off button, etc. In my app the custom container controller I mentioned before is also the delegate for such an overlay view. When users tap the shutter button on the overlay view to take a photo, the overlay view sends a “plzTakePhotoMeow” message (wording is approximate) to it’s delegate (which is the custom container VC I keep talking about), and that delegate (the custom container VC) then calls takePhoto on the CameraViewController instance. In turn, when the camera VC is done getting an image from AVCaptureSession, it returns the image back to the delegate (the same custom container VC). The custom container VC can then remove Camera VC from the view hierarchy and instantiate the next VC, such as EditPhotoVC, passing it the photo from the camera.

Ok, I promised code for that NSThread category, so here it is:

@implementation NSThread (Helpers)

+ (void)executeOnMainThread:(void (^)())block
{
	if (!block) return;

	if ([[NSThread currentThread] isMainThread]) {
		block();
	} else {
		dispatch_sync(dispatch_get_main_queue(), ^ {
			block();
		});
	}
}

@end

Finally, the code for the timer to stop the camera (if you go this route) is below. First we define a few properties on the controller that is the delegate to CameraViewController (again, in my case it’s the container VC):

@property (atomic) BOOL openingCameraView;
@property (nonatomic) NSTimer *stopCameraTimer;

And then, in didTakePhoto:, canceledTakingPhoto: (if your overlay controls UI has that button), etc. spin up the timer:

[NSThread executeOnMainThread:^{
		// display the next VC here
		...

		// start a timer to shut off camera
		if (!self.stopCameraTimer || !self.stopCameraTimer.isValid) {
			self.stopCameraTimer = [NSTimer scheduledTimerWithTimeInterval:15 target:self selector:@selector(stopCameraAfterTimer:) userInfo:nil repeats:NO];
		}
	}];

The actual method the timer references is here, and it just ensures we don’t stop the camera while Camera VC is shown (that would be an oopsie) or we’re in the process of showing it (self.openingCameraView is set manually to true at the beginning of the method responsible for adding Camera VC to the view hierarchy):

- (void)stopCameraAfterTimer:(NSTimer *)timer
{
	if (!self.openingCameraView && !(self.currentlyShownVC == self.cameraVC)) {
		[self.cameraVC stopCamera];
	}
}

I hope this helps! If I made errors or you have questions, please contact me on Twitter.

Happy coding in 2014!

Progress – iPhone 4 Performance (Programming)

Merry Christmas, everyone!

I’ve been working on Progress, my first iOS app, on-and-off since February. I did 99% of the coding so far, Edward Sanchez did all of the graphical design, our friend Samuel Iglesias helped with some UI tricks, and all 3 of us collaborated on UX. Today, on my last full day of version 1.0 development, I implement the last major piece of “polish” code that I’m particularly proud of, and it’s focused on performance.

When I first ran the app on an iPhone 4, I could barely scrub through 4 full-screen pictures per second. But today, check this out (Vimeo link):

iPhone 4 Scrubbing Performance

Yes, that is iPhone 4 blazing through full-screen pictures without any trickery (no GPUImageView yet or anything, just your normal UIImageView). To get here, I had to do three things:

1. Store each photo in 3 different versions:

  • “Full-size” version at 1936×2592 (iPhone 4 camera resolution, the lowest denominator) and compressed to 40% (as in, iOS SDK’s 0.4 on a scale of 0 to 1 – this is really a much better compression than JPEG’s 40% as you will be hard pressed to notice any artifacts whatsoever). Between 150-300KB/photo.
  • “Device-optimized”, i.e w/e your device screen resolution is (so 640×960 for 3.5-inch screens, 640×1136 for 4-inch screens), compressed to 45% of the full-sized photo above. Due to double compression, 45% (0.45) was as low as I could go. About 75-100KB/photo.
  • “Low quality”, or the same resolution as device-optimized but at 25% compression level. This is about 40-50KB/photo. I can probably compress this further, but right now 25% works.

2. Use NSOperationQueues when loading the photos. I normally use GCD, but queues are perfect here. I start loading a device-optimized version for photo #1, but if photo #2 is requested while the device-optimized photo-loading queue still has an unfinished operation in it, I cancel that operation and start a low-quality photo-loading operation instead. At the same time, I throw a new NSOperation to load a device-optimized version (this time for photo #2), but with the low-quality photo operation as the dependency and with the same completion block. So as soon as the low-quality photo is loaded, the device-optimized version begins to load and then replaces the low-quality version. Or, the operation loading device-optimized version is cancelled once again if the user jumps to the next photo while all this is going on.

This is really just for devices like iPhone 4 and, to a lesser degree, iPod Touch 5G and iPhone 4S, and even there it happens so fast that the longest a user sees the low-quality version is maybe for 1/10th of a second – they don’t even get to notice the heavy compression on the low-quality photo. But they do get to notice the general shape/features, which is what the app is about, so for us it’s important to show _something_ instead of waiting for the device-optimized version.

3. The piece I added today: pre-caching. I remember reading @mattt’s article on NSCache some months ago where he talked about the mythical totalCostLimit property of NSCache. Mythical it is indeed, but Apple does use the size of an image in bytes as an example of what the property can mean. In this app, I have 3 NSCache instances, one for each of the photo versions I listed above and each with a different totalCostLimit. The NSCache instance for device-optimized photos has a totalCostLimit of 5242880, for example, which is 5MB written out as number of bytes. So when the app loads, I launch a background task to pre-cache as many device-optimized photos as I can before I hit that limit. With current average photo sizes, that’s about 50-60 pictures. I take note of all the photos I wasn’t able to cache, and then run another algorithm to cache low-quality versions of those photos (2MB limit for that cache), which is another 30-40 pictures or so. iPhone 4 manages to read and cache at least 15-20 photos per second, so by the time an average user tries scrubbing through their photos, chances are most of those photos will be already cached in either full-quality or low-quality.

The power of the last item is what the video above really demonstrates – between the time the app loaded and I started scrubbing through the photos, all of the photos were already cached. That’s what 76/76 number means in the top-right “debug” area – there were 76 total requests from the scrubber to show a photo, and for all 76 of those the app was able to show a device-optimized photo. If it wasn’t able to keep up, the third column (empty black area) would show the number of low-quality photos it would have had to fetch and show temporarily before catching up and replacing them with device-optimized versions (which was the case before today). Success!

Outside of the general performance tricks that I still need to work on, such as minimizing alpha-blending, there’s another caching-related task I want to do before I can move on with a peace of mind – forward-caching. For example, if we’re looking at photo #1 and start scrubbing to the right, going to #2, etc., and photos 2 through 9 are already cached, I need to start caching #10, 11, and so forth. When they stop on a photo, I need to cache 5 photos or whatever on each side. Sure, they’ll still catch up with me eventually if they scrub fast enough, but it will be a smooth experience for them until then. And if they slow down for just a moment, I’ll again be X cached photos ahead of them, which for normal scrubbing speed means “infinite” smooth scrubbing experience.

Forward-caching isn’t essential for version 1.0, though – I’m content to ship with what we have.

Tesla Motors Facts vs Concerns: Fight!

I was so outraged at the lack of fact-checking in Car & Driver’s December 2013 article on Tesla Motors that it spurred me to write this post. Having an opinion is absolutely fine; presenting facts that are quite literally opposite of what is known to be true, however, is not fine – especially in print. The majority of this post isn’t about that article, it’s about various concerns I’ve heard and seen elsewhere; only the first section addresses Car & Driver’s article.

I’ll write an opinion piece on Tesla later – how I see their strategy, where they are going, etc. I do own some Tesla shares and feel it would lack integrity for me to mix facts with my (supposedly biased) opinion.

The company lives off of government subsidies

Car & Driver’s Aaron Robinson stated that “…the company is going to have to earn more dollars making cars that currently don’t generate profits on their own [without government credits]” and “Tesla will soon have to swim on its own” (again implying it’s only afloat with government’s help).

Facts: Mr. Robinson is referring to ZEV (aka, “clean air”) credits, which Tesla receives for each EV sold and then re-sells to gasoline car companies, and the $7,500 federal + various state discounts per EV. In their latest quarterly earnings report, Tesla indicated a profit margin of 21% per car excluding ZEV credits, and 22% with the credits (so only 1% of the margin is now attributable to any government credits). In a few months, by the end of 2013, Tesla expects to achieve 25% profit margin per vehicle even with no profit from ZEV credits. As far as the federal $7,500 credit, and the related state credits, Tesla Motors does not receive any money from those – the money goes directly to the consumer. And, of course, Tesla paid off all debt it owed the government back in May, nine years ahead of schedule. In comparison, GM’s gross profit margin is 13.3%, and Ford’s – 16.5%.

Commentary: I’m no expert at finding things – for instance, I couldn’t find any sense of journalistic integrity in that Car & Driver article no matter how hard I looked – but I was able to find the facts above with just three quick Google queries.

Tesla Motors’ low profits are worrisome

Facts: Tesla Motors has about $745 million cash on hand and, according to its senior leadership, isn’t worried about running out of money. So the profits are being heavily re-invested, which is to be expected for a young company. (10 years is very young in the car industry.) Much of the money is going toward ramping up production from 21,500 cars this year to 500,000 cars in 3-4 years, maybe more. A great deal of money is also going toward their Supercharger network, additional stores in the US and Europe as well as expansion into Asia, and service centers – unlike conventional car companies, Tesla operates the entire vertical chain.

The batteries will stop holding charge in a few years

Facts: Model S comes with an 8-year battery warranty, even if you recharge it at a Supercharger every single time you charge it. While the amount of capacity per unit of battery weight has seen slow improvements in the past decade, the ability to hold that charge after numerous re-charges improved dramatically. Six or seven years ago, for example, it would be reasonable to expect a laptop battery to only hold 80% of its charge after 300 recharge cycles. I just checked my 2012 Macbook Pro’s battery – after 289 recharge cycles, it can still hold 99.7% of its original capacity. Tesla’s battery tech is similarly top-notch, as evidenced by the fact that both Daimler and Toyota are buying Tesla’s tech for their own EVs.

You can’t go from NY to LA in a Model S

Facts: True, currently that requires some planning and a few extra days. Until the end of 2013: at that time, you’ll be able to make the trip with no planning and using Superchargers only, according to Elon Musk. Bonus: you’ll pay $0 in fuel costs on that trip. Notably, you can already travel along the Western seaboard (San Diego to Vancouver) and by the end of 2013 or early 2014 you will be able to go from Miami to NY as well, says Tesla.

Model S’ range drops significantly in cold climate

Facts: False. The Netherlands is Tesla’s highest per capita sales country and it’s quite cold over there.  You should expect a range drop of maybe 10% for cold weather. And by cold I’m referring to around -20 Fahrenheit and below, not Florida-standards cold (which I hear is around 65 degrees?).

Teslas are too expensive!

Facts: They may end up so, but the sticker prices aren’t the prices you’re looking for… EVs cost more upfront, yet require substantially less maintenance and save money on fuel. With Model S specifically, Supercharger use is free for life so if there’s one next to you, you’ll almost never have to pay to recharge the car. Once you add up all the savings, and compare the final figure to the cost of a gasoline car plus gas/maintenance expenses, a Tesla may still be too expensive for you right now, but you’ll be surprised at how much closer the final numbers are.

Aren’t we just shifting the pollution to coal-burning plants?

Facts: Yes, but only a third of it. A power plant is about 3 times as efficient as a small internal combustion engine in a car – it wastes ~25% of energy vs car’s ~75%. There’s just not enough space in a car to capture the energy lost through engine heat and use it to heat up steam to turn a turbine that would generate more energy. There’s plenty of space for that in a plant, though. Gasoline cars also don’t recover energy from braking and slowing down like hybrids and EVs do. That 1/3 of emissions will be reduced even further once Tesla builds out solar arrays for the Superchargers – the company promised to offset all energy needed for the Superchargers with renewable sources.

The grid won’t be able to handle that much demand for electricity

Fact: This is a valid concern and another reason for Tesla to power Superchargers from renewable energy sources. Grid capacity is a macro issue not specific to Tesla, however, and many big players from utility companies to governments to a host of best universities are working on this problem. We’re not using the existing capacity as efficiently as we could have, either, so there’s some room to grow. For right now, we can accommodate the Teslas being produced.

I heard about the fires – Model S doesn’t seem safe

Facts: Model S is, without any hyperbole, the safest car the US Government had ever tested – that’s according to the government itself. If one checks the statistics, there were 1.27 fatalities and 80 injuries (at least half of which had consequences) per 100 million vehicle miles traveled (VMT) in 2008. Tesla’s Q3 2013 earnings revealed that Tesla owners have traveled over 100 million miles now, and a few days ago Elon Musk reported that there have been 0 fatalities and 0 permanent injuries so far. So apples to apples, that’s 0 and 0 for Tesla cars vs 1.27 and 40 for an average car. As far as car fires go, over 150,000 of them occur in the US annually, resulting in about 800 people injured and 200 dead. As of today, Nov 13 2013, Tesla has a much better record than an average car on both the amount of fires per vehicle miles driven and the number of injuries from said fires (none). Notably, all 3 owners whose cars caught on fire already bought or want to buy another Tesla car. If I thought a car was at fault, I wouldn’t buy the same one again – would you?

EVs take forever to charge, which makes them too inconvenient

Facts: True. However, a Model S has the best range of all EVs: 208 real miles on the smaller battery and 265 miles on the bigger one, according to the government. If your commute is under 100 miles one way, you don’t need to worry about recharging during the day, and the car gets a complete charge off a wall outlet overnight. Tesla’s Supercharger network is free to use for life, too. A Supercharger can recharge a Model S to half the capacity in 22-25 minutes, and to full capacity – in about 45. This is down from 30 and 60 minutes respectively earlier this year, and the time may be reduced further in the near future (Tesla will reportedly be testing faster charging times in Germany soon). Finally, if you’re in a hurry, Musk says you’ll be able to swap the battery pack in half the time it takes to put gas in your car for about the same price as you’d pay for a full tank of gas – just pick up your original battery on your way home (fully recharged). If you did a double take on that – yes, that technology is real and, currently, Model S is the only EV in the world with a swappable battery pack.

GM/Ford/XYZ will crush Tesla Motors

Well, that’s an opinion. Time will tell. What we do know is that Musk’s PayPal started an online payment revolution from a tiny office and defeated eBay, forcing them to buy PayPal; Tesla’s cars have the longest range and the best safety and performance characteristics of all EV competition; and SpaceX manages to fly missions using 1/10th the money NASA needed. Being big isn’t always an advantage; sometimes, it’s what holds you down.

Did I miss any? Let me know on Twitter! @AlexDaUkrainian

29 Years of Awesome, and Why YOU Rock

I’m 29 today! Naturally, I’ll take this opportunity to explain why you are awesome, even if it has nothing to do with my birthday. Keep on reading, fellow life traveler.

But first – wow, what a roller coaster ride it’s been! I mean, holy cow – from a tiny village full of mud and livestock (mooo) to an American suburban “dream” with luxury cars, motorcycles, and good friends. I’ve been the love of my family and I’ve been the outcast. I’ve been betrayed, and I’ve been forgiven; I found true love, and I lost it. Some knew me as over-confident, unstoppable, and hyper-productive and some – as depressed, lonely, and continually sick. I’ve both made 6 figures per year and teetered on the verge of bankruptcy. I helped change lives and I’ve desperately needed help myself. And in a few weeks, I’m about to change location for the 13th time in my life, despite still not having 100% certainty whether the destination will be San Francisco or Canada (99% certain for SF, it’s just not under my control).

Today, I once again find myself staring into the face of uncertainty. It’s been 11 years since I moved to the U.S. of America, and I am still a nonresident alien (read: no ‘green card’). That’s despite earning my college education here, working many jobs, and being an exemplary citizen who pays a ton in taxes. But you know what? Fuck it. I’ve studied brain science, psychology, and philosophy for years and let me tell you that “fuck it” is the best wisdom I will ever be able to give you. Because it’s all in our heads, my friend. All of it.

Real stuff happens in the world, and absolutely we should react to it with reason and evidence-based judgement. But our attitude towards it all? Our motivation? Our psychological well-being? It’s all in our heads. We have automatic responses to things we’re been forcefully attached to – parents’ approval, love connections, definitions of success or what’s good and bad, social expectations, religious beliefs, etc. – but all of those things can be changed, morphed, or even entirely dissolved with but a single switch in your head. A switch you can flip.

How do you find the switch? By realizing that the world isn’t magically “better” than you are and other people are not smarter – like you, they all just are. Everything around you has been built by folks that are as smart or dumber than you are (credit: my dad). They were all just trying to figure out what the fuck was going on and how to co-exist amid it all, and all of them, without a single exception, made a lot of mistakes, too. Even the figures we’re taught to idealize, like Gandhi, MLK, and Mother Theresa, were just people who stopped listening to the noise and started sending out their own signal into the world. One day, they simply said, this is who I am from now on and I don’t need anyone’s approval for it – I’ll start being that today. And the rest of us, just as capable and great, huddled around them not because they were better than us, or somehow stronger – we did it because they became a constant, a rare constant amidst the chaos that is the rest of the world. And I’m here to tell you – you are Gandhi. You are MLK. You are Mother Theresa. You are everything and everyone, and the only difference between you who is all-capable and you who is meek and weak is your decision to be so.

Now, some have talents – music, quicker math, etc. – but that’s not what gives strength or greatness. Hell, I’m smart – used to be real smart – and I’ve spent many a month in depression. And no, you won’t magically become rich just by thinking you will. But you will be strong because it’s not a thing you get from outside of you. And you will be on the path to having your inner greatness expressed in the real, outside world. And you will be happy. You will be happy because you will know that the only thing you need to be happy is your own approval and definition of happy. Whether you’re trying to create a piece of software or you’re trying to find food and shelter for the night, the only two possible outcomes are identical – you’ll either succeed or you won’t. But who you are during that process, while on that journey, is entirely up to you.

See, no matter how damn good your life is, there’ll always be shit to worry about – will my baby get sick and die in her sleep tonight? Will the boss approve my idea tomorrow? Will US adopt communism next year? And if it’s a reasonable concern then make reasonable plans and steps to counter it. But in the process – give up the idea of control. Have both the humility and faith to say, “I can’t know the future. Ever. But until that future arrives, I can choose what to believe about it.” Choose to believe that it will all turn out. Why?

Because life, my dear reader, is the biggest joke ever played on you. Life, you see, is a self-fulfilling prophecy. If we believe life sucks, and bad shit happens – our attitudes make us feel down and depressed, the world is less willing to interact with us, and everything that does happen is interpreted as crap, and we often choose overly defensive and safe actions that lead to less win and more suckiness. Or, we choose to believe that life rocks and our attitude goes up, people want to hang out with us more so we meet more people and build a better network, and we end up taking more risks that lead to bigger rewards, and so once again the results confirm our initial bias about life. But in the end, no one will tell you which way was right and which way was wrong. Nor will you be able to judge b/c, from your perspective, your results will confirm whatever bias you starting with! And that’s the joke of life – I hope you laughed.

So separate yourself from the noise, confusion, and the pain of the rest of the world. The world doesn’t know any better. Your parents made mistakes. Your friends made mistakes. Don’t spend your life proving them right or wrong, or earning their love or approval. Don’t spend your lives being afraid of being ashamed or stressing over every achievement to prove someone wrong. Fuck it, fuck all of their opinions. Their opinions are just as biased, as imperfect, and mean as nothing as your opinions. (See what I did there with that ‘nothing’ word? I’m proud of it. No, I don’t care what you think – I’m still proud of it.)

Define yourself. Obviously, don’t do illegal shit – that’s another topic. But choose to believe that life will turn out. Choose to believe that you’re great and amazing. Choose to believe that you’re loved. (Do you love yourself? See, told you. Wait, you don’t? Well start today. See, now you’re loved.) Choose to believe that you don’t need to be loved by everyone (even if it’s someone you really want to be loved by). You don’t need approval, you don’t need direction. You just need you. Find friends who support you in that manner, who will stand for who you really, truly are (as defined by you). Ditch the friends who lead you astray (seriously, fuck them, you owe them nothing). Ditch all of your friends if none of them support you in being stronger.

Because I tell you – you’re goddamn amazing. You’re beautiful. And you were so even before I just said it. This life, I don’t think it means what you think it means… it means nothing. It means what you make it mean. So make it mean whatever makes you happy. Enjoy it. Go play with your kids, or go ride that bicycle, or go solve malaria or poverty.

Pick the direction of your self-fulfilling prophecy, Captain. You need no look at others, you need no approval or praise – you are the highest praise you can get. That said, just FYI, I have full faith in you. You rock!

And to myself I say – happy birthday, Alex! You are alright, too. I’ve so, so missed you.

Why are you still here?? Go be awesome.

The Conundrum of Power… J.D. Power

J.D. Power hasn’t released the underlying data for their latest tablet owner satisfaction survey, so the headline “Samsung Ranks Highest in Owner Satisfaction with Tablet Devices” left many confused about how Apple’s iPad could scored more circles and still lost in the end. Here’s how:

A)  J.D. Power does not rate or compare tablets, and it never has. What they do is rate owner satisfaction with a particular device, in isolation from other devices and satisfaction ratings, and then they compare the results. Even if one tablet was capable of teleportation and shooting phasers, as long as owners of another tablet report that they are more satisfied with their tablets (maybe they are not aware of tablets that can teleport people, or maybe they just like taking a bus), the other tablet will rank higher in owner satisfaction. The words “in owner satisfaction” are crucial.

B) The scale is only absolute within the domain of each manufacturer. We can assume that most owners surveyed only owned one of the tablets. Hence, their satisfaction is relative. Just like one person can be very happy with their $3,000 Corolla while another person is unsatisfied with their $40,000 Lexus IS 250 (perhaps because their friend just got the more powerful IS 350, or because they’re a spoiled little brat who never worked for the car to begin with, and instead got it as a gift from their rich grandparents who are too sweet and naive to understand that they are inflicting a deep, permanent psychological disability on their grandchild through these expensive gifts that the kid has no ability to appreciate due to the fact that such appreciation would imply an understanding of hardship and work, the very two things the grandparents have made unnecessary for the said child), in the same fashion you can have some owners be happy with a crappier tablet while others are less happy with a better tablet. J.D. Power doesn’t measure if one tablet is worse than the other; it measures how satisfied with them their owners are.

C) The circles do not represent absolute ratings, such as when critics rate movies, or when CNet rates products; instead, they show manufacturers’ relative placement within the group. So if 3 tablets get performance ratings (on a scale of 1 to 200) of 149, 150, and 151, then they will respectively get 1 circle, 3 circles, and 5 circles despite the fact that their performance is rated virtually identical. (Note: I said “rated,” not “is”.) From this we can conclude that Apple barely won the 4 categories, and lost the Cost category by a wider margin. Again, “won” here does not mean that Samsung tablets’ performance or features were near identical to those of Apple tablets. It means that the survey participants’ perception of those metrics was similar, presumably without having used the other company’s tablets and hence having little to no frame of reference.

Of course, we can’t conclude an exploration of conundrum without pointing fingers – that’s like being Santa and doing all the work of delivering presents without taking advantage of the free cookies and milk. :) So I blame J.D. Power for not disclosing the underlying scores with every report and thus making their chosen method of presentation extremely confusing; I blame the media for spinning this as if J.D. Power said that one tablet is better or worse than another; and – let’s be candid with ourselves – I blame those of us who quietly accepted the results of this survey for years while iPad was on top, but suddenly started demanding full disclosure of data and an immediate explanation from J.D. Power when iPad didn’t come in as #1.

Let’s all hope iPhone’s rating never slips, or none of us will feel safe coming online and playing on Twitter for weeks thereafter. Now excuse me as I use my crappy, 2nd-place tablet to teleport myself to Hawaii.

The Case for Ashton Kutcher

I’ve noticed several high-profile people critiquing Lenovo’s hire of Ashton Kutcher as Product Engineer. I respect many of those people, yet I couldn’t disagree more with their critique; in fact, I find it very hypocritical.

Who was Steve Jobs when he started Apple with Wozniak et al? Nobody. He had no degree to qualify him for the job. He had no engineering chops. He was a college dropout, a hippie, and a druggie. But he had a passion about something. And that passion transformed him, and allowed him to play the roles of product engineer, salesman, leader, and CEO, among others.

On the other hand, we have Bill Gates – a true engineer. How much have you enjoyed the user experience of Windows over the years? Zune? Windows Phone? Windows tablets (pre-Surface, especially)? IBM is run by engineers – their products are anything but simple and easy to use. BMW has some world-class engineers – and their UI is one of the most complex and frustrating in the business. Engineers create great technology, but they often don’t know how to integrate and simplify it to produce superb user experience.

Ashton clearly stated that his role would revolve around bringing a better user experience to the already-excellent Lenovo hardware offerings; nobody said anything about designing electrical circuits. Time and again we’ve all agreed with Jobs and Tim Cook when they said, “Apple is an intersection of Liberal Arts and Technology.” And yet, when Lenovo hires someone from the liberal arts side to help their technology, we laugh. Why? This isn’t any different from what Apple has been doing.

I don’t know if Ashton will do a great job. But that’s the thing: I don’t know, so I give him the benefit of a doubt. I don’t pretend to know because it’s impossible for me to know. And, if anything, I see many parallels to Apple here, which is a good thing. I know Ashton is extremely passionate about technology. He has a successful record in technology investments. He has read more books than I maybe ever will. He surely knows enough about technology to know what is a good investment and what is a bad investment. His new role is to decide which features/design are a good investment of employees’ time and what aren’t – that doesn’t sound all that different from what he has already been doing.

Lastly, some people have issues with the word “engineer” being in his title. Well that’s just ignorance of English language – we don’t see Doctors of Philosophy degree recipients operating on patients. Like the word “doctor,” “engineer” is a versatile term.

I wish you the best of luck, Ashton – you have the passion, that’s for sure. And as far as I can tell, that’s the most important thing in the world. Martin Luther King wasn’t in a college marching band, yet he organized some of the best marches in history. And all he needed, he told us, was a dream.

Oh, and can we please have the next Lenovo laptop weigh exactly two and a half pounds?