Paid Upgrades and the Mac App Store

There’s been some solid debate recently around Mac App Store pricing, and the idea of paid upgrades, continual ownership and updates.  It’s relevant, and it is a change in revenue model for software companies.  Will Shipley, always a thoughtful and thought-proking individual weighed in with this opinion.   Largely proposing a way for the Mac App Store (and by inference, I would say it includes the iTunes iOS app store) to provide for paid upgrades on major revisions of the software.   It’s valid from a more traditional point of view on revenue and pricing of software.

A very solid counterpoint was provided by The Mental Faculty in the blog post Paid Upgrades and the Mac App Store.

I believe there is a missing piece there that major upgrades are treated differently, by Apple and others.

My expectation is as follows.   Initial prices for software will drop.   Not to $0.99 or any of that.   Major software is going to cost money.   $10-$30 for a lot of normal apps that might now go for $50-$70.  Those apps will have feee updates for their lifecycle.   Then when a major upgrade is going to be released, it’s a *new* application.   Migrating data will be a bit of a pain in sandboxing and application security, but a bit of Dropbox or other transfer ingenuity will alleviate that.

My rationale?    It comes down to essentially a “license” that you sell for the duration of a version, rather than the duration of a year or the like.   The great part for the customer is the license never expires (unless the software no longer works on an OS upgrade or device upgrade).   The upside for the developer is that you get lower cost for adoption, and you do have a recurring revenue stream for solid bits of new work.   It is a change in the upgrade and sales model.   The other option is to introduce major uplifts in features as paid in-app purchases, which is another option with what already exists.

I don’t think it’s either full price new versions or free upgrades for life.   There’s a lot of capability in the revenue models Apple has in the app stores they run, and while those models are nowhere near exhaustive, they are, I believe, sufficient to support a wide range of development models and companies.

I expect that the iLife and iWork applications will go this way as well, and as that happens, iCloud is how those apps will move across data and settings between versions.   Mountain Lion will be a new app on the app store, even though it’s an “upgrade”.   The supporting evidence is that Apple showed us long ago the OS was a lower cost upgrade than say Microsoft provides.   This gives incentive for people to upgrade much more readily than a higher price point would, yet everyone pays the same price.

Every sale is an upgrade.   From the previous major version to the current version, or the first version to the latest version.  One price upgrades.  Even if it’s upgrading from nothing to a new customer.

One platform, two, three, more? – Original Post November 24, 2010

James Governor over at Red Monk (a great bunch of analysts I alternately seem to agree with and take issue with) posted a take on the mobile platform development continuum and HTML5 as an alternative. Take a read and the disagreement I voice below will have a better context.   I’m reposting the reply here because it is a relevant discussion and I leave him the right to do with his site as he will, and that includes having it go away.   So I’ll keep my work where I can get at it.   On my laptop in MarsEdit and on the web on my own blog as well.   :-)

[I feel] James, [is] getting a bit absolutist for an analyst that usually sees how things fit together more completely.   It’s not a winner-take-all end game.   Some apps work better on a local platform, native toolchain, optimized into the hardware, fixed system, no browser barrier.   Usually have a higher development cost than the average web app.   These are, of course, generalizations.

Just look at the Samsung Galaxy response.   Some people like it.   Different form factor.   Different purpose.   iPad app onto a Galaxy?   Most wouldn’t feel right or be a great experience.   The web gives you a “common denominator” approach, and that usually gets offset by a pile of javascript conditionals adjusting for the nuances of browsers.

I tell you, trying to use a mutual fund screening system on the web today was horribly painful.   The state and ability to pull things in and out of a spreadsheet app is just not happening.   The app is older and could be improved with some newer approached and technology, but really, I want my <em>data</em> to be able to flow back and forth securely between devices, but when I’m working, I want it to be <em>here</em>.   That’s why the browsers keep adding in desktop features.   Because the experience in the browser is and has been inferior.

WebGL?  Offline Storage?  Hardware graphics acceleration?   Codec hardware acceleration?   These are all desktop features that get pulled into the browser over time as it trails behind.   It’s not surprising, and it’s not going to change.  Standards move slower than most proprietary innovation.  Web apps are usually for a cheaper and/or broader approach, with a radically different revenue model.

It just baffles me that development gets lumped into a bucket of “proprietary platform” or “web standards”.  Both are two ends of a continuum.   OpenGL is pretty cross-platform in the bulk of the API.   And it’s well defined on experience.   There’s shades of gray all over.   Are you telling me the same technology and approach for Farmville is appropriate for Doom3 or Call of Duty Black Ops?   I don’t hear the game companies complaining about supporting 5 or 6 platforms, let alone 2 or 3.

Of course, the desktop pulled in automatic updates and notfications and sync from the web in many ways to bring those strengths across too, so it’s not totally a one-way street.

But it’s not a one-lane street either.

 

The future of personal communication? – Original post August 16, 2010

Apple may be laying out a foundation of pretty serious evolution in personal communications right under our noses with Facetime.

Facetime is an open spec.  Always good for adoption.  It’s supported by one of the hottest consumer cell phones out there.   It’s data-based, but also rooted in SIP.  Now, if the rumour sites are right, the latest incarnations won’t only go to phone numbers, but to email addresses associated with devices.

Big deal you say?  Think it through for a moment. If you have multiple devices, they register to a single email address.  You have a universal number now.  Any connection to that email address via Facetime (voice, video, however) and you alert every device it’s associated to (or the most recently live one is a default, or user preference) regardless of location.

Ok, so that’s Instant Messenger on many devices.  Again, big deal you say.

SIP can associate a phone number in there as well of course, and the phone has a phone number in there, but let’s theorize something.

Let’s say for theoretical argument’s sake that Apple replaces AIM as the mobileme transport with this “universal iContact system”.  Anyone on it with an iChat client or Facetime device (iPod, iPhone, iPad, Mac (again, theoretically, we’re forecasting)) can connect to anyone on any of their devices with a voice, video or even plain text or data session at any time.  SMS goes bye-bye as it’s just a typed text chat line to the facetime address.  You have a new unlocked iPhone 4 and you pick up a local data plan wherever you travel to and all your “calls” are forwarded automatically at local rates for data.   Roaming charges to bye-bye.

Now let’s say the Apple creates a SIP gateway that can associate the iContact ID with a phone number they provide, sort of like a SkypeIn number or other “real” phone number in a voice system.   When the call goes to that number, it gets forwarded as a translated Facetime voice chat to any and all of your associated Facetime iContact devices, and you pick it up just like a Grand Central number (Google Voice now).

Apple has replaced present-day mobile phone numbers with a universal number, added in the video calling, text chatting, voice calling and crunched it all down to data streams on a standardized open protocol.  The cell companies finally are reduced to data plan sellers (which is all they *should* be charging us for today anyway, the rest is restricted repackaged data at ridiculous markups (read: SMS)) and the roaming charges that get travellers so irate are gone without losing contact by having local sim cards and phone numbers everywhere.  Now it’s starting to look like a big deal.

Deploy VOIP and messaging as a primary avenue with a bunch of features everyone thought were in the future 20 years ago (video calling) and already have a device that’s creating a critical mass.   Lump all that on top of the ability for other device manufacturers to jump on the bandwagon with compatible services and offerings, and we finally get connected in a quality way, a universal way, and a fair way, and it happens soon.

It may even be happening now.

Bottom-up Outsourcing – original post July 27, 2009

I happened upon this little tidbit on my blog backlog. The unconventional James Governor taking a whack at outsourcing as done from the trenches: James Governor’s Monkchips’ Give Every Developer a $5k Outsourcing Budget

Some of my past compatriots may recall the idea I had worked up that didn’t go quite this grass roots, but was a variation on a theme that might appeal to the slightly more conservative innovators.

The suggestion was that the projects be done to understand an explore outsourcing for the company, and learn to manage and use it down in the people that would be the internal leads. The developers would examine the train of tasks coming through, and the projects they had assigned or that were of value to them (which may be their own projects). They would then propose how to outsource, under their direction and management, a project or component. The collection of these proposals would be collected periodically (or continuously) and assessed for value, risk, and other criteria the company and team may see fit. At that point, the selection would whittle it down to a few and distribute the budget accordingly in the proposals (with possibly a round of refinement if the numbers don’t all add up) and the developers would have both skin in the game, and would stand to gain both valuable management and communication skills and experience, but also would show some of the soft-skill capabilities to the company.

Unfortunately, the corporate and technical leadership at the time figured that big projects and minimal oversight was the way to go, so not only did the staff gain near-zero experience in this global toolset, but the outsourcing also had a number of large failures, little learning, and didn’t bring value to the company in any reasonable time frame. Lessons have been learned since, but I still stand by this approach if you want your team to think of outsourcing as a partnering and supplier-style tool, they need to be involved and committed.

If you’re faced with the opportunity or need, consider a variety of approaches, and also consider strategically how you expect outsourcing to work in your company on a continuing basis, what you need to make that happen as far as your staff skills, and finally what it will take to make that transition start. Delegating into the team the responsibility and control serves a number of needs and strategic goals if you’re serious about adding outsourcing to your tool aresenal.

Currently playing in iTunes: Comfort by Jillian Ann