Misguided analyst editorial – update: Called it

Wow.   Rob Enderle has a lot of readers on Computerworld I expect.   I read the odd article from him.   I had thought he would be offering solid business advice in light of the viral “Comcastic” support call from hell [ http://www.huffingtonpost.com/2014/07/14/the-comcast-call-from-hell_n_5586476.html ] I stand corrected.

Basically, Enderle shows to be a run-of-the-mill, CYA sycophant extraordinaire advocating analytics to essentially make two classes of customer, rather than use it to help monitor and improve your customer relationships as a whole.    His original article I’m ranting about is here: Don’t Be Comcast: Use Analytics, Monitoring to Prevent a Viral Disaster – Computerworld

He starts somewhat sanely, in having a list of the biggest customers available to managers, so that, as he did, you don’t cancel a supply contract you have from someone who happens to be your largest customer.  I get the impression that it wasn’t a healthy business relationship, and may have been grounded more in back-room “you scratch my back I’ll scratch yours” deals than good business if a cancellation resulted in that sort of fallout.   Either that or the company Enderle was working for ALSO wasn’t competitive, and let’s just say what goes around comes around in that case.

But when he then takes social media, and proposes to use it to monitor when you might have a PR issue on your hands, or track negative and positive PR, that’s just good sense.   But that’s not what he proposes.   He takes the idea of the “influencers”, people who have larger pull in social media and PR, being often celebrities or journalists, and having your real-time analytics alert you when they are contacting the company in support to give them extra-special treatment.   Basically, make them an “elite”customer, and screw the rest of us.  

You know what social media does then?   Check the hashtag count on twitter.   You’ll still get #comcastic from all the rest of your customers relating serious issues and problems, and you’ll have a few media celebrities having positive PR.  And it will catch up with you, and if your customer service sucks, I’ll trust my second-cousin’s-friend’s opinion of your shop far more than some privileged A-list celeb on where I take my business. 

Enderle is a “mover-shaker-fool” that is looking for quick results and his own rep in a corp rather than actually making your business the leader in the category.   Fix the problem.  Treat your customers correctly and install that in your employees.  And don’t incentivize them disproportionately against that.   I’ll bet good money that  the call rep at Comcast is paid good coin on a “save” of a leaving customer, so he will work his rear end off to the point shown in that recording to make that save.   It’s worth it as I expect his performance and his compensation is so skewed to making the save that it’s not worth his time to be courteous and walk people through it professionally.   Enderle says the rep should be fired.   If I were the CEO I would start with looking at how the incentive programs for the call centre are set up, especially in the “customer retention” area.   And adjust the attitudes of the people setting that up.   

And you know the irony?   I bet the whole behaviour of grinding so hard to keep a customer (and up in Canada Shaw and Telus do it just as much, but I didn’t run into quite the level of zealousness that was in the viral posting) is based on analytics.  Enderle hasn’t yet learned the lesson that people who actually THINK about analytics and their application has, which is you still need to have a goal in mind when you apply them.   Comcast has a goal when it hits customer retention, as do all these telecom/internet providers.  Keep the customer at all costs because customer acquisition is very expensive.  The numbers say it’s a lost cause so go all out.   Any win is a great bonus.  Social media is re-empowering the consumer and making the businesses play honest with everyone.   Enderle doesn’t get it.   Make sure you make a better decision than he advocates.

 

UPDATE:  Looks like I called that better than the paid analyst did.   http://venturebeat.com/2014/07/22/comcasts-retention-policies-take-the-blame-for-that-customer-service-call-from-hell/ pretty much outlines what I figured was the core of the issue.   Perhaps new metrics NOT from the accounting department need to be added in?

Snowden we know about but…. who else?

I’ve been following the Snowden revelations.  And commentary by people like Bruce Schnier, as well as responses from the NSA and others on TED Talks.  It’s a complex issue.   If there was ever a time a US Agency succeeded in undermining the entire US economy before, I can’t think of it.   But I think this time it will reach that far.   

Because of that, Snowden is being vilified.   He should never has spoken up, many say, and the damage he has done to the US and their reputation, and via that to their economy, is treason.   Criminal treason.  

He broke the law.  I have no argument there.   Anyone engaging in civil disobedience is acting against the law as a matter of conscience.   Ed Snowden obviously cares a very great deal about what he did, and he is paying a very high price for those actions, but he feels they were worthwhile.   They were not for personal gain.   It was for a principle.  A principle that said the US was honourable, at least to its own citizens. 

I’ve not felt the US is all that honourable to *anyone* since the DMCA and the Patriot acts came in.   The government acts almost entirely to facilitate special interests and powerful elite individuals and organizations.   Not to make the land of unlimited opportunity it was founded on.   The people who seized those opportunities don’t wish anyone else to threaten their successes.   They don’t want the next generation succeeding them.   Or beating them.   But in all this turmoil, all these differing viewpoints, motivations, and possibilities, there is one question that I haven’t heard yet.

Who else besides Snowden?  Snowden was a rare character with a social conscience that could see that this was morally wrong, and was against the law of the land he served and cherished.   More importantly, it was against the spirit that his entire nation was founded on.  That the government served the people.   

So these programs have existed for a long time.   They have known vectors, and methods.   They have catalogued vulnerabilities.   These things are extremely valuable, especially to foreign powers, and criminal powers.   This is the backbone of information for major organizations.   These vulnerabilities are the keys to the kingdom.   The knowledge was a top secret weapon.   But the holes in the infrastructure are in all the infrastructure.   Including the US, Canada and all of the other  allies of the US.   That’s a really, really valuable thing, as long as the US believes that these vulnerabilities are still their own little secret ,and nobody else’s.  

Snowden was one relatively minor actor in these organizations.   He had a conscience.   He served his conscience as he saw fit.   But suppose any one of the thousands of others with knowledge of these programs was wronged or felt they deserved a better result from their own labours?   Suppose they felt they could keep the US operating as they desired and not leak the information in good conscience, but instead sell it.  Secretly.   To a foreign power or criminal organizations.   There are so many pieces they would only need a handful, and probably would profit immensely from each one.  Who’s it gonna hurt?   The NSA knows about the vulnerabilities, so it shouldn’t hurt them.   And so what if a few US companies get caught in the crossfire.   The NSA thinks that’s ok.   The US Government thinks it’s ok if they do it.  So spread the wealth a little.   

How many of the foreign attacks we have seen have been through these intentionally introduced vulnerabilities?   How many times have advantages been given to hostile powers, to those who would do harm to others with this power?

Snowden did us a favour.   He gave us a shot at stopping all of it.   And hopefully being wary of it happening again.

Who else gave away copies of the keys to the kingdom?

Currently playing in iTunes: Christmas Song by Dave Matthews Band

HTTPS links in Redmine email

We’ve been shifting a few things around in our infrastructure, and one of them was the Redmine server.   Getting it up and running on a custom port using https is all pretty straightforward, but when the email notifications on issues started going out to watchers, they all had http://hostname:port/.… in them.   The problem was it was an https server.

Pulling the default swiss-army-problem-solver of searching the web out, it’s in the Redmine FAQ as well as a number of posted solutions.   Apache isn’t passing through the protocol.   So you add in a magic little Apache line for RequestHeader on forwarding the protocol as https and….

It doesn’t work.   

Then into some hairy mod_rewrite.   Firewall configs cause that to turn into a serious conflagration.   What to do, what to do.

Oh look, there’s a setting IN REDMINE on the general page, just under your host name with custom port if you have one of “http” or “https”.

Set that to https, problem solved.

Now trying to get that into the Redmine documentation as an outsider is a whole other adventure, so in the mean time at least it will be here in the swiss-army-problem-solver of the internet, and if you’re specific enough, you got here with a good ol’ search through DuckDuckGo.com.   Or if you got here through a filter-bubble of a more popular search engine, well heck, I guess my page rank went way up now didn’t it?   :-)

Happy configuring.

Paid Upgrades and the Mac App Store

There’s been some solid debate recently around Mac App Store pricing, and the idea of paid upgrades, continual ownership and updates.  It’s relevant, and it is a change in revenue model for software companies.  Will Shipley, always a thoughtful and thought-proking individual weighed in with this opinion.   Largely proposing a way for the Mac App Store (and by inference, I would say it includes the iTunes iOS app store) to provide for paid upgrades on major revisions of the software.   It’s valid from a more traditional point of view on revenue and pricing of software.

A very solid counterpoint was provided by The Mental Faculty in the blog post Paid Upgrades and the Mac App Store.

I believe there is a missing piece there that major upgrades are treated differently, by Apple and others.

My expectation is as follows.   Initial prices for software will drop.   Not to $0.99 or any of that.   Major software is going to cost money.   $10-$30 for a lot of normal apps that might now go for $50-$70.  Those apps will have feee updates for their lifecycle.   Then when a major upgrade is going to be released, it’s a *new* application.   Migrating data will be a bit of a pain in sandboxing and application security, but a bit of Dropbox or other transfer ingenuity will alleviate that.

My rationale?    It comes down to essentially a “license” that you sell for the duration of a version, rather than the duration of a year or the like.   The great part for the customer is the license never expires (unless the software no longer works on an OS upgrade or device upgrade).   The upside for the developer is that you get lower cost for adoption, and you do have a recurring revenue stream for solid bits of new work.   It is a change in the upgrade and sales model.   The other option is to introduce major uplifts in features as paid in-app purchases, which is another option with what already exists.

I don’t think it’s either full price new versions or free upgrades for life.   There’s a lot of capability in the revenue models Apple has in the app stores they run, and while those models are nowhere near exhaustive, they are, I believe, sufficient to support a wide range of development models and companies.

I expect that the iLife and iWork applications will go this way as well, and as that happens, iCloud is how those apps will move across data and settings between versions.   Mountain Lion will be a new app on the app store, even though it’s an “upgrade”.   The supporting evidence is that Apple showed us long ago the OS was a lower cost upgrade than say Microsoft provides.   This gives incentive for people to upgrade much more readily than a higher price point would, yet everyone pays the same price.

Every sale is an upgrade.   From the previous major version to the current version, or the first version to the latest version.  One price upgrades.  Even if it’s upgrading from nothing to a new customer.

Ads in Apps and Resource Usage

John Gruber linked in a study from Purdue about the majority of battery life going into supporting and displaying ads.   Rather surprising really, but largely due to the fact I hadn’t considered it much on mobile devices.   I usually get paid versions of apps as I’m just not a fan of ads eating screen real estate.   That said, I use ClickToFlash to limit ads on web pages and further cull their activity with Little Snitch blocking a lot of track sites and flash serving ad sets.   I’d rather have a way to subscribe, or read inline advertising as Gruber does with Daring Fireball already.   It’s much more effective as well than the pop-over ads and interstitials I go looking for the close button on before the thing even gets rendered.

I wonder if this sort of study is going to affect purchasing decisions in people if it becomes more widespread?   And if that happens it will affect developer decisions and will actually wind up changing some of the ad industry approach in that they too will have to consider efficiency and battery life.   It won’t just be the application developer dealing with the restriction.   The ad is a supporting mechanism, so really it should use no more than say 10% of the resources the application uses itself.   Definitely not 3x the resources at any rate.

In-App Ads Consume Mucho Battery Life:

Jacob Aron, NewScientist:

Up to 75 per cent of the energy used by free versions of Android apps is spent serving up ads or tracking and uploading user data: running just one app could drain your battery in around 90 minutes.

Abhinav Pathak, a computer scientist at Purdue University, Indiana, and colleagues made the discovery after developing software to analyse apps’ energy usage. When they looked at popular apps such as Angry Birds, Free Chess and NYTimes they found that only 10 to 30 per cent of the energy was spent powering the app’s core function.

[…]

(Via Daring Fireball.)

Things are moved, and we are underway!

Apologies for the deluge of reposts as I moved everything over.   The original post dates are in the titles of each should you be interested, and things are backed up and configured as needed at last.

So after a few years of being largely idle, Digital Katana Technologies Ltd. is underway and operating.   Currently we are doing work both with software development and coding as well as architecture and technology consulting.   The the midst of that, some time is finally being allocated to work more consistently on the iOS apps Digital Katana has been designing and exploring over the past year.

The blog posts on technology will start to flow and anytime a software release or other company news comes about, the news will be published here first.

 

Avahi… linker flags to compile the examples – Original post June 1, 2011

So, let’s assume you want to attack using mDNS also known as Zeroconf on Linux to advertise your new service in a modern, portable, discoverable way with no pain on the user’s part?   Simple, just pop over to http://avahi.org and look at the examples.   Compile them, try them out.   If you’re on a Mac, avail yourself of Bonjour Browser to have a look at the services popping in and out of existence as you test it.   There may be a Linux zeroconf service browser, but I didn’t find one ready to go.   If you know of one, please add it into the comments!

Wait, you’re saying you can’t get it to compile and link?   Ah.   Yes, so there’s two normal landmines people step on in Linux development.   The first one can be solved by a general rule of thumb.   You need the avahi-devel packages to be installed so you can use the headers in the examples and in your software that allow you to link into the avahi ABI.  On an RPM-based system, yum install avahi-devel will get you there.   Normally, pacakagename-devel is going to get you these developer libraries.   Avahi is already on most RPM distros so you can have your software run out of the box without this install on the target machine.   You just need it for development and compilation of the binary.

Wait, still not working you say?   Ah.  You’re getting a few pages of “undefined reference” you say?   All to avahi functions?   So you try the link flag -lavahi as that usually gets it right?   No go.   libavahi doesn’t exist.  So you go googling.   I did all this.   It’s an interpretive pain.  Here’s the magic incantation to build the service publishing example:

gcc avahitest.c -lavahi-glib -lavahi-core -lavahi-common -lavahi-client

That links in all the avahi libraries I could find (and you don’t really need them all, but they are listed here for completeness).   Then it runs and works brilliantly.  If you’re wondering where all these are located, it’s in /usr/lib64 at least on x86_64 Fedora.

I wandered through the avahi wiki at some speed, and couldn’t find anything simply listing this time-consuming necessity.  So I’m posting it here in the hopes that a future frustrated searching developer might find just a bit of relief and save themselves a bunch of blind stumbling to little effect.

 

 

 

Currently playing in iTunes: Know Us by Jillian Ann

One platform, two, three, more? – Original Post November 24, 2010

James Governor over at Red Monk (a great bunch of analysts I alternately seem to agree with and take issue with) posted a take on the mobile platform development continuum and HTML5 as an alternative. Take a read and the disagreement I voice below will have a better context.   I’m reposting the reply here because it is a relevant discussion and I leave him the right to do with his site as he will, and that includes having it go away.   So I’ll keep my work where I can get at it.   On my laptop in MarsEdit and on the web on my own blog as well.   :-)

[I feel] James, [is] getting a bit absolutist for an analyst that usually sees how things fit together more completely.   It’s not a winner-take-all end game.   Some apps work better on a local platform, native toolchain, optimized into the hardware, fixed system, no browser barrier.   Usually have a higher development cost than the average web app.   These are, of course, generalizations.

Just look at the Samsung Galaxy response.   Some people like it.   Different form factor.   Different purpose.   iPad app onto a Galaxy?   Most wouldn’t feel right or be a great experience.   The web gives you a “common denominator” approach, and that usually gets offset by a pile of javascript conditionals adjusting for the nuances of browsers.

I tell you, trying to use a mutual fund screening system on the web today was horribly painful.   The state and ability to pull things in and out of a spreadsheet app is just not happening.   The app is older and could be improved with some newer approached and technology, but really, I want my <em>data</em> to be able to flow back and forth securely between devices, but when I’m working, I want it to be <em>here</em>.   That’s why the browsers keep adding in desktop features.   Because the experience in the browser is and has been inferior.

WebGL?  Offline Storage?  Hardware graphics acceleration?   Codec hardware acceleration?   These are all desktop features that get pulled into the browser over time as it trails behind.   It’s not surprising, and it’s not going to change.  Standards move slower than most proprietary innovation.  Web apps are usually for a cheaper and/or broader approach, with a radically different revenue model.

It just baffles me that development gets lumped into a bucket of “proprietary platform” or “web standards”.  Both are two ends of a continuum.   OpenGL is pretty cross-platform in the bulk of the API.   And it’s well defined on experience.   There’s shades of gray all over.   Are you telling me the same technology and approach for Farmville is appropriate for Doom3 or Call of Duty Black Ops?   I don’t hear the game companies complaining about supporting 5 or 6 platforms, let alone 2 or 3.

Of course, the desktop pulled in automatic updates and notfications and sync from the web in many ways to bring those strengths across too, so it’s not totally a one-way street.

But it’s not a one-lane street either.

 

The future of personal communication? – Original post August 16, 2010

Apple may be laying out a foundation of pretty serious evolution in personal communications right under our noses with Facetime.

Facetime is an open spec.  Always good for adoption.  It’s supported by one of the hottest consumer cell phones out there.   It’s data-based, but also rooted in SIP.  Now, if the rumour sites are right, the latest incarnations won’t only go to phone numbers, but to email addresses associated with devices.

Big deal you say?  Think it through for a moment. If you have multiple devices, they register to a single email address.  You have a universal number now.  Any connection to that email address via Facetime (voice, video, however) and you alert every device it’s associated to (or the most recently live one is a default, or user preference) regardless of location.

Ok, so that’s Instant Messenger on many devices.  Again, big deal you say.

SIP can associate a phone number in there as well of course, and the phone has a phone number in there, but let’s theorize something.

Let’s say for theoretical argument’s sake that Apple replaces AIM as the mobileme transport with this “universal iContact system”.  Anyone on it with an iChat client or Facetime device (iPod, iPhone, iPad, Mac (again, theoretically, we’re forecasting)) can connect to anyone on any of their devices with a voice, video or even plain text or data session at any time.  SMS goes bye-bye as it’s just a typed text chat line to the facetime address.  You have a new unlocked iPhone 4 and you pick up a local data plan wherever you travel to and all your “calls” are forwarded automatically at local rates for data.   Roaming charges to bye-bye.

Now let’s say the Apple creates a SIP gateway that can associate the iContact ID with a phone number they provide, sort of like a SkypeIn number or other “real” phone number in a voice system.   When the call goes to that number, it gets forwarded as a translated Facetime voice chat to any and all of your associated Facetime iContact devices, and you pick it up just like a Grand Central number (Google Voice now).

Apple has replaced present-day mobile phone numbers with a universal number, added in the video calling, text chatting, voice calling and crunched it all down to data streams on a standardized open protocol.  The cell companies finally are reduced to data plan sellers (which is all they *should* be charging us for today anyway, the rest is restricted repackaged data at ridiculous markups (read: SMS)) and the roaming charges that get travellers so irate are gone without losing contact by having local sim cards and phone numbers everywhere.  Now it’s starting to look like a big deal.

Deploy VOIP and messaging as a primary avenue with a bunch of features everyone thought were in the future 20 years ago (video calling) and already have a device that’s creating a critical mass.   Lump all that on top of the ability for other device manufacturers to jump on the bandwagon with compatible services and offerings, and we finally get connected in a quality way, a universal way, and a fair way, and it happens soon.

It may even be happening now.

The Apple Design Awards – Original post April 29, 2010

This is a weird one. Apple finally (and rather late really) announced the WWDC 2010 dates and such. And dropped a bit of a shocker on the long-standing Mac developer community.

The Apple Design Awards, long coveted as awards recognizing some of the best in design, performance and functionality on the Mac, and recently on the iPhone OS (including iPod) this year are only accepting iPhone OS apps and iPad apps. ADAs for apps that haven’t even been shipping a full quarter? And NONE for the Mac. You can’t nominate anything that doesn’t have an app store URL.

This really seems a callous slap in the face of Mac developers. Somebody has to make the Mac good for more than just developing iPhone apps, and these guys do it. But that seems to have been lost on Apple this year.

Looking at the session overview though, I have a theory of sorts on this and what’s coming up.

The sessions are VERY heavy on iPhone OS, mostly iPhone OS 4. There are Mac streams as well, but really mostly specialized streams. There’s also no IT stream listed this year. That’s another big break from last year.

No, I don’t think Apple is bailing on the Mac platform, and no I don’t think they are going to kill off development on it or on the server OS. What I do think is they wound up with a year of heavy focus on iPhone and the iPad, and that actually drained their engineers away a lot from the rest of the work. Let’s face it, they put a pile of effort into Snow Leopard in the guts, with OpenCL, the LLVM stack, GCD, and an acre of other bits. This year it’s iPhone OS 4 that looks like it’s really chewing up a lot of time, plus they need to get the iPad onto that release as well (I’ll bet that’s to be announced at WWDC this year).

So I’m going to predict that 10.7 is not coming on the usual release cycle. I think we’re looking at WWDC 2011 will be a resurgence on the Mac side as it will likely be an off-year for iPhone OS in many ways as they enhance 4.0 and start thinking about 5.0. WWDC 2011 will preview 10.7 that will have some of the iPhone OS showing up in the touch interface capabilities, possibly with some new hardward in the iMac line with touch screens in the fall, and 10.7 released with those. The level of quality that Apple usually churns out I think has actually stretched them on this one, and they can’t keep 3.5 product lines cranking, being the Mac, iPhone, iPad and the 0.5 of the iPod being an adjusted iPhone. So they wind up with sort of an iPhone/iPad/iPod year of WWDC with a minor Mac focus, then (hopefully) a major Mac focus with a minor mobile, or even more ideally for both groups (though expensive for those with feet in both ponds) they split to have a mobile WWDC and a Mac WWDC.

Then the ADAs could have mobile ADAs and Mac ADAs. And all would be a bit more balanced again in the universe. Maybe even the odd Universal ADA for software like OmniFocus that spans all of them. If all the platform implementations are good enough.