Thursday, December 13, 2012

Architectural Homeopathy

Sometimes when you are in a meeting someone says something brilliant, I had that experience today when someone said 'We have a sick body, can we please stop pretending everyone is a surgeon'.  Her point was simple, historically in the company they have had challenges of people having opinions and critically of decisions being made without data and whose implementation success isn't tracked.

I talked about 'Thinking is dead' previously and this is a clear manifestation of that and in the world of IT I'd like to give it a name... Architectural Homeopathy in other words people making IT decisions which

  1. Are based purely on an individual opinion on what will succeed
  2. Are not backed by a case as to why it will succeed
  3. Whose implementation success is not measured (beyond 'it went live')
This approach also has another aspect which is related to comments and challenges to recommended practices.  A good architectural challenge is to say that 'X won't work because we don't work in a centralised way, we need to do Y because we are federated and Y has already been proven to work here', the Architectural Homeopathy challenge is 'X won't work, we should do Y'  or more likely just 'X won't work'.  No evidence will be proposed to support this 'feedback' and no constructive change can be based around it, but if any issues occur the Architectural Homeopaths will say 'You should have done as I said'.

Architectural Homeopaths often build whole careers out of this, building approaches based on 'it worked for me' and failing to see the broader elements that drove success or criticising a proven approach based on a perceived technical flaw while missing the real problem (I'd argue that REST for enterprise integration is a great example of that).  These are the folks who propose great views of architecture without actually having delivered it themselves into operation and coached others how to use that approach.

The point here is that Architectural decisions need to be defensible based on data, even if it turns out to be wrong that gives you the ability to learn.  Proven approaches with justifiable approaches that are measurable against a set of criteria are what professionals should strive to do.  'Advisors' who promote approaches and cannot demonstrate their success and more importantly how others have adopted their approach and delivered success should be treated the way a Homeopath is treated by the medical profession.

I have no objection to the phrase that coding is an art not a science, that it is a creative endeavour not simply mechanistic.  That is fine, but unlike art its success is quantifiable, it either works as intended or it does not.  What I object to is people promoting architectural approaches (or indeed business approaches) which are based on a series of powerpoints and opinions with all of the real support evidence of Homeopathy.  These Homeopaths throw in comments based on this ignorance and personal 'belief' and disrupt progress and love to claim that 'it would have been better my way' while not explaining in detail what their way really would have entailed.

Architectural Homeopathy... get into the sack...

Wednesday, November 28, 2012

The internet was better before...

Over the past few years I've heard lots of complaints about 'the internet was better before everyone used Facebook' and it got me thinking.  That much like the strata in rocks you can determine your geological internet age based on the first thing you remember complaining about.  So moving backwards

Smartazoic - the modern era (2011 to present)
If you haven't yet complained about the new entry into the internet of any groups at all then the odds are that you are part of the latest major influx characterised by a dominance of mobile usage.

Socialem - the social era (2009-2010)
Socialem's are characterised by their complaints that things were much better before everyone started using mobile phones to use Facebook or other social media sites.  These people complain bitterly about instagram.

Broadem - the broadband era (2003-2008)
This is the era of major adoption of broadband and the rise of YouTube these people tend to complain about the rise of Facebook and other social media sites and don't understand why people can't just be happy with Yahoo and BBC News.  They complain that all these new 'social' users are just adding rubbish to the internet rather than the managed content of their day.

AOLem - the dialup era (1998-2002)
This is the era of the mainstream pioneers, the people who used AOL and other elements to get online.  People who remember what lag really is and complained bitterly about the new stupid sites with their insane bandwidth requirements and 'why oh why would anyone want to watch video'.  Their complaint was that the broadem era took away the 'personal' nature of a dial-up modem and also meant that they couldn't use their home connection when they went to their parents or friends house.  The explosion of broadband also meant there were 'too many people' around.

Establishism - the foundations of the WWW (1994-1997)
These are the folks who had access before the dialup era and who complained bitterly as AOLers started coming into Usenet and ruining the internet.  These folks used the WWW in fact many built the WWW but a distain was had for those who hadn't coped with Mosaic.

Jurrassic - Before WWW (1980-1993)
These are the folks who had network connections and lived in FTP, MUDS, telnet and Usenet and bemoaned the sudden lack of technical skills required to 'browse' these new fangled websites.  What was the fun in something that displayed an image in one go rather than requiring a uuencode/decode process?

So which era do you belong to?  Or are there other classifications and strata that should be used?

Tuesday, November 27, 2012

When to shout, the art of constructive destruction

I've always believed that sometimes teaching is about the stick as well as the carrot but there are very clear rules on when to use the stick and how to use it.

Its not good enough to start with shouting, that marks you down as an idiot and a prat.  If someone has done something that you don't want but have never explained to them then its your fault as the person in authority.

Rule 1: You have to have explained first before you shout

The stick therefore is something that should only be used when someone has deliberately gone against advice or guidance.  If people have followed what you said and it didn't work... its your fault.

Rule 2: It should be obvious to 'the average man on the Clapham Omnibus'

By which I mean that the fault should be obvious to a person of the given level and experience, if its a junior person who has made the mistake then shouting is not appropriate.  If its a self-proclaimed expert who has screwed up then a kicking is in order.

Rule 3: Pick on the leader not the team

If there is a team of people who have screwed up, don't share the blame equally, that person is accountable to you for the team so they have to take the responsibility for the failure.  Their team will know its a joint thing and that the leader has taken the heat for them and this should improve the situation if there are any team dynamic issues.  If you flame the whole team it basically says that you don't accept that they have a leader so you should be managing them all directly yourself.

Rule 4: Be specific, be constructive in your destruction

'You are a moron' is not a constructive statement.  'If you don't explain to me how what you are proposing addresses the two key use cases then I'm going to have to kill you' is constructive destruction.  The point here is not to hide the anger but to make clear that your challenge is very specific and targeted and gives them the opportunity to respond.

Rule 5: Make perfectly clear you are pissed off

You should only be doing this when its got really bad so you have to underline that it really is bad.  This doesn't mean you have to swear or throw chairs about but it does mean everyone should leave the room knowing that they are in your bad books and that if they don't buck their ideas up then chair throwing might be in their futures.

Rule 6: Give specific ways they can get back into favour
Before the team breaks up be specific, give them a short time frame on how they can recover the situation.  These need to be actions you can, and will, track of the next few hours or days on how the team can show you they are getting back on track.

Rule 7: Be honest when you get it wrong, congratulate
If it turns out that you were wrong and in fact the team had been going the right way but you didn't have all the information then apologise and congratulate them.  Shake the hand of the person who stands up and says 'you got it wrong, here is the proof' and deal with it as you should, with humility and possibly donuts.

Rule 8: If things don't change make a sacrifice
If the team keeps working badly and avoiding advice its time to make a public sacrifice, this could be kicking them off the project or even out of their job or it could be as simple as public humiliation.  Putting a small statue on their desk saying 'Crap Architect', check with HR first on what you are allowed to do.  Use your companies formal processes to underline it.  The key is that you want everyone to think 'fuck I don't want that to happen to me'.

Rule 9: Be inconsistently angry
The penultimate rule is to not use anger and shouting as a 'what happens at level 10' thing.  Sometimes use it as an opening gambit, sometimes use it as a final twist, sometimes use it through-out a process.  The point here is to use anger and shouting sparingly with the team.

Rule 10: shouting isn't about volume
'Talk quietly and carry a big stick', shouting for constructive destruction in a project is not about raising your voice, its about the impact that an attitude carries.  Talking more slowly, deliberately and quietly can be significantly more threatening than shouting loudly in many circumstances.  The key here is the effect, occasionally you want people to change their ideas and give them a jolt, don't think volume, think impact.

Lets be clear, I don't think that shouting and constructive destruction is a thing to use all the time, but sometimes the stick is required these rules help me ensure that I don't just shout like Shakespeare said 'Full of sound and fury signifying nothing' but direct that anger and use it to save the situation.

Constructive destruction is about tearing down bad behaviours and providing a way to rebuild them in a positive way.  There are lots of touchy feely ways to do that, but sometimes fear is the right way.

Tuesday, August 14, 2012

Java SE - could it be more pointless?

Back in 2007 I gave a presentation on Java whose theme was that Java had won.


On slide 12 I put forward a proposal that I'd talked about here before as well in that Java needs to be professional and critically reduce to having a small core on which people can innovate and develop whether that be in smartphones, desktops, servers, virtual machines, smart cards or anything else that people dream up.  Simply put Java SE has no point in today's IT landscape as the 'primary' release on top of which everything else is done.

When I look at how SAP are pumping new life back into ABAP and how other languages are exploding, not because they make things better but because they address specific niche concerns.  High-performance computing still reverts to the assembly languages, in part because Java's bloat means it can't be tuned down to the core that they need.

The question is whether anyone has the guts to make the change or just continue on the road to obscurity.

Java needs a revolution, not another poor attempt at intelligent design.

Thursday, August 02, 2012

How to weed out bluffers....

Following up from the concept of thinking being dead I'd like to talk now about one of the biggest challenges in IT.
How do you spot those people who are bluffing?
And here I mean people who don't really think they are bluffing because they are rubbish, or those that are bluffing because they think you are rubbish, its related to the challenge of Terry Pratchett Architects (PArchitects?) where how do you tell someone who distaines technology through knowledge v one who distaines through ignorance? The point is this:
95% of people interviewing senior IT people don't understand enough to weed out bluffers
This means that regularly I come across people who have a senior position based on a level of buzzword knowledge and general ignorance of reality that causes me to step back in amazement at the ability of the person to hold down a job. Normally of course these people are in jobs 12-24 months and often are contractors where such variability is almost seen as a benefit rather than an indication of being found out. So here are the top tips on weeding out the bluffers, and I'm assuming here we are at the Architect level and you can weed out bad developers...:
  1. Don't do it in a phase 1 Tech, phase 2 HR interview process - add in a middle phase which is set up by phase 1. 
  2. In Phase 1 ask the following 
    1. When was the last time you coded into production, what language, what plaform? 
    2. When was the last time you created a conceptual and logical data model? What platform for? 
    3. What is the difference between Regex, Regular Expressions and Perl in string handling? 
The first two are the real set-up questions... the last one is something that stunned me once when someone I was interviewing kept saying "I did Regex" and then later on said 'On that project we used Regular Expressions', I asked him the difference and he said that Regex was a language, Regular Expressions was... a different language. The point here is that you are after an understanding of the languages and platforms for which this person claims some level of expertise.

Lets be clear here, I'm of the opinion that an architect, whether Solution, Enterprise or Business who claims to sit within the IT domain should still know how the platforms work and especially should remember how their last platform worked. In the second interview you should include real deep developers, get them to ask real deep developer questions. You aren't looking for 'this person is a bitch ass programmer in language X' but 'not brilliant, but seems to know his stuff generally'. What you are looking to avoid is 'I don't think this guy has ever used X in his life' or similar statements.

Similar approaches, but more abstract should be used for PMs and BAs where you should get PMs and BAs who have done similar technical projects to ask the questions. I'm stunned at how many times I meet someone who is a 'Functional' SAP BA and then when you introduce them to someone who really is a functional expert in that area they fall to pieces... sometimes not even knowing the acronyms of the SAP modules they claimed to have used ('We did supplier management, not sure what SAP called it') The point here is that you need a first stage to find out where the bluffer claims to have depth and then a second stage to rip the hole if it exists.

Bluffers florish in a world where thinking doesn't exist.

Wednesday, July 04, 2012

Thinking is dead

Anne wrote a reasonable blog a while ago on why SOA was and wasn't dead but I'd like to go a bit further today and say that generally the concept of thinking appears to be dead.  The value of 'thought' and thinking in IT has diminished, in a way that mirrors society at large, to the stage where design, planning, architecture and anything else other than just banging away at a keyboard appear to have been relegated behind opinions and statements as fact.

REST was a good example of this.  It was going to be the revolution, it was going to be the way that everything worked.  Two years ago I called bullshit on that revolution and I still say its bullshit and the reason is simple.
IT values technologies over thought
So people genuinely, and stupidly, think that how you shift packets from A to B will have a massive impact in how successful IT projects are at delivering their objectives.  What impact has this had on the massive spend of ERP packages out there?  Nothing at all.  What has impacted that?  Well folks like SFDC because they've shifted the business proposition and in so doing have moved the way business people think about IT systems.

The same goes around with Hadoop and Big Data.  The massive amount of information growth is complemented by an equally large amount of bullshit and a huge absence of critical thinking.  What is the major barrier to things like Hadoop?  "Its lack of real time ability" I hear people cry.  Really?  You don't think its that we have millions of people out there who think in a SQL Relational way and who are really going to struggle thinking in a non relational non-SQL type of way?  You don't think that the major issue is actually in the acquisition and filtering of that information and the construction of the very complex analytics that are required to ensure that people don't go pattern matching in the chaos and find what they are looking for.

We are currently seeing a rush to technology in IT departments that is focused hugely on bells and whistles while the business continues to look at outcomes.  With the business looking at SaaS, BYODand self-service applications the importance of thought and discussion in IT is acute. What I often see however is statements of 'fact' like 'you don't need integration its SaaS' or even worse a statement on a specific piece of API based technology as being important.

Planning, architecture and design are being seen in parts of IT as bad things as a mentality develops around a concept that some how basic fundamentals such as TDD, contract design and doing things that actually proven to work are in some way wrong.  Adoption of unproven technologies is rife as is the surprise when those technologies fail to deliver on the massively over hyped expectation and indeed fail to even come close to delivering at the level of dull old technologies that do the job but don't currently have the twitterati in thrall.

'Experts' in this arena has come to mean 'people who shout loudly' in a similar manner to US politics.  Facts, reason and worst of all experience are considered to be practically a disadvantage when being an expert in this environment.  I was recently in formed that my opinion on a technology was 'tainted' as I'd used multiple other technologies that competed with it and therefore was 'biased against it'.  I'd also used the technology in question and found that it was quite frankly rubbish.  Sure the base coding stuff was ok but when I looked at the tooling, ecosystem and training available from those competitors I just couldn't recommend something that would actually work for the client.  Experience and knowledge are not bias, thinking and being critical of new approaches is not a bad thing, thinking is not dirty.

When I look around IT I see that design is a disappearing skill and that the ability to critically assess technologies is being replaced by a shouty fanaticism that will ensure one thing and one thing only:
IT will no longer be left to IT folks
The focus of shiny technology over business outcomes and the focus of short term coding over long term design will ensure that IT departments get broken up and business folks treat IT as a commodity in an ever growing way.

Thinking, design, planning, architecture and being skeptical on new technologies is the only hope for IT to remain relevant.


Wednesday, May 09, 2012

Carbon Travel Tracker

Okay so now in the iTunes Store is my first attempt at an application that does something actually useful. Its the Carbon Travel Tracker. I travel a lot, not as much as some but quite a lot more than most, two questions always came to mind

1) Just how much do I really travel
2) What is the Carbon impact of that
In the spirit of 'if you don't measure it you can't change it' this really was an application where I wanted it to be really hands off.  I don't want to have to enter all of my travel, I want my phone to stalk me all the time and work out what sort of travel I'm doing and what is the impact of that.

There is much more on the application on the Carbon Travel Tracker support pages but here are a few highlights

  1. Tracks automatically in the background - kick it off and it just runs
  2. Automatically splits travel into sections/journeys and makes an educated guess on the travel type
  3. Allows you to manually override whole sections or sub-sections with the actual travel type
  4. Records your distance traveled in the day, week, month, year and since the app was installed
  5. Records the carbon impact of that travel
  6. Tells you how many times around the world that travel equates to
  7. Tells you how many trees, over the same time period, it would take to make you carbon neutral
The purpose is really to have a map so my kids can see where I've been an it sort of escalated from there.


So the map required points, points gave me distance and speed, speed & distance gave me travel types and finally that gave me the carbon impact.

So I now know that basically I don't have a Carbon Footprint, I have a Carbon Body bag.  Now as it says at the top, this app is now:





Wednesday, May 02, 2012

Why coding isn't a 1 day thing and why the UK view on education has to change

decoded.co have started quite the PR and education puff around the idea that they can teach you to code in a day.  Or to be accurate on their site they say they
To teach anyone code in a day.
Now clearly that sentence doesn't make sense as coding is a task, something you do so it should be 'to code' not simply 'code'.  Its like saying 'To teach anyone run in a day' so first off they clearly need to fire their copywriter.

This is why I wrote a series of posts that came out today

The purpose behind these posts is to point out the idiocy of the idea and the lack of respect that science and engineering really have in the UK.  No one would dream of claiming that you could be an artist in a day (despite the fact that at 13 my bed was WAY messier than Tracy Emin's) but its acceptable to claim that coding is somehow less challenging and requires only a day to learn.

Lets be clear here, I am a massive supporter of getting decent technical IT education into schools and of raising the profile of technology and getting more people involved in coding.  I regularly suggest to people that they should give it a crack and suggest using existing examples as the starting point for learning.  If the intention was to introduce people into coding and open their eyes to the possibilities then I'd be okay, but this course doesn't aim to do that it aims to make people think they are actually coding after a day.

I'm sure some people would read this and think 'ooooh he would say that because he has a vested interest', well I'd like to put people like that in the same group as those who believe in horoscopes and homeopathy and refer to astro-physicists and doctors as 'vested interests'.  I do this for a living, I did a degree in this subject and I'll tell you two things
  1. Natural Talent matters in code the same way it matters in art, some folks are just better at thinking like a computer and bending it to their will.  Give me one of those with some basic training over the learn and code by rote developer with 20 years experience
  2. Grasping the basics isn't that hard
The latter point is the one I'd like to say.  I reckon I could teach anyone to ACTUALLY code and ACTUALLY understand what it was doing in a couple of weeks.  At the end of that they'd know about pointers, memory, algorithms, OO basics etc and be able to write themselves a small program from scratch.  If they turn out to be in the natural group then the program could be quite complex, if not then it will be a directed view of a specific solution.

That is why decoded's idea is so rubbish, not because code really does take 10,000 hours to really do well but because it doesn't.  By reducing it to a day of cut and paste you actually miss the creativity and the real sense of achievement you get from well crafted and working code. But in the UK this isn't surprising to be honest as science and engineering have for a long time been very much the 'dirty' parts of education, areas where its okay to look down on because they aren't as 'pure' as Art.

The UK is a country where people value a degree in Classics... Latin and Greek.  Its seen as the 'top' degree from Oxford and Cambridge in many circles.  How mental is that in 2012?  How mental would that have been in 1950 in fact?  Despite this ridiculous concept of art subjects and Classics been 'valuable' and the active steering of kids away from Science, Maths and Engineering from a young age its amazing how many world leading engineers the country has produced and the number of Nobel prizes that come from this little shore.  The talent is there but the support most clearly is not.

Coding in day is typical of the lightweight way that British Education, Media and Society treats Science and Engineering, its something for 'geeks', its dirty, its complex and its completely ok to disagree with it from a position of total ignorance as this gives us 'balance'.  It is perfectly okay for people to hold views from ignorance on Maths, Science or Engineering and indeed the set up of British Education is designed to do just that, most especially to ensure that as few women as possible do those subjects.  Much of this dates back to the 'Arts and Crafts' movement of Victorian times which looked to portray science and engineering in a negative light when compared with the 'purity' of hand-crafted art.

The reality is that Science, Engineering and Maths are what the UK does really well, Newton changed the world through the use of Mathematics, in the 18th and 19th Century Britain changed the world through Science and Engineering and in the 20th century it was Brits who invented many of the things we take for granted today as well as providing the theoretical basis for modern computing (and the first computer).  

Despite all of this it is still not supported or promoted in the way that it should be.  When I travel to Germany, Netherlands, France or the US I see a different perspective.  Even in the US where the concept of jocks and geeks is ingrained into the school system there is still clarity at Universities over what achievement is about.  

Getting people interested in coding is absolutely something that should be applauded and encouraged but it needs to be done in the same sort of way that people would be taught how to paint, how to write, how to postulate on the foundations of the universe: as a building block, a building block on a long and potentially wondrous journey.

It also needs to be treated with the seriousness that an industry which has revolutionised the planet deserves.  This is the economic powerhouse of the world, it has bigger impacts than any other single industry in the world today.  Ultimately isn't that something that deserves real government focus and real media focus rather than receiving coverage more superficial than that dedicated to horoscopes and homeopathy?

Learn to be French in one day

Ever listened to a French Art house film and thought 'I wish I could do that'?  Ever walked through the streets of Paris listening to people and thought 'How do they do that?'.  For too long being French has been thought of as something that you need education for, something that requires rigour, experience and actual learning to achieve.  But no longer.

I am now offering people the chance to learn to be French in a single day.  As part of this course you will
  1. Wear a stripy jumper and put onions around your neck - and you will be French
  2. Shrug and point at things - and you will be French
  3. Refuse to go on with the course and burn a sheep(*) - and you will be French
  4. Use words such as "Bonjour", "Zoot alors" and "Bricolage" - and you will be French
At the end of this one day course you will be a fully fledged French person and understand all the principles behind France and be able to declare yourself French(**).  I firmly believe being French is not just for people who were born there or who have lived there and achieved nationality, its for anyone who pays me money to do basic tasks where I can wrap enough crap around it to convince them they've actually achieved something.

At the end of just one day you will understand what it takes to be Fnrehc, you will realise that being French is something that is actually fundamentally easy and all French people are actually just over complicating it.  There is nothing else that you need to learn to be able to say that you understand what it takes to be French.  YOU will be able to look at the French nation and say 'I know exactly what it takes to be French'.

* additional fee applies
** not legally binding, does not include passport


(in homage to decoded)

Learn to be an actor in one day

Ever looked at a play and thought 'I wish I could do that'?  Ever looked at the Oscar winner for best actor and thought 'How do they do that?'.  For too long acting has been thought of as something that you need training for, something that requires rigour, experience and actual talent to achieve.  But no longer.

I am now offering people the chance to learn to be an actor in a single day.  As part of this course you will
  1. Read stuff out of a book - and it will be acting
  2. Copy famous scenes from films - and it will be acting
  3. Get cast in a Michael Bay film as an attractive woman - and it will be acting(*)
  4. Shout loudly on a stage from a script - and it will be acting
At the end of this one day course you will be a fully fledged actor and understand all the principles behind acting and be able to declare yourself an actor.  I firmly believe being an actor is not just for people with talent, experience and training, its for anyone who pays me money to do basic work where I can wrap enough crap around it to convince them they've actually achieved something.

At the end of just one day you will understand what it takes to be an actor, you will realise that being an actor is something that is actually fundamentally easy and all those great actors were actually just regular people like you.  There is nothing else that you need to learn to be able to say that you understand what it takes to be an artist.  YOU will be able to look at Spencer Tracy in 'Inherit the Wind' and say 'I know exactly what it takes to do that.

* limited places


(in homage to decoded)

Learn to be a writer in one day

Ever read Shakespeare and thought 'I wish I could do that'?  Ever read Steinbeck, Dickens, Austen, Pratchett or Proust and thought 'How do they do that?'.  For too long literature has been thought of as something that requires rigour, experience and actual talent to achieve.  But no longer.

I am now offering people the chance to learn to be a writer in a single day.  As part of this course you will
  1. Rewrite a paragraph of Dickens in modern English - and it will be Literature
  2. Use Google translate to switch Mollier from French to English - and it will be Literature
  3. Write a short poem on the subject of being a writer - and it will be literature(*)
  4. Cut and paste text from various writers to make a new story - and it will be literature
At the end of this one day course you will be a fully fledged writer and understand all the principles behind literature and be able to declare yourself a writer.  I firmly believe being a writer is not just for people with talent, experience and dedication, its for anyone who pays me money to do basic copy work where I can wrap enough crap around it to convince them they've actually achieved something.

At the end of just one day you will understand what it takes to be a writer, you will realise that being a writer is something that is actually fundamentally easy and all those so called giants of literature were actually just regular people like you.  There is nothing else that you need to learn to be able to say that you understand what it takes to be an artist.  YOU will be able to read Chaucer and say 'I know exactly what it takes to do that.

* additional fee for beret 


(in homage to decoded)

Learn to be an artist in one day

Ever looked at the Mona Lisa and thought 'I wish I could do that'?  Ever looked at the Turner Prize and thought 'How do they do that?'.  For too long art has been thought of as something that you need training for, something that requires rigour, experience and actual talent to achieve.  But no longer.

I am now offering people the chance to learn to be an artist in a single day.  As part of this course you will
  1. Take a urinal off a wall - and it will be art
  2. Throw paint randomly at a canvas - and it will be art
  3. Fill a cast of Rodin's 'The Thinker' with molten bronze - and it will be art(*)
  4. Trace around famous pencil drawings - and it will be art
At the end of this one day course you will be a fully fledged artist and understand all the principles behind art and be able to declare yourself an artist.  I firmly believe being an artist is not just for people with talent, experience and training, its for anyone who pays me money to do basic copy work where I can wrap enough crap around it to convince them they've actually achieved something.

At the end of just one day you will understand what it takes to be an artist, you will realise that being an artist is something that is actually fundamentally easy and all those old masters were actually just regular people like you.  There is nothing else that you need to learn to be able to say that you understand what it takes to be an artist.  YOU will be able to look at the Sistine chapel roof and say 'I know exactly what it takes to do that.

* additional fee applies


(in homage to decoded)

Wednesday, April 25, 2012

Bling and ignorance - how cloudy thinking will screw up IT

Today it was annouced that Progress Software are going to 'divest' their BPM and SOA technologies and instead are going to focus on cloud technologies that don't really exist yet. This is indicative of a mentality I see around so first lets start with some credentials: 1) I worked with Google around a SaaS partnership in 2007 around Google Apps 2) I delivered an SFDC solution in 2008 3) I've worked with Amazon, VMWare and lots of other cloud based companies So lets be clear I think cloud and SaaS can be extremely useful things, the problem comes when people begin to think they are magic. Lets be clear SaaS is a software package pre-installed and configured that you can pay for on demand - the change isn't the software its the capacity and the charging model Cloud is an infrastructure play to provide old school tin in a virtualised manner in a way that can be quickly provisioned and paid for on demand. That really is it. The problem I see is that people take the new bling and forget the old lessons. So people say 'I'm am using SFDC and it is my customer MDM I don't need anything else'... in the same way that people said that about Siebel in 2000 and found out in 2002 that it wasn't true. They say 'its on the cloud so I don't need to worry about architecting for performance, I'll just scale the hardware' as people did around 'pizza box' architectures in 2000. I hear 'we don't need to worry about integration, its on the cloud'... like people did around SOA/ESB/WS-* in ... 2000. The problem is that the reality is the opposite. The more federation you have the more process and control you need. The more information is federated the more requirements you have for Master Data Management. Cloud solutions are not silver bullets and they are not always simple solutions to integrate with other systems, great on their own but not good with others. Rapidly people are building federated IT estates with federated data and no practice and process around integration which leads to a massive EXTERNAL spaghetti mess, something that makes the EAI problems of the 90s look like a walk in the park. Or to put in another way... isn't it great how people are making more work for those of us in IT.

Monday, April 23, 2012

Why Silicon Valley is bad for Enterprise IT

99% of IT is done outside of Silicon Valley in Enterprises.  I like the valley and San Francisco is my second favourite 'new' city (Sydney comes first, London, Rome and Paris remain... well London, Rome and Paris).  The problem is that 99% of the media about IT is about what happens in Silicon Valley.  This is the equivalent of all coverage of the global car market being about Ferrari.  Its not good for IT for a few reasons.

  1. A large proportion of Silicon Valley is hype over reality - remember Transmeta? A company whose coverage dwarfed that of ARM for several years, despite ARM clearly being better
  2. It focuses on the 'new' over the 'useful' - Lots of good stuff comes out of Silicon Valley but the focus is always on 'the next new thing' this leads to lots of new 'wheels' for IT but very little in terms of actual progress.  Is Enterprise IT better today than it was 10 years ago? 20 years ago?  Possibly a bit with things like Java, but the focus in Silicon Valley today is to INCREASE language fragmentation thus reducing the benefits.
  3. Its technology over practice - we've known for 40+ years that the real issue is in design, Silicon Valley focuses on the coding not the practice
  4. Its become a circle-jerk the tech media, tech companies and VCs all running around in their own little hype cycle un-encumbered by reality.
The impact is that even the older vendors in the valley tend to leap on bandwagons like a politician desperate for votes. So rather than fixing existing technologies or providing you with the minor enhancements you get a bunch of shiny new technologies that don't work terribly well to put on top of the  old pieces that have bugs they don't fix.

Lets be clear there are great things that come out of the valley and some real revolutions have been driven from there, but revolutions have happened in other places too, Waldorf, Cambridge (UK and MA), India, CERN and lots of others.  But what really hasn't happened, and Silicon Valley is the biggest criminal in this crime, is a focus on improving the 'dull' part of IT.  Project failure rates remain high, integration is still a technical job and data quality remains poor.

Unfortunately the current set-up is designed only to hype the new rather than improve the infrastructure of IT.  Now some could argue that this is a great example of where capitalism doesn't produce the greatest good for the greatest number as the current approach actually does very little to improve the lot of the majority in IT while enriching a small subset... but I wouldn't say that as capitalism remains the only system that actually works.

The question therefore is how can IT re-orient away from the valley and away from the hype.  I think the rise of India and China will be key to this.  China and India are focusing on 'being good' rather than 'being shiny'.  This focus on improvement and innovation for improvement rather than straight innovation for the brand new.  As these economies grow and their importance to IT grows maybe we'll see a shift in IT away from the new and shiny towards the actually improving.

Lets hope.

(please note in the above I exclude Apple as they appear to live in a bubble disconnected from the valley ;)

Thursday, April 19, 2012

Information the next Trade War? Data Privacy v CISPA

Today it was annouced that Progress Software are going to 'divest' their BPM and SOA technologies and instead are going to focus on cloud technologies that don't really exist yet. This is indicative of a mentality I see around so first lets start with some credentials: 1) I worked with Google around a SaaS partnership in 2007 around Google Apps 2) I delivered an SFDC solution in 2008 3) I've worked with Amazon, VMWare and lots of other cloud based companies So lets be clear I think cloud and SaaS can be extremely useful things, the problem comes when people begin to think they are magic. Lets be clear SaaS is a software package pre-installed and configured that you can pay for on demand - the change isn't the software its the capacity and the charging model Cloud is an infrastructure play to provide old school tin in a virtualised manner in a way that can be quickly provisioned and paid for on demand. That really is it. The problem I see is that people take the new bling and forget the old lessons. So people say 'I'm am using SFDC and it is my customer MDM I don't need anything else'... in the same way that people said that about Siebel in 2000 and found out in 2002 that it wasn't true. They say 'its on the cloud so I don't need to worry about architecting for performance, I'll just scale the hardware' as people did around 'pizza box' architectures in 2000. I hear 'we don't need to worry about integration, its on the cloud'... like people did around SOA/ESB/WS-* in ... 2000. The problem is that the reality is the opposite. The more federation you have the more process and control you need. The more information is federated the more requirements you have for Master Data Management. Cloud solutions are not silver bullets and they are not always simple solutions to integrate with other systems, great on their own but not good with others. Rapidly people are building federated IT estates with federated data and no practice and process around integration which leads to a massive EXTERNAL spaghetti mess, something that makes the EAI problems of the 90s look like a walk in the park. Or to put in another way... isn't it great how people are making more work for those of us in IT.

Monday, March 26, 2012

Why you should Unit Test - an example by MS Office

I'll admit it, when I'm writing code just for myself to learn stuff I tend not to unit test.  When I'm in a team environment I insist on it as the person coding isn't going to be the person maintaining.  Unit Testing has lots of benefits in terms of project success, code quality and longevity.  If you think testing first then you tend to think about the solution to the problem a bit more and that is always a good thing.  I tend to think about design by contract over unit testing, but unit testing tends to be the only way to build such things.

So lets take an example... file compression.  The sort of thing that a single class can easily do.  We've got two functions

char *compress(char *bytes)
char *uncompress(char *bytes)

Right now what are our core unit tests?

Well the first one of course is that if we compress something then uncompress it then it must be the same as the original.  The second one is also pretty obvious namely that the compress method should result in something smaller than the input.  Pretty much the basic method for such things.

Over at Microsoft Office however the 'compress' function built into MS Office, at least on the Mac, which is a one time thing to 'reduce file size' has a slightly different approach.  When you do 'reduce file size' on a presentation it manages to sometimes increase the file size.

That sort of thing should really have been caught in Unit Test.

Thursday, March 22, 2012

Dear Google you've patented my published idea...

Okay so a while ago I had an idea, that people should blog about ideas they had and tag them as 'prior art' as a way to defeat patents.  Well today I read an article about Google patenting a location specific ad service which takes in local information (the weather) to give people targeted adverts. Well back in 2008 I had an idea, where I talked about temporal information including the phrase:
The final piece in the puzzle is then time. "Here" Services would need to know not only where "here" is but they would also need to know when "here" is. I don't want to be told about a show that finished yesterday or about one that is six months away. If its 11pm tell me about the Kebab shop, if its 9am tell me where to get Asprin.
Now in this I also talk about ad-servers and some sort of federated deployment model so arguably Google's great big central implementation is 'sufficiently different' to mount as a patent but I don't think so. However you can find lots of elements out there about location specific campaign management so the only 'difference' is that Google are talking about taking into account some environmental information to direct the advert.  This is something that retailers already do... ever noticed how they have more adverts for BBQs if its going to be sunny and more adverts for brollys if its going to rain?

So what is Google's patent really?  Well its the combination of temporal based (time/location) advertising with environmental information.  Its incredible that this passes a threshold of being original and not being something that anyone with decent experience could do.

I've had other ideas at other times but not been arsed to implement them.  Feel free to be the person who actually takes on the challenge.  The real point of this post however is to make the point that yet another patent has been granted that shouldn't be.  Its not about privacy moaning its actually about business and economic growth.  

What Google are patenting is what a corner shop keeper has been doing for as long as there have been corner shops.  Looking at who is on the street, looking at the weather and then picking their window display.  Re-writing that and putting a 'e' infront of it shouldn't be the bar that has to be cleared to get a patent, patents should be for genuine ideas that move us forwards and where the creator should be rewarded not for being the first person to work for a company with enough money to file a patent on the obvious.

Why the iPad is temporary but the screen is forever


Just thinking about the old MSFT case with the DoJ where a split up of Windows and Office divisions was muted (but abandoned), with current rumours that Office will be on the iPad later this year there is a question on the post-PC revolution...

Will I be able to code?  If I've got Office then that covers 90% of everything I do in my current role.  10% requires some sort of virtualisation solution, but VDI could be acceptable.  This means for Powerpoint and Word jockeys like me there really is no reason for the company to give me a full laptop.

But I still like to code to relax and the latest iPad has a larger screen resolution, more memory, more storage, better network connections and a better connection to an external monitor than the laptops I used ten years ago to code with.  This got me thinking, what actually stops Apple releasing XCode for iPad?  What stops an eclipse port?  Well there is already some work around using the browser as the editor but I think more will be done.

So what do I need to code?

  • A processor, RAM and storage space
  • A big screen, ideally more than one
  • A decent keyboard
So while the iPad could deliver on the processor and RAM and a single decent screen and bluetooth to a keyboard its actually a bit of overkill.  Why couldn't my iPhone be the processor, RAM and Storage?  Why won't that just be on Amazon or another cloud provider?

The Post-PC revolution is currently dominated by PC mentality devices, integrated boxes of screens and stuff.  Technologies such as 'AirPlay' however talk to what the future really is.  A device that connects remotely to screens, and other elements.  So the future will be about the ultra-lightweight 'iScreen' which connected automagically to your iPhone 12 so you don't need to worry about heat issues or weight as the screen will just be the screen and some pass through circuitry.  The drive will be for that connected device that you have to link to your local personal network of things.

Why do I know this?  Because Bill Joy gave a presentation at JavaOne in 2001 that predicted it all.


Monday, March 12, 2012

SaaS v cloud, its a licensing thing

One of the things I get asked quite a bit is what is the difference between SaaS and cloud solutions and while there are lots of options for me it comes down to a simple question:
How are you licensing it?  
Your SaaS vendors license via users or other capacity metrics.  Things go up, things go down but you license based on those 'things'.   So for SFDC its the connected users for instance, same with GoogleApps.  Critically the thing that you are licensing is a business outcome that works out of the box.  So you can log into SFDC or Google Apps straight away and get working, sure you can extend and customise if you want but fundamentally you license based on a real-world outcome and are delivered a service to deliver that outcome.

With Cloud providers you license based on capacity over time so bandwidth/hrs, CPU/hrs, storage/hrs but this is just the raw capacity.  Same really when you are looking at PaaS, again its really just about the capacity, you are buying a slightly higher order 'box' but its still dumb until you do things to it.  So you are licensing an IT asset in a utility way.

So with Cloud providers you are licensing a capacity and on its own it does nothing.

So now we come to the final question on the cloud, what if you aren't using PaaS or if you are you need some extra software.  This is where a cloud can quickly turn from a cloud into just a virtual data centre.  If your additional software on that cloud is licensed based on CPUs rather than CPU usage then you aren't doing 'cloud' anymore you've just turned your cloud hardware into a virtual data centre where you are back to paying for licenses based on physical sockets (an odd idea in a virtual data centre where contention might be high) and not based on utility.

So in conclusion:

If you are paying for an outcome and paying for it based on a physical world business metric (number of trades, shipments, users, etc) then its SaaS, its about the service not the software.

If you are paying for an IT asset based on an IT metric and paying based on usage related to time... the you are doing cloud.  BUT you cease to be doing cloud once you add software onto that environment which is licensed based on a physical IT metric unrelated to time (e.g. sockets/CPUs/cores).


Monday, March 05, 2012

FUD Farming

FUD Farming: the practice of selling expertise based, primarily or solely, on negative (or FUD) based messaging.

I'd like to include a new term into the lexicon: FUD Farming.  Over the years I've seen the following attempt to sell used over and over again
X is a complete and utter disaster, X will lead to the destruction of modern society and ultimately Armageddon.  X doesn't work like it should, you really don't want to use it, none of your competitors are using it and if you do use it how do you know it won't cause everything else that you have to stop working?
FUD as a term is apparently over 90 years old (as is my Gran, happy 92nd Birthday for tomorrow Gran) and has a rich and ignoble history in IT.  However what I've noticed with the wave of social media 'experts' and specialist consultancies that FUD appears to be the only thing they are selling.  I receive emails with headings like
Do you know that your employees are sharing sensitive information on Facebook
How competitors are using Twitter to damage your brand
How Social Media is undermining your current marketing strategy
Almost all the messages and lead paragraphs are plain old FUD and the answer is of course to employ the expert/specialist consultancy who really understand this FUD and can therefore help you. Its this approach which I'd like to christen FUD Farming which  allows us to call such individuals "FUD Farmers", these are the individuals whose primary or indeed only approach to selling is based around creating FUD, throwing it around as fact and then utilising that as the way to solve the problem that didn't really exist in the first place.  This later is a critical point, most of the FUD comes down in the end to 'have a sensible engagement policy that is communicated to your staff' or 'your staff are doing this, you can't stop them so educate them' and guess what here is just the $$$ course to do that.  Of course as a company you knew that already but thanks to the FUD you've been sold a pup by the FUD Farmer.

Thursday, March 01, 2012

WebSockets v REST (v WS-*) more pointless than eclipse v Emacs (v vi)

Well the shiny kids have a new toy... its WebSockets and by jiminy if it isn't just the best thing ever.  Actually to be fair to people promoting WebSockets they do appear to be taking a more rational, and less religious, stance than the REST area but this discussion remains pointless.  Mark Little's recent InfoQ post was a good pitch on the debate.

Just as REST v WS-* is pointless so the WebSockets debate is pretty pointless as well.  Its a new way of shifting information from A to B, its an 'improvement' over REST and WS-* in some ways and not in others but it doesn't actually make the job of conceptualising solutions any easier.
Its a rubbish picture but it gets the point over.  REST, WS-*, WebSockets, IIOP, MQ, Flying Pigeons, Rats with backpacks and any other data shifting technology are about providing the mechanism via which two services can communicate.  Its a required piece but the key question is 'what is the cheapest way', the value is provided by what goes across that mechanism and to define that you need to understand how the producers and consumers need to interact and that requires a conceptual model and thinking.

The hardest part of IT remains in that conceptual part, the planning, the thinking and the representing back to the business what is being done.  REST, WS-*, WebSockets or any other mechanism do precisely bugger all in helping to get that done.   The question I'd pose is this

Its 2012 now and what has improved in the last ten years in terms of making systems easier to conceptualise and the business concepts easier to communicate and what has been done to make the translation of that into technology (producer, interaction, consumer) much simpler and straight forward?

From where I'm standing the answer appears to be a whole heap of bugger all. Does WebSockets make this problem much easier or is it another low-level coding mechanism?  Its of course the latter. When people moan about the walled garden of Apple or 'monolithic' ERP they are missing a key point:
Technical excellence in the mechanism isn't important, its excellence in delivering the value that counts.
See you in another 7 years for the next pointless debate.

Has cloud lost its buzz?

After doing a presentation the other day I joked to someone from IBM that the WebSphere 'Cloudburst' appliance was one of the silliest names I'd heard.  An appliance that was tagged as cloud.  He then informed me that its not called that anymore but is now called the much duller, but more accurate, Workload Deployer.  Now I'm still not sure as to why this is a piece of physical kit rather than a virtual machine but its an interesting shift in marketing over the last few years that now people are taking the word 'cloud' OFF products.

Now this could be because the definition of cloud is now clearer and marketing departments don't want to confuse people, or (and more likely) its because cloud isn't seen as 'shiny' any more and therefore has lost much of its ability to entice clients.  In the later case shifting to a more prosaic name makes sense.

This would be a good thing as it means we've got beyond the peak of the hype cycle and are now getting towards cloud becoming an important technical approach but not being the solution to world hunger or having value as a 'badge' to slap on something to make it more attractive.

Monday, February 27, 2012

How Skype treat me as a transaction not an individual

I blogged at work on how the challenge of Omnichannel engagement means you need to treat the customer as an individual not as a transaction.

Well here is an example of a company doing the exact opposite.

First some background.  At work I tend to use a VPN that takes me out through the Netherlands but regularly I'm in different countries and I tend to use Skype.  Now Skype know a lot of information about me:
  1. My credit card number
  2. My home address (which includes the country)
  3. email address
  4. All of the locations from which I've connected before
So what happens when I access, as a logged in user, from http://www.skype.co.uk (notice I'm forcing it to use the UK site at this stage, surely another hint....)

Brilliant eh?  Its all in Dutch.  So Skype take all the information that they know about me and then totally ignore it in favour of an IP lookup to get a country and therefore a language.  This page is truly special as the top of the page has not a single thing to enable me to shift the language into English.  Certainly makes it more 'fun' in adding credit to my account... did it succeed?  Is that an error message?  How on earth would I know its in Dutch.

Now on my mobile there is a screen for profile that includes language... a useful thing... but something that isn't on the profile on the website, which also declares that anything you put on the profile is going to be shared with the world (not what I want).

So Skype is falling down on several cases.  Firstly its not using the internal information that it has to offer me a personalised service, I'm in the UK, I live in the UK I don't want to change my language based on the country I currently sit in.

Secondly its profile configuration is inconsistent between its mobile platform and its web platform.  I have miles more things on the mobile platform but I'm unlikely to set them as they will be shared with the world.  But I decided to set the country and the language as much as I love the Netherlands my Dutch is non-existant.

So I set up the profile (left) and look what I got on the web (right)




So see the issue?  The country hasn't been picked up and the language isn't even an option, the website certainly didn't use the preferences from my mobile device to give me the language I wanted on the web.

Skype is multi-channel not Omnichannel.  Clearly Skype don't have a real-time customer information mastering process that keeps their channels in sync and equally clearly their website doesn't use the customer profile to customise the web experience to the individual instead it uses the IP address to customise it to the network connection.

Omnichannel is about treating the individual like an individual independent of channel.  And its that which requires good real-time operation centric MDM.


Friday, February 24, 2012

How politicians could kill SaaS with stupidity

Back when I was doing SaaS a few years ago I raised the issue of the Patriot act as being a reason why cloud providers would be setting up in Europe.  The rule however appears even worse than I knew so now the Patriot Act impact US cloud sales directly as the hosting location doesn't matter its the rules.

With the US Congress seeming to view China and the rest of the world with concern and talk of trade barriers being raised it isn't hard to see that the next four years could see a real shift in cloud and SaaS adoption, if for instance any European or Asian companies suffer publicly as a result of US policy with regards to their own information or non-US legislation (EU data privacy for instance) makes it impossible to be both Patriot Act and client complaint.

This challenge to US based vendors could lead to a flight from US shores for many of them or arms-length 'collaboration' agreements with European and Asian providers.  At worst it will lead to a collapse of these cloud providers as their markets are restricted to just the US borders while truly global players will be able to address emerging high volume markets.

If congress does start making more draconian legislation which means US companies are able to offer even less assurances to non-US organisations and non-US governments 'retaliate' by strengthening their own data privacy and retention legislation then we could very quickly find ourselves in a 'Cloud' based trade war, one governed not by tariffs but by policies because the adherence to those policies acts as a tax on the cloud provider, and if the provider is not able to obey both the US legislation and local laws then in effect that provider will have been barred from the country.

Trade wars in SaaS and Cloud will be fundamentally different, less about tariffs and taxes and more about policies and laws.  Right now Congress has firmly put itself on a trade war path.

Unfortunately I don't think they realise that.

42 - Or do you understand your Big Data question?

What is the ultimate Big Data question?  Well it is of course the Biggest Question... the question... of Life, the Universe and Everything.... but that poses a problem: in an analytical world do you really understand the question?

(or in short form if you are in a rush)


The point here is that one of the major challenges with Big Data is that we are moving away from simple SQL driven questions 'who are my top ten customers' or 'how much did I sell last week' into much more analytical and predictive questions such as 'if I reduce the price of goats cheese how much more red wine will I sell'

This presents a new set of challenges because analytics can give you simple answers '15% more' which then lead you to drop the price of goats cheese.  The network effect of that change however means less beer is sold and less hard cheese is sold so now you are over stocked in beer and hard cheese, both of which have a use-by date.  The point is that the question was badly formed but correctly answered.  Greater degrees of abstraction also introduce greater degrees of assumption in those creating the models.  So while the business has asked a small and concise question 'where to put the next store' the model has made certain assumptions that may or may not be the case. How are these assumptions shown to the business and if they are can the business even understand them?

Today there exists a problem of chained spreadsheets, in the future the issue of chained analytical models is going to make the connection between the business 'question' and the 'answer' more complex and harder to understand and put more power into the hands of mathematicians who prove good at converting abstract questions into good models.  This also means that there will be ever more importance placed on getting control of the definition of information into that model (what is a customer, what is a product, how do you identify them... MDM stuff) these are the bits that the business can control.  The core information, the sources and the quality control around them.

Big Data answers - only as good as your understanding of the question.



Thursday, February 16, 2012

How Apple will beat Facebook

Looking at the extension of iMessage to the desktop made me think about how Apple can take on Facebook and win.  Lets see what Facebook have over Apple technically...

  1. Multiple people in one conversation
  2. Broadcast conversations with 'followers'
Now Apple have already integrated Twitter into the iPhone but lets assume that long term the folks from Cupertino want total control.  What do they need to do?
  1. Add a 'message thread' function to iMessage so its not just a stream between two people
  2. Add the ability to talk in groups
  3. Add the ability to broadcast status updates
Applications can compete easily by having some form of multiplayer iCloud sync, or in the same way they already do via 3rd party servers.  What more could Apple do however than Facebook?
  1. Integrate the status update piece into Contacts so before you call you see the status and can see the recent messages
  2. Integrate the group chat dimension by having 'groups' in Contacts (umm almost Circle like)
  3. Provide multi-participant Facetime to switch from iMessage to group comms
The point here is that technically Facebook don't have much that Apple couldn't make standard Mac OS X and more importantly iOS functionality.  Indeed much of this would be a welcomed integrated upgrade to those things (rather than a clear market grab like Google+) so people would 'naturally' start using these facilities as they are on their phone/desktop.  This would increase the link to Apple products in those communities (much as Blackberry used to see).

An added advantage of Apple's approach is that it can remove the view of a 'big central server' and instead create a more federated view of inclusion than Facebook.  This is liable to help increase people's engagement levels and unlike Facebook Apple doesn't need to drive revenue via advertising or selling people, it wants to drive that via more people on its platform as those people hand over real hard cash.

Facebook's big risk is that its network ceases to be the cool and only place to network and that other social based approaches take off.  Apple are ideally placed in the consumer space and have the platform control mentality to drive this.  iMessage is only the start of the evolution, the question is just how much engagement does Apple want to have?



Monday, February 13, 2012

Why Broadband, Apps and Moore's law will beat Server based HTML 5

The browser is about to have its day... Apps are going to win.  Now these Apps could be like the Chrome store pieces and developed in HTML5 but with local storage and offline access added but they will fundamentally be local things.  Why?
  1. Moore's Law isn't going away any time soon. 
    1.  In a couple of years we will have Smartphones with quad or even octo cores, 8GB of main RAM and 256 GB of storage... and you are seriously just using that as a browser?
    2. Your TV will have that as well
    3. Your Fridge will be a bit dumber, say dual core, 8GB storage, 100MB RAM... its a ruddy Fridge
  2. Connections to the home will be a normal thing
    1. Mobile phone companies will start offering 'VPN to the home' as a standard offering so you can unify your control of internet access
    2. This doesn't require a 'home server' just a VPN link
    3. Your home devices will then be accessible directly or via the cloud
    4. Current 'TV via 3G' offers will be linked back to your home connection
  3. Rich Clients beat thin clients when there is a choice
    1. Look at the App Stores, look at games...
  4. The network is never something to bet on being 'always' there
    1. Do you want a Sat Nav that fails due to network connections?
    2. Do you want a Fridge that turns off because it can't download the right temperature for fish?
  5. The speed of light isn't just a good idea... its the law.
    1. Ping lag is an issue of immediacy.  Even if processing takes zero time there is still 100ms+ of ping-lag to contend with and server lag, etc, etc. 

This isn't a retro-grade step its actually a great challenge as what it means is that federation is going to increase.  Social centralisers like Twitter and Facebook are liable to start facing competition from Social aggregators which work on federated information served from your devices via your home network.  Cloud providers will remain focused on functionality and capacity and the blurring of the cloud between the physical and the virtual will be complete, you won't even know if your TV is running locally or via the cloud... except when it borks... which is why in reality it will run locally.

HTML5 a great technology but for it to win it needs everyone to sign up for 'open' on all devices, this includes TVs, Mobiles, tablets and motor cars.  Applications are so much the 'thing' that Google are even promoting applications that can be downloaded and run from Chrome, thus meaning that Chrome isn't really a browser anymore but instead is a hosting platform for applications.

Server-side HTML has had its day, the only question now is whether the industry will unite behind a single 'open' client-side approach for applications or whether every platform will have its own approach.  Apple's current success and the Android marketplace seem to indicate the later.

Server-side HTML - 1991 to 2015.

Why I rewrite rather than re-factor

Refactoring is one of those terms in IT that gets bandied about as a good thing.  Refactoring is meant to be the small incremental cleaning up of code to restructure it while leaving it functionally the same(1).  Some folks say that re-writing is different as it changes this functional contract.  What happens in the field however is different...

Refactoring tends to be done in small chunks and done incrementally around a specific area, for instance a single function or class.  The refactoring tries to 'evolve' the code to make it better and does have some good theory behind it, but regrettably in the field I tend to see refactoring creating an incrementally more complex mess as parts are 'optimised' while the whole collapses.  Refactoring in the field is also mostly done when you've got new functionality, so the idea of functional equivalence goes out the window its now just about refactoring as you develop a new requirement which means that the new requirement skews the refactoring.

This reality of refactoring in the field means that I often come across code that has gone through several refactors and each was tinged by the mentality of the developer undertaking a specific new piece of functionality, the refactors therefore are not making the code better but instead making it better to develop that one specific requirement.

For this reason I'm actually a big fan of re-writing and, whisper it quietly, I tend to find its quicker and produces better code, especially from a maintenance perspective.  Now I could argue what I do is refactoring as I'm always rather anal around getting interface designs right and they tend to be rather stable as I put quite a bit of time into them.  The reality though is that the body is completely changed behind the facade.

Re-writing works because I now know what I didn't when I first planned the internals.  I know the additional requirements that I've received and the new information classes that I now must operate on.  Recently I had a piece of code where I had spent a couple of 'refactors' making it 'better' and the code was a bit cleaner and more manageable.   I then came across a bug which was proving rather hard to track (Objective C how I loathe you) and some new functionality I wanted to develop was adding complexity into the code... so it was re-write time.

About 60  minutes later there were 120 lines of code (SLOC) where previously there had been 300+, this clean down had removed the bug (which was to do, as ever, with memory allocation) and added in the new functionality.  The new code had been quick to write as I now understood much better what the end-goal was thanks to new requirements through a few iterations and I'd a much better grasp on how all of the different classes needed to engage.

Functionally my class interface hadn't changed and the calling classes didn't need to change but its hard to claim what I did was a simple refactor as I trashed every single line of code and re-wrote from scratch based on the requirements & design.

Refactoring has become a short cut for 'don't do it well first time' and the reality of the field does not match the theory in the books.  Too often the phrase is used to hide bad code, badly re-structured (and I'll admit that my first pass was in retrospect just that as I didn't know quite how Objective C and iOS worked).

Its time to come clean:

Sometimes its better and quicker to re-write than keep up the facade of 'refactoring'.

Technorati Tags: ,

Friday, January 27, 2012

Controlling what kids access - VPN to the Home - the next big thing in Mobile security

I've got kids, currently they are under the age where they get Smartphones and unfettered internet access but such a day is coming.  Now at home I can set it up so on the WiFi there is a proxy and all content has to be routed via that proxy or it doesn't go out and I can lock down the proxy so they can't go where I don't want.

However on a Smartphone they get good internet access without me being in control.

Bugger.

Then I got my new internet connection from BT (VDSL, 40 down, 10 up) and so I'm putting a VPN in so when I'm abroad I can still do back-ups etc without having to lug disks around.  Then I realised that I could set it up so my mobile phone used the VPN as well, which means iTunes backup and sync can be done as well.... double result.

This then made me think of how if you combine the VPN with a proxy that you can then have a controlled connection at all times.  All you need is the ability to add restrictions to the device which force 'always use VPN', something that isn't supported today, and prevent the VPN connection identifier from being changed.  This has two key usage scenarios:

  1. For Enterprises it means mobile device internet access can be controlled
  2. For families it means being able to control what your kids access... until they are around a friends or buy their own device
Its the latter that interests me at home, obviously, as I do think as a parent I have a responsibility to control what my kids access and to ensure that I can track things and keep them safe.  This isn't about being over protective, I wouldn't let them wander the streets on their own or take a train to London on their own, and the internet can be just, if not more, dangerous than those things.

With the rise of high bandwidth upstream connections this sort of thing becomes completely feasible, all it needs now is the mobile phone manufacturers to add the capabilities into the OS in the same way as they add things like Internet blocks or other application blocks today.

High-speed uplink + VPN + Mobile = Personal control of your own internet connection... with the added benefit of NEVER being unencrypted on a public WiFi connection. 

Wednesday, January 18, 2012

IT going backwards - Objective C is 90s retro

I've ranted quite regularly on how Enterprise IT just hasn't really developed in the last 5 years and my personal task for 2012... learning Objective C and programming for iOS has taken my disbelief to another level. Back in 2008 I learnt Python and for me it sucked. Its 'advantage' over scripting languages of the 80s and 90s was minimal and it had the most hated (for me) of things... indent sensitive code. Objective C however really has stepped it up a level.

I remember learning Ada, C and Eiffel (along with bits of LISP, Prolog, Assembler, etc) and most of all I remember being confused as to why people like coding in languages like C, where the syntax is terse bordering on emo, over languages like Ada where even non experts can have a crack. Through my career people have claimed the stupid 'less characters = language efficiency' which again matches up by saying that Martin Luther King was a crap communicator while a grunting teenager is much more efficient.

But all of this couldn't prepare me for the horror that is Objective C. SIGFAULTS in 2012? Seriously? Have years and years of exception handling been ignored? No even better than that... Objective C has exceptions but you are discouraged from using them, yup they are there but are 'resource intensive' so you shouldn't use them.

Second off we've got the wonder of memory management again, although now with 'ARC' it actually does some garbage collection, yup folks its 2012 and Apple have just caught up with the mid-90s.

All of this is annoying, and rubbish, but that would be nothing if the language had a nice syntax and logical way of working... but Objective C is like people have looked at Java, C, C++ and then sat down and though 'how could we make this really suck?'.  Yes its the same old .h/.c (or .m in this case) combination of header and code but just basic things like function calls are made excessively silly.  No simple .(, ) for Objective C... well not always, sometimes and you can do it but not normally... ahh consistency avoidance always a great way to have sucky code.  No in Objective C you call a function like this
[instance method]:param1 param2:param2
This means you end up with wonderful code that looks like
[[[eventHistory getEventAt:location date:date] calculateDistance:newLocation].doubleValue
Notice that '.doubleValue'?  Yup when using NSNumber (object for doubles) you use the old '.' notation. Perfect eh?

Then we have XCode, an IDE that seems to crash if you do anything it doesn't expect rather than a warning saying 'Fail: you didn't mean to do that'.  Some bits are nice, like some of the code generation and some of the bits, like refactoring, are pretty much up to Java IDE standards from 2001/2002.

The layout model in XCode is okay, with some nicer bits around chaining screens but seriously is it that hard to implement XmForm?  With the multiple display layouts that you get with mobile devices it really would be a cracking layout manager to have.

Then we have the iOS simulator, its great, except if you want to simulate locations that Apple hasn't thought of... the 'accuracy' if you use custom locations (for instance if you want to test something using European locations) is 150,000m... or to put it another way... a level that every decent piece of code should ignore.    Application development speed wise I'd clearly be faster in Java, but as a new language I'd say that Objective-C ranks behind C++ in terms of 'complexity' to learn and ranks significantly behind both C and C++ in terms of language efficiency.

But that said the example code pieces are good and the online manuals are good as well and I knocked up the second stage of my application on a flight across the Atlantic.  Basically however it feels like using C++/Motif with Emacs and the TeleUSE UI builder.  Its 2012, shouldn't it feel like we've progressed?  What it really feels like is some sort of retro homage to the 90s wrapped in a shiny and expensive new package.

From now on I'm only coding for iOS while listening to an iPhone playlist 'Songs of the 90s', it helps get my mind in the iOS Zone.

Technorati Tags: ,

Monday, January 16, 2012

iPads on planes during takeoff? Hell I'd like to use it in the airport!

People have been asking for iPads, and Kindles, to be used during takeoff and landing (like Pilots can) but for me that isn't a massive deal, yes I'd like to read my online Economist from the iPad when I'm travelling and sure it can be a bit of a pain to have to use old style paper... but I've got a bigger gripe.

The CBP (Customers and Border Protection) and their mental policies at immigration.  Now putting aside the normal 'welcome to America' of 1 bloke for the Rest of the World and 15 for the 20 Americans on the flight, or the ridiculous number of times my passport has to be checked in the UK (THREE TIMES! on this trip).  Or the questions that sometimes border on the clinically insane.  No my complaint is simple.

I use TripIt for my travel, its a great service, but the reality is that for the last 10 years I've not printed out a hotel reservation, for two reasons

  1. I know where the hotel is
  2. Its on email
Repeatedly on the last few trips to the States its not been enough to put 'JW Marriot, Miami, FL' or similar, nope they want the street address and knowing its 'on Brickell' isn't enough.  So on each occasion I've done the same thing, pulled out my mobile and been met with...
You can't use that here, it needs to be turned off
What about the iPad I enquire?  Nope that is banned as well.  So here we are at an impasse, its 2011 and 2012 and thanks to the wonder of technology that has existed for the whole 21st Century (and a bit before) I have access to my reservation details on a mobile device without having to print them off. Amazing eh?  But to the CBP this is a clear and present threat to the United States.

I've been asked, when coming in with my family, to show a paper copy of the hotel booking to 'prove' we have a reservation... seriously?  In the modern era of printers and word processors its considered a security check to have an EMAIL reservation printed out... rather than actually showing the email?

I saw a guy at the final 'hand in the blue customers paper' check told the same thing on a phone that was just held in his hand, not being used mind, just held.  The international arrival area is clearly not somewhere that phoning for a taxi or telling people you've arrived is a massive security risk.

Yes this is a rant, but seriously its 2012 and most people are shifting away from paper onto mobile devices, the CBP should be encouraging this rather than dissuading it.  How about this, how about having a CBP approved application which you load all these details onto, this then generates a QR Code or similar, this gets scanned at the immigration piece and they get not only the hotel but also the details on your return flight, go a stage further and have the questionnaire on the application and suddenly you've got all the information you need with no OCR processes and no lost data (and reduced risk of ID fraud/theft).

Come on CBP, its 2012, get with the program and face the reality of a mobile world.  Let people have mobile phones (you can even say 'no calls at the desk' like they do in the UK) and maybe even save some time and money, and identifying risk better, by automating the paper process.

 



Technorati Tags: ,