Jupiter Moonbeam & the Geeks from Cyberspace

Friday, 1 October 2010

I am NOT a geek

I am writing this due to provocation by an number of colleagues who seem intent on labelling me as a Geek. Making claim to bullying of this character in a technical organization such as ThoughtWorks may sound implausible but my public derision finds root in the rallying call I made (on an internal mailing list): I AM NOT A GEEK.

So why would a respected developer such as myself risk a well earned reputation with such a perilous statement? And then why would I go onto circulate the claim amongst the wider community of readers on the ThoughtWorks blog roll (I am still on it aren't I?)?

It has been a dark secret that I have been keeping from all but my closest colleagues; only my friends and family have really ever known the truth. Coming out of the closet on an internal mailing list may have been tough, but - apart from the odd comment from the usually insensitive and ignorant (well emotional intelligence was never the geeks strong point) - it has been a liberating experience. No longer do I have to pretend that I actually care about the resolution of iPhones versus HTCs, no longer do I have to feign interest in the difference between the generics implementation in Java and dotnet. And, for the first time in my career, I have felt that I could style my hair in a moderately fashionable manner. And the support has been incredible: BAs and QAs and even other devs have quite literally wrapped their arms around me in deep sympathy to confess that they too couldn't work out what pub everyone was meeting at every night until they twigged that all the socializing occurred within the realms of the World of Warcraft servers.

Though one cannot make such a claim as I have without some hard evidence. After all, over a decade in the IT industry would suggest that my declaration is fraudulent. But no, the truth is that my participation in the geek subculture is limited at best. I don't like science fiction or fantasy, I find Lord of the Rings far fetched and ridiculous, I own not a single gadget (no honestly not even a fancy phone), I perceive the playing of computer games to be a waste of my life, I don't like Star Wars (but I want to ride my bicycle). On the other hand I like being outside - especially if involved sport - adore music (and no, not Heavy Metal), enjoy the company of others and I can relate to women. Some may go as far as describing me as a "people person".

To sum it up, once out of the workplace, technology plays a very minor role in my life, except in the places where I can leverage it as an enabler (for typing this blog on, or finding cinema times on the Internet). Careful not to misconstrue me: I believe technology is a great enabler for many things, from opening information to social justice. As a power in society it is arguably the single greatest development since the printing press. However, if it all disappeared overnight would I cry? If I was born fifty years ago would I have struggled to find my calling in life? No, no and definitely no. So hence my assertion: I AM NOT A GEEK.

Please, before you make your judgement to find my decision wise or foolish, let me impart you with my reasoning. Firstly I hope my actions will path the way for others, but also, I believe this selfless acts of hari kari will be the catalyst to make our industry a better place.

Within our great industry there is a template, an expectation, a stereo-type if you like, of what a typical IT worker should be. So deep it is within the culture I should not need to draw this caricature in too much detail but, let's throw some words out there: spotty, socially awkward, coffee drinking, light hating, arrogant, emotionally unintelligent male that spits when he talks and is probably wearing a t-shirt - for several days and smelling a bit - with some witticism along the lines of '/me 0mwz j00' or something equally as mystical - possibly even written in Klingon - to the majority of society. For cultural references consult The IT Crowd, that Russian guy in that James Bond film, Big Bang Theory or any other computer guy you can think of in any mainstream film or TV comedy.

Before continuing let me state, on the record, that the number of individuals I find that meet this template in IT are ever decreasing. Of course it was considerably greater a decade ago when they locked us in the basement but since we've been allowed to occupy floors that let in natural light, like Gremlins, they've bubbled up and exploded (I believe getting them wet was equally dangerous).

Regardless, the stereotype still persists. My concern is this stereotype is a barrier to the IT industry reaching its full potential. It has a number of negative effects: it deters potentially able people from the industry (women for example, but not exclusively); it normalizes a set of behaviours which are damaging; and it prevents peoples development by narrowing their horizons to a limited template.

Though there are people out there who not only relate to the stereotype but will defend its place within the unique culture of IT. They will state that the attributes to be good with technology are exclusive to geeks and therefore they are the best people to run IT - yet looking at the state of the industry I'd say you've not done a very good job so get out of the way and let someone else have a go. But this is a complete fabrication. Without trying too hard to burst the whole indulgent We are Special myth and point out that lots of normal people in other industries have these skills in abundance too there truth is that there are many skills that IT people, in particular developers need, to enable technology in a manner that brings value to the business which are both arguably more important and that the stereotypical geek lacks; whether communication skills, awareness of business, empathy and understanding for users, prioritization, even the ability to solve problems WITHOUT technology. Of all of the key skills those attributed to the geekier sensibilities are, in modern development, the least critical. Wizzy maths and clever algorithms do not form the heart of the vast majority of software. Thanks to the Agile movement and things like BDD and Domain Driven Design, the business is at the heart of software development.

Modern software places the emphasis on those other skills that the stereotypical geek doesn't tend to have. Normal people on the other hand do. And what's more, plenty of normal people have those other problem solving skills that geeks believed made them stand out - I'm sorry but attention to detail, inquisitive nature, analytical skills, logical thinking, problem solving etc. are not, and never have been, attributes exclusive to the introverted middle class white male plagued by OCD and touched by autism. The simple truth is that non-geeks can and do make great developers and often even better developers than geeks (in fact some of the worst developers I've worked with in my career have been the geekiest and the inverse has also been true).

Let me restate this for crystal clarity: you do NOT need to be a geek to be a great developer; you don't even need to be a bit of a geek - not even give into the odd geeky impulse and no, no, no you don't even need to try and fit in and do geeky stuff - 'cos it doesn't make one tiny bit of difference to how good you are as a developer.

In a by-gone age when geeks were the main consumers of technology then the natural geek instinct of the developer was enough to satisfy any demands the audience may have. One bunch of geeks produced stuff for other geeks to consume. What the first geeks produced made perfect sense to the other geeks. But that isn't true any more. Geeks are a minority of users. Do you want evidence? Go and look at how many users are running Linux (no true geek would run Windows): the number is insignificant. The majority of technology in the majority of the world is used by non-geeks.

To write effective software for non-geeks you need people who understand them, and unfortunately geeks tend not to. The things that geeks prioritize are often not what non-geeks prioritize. Remember when everyone derided Macs for being 'simple' and how dumb Apple users were because they couldn't manage more than one mouse button, whilst MS covered every possible surface of the mouse with different ways to click and Opera invented 'mouse gestures' (yes, only a geek could have done that). And then Apple go and produce a music product with only one button and a wheel? What geek would have come up with that idea? No, that required someone who understood how normal people want to interact with a music player (the geek stood back in shock that normal people could manage to rip, manage and sync their CD collection without any guidance just because were given an interface that made sense).

Now my tirade is over, my blows made, wounds inflicted, oil spilt and lit, I wish to make my peace. The future of IT and software development needs to be one where the creativity, the passion, the human element of building brilliant software and the skills and attributes required to achieve this are encouraged and nurtured regardless of it's source, geek or non-geek. The industry must stop holding the geek up as the exemplar and alienating, or degrading, those that do not meet or wish to conform with this caste. With each generation IT moves further into the consciousness of the mainstream and in doing so its appeal will grow beyond those great pioneers, the geeks, upon which today's industry is founded.

So here's to an industry which attracts the brightest and the best from the entire psychological and sociological spectrum and a long overdue fairwell to that which limits it to that one tiny, specific character with its limited range of skills.

Monday, 26 April 2010

UK politics and the battle between Old and Social Media

Firstly this content is about politics.  Though the purpose of the post is not to 'push' any specific political agenda but to examine the role social media has played in the sudden, and unexpected rise, of the Liberal Democrats and how that too has impacted the 'old' media.

For those of you reading this who are not in touch with the current affairs in the UK, currently there is a general election being held it appeared that the 13 year majority of the current Labour government would be seriously contested by the opposition Conservative party.  Then, on the 15th April, the UKs first live televised debate went out with the three main leaders.  The result was that the previously marginal Liberal Democrat party were declared the outright winners.  All hell broke loose.

The TV debates are an historic first for the UK but something else that is also a historic first is the use of social media.  Of course, over in the US, the effective use of social media gave it claim to Obama's victory.  I don't think that this was a fact ignored by the election teams in the UK as politicians ran to pick up their own Twitter accounts and Facebook profiles but the nature of on the ground campaigning hasn't altered quite so dramatically.

Traditionally, in the UK, the main stream press has held the single greatest influence over elections and this sudden swing is evidence that their grip is breaking.  There is no doubting that the televised debates have been the single biggest influencer in the dramatic shift in political ground over the last two weeks however there is a real war between social media and the mainstream press.  After all, the UK has taken fondly to social media, being only second behind the US and with London having the highest percentage of Twitter users than any other city.

Although it is easy for some in the press to dismiss social media users the intensity of UK usage compared to other countries is no doubt going to have an impact on the end result.

Real evidence of this was available even before the first leaders debate had finished.  Looking at what was trending on Twitter gave a solid prediction of the outcome.  Nick Clegg was far in front with Cameron and Brown just nipping in at the bottom of the top ten terms.  Ironically Clegg was kept off the top spot by the hash tag #iagreewithnick.  A few minutes after the debate had finished Nick Robinson delivered the catchphrase live on BBC.  For Twitter users he was only confirming what they'd known for the last ninety minutes.

It was in the week after the first leaders debate when the power of social networking was truly demonstrated.  After Cleggs surpirse win the right wing press went into overdrive to discredit the new upstart.  What resulted was a simultaneous barrage of front page smear campaigns across the four main right wing papers (Daily Mail, Daily Express, Daily Telegraph, Sun).  Twitter users reacted instantly with disdainful satire by manufacturing a series of ridiculous claims and blaming them on Clegg.  Before lunch time the same day #nickcleggsfault was the highest trending hashtag.  And it stayed there, trumping even #leadersdebate, for several days.  Even if the press couldn't pick up on the subtleties of that then the fact that "murdoch lose britain" was trending a few positions  behind should have been a clear enough message.

Then the Murdochractic media made its second mistake when  Sky TVs host Adam Boulton questioned Clegg on one of those very front pages.  It was outside the rules and hundreds of tweets popped up within seconds protesting against it.  These soon translated into over 100 complaints to Ofcom and of course news coverage of the slight.  From that moment Twitter had a microscope over Sky's coverage of Clegg and was even accusing Sky of deliberately 'cutting' Cleggs shots so he couldn't 'look into the camera' whilst giving Cameron more opportunity to do so.  Of course this is pure conspiracy but it was a small enough seed of doubt that people believed deserved to be retweeted.

And this seed of doubt had serious consequences.  Once Sky announced that YouGov had David Cameron in first place Twitter reacted with disbelief.  Within seconds people were posting that they'd taken part in the YouGov poll and felt it was biased, they were even posting screenshots of the survey to prove it.  This lead to accusations of 'push-polling'.

And it didn't end there.  The day after the Daily Mail were claiming on their home page that their polls showed that Cameron came out on top.  But a Twitter user had observed that the Mail had withdrawn their original survey which had Clegg top and republished it.  The result was a Twitter call for poll-jacking to put the numbers 'right' (which they eventually did).

What is interesting about all the above is how UKs Right Wing press inadvertently waged war with social media users.  The press forced people into camps: with us or against us, and to against us was to be with Clegg.   And so social media users reacted in the same way they had against what they perceived as the mainstream bullying and manipulation of Simon Cowell by campaigning to deny him the Christmas number one by replacing it with Rage Against The Machine anthem "Fuck You I won't do what you tell me".  Except this time they are reacting against the Right Wing Press and Clegg is their Rage.  Evidence?  Well a facebook group called "We got Rage Against the Machine to #1, we can get the Lib Dems into office!" that has already collected over 150,000 members - which is more than any political party.

Why did they make such an error?  Well perhaps it's because the press like to dismiss social networking as something for 'young people' and 'young people' don't read newspapers and, so they thought, young people don't vote.  But, thanks to social networking, a kick-back against the 'old' media and a some dramatic TV coverage it looks like that all might change.

Of course the truth will out on May 6th when people turn out to vote.   The Right Wing press still hold considerable influence over the British public, let's not forget that social networkers are a specific, though not insignificant, demographic. But, though it's hard, look behind the politics and you'll see, just like the XFactor Christmas single, there is a clear theme from the 'internet generation' that they won't be dictated to by the traditional media and now, thanks to social media, they can quickly mobilize and push back.  Perhaps the press should take warning from the Rage campaign, because when they do push back, they push back on mass, and they push back hard.

Friday, 9 April 2010

nPower and my UX nightmare

I saw npowers new Wallace and Gromit ads for their energy monitoring device.  Apparently just ask and you'll get one.  Brilliant.  After lots and lots of web searching and scratching of heads I found out the way to do it is register for a paperless billing (which I was already on) - this is of course once I'd worked out how to log in (tip: tiny link top left corner in a sea of red).  More confusion and then I find you just have to edit your details (it took me a couple of reads of the clearly written English to work this out):
To request a FREE smartpower electricity monitor, set up paperless billing, or activate other smartpower options, simply add or edit your account details now 
First, what's wrong here?  Do an ad campaign and then make it hard for existing customers to follow through? You'd at least thought that all you have to do is log into your account and see a picture of Wallace and Gromit inviting you to click the big Apply for Smart Power Monitor?  No?  Well, obviously the marketing guys at nPower don't think so.

But it gets worse.  I try to edit my details and the page blows up with a load of error messages about missing information.  But there's only check boxes, the stuff that's missing is your account details? Huh? There's no place to enter them either (and why do I need to enter something you should already know)?   So I try to re-register for paperless (I think I'm smart here, I'll disable it then re-enable it).  This time the fields it moaned about not being filled in are there and blank and apparantly I have to refer to the bill npower sent me for the details (even though they are all on the previous page next to the link I clicked to get here).  But it blows up, this time horredously, logs me out and gives me a 'sorry, our system appears to be experiencing problems'.  You don't say.

So I email npower to ask them to sort it out.  Explain clearly that the website keeps blowing up every time I try register for the monitor and ask them if they could just sort it out for me behind the scenes.  A few days later I get this reply:
I have lokked into this for you and customer do have to apply for the smart power monitor on line please follow the information below to apply ofyou still have issues please call our web support team direct on 08451663443.  
Did they read my message?  Did they read there's before sending it? Can they read?  They certainly can't write.  Great customer service nPower.  I feel really valued.  Well done.

Thursday, 11 March 2010

A Thoroughly Modern Developer

The world of development has changed rapidly over the last decade or so.  Thanks to Agile, great tools such as TDD, O/R mapping, dynamic and functional languages and a million other little things, the way companies approach development is changing.  A bright new future awaits where only the true veterans wince as the CEO discusses going waterfall when promoting the merits of the new hydroelectric generator.

Inevitably, as development changes so will developers need to adapt.  The stereotype of a socially awkward, green screen loving, mouse hating, hacker who wears black-now-grey jeans and t-shirts that state "127.0.0.1 is where the heart is" for a week at a time and mumbles through the pizza crumbs that drop in piles from his beard onto his ever rounding belly, is not going to cut it in this new world.  Future projects will be run on the basis of success and that means you can't forgive someone's shortcomings because they are a "code wizard".

So what sort of developer does cut it at the beginning of this new decade?  What sort of skills are you going to need?  Well, funny you should ask-


Domain Knowledge

Or, knowledge of the business you are working for.  Systems are more complex than ever and businesses increasingly rely on them.  Regardless of the general mistrust of IT it has moved right into the heart of business, providing the engine, one without which many businesses couldn't survive.
To build successful systems The Thoroughly Modern Developer has a thorough knowledge of the businesses intention and the value being delivered; it isn't good enough to rely on a BA and a Development Manager to 'translate' business speak into dev syntax.

Eric Evans goes into this in detail in Domain Driven Design.  Designing and building a system is a collaborative effort between the domain experts and the developers to create a common model (or a ubiquitous language).  If you don't understand the business how can you model it?  But it goes further than that; if you don't understand the business value how can you deliver it?


QA

Back in the old days testing meant running up the app, clicking a couple of times and then waiting a month or two before the testing team raised a list of bugs for the junior devs to pick up and fix.  It took XP to change our attitudes on this.  TDD meant we wrote unit tests and verified our systems with at least some code.  Agile put QAs at the heart of the development process and bugs where fixed at the end of every iteration, but the 'throw it over the wall' principle was still there, just shorter.

The Thoroughly Modern Developer takes responsibility for her own quality, she cares more about meeting the acceptance criteria in a bug free fashion than anything else.  This makes the QA's role even more critical as they must continuously guide and help the dev but they will no longer be reduced to simply checking they've done the work.


Usability

The average developers idea of building something usable is akin to [the car Homer designed].  For some strange reason even the simplest of tasks, such as getting a column of text boxes to line up, seems to be a feat of incredible endurance.

But usability is crucial.  The first developer I ever worked for told me The users don't care about how clean and beautiful your code is, they never see it, but the smallest spelling mistake on the UI and they're on the phone.

The Thoroughly Modern Developer builds systems with usability in mind from the start.  Sure, she's no expert - but she knows enough simple rules to get her by - so she works closely with the UX person to ensure what is being produced is usable not just functional.


Polyglot

The Thoroughly Modern Developer does not define her role or skillset around a single language (or worse, a single toolkit - i.e. ASP.NET Developer).

She is language agnostic, and she has experience of a number of different languages, using different paradigms (OOP, functional, dynamic etc.) and her level of understanding goes beyond syntax.
The Thoroughly Modern Developer will choose the best tool for the job or circumstance.  Throw her a language she's never worked in and she has no issues about picking it up.  Or put her on a project where she's expected to work in two or more different languages and she isn't phased.

In Code Complete Steve McConnell talks about Programming "into" a langague over programming "in" a language.  The Thoroughly Modern Developer does the former.


Value Driven

To every new feature, ever request, every line of code, the Thoroughly Modern Developer asks the same question: how does this deliver value or what's the value of doing this?  She's obsessed, she keeps going on about it, it's all most as if it's all she cares about.

Which it is of course.  To the Thoroughly Modern Developer value is the sole purpose of her job.


A People Person

Oh yes, it's that horrible phrase, one that causes many devs of old to run away and hide behind a wall of cabinets filled with specification documents.  The Thoroughly Modern Developer, on the other hand, likes people, gets on with people, can talk to people.  She doesn't need 'Relationship Managers' or 'Business Interfacers'; put her in a room full of real people and she'll hold her own without spitting when she talks or snorting Beavis and Buthead style when someone uses any word, or collection of words, which bear a vague resemblance to bodily functions.

Why is the Thoroughly Modern Developer such a people person?  Because she understands that in order to build quality software, that delivers business value she needs to talk to people, all different sorts of people, all the time.  Whether it's to find out whether the button should say Save or Create or to explain to non-technical people why it took longer to integrate the zobertron with the phlargbleg initiator (of course the Thoroughly Modern Developer would never have come up with those names but she's still got to get along with the old skool) and for the client to be confidant.

People are what makes a software project successful and if you can't do people you can't do software.


Facilitation

The Thoroughly Modern Developer often finds herself in the middle of difficult and complex situations.  Because she wants to get the system right she has to raise difficult questions about the way the business works.

The Thoroughly Modern Developer needs basic facilitation skills.  She needs to be able to lead a group of people through creative and difficult exercises. To get the right answers you have to keep people on track, resolve conflicts, remove distractions, know when to call time-out, get them to make a decision.


Has "other" interests

For athletes cross-training (training in your non-core sport) is a essential technique to ensuring you excel in your core discipline.  This is no less true for intellectual disciplines and even more true for creative ones (artists/writers/musicians have known for centuries the importance of pursing other arts - think Da Vinci).  If your entire existence is writing software then you are greatly narrowing your reference points and are more likely to suffer from boredom or stagnation.  For example many prominent developers have blogged on the strange relation between development and music (as a failed musician I entirely concur).

Personally, I have found that long distance sports have allowed me to strengthen and develop a lot of essential development skills: focus, pace, general discipline; not to mention the health benefits that keep my brain active and my energy levels high.

But not only does it benefit your work but it makes you a more interesting person, which is always useful when talking to 'real' people like the users. So do your self a favour, when you get home do something that doesn't involve the computer.


Understands that technology isn't important

The Thoroughly Modern Developer has a healthy cynicism towards technology. If something can be done without technology that's her preference and she'll push for it.  She actually wants to write less software; complex clever gadgetry and features fill her with a great sense of foreboding.

If she was a developer at Timpsons (who have no centralized till system), she'd be strong in resisting all efforts to introduce one.  She only cares about technology if it offers real benefit, if it provides genuine value or is essential to the business or user.


She's great is the Thoroughly Modern Developer.  She's so awesome people high five her every time she gets up to make a cup of tea.  And yet she's so humble with it.  If only I could be just like her (sigh).

Wednesday, 10 March 2010

Fear of the new (how TDD and DI encourage bad practice)

When I first started doing TDD back in the mid-naughties it changed the way I programmed dramatically.  It took me a while to get it at first, how can you test one unit? until I discovered the tricks of dependency injection.  Now my code was shaped in loosely-coupled wonderness and I felt invincible.  I thought I had reached the pinnacle of OOP. Everything suddenly became so easy, it was well tested, easy to change, easy to wrap, swap implementations, decorate, whatever, I was throwing the best design patterns out with such ease that I barely needed to refer to the Gang-of-Four anymore.

I went on in this vein for a good year or so.  And as the millenium's first decade entered early evening and its bright lights began to fade I took a step back and looked at my code.  My god, I thought, this is terrible, what the hell have I been doing? And it was terrible, hundreds of interfaces named somebody's Manager, Persister, Service (oh so many classes called Service), Validator etc. etc., all with a single namesake class implementing them.  There were tens of classes whose purpose seemed to be to co-ordinate these little buggers - for they were little, ten or so lines each (oh how I thought I was doing a grand job) - and they would hand off to other similar classes who co-ordinated other classes and it was turtles all the way down.  And in amongst all of it there were about a dozen classes that had real names like Customer, Account (but even they had interfaces too) and, aye how the sting of memory pains me to this day, they only had getters and setters (it's the shame, you never get over the shame).

It was with great pain I realized for the last year of my professional life I had been producing procedural code with all the idealistic misguided joy of a card holding Russian worker pouring cheap cement to build pointless roads.  I knew the culprit alright: TDD and DI.  I had gone decoupling, unit testing mad.  The result was an irrational fear of the new keyword.  As far as I was concerned the new keyword was banished to the dark depths of Containers, it was a clumsy language feature never to be used again.  But what I had instead was a dependency graph made up almost entirely of stateless singletons.  A simple Ruby script could have replaced every injected dependency with calls to static methods and sure the application would have lost its testability, it would no longer be decoupled, but essentially, at it's essence, and in terms of structure, it would be the same.  TDD and DI had simply given me a fancy way of decoupling static classes.

The new keyword is a wonderful thing.  It lies at the heart of OOP.  Its sole objective is to create a new instance of an object.  If you don't use new you can't create new objects and if you can't create new objects you can't write object orientated code.  And the other fundamental thing about objects is they encapsulate behavior and state.  Singletons don't.  Singletons are procedural.  Regardless of whether they are loosely coupled and injectable.  Real OO classes have private fields (the fewer the better) and methods which do things based on those private fields, sometimes based on arguments passed in, but always, always in the context of those private fields.  And the only way, the one single true way you get values into those private fields is via the constructor and that means using the new keyword.  Let's put it clearly: this 'new ValidatedCustomer(customer).IsEmailAddressValid' is OOP. This 'customerValidator.Validate(customer).IsEmailAddressValid' is procedural.

Now I was writing code using TDD and DI that was object orientated code.  But my fear of the new was still there.  In order to maintain the injectable, loosely-coupled gorgeousness I had begun to do a horrible thing.  I started injecting factories everywhere.  Sure it was an improvement but there was still something horribly smelly going on.  After all the factories were essentially static, singletons whose sole purpose was to delegate to the new statement.  I mean there's single responsibility and then there's craziness!

So I started doing something I thought was really bad: I created objects in my classes! But I wanted to keep loosely coupled and all of that wonderfulness.  This required a dramatic shift in thinking.  Remember when you first did TDD and how it really hurt because you had to structure programs in a different way?  Well it was like doing that all over again.  Every time I started doing something I had to spend half an hour thinking, how do I make this work?  I know I'm right but how?

I learnt a few lessons:
  • Not all classes are about interactions, sometimes they are about results.  Mocking everything out for the sake of it doesn't gain you anything.  Use state based testing to assert the end results not interaction testing to assert what is happening to achieve those results.  
  • It's OK for your collaborators to new up something if you ask them, so you can say fileSystem.NewFile(name) if you need to.  
  • Collaborators don't always have to be passed in the constructor: they can be passed as arguments to methods as well, so you can say new File('myfile.txt', 'Some text').SaveTo(filesystem).  
  • If a class is unavoidably all boilerplate and co-ordination consider using integration tests, after all mocking the interactions of boilerplate code doesn't tell you anything useful at all.  
  • The container shouldn't contain everything, it should be the things that generally require configuration or are likely to change: databases, filesystems, web services etc. Rarely do core domain classes or business principles need to be 'switchable'.

Tuesday, 12 January 2010

Desktop Mashup

The concept of mashups generated a leap forward for web applications. By exposing data and functionality in an open and standard way other sites could use it, mashed up with other data using the functions from a different site, to provide services beyond the imagination of the original owners. It presented a significant shift in thinking: that data and function was more powerful - and arguably more valuable - if applications outside your own could leverage it.

This shift in thinking is not only limited to the web; desktop applications can also leverage these ideas. Locally running rich client desktop applications can be enabled to expose their features to other local, or even distributed, applications.

In fact, in Unix, this concept has been around for decades in the very simple, yet immensely powerful, form of software pipelines. By chaining the output of one program's data (via stdout) another program, with it's own distinct functionality, can then manipulate the result. This simple technique has enabled Unix developers and administrators to solve an infinite number of problems from a small set of tools.

Between these two extremes is the idea of desktop mashups. But instead of following the pull nature of the web, or the push nature of software pipelines, desktop applications can take advantage of the event driven nature of GUI systems and thus create a rich desktop experience. Thus separate programs, with distinctly different functional offerings, can leverage each other to provide functionality beyond the others original intention.

For clarification consider this simple example: you have two separate applications: one which rips DVDs, another which catalogues them ready for playbook (e.g. iTunes). When the DVD ripping applications completes it sends a message to say a new movie is available. The cataloging application - which has no direct knowledge of the first, receives the event and interrogates the new movie and adds it, and its metadata to the catalogue.

Of course this is nothing more than Event Driven Architecture which is hardly a new concept. The move to mashups comes when we create new applications from the existing apps functionality. So, in our simple example, we create a new application which fetches film reviews from imdb. When the user inserts a DVD from ripping it fetches the review so the user can decide whether they should go ahead. When the user does rip the DVD the review application updates the catalogue application so the review is part of the movie's meta.

Enterprises are ripe for desktop mashups. A significant number of problems for users is the lack of integration between applications, whether they are different internal apps or even third party. Another advantage, especially in large enterprises, is that development teams can focus on developing small applications that do their jobs well, or integrating with legacy apps, and mashing them together rather than struggling to develop all-in-one god apps.

Wednesday, 9 December 2009

Immutable Wrapper

I am a huge fan of immutables, especially for value objects. Even in clumsy static languages like C# and Java that seem to throw every obstacle in your way to deter you from using them I find the benefits outweigh the costs.

Sometimes though a mutable makes sense. Whether that's because of the restrictiveness of a framework or the paradigm of the language. But what if we don't want to loose the power of immutables behind the scenes?

Introducing The Immutable Wrapper Pattern!

The Immutable Wrapper is a wonderfully simple pattern because there are only two things you need to do.
  1. Create a class with only one mutable field
  2. Wrap the immutables functions.
Here's an example. You've got a copy-on-write list implementation but your binding it to some GUI control that gives you a big headache if you try and point it to a different data source.

Here's your immutable:
  class CopyOnWriteList
{
private readonly IEnumerable items;

CopyOnWriteList Add(item)
{
return new CopyOnWriteList(new List(items, item));
}
}
Now wrap it:
  class WrappedCopyOnWriteList
{
CopyOnWriteList list;

void Add(item)
{
list = list.Add(item)
}
}
There, it couldn't be simpler. So simple in fact I don't know what else to say. Enjoy!

About Me

My photo
West Malling, Kent, United Kingdom
I am a ThoughtWorker and general Memeologist living in the UK. I have worked in IT since 2000 on many projects from public facing websites in media and e-commerce to rich-client banking applications and corporate intranets. I am passionate and committed to making IT a better world.