Monday, December 16, 2013

Tampere Goes Agile 2013



I have a two year old tradition of going to TGA with a friend of mine. Its usually the only trip to Tampere each year for both of us. So I thought since going there is a tradition, why not writing a blog summary about the event as well.

I was really tired this year after only a few hours of sleep the night before so it might have brought some bias, but I didn't feel quite the same level of good vibes and community as last year. I'm thinking the venue (Tampere-talo) might have something to do with this. The place is ultra clean and areas between presentations weren't really good for sparking ad hoc discussion.

So here are the presentations that I saw and what inspired me.

Note I didn't see all of the presentations and who knows how amazing things I missed. Not being mentioned here is not a sign your favourite presentation wasn't great in my opinion.


Science!


Laurent Bossavit: The Art of Being Wrong. How good are your guesses and estimates? How well calibrated are you and do you ever check your assumptions after the fact? These were the questions Laurent brought up. I've written about this same topic earlier in my own blog - only more focused on testing.

Laurent is the author of a book I intend to read, called The Leprechauns of Software Engineering. Superficially it looks very similar to the book  What We Actually Know About Software Development, and Why We Believe It's True which I did read earlier this year. I believe the topic is important, but Wilson's and Oram's book was more like a collection of academic papers demanding more academic research about software engineering. Which is good and great, but honestly not a very useful read for a practicing engineer like myself. So I'm really looking forward to reading Laurent's book and will get back to the subject once I've read it.


Lean!


Next interesting presentation was by Marko Taipale. He was a presenter last year as well and did a very similar presentation about Lean Startup methodology. This time Marko drilled deeper and specifically into how his own company uses lean startup tools in developing their own business models. Lean thinking and Lean Startup -movement is a huge topic and I will not go much deeper here. Instead, I'll just provide a link to Marko's presentation.

Ok I will say one thing about Lean Startup -movement after all. Its starting to get real criticism as well which is always a good sign. It means its being used and people have opinions and hopefully even hypothesis about if and how well it works. To balance things out a bit, read for example some good critique by Dan Norris. Would love to have Marko Taipale and Dan Norris in a panel discussion about Lean Startup!


Contracts!


I'm a freelance contractor myself, been for about 3 years now. And even before that I was a consultant for 10 years doing hour-based contracting for various companies. Antti Kirjavainen from Houston Inc did a very thought provoking and important presentation about the current state of manhour-based contracting work and why it sucks so much for everybody. 

I'm hereby officially giving the Best of TGA 2013 prize to Antti! I might return and write a whole blog entry about the topic, but for now, here is Antti's presentation.

Contracting and how the dynamics in it are built right now will end up hurting everyone in our industry at some point. Delivering warmed seats instead of identifiable value and treating each developer as seat warmer for which the hourly price needs to be minimised, is a doomed model and needs to change at some point.

I was honestly too tired to have a chat about this with Antti at TGA, but I will definitely do so at some point if I get the chance.


#noestimates


Nice to have a word for what I've done intuitively as a team lead if there has been no pressure for strict Scrum or other Methodology! Henri Karhatsu did a very inspiring presentation from a real life project he had been leading, transforming the process from poorly working traditional agile planning to a much more streamlined custom process without estimates.

I got a few new ideas to try out from Henri's presentation. I really can't say if my next project will use estimates or not. Sometimes they are good. Sometimes they aren't that useful. There is no predefined rule on how to determine which is the case beforehand.

What I really liked about Henri's presentation was advocation of thinking for yourself. This worked for him, but might not work for you - at least not copied blindly. 

What often most ticks me off about Methodologies in software engineering, is that they are essentially attempts in externalising thinking. "Read this book and follow the steps and you won't have to think for yourself. We have solved this for you". 

Do what makes your developers happy and result in correct business need being fulfilled

This needs to be fine tuned and figured out in every company, in every project of any significance. That's just how it is. Learn to live with it. Hire a Henri to figure these things for you :)


And finally...


Very much thanks and hugs for the wonderful people who organised this free (!) event and the sponsors who paid for it all! 

My sincere suggestion for next year is that the event wouldn't be entirely free anymore. Now apparently over 60 people who had registered never showed up. Make it 50e a pop and people might not reserve a seat just in case they feel like going. 

Thursday, April 11, 2013

Ajatuksia Apotista

This blog post is in Finnish because of the topic. 



Apotti-hanke on herättänyt ennennäkemättömän määrän keskustelua sekä Suomen IT-piireissä, että julkisuudessa. Mahdollisesti ensimmäistä kertaa yleiseen keskusteluun on noussut se, miten ongelmallisia ja kalliita julkisin varoin toteutettavat IT-järjestelmät ovat.

Olen mielenkiinnolla seurannut aiheen ympärille syntynyttä Facebook-ryhmää. Keskustelun laatu on korkeatasoista ja siihen osallistuvat sekä IT-alan että terveydenhuollon ammattilaiset.

Tämän kirjoituksen tarkoituksena on kerätä teemoja, jotka vallitsevat keskustelun taustalla - valitettavasti usein hukkuen yksityiskohtien ja tarkan argumentoinnin sekaan. Pyrin nostamaan tarkasteltavan ongelman sellaiselle tasolle, että voidaan päästää irti kehää kiertävistä argumenteista (esimerkki: "ketterät menetelmät auttaisivat - ei voi käyttää koska X, Y, Z"). Yritän myös tuoda keskusteluun muutaman uuden käsitteen ja idean, jotka ehkä jäsentävät aihetta niiden mielissä, jotka tämän kirjoituksen lukevat.


Millaista on hyvä ohjelmistokehitys?

Jotta voitaisiin verrata vaikkapa Apotin toimintamalleja johonkin, esittelen ensin oman parhaan näkemykseni siitä, miten järjestäytyy toimivia, kestäviä ja kauniita ohjelmistoja tuottava kehitysmalli. Listaan ominaisuuksia, joita tällaisella mallilla voisi olla. Mitä enemmän näitä saadaan käyttöön samassa hankkeessa, sitä varmemmin se tuottaa arvokkaita tuotoksia.


  1. Ekosysteemeitä hierarkkisen hallinnon sijaan.
  2. Varhainen ja usein toistuva altistuminen todellisuudelle.
  3. Osapuolilla on tasapainoinen riskijakauma. 
  4. Ohjelmiston rakenne tukee kohtaa 1.
Katso mitä Apple, Google, Facebook ja muut uuden sukupolven teknologiayritykset tekevät, niin ymmärrät. Onko erityisiä syitä, miksi näin ei voitaisi toimia julkishallinnon IT-hankkeissa? Varmasti tuhat syytä. En vain usko, että mikään niistä kestää tarkempaa tarkastelua.



Ekosysteemeitä hierarkkisen hallinnon sijaan

Julkishallinnon IT-hankkeissa istuu norsu keskellä huonetta. Kaikesta muusta keskustellaan, mutta silmiini ei ole osunut kritiikkiä sitä kohtaan, että hankkeita vedetään täysin ylhäältä alaspäin johdettuna. Korkeintaan vaaditaan parempaa osaamista sinne norsunluutornin huipulle.

Parempi suunta olisi pyrkiä minimoimaan hallinnon määrä ja keskittyä siihen, että pystytään reagoimaan nopeasti ärsykkeisiin.  

Tähän päästään mielestäni niin, että luodaan pieni ja erittäin asiantunteva organisaatio, jolla on niin korkean tason mandaatti "tehdä kaikki tarvittava onnistumisen varmistamiseksi", että asioiden hyvin tekeminen ei muutu mahdottomaksi.  

Tällainen organisaatio omistaa kaiken koodin, infran yms. mitä hankkeessa syntyy. Erityisesti tämä organisaatio vastaa ns. ytimestä, joka löytyy jokaisen hyvin tehdyn tietojärjestelmän keskeltä. Käyttöjärjestelmissä sen nimi on "kernel", potilastietojärjestelmässä se on potilastietojen optimoitu tietomalli, sen toteuttava tietokanta ja nämä kaksi yhdistävä logiikka. Tämä ydin on suojattu rajapinnoilla, jotka tämä organisaatio omistaa ja joiden kehityksestä ja muutoksesta se vastaa.

Jokainen ekosysteemi siis tarvitsee ytimen. Applella on AppStore. Linuxissa on Linus Torvaldsin hallinnoima Linux Kernel. Tällaisen ytimen päälle voidaan tuottaa erilaisia laajennuksia, käyttöliittymiä, sovelluksia. Mutta niin, ettei mikään yksittäinen osa pysty rikkomaan kokonaisuutta.

Tällaisessa mallissa pärjätään pienellä ydinorganisaatiolla, muu työ voidaan kilpailuttaa. Mikä parasta se voidaan kilpailuttaa pienilläkin yrityksillä, jolloin ei olla jumissa kahdessa tai kolmessa toimittajassa, joiden välillä on ehkä kilpailua, mutta ei juurikaan eroa lopputuloksen laadussa tai hinnassa (lue: huono ja kallis).

Kun tähän lisätään hyvin tehty käyttöliittymästandardi, ei pitäisi käytettävyydessä olla suuria eroja eri osajärjestelmien kesken.


Varhainen ja usein toistuva altistuminen todellisuudelle

Sähköisen reseptin viimeaikainen uutiskynnyksen ylittänyt kompurointi on tästä mainio esimerkki. Hanketta on jossain muodossa kehitetty tai valmisteltu 1980-luvulta asti. Vuonna 2013, kun se on saatu pikkuhiljaa käyttöön, käy ilmi että järjestelmä ei ihan vastaakaan todellisuuden tarpeita.

Ei. Oikea vastaus tähän haasteeseen ei ole entistä tarkempi ja perinpohjaisempi määrittelyprojekti. Ei, vastaus ei myöskään ole juristiarmeijan laatimat sopimukset, jotka kieltävät virheiden tekemisen toimittajilta.

Ohjelmisto, joka altistuu todellisuudelle vasta kun se on valmis, on kuin ihminen joka altistuisi elämälle vasta 18-vuotiaana. Kompleksit järjestelmät (ihminen, ohjelmistot) vaativat evoluutiota tullakseen toimiviksi. 

Tämä on tärkein yksittäinen asia, minkä ketterä kehitys on tuonut (takaisin) ohjelmistokehitykseen. 

Todellisuus on niin monimutkainen, että sen kanssa toimivan järjestelmän on pakko kompuroida ensin, että siitä saadaan toimiva.

Ja juuri niinhän nyt on sähköisen reseptinkin kanssa käymässä. Ongelma vain on, että altistuminen tapahtuu liian myöhään. Lisäksi järjestelmä ja varsinkin sen hallinto ovat liian kankeita pystyäkseen mukautumaan tällaisiin "yllätyksiin". Sen sijaan nyt joudutaan odottamaan puolitoista vuotta(!), jotta loppukäyttäjän näkökulmasta täysin triviaali muutos saadaan tehtyä.



Osapuolilla on oltava tasapainoinen riskijakauma

Englannin kielessä on osuva sanonta "to have skin in the game".  Siitä tässä on kysymys.

Julkishallinnon IT-hankkeissa erityisesti ja julkishallinnossa ylipäätään on valtava ongelma tässä suhteessa. Virheistä, vakavistakaan, ei yleensä tule seuraamuksia sen tekijöille. Toisaalta myöskään onnistumisesta ei palkita. Tämä on virkamiesjärjestelmämme pimeä puoli.

Vapailla markkinoilla sekä virheistä että onnistumisista rangaistaan/palkitaan. Käynnissä on jatkuva evoluutio, joka tuottaa koko ajan parempia ratkaisuja todellisiin ongelmiin.

Aiemmin mainitsemassani ytimen omistavassa organisaatiossa työskentelevillä julkishallinnon työntekijöillä olisi aivan ehdottomasti oltava "nahkaa pelissä". Tuntuvat rahalliset korvaukset, jos hanke onnistuu. Potkut koko porukalle, jos epäonnistuu. Näin karrikoidusti. Palkitsemisjärjestelmien asiantuntijoilla on varmasti hienosyisempiä malleja.


Ohjelmiston rakenteen on tuettava ekosysteemin rakentamista

Apotin kaltainen monoliittinen ostos (jos ollaan edelleen ostamassa Epic-järjestelmää) ei toteuta tätä vaatimusta. Kovan ytimen tunnistaminen, suunnitteleminen ja rakentaminen on erittäin vaativaa. Vielä vaativampaa on sellaisen ostaminen. En oikeastaan osaa edes kuvitella, miten sellainen ostettaisiin "valmiina".

Ihan lonkalta heitettynä miettisin itse vaikkapa Suomen terveydenhuollon IT:n omaa pilviarkkitehtuuria. Tässä pilvessä sitten ajettaisiin ytimiä, joiden ympärille voidaan vapailla markkinoilla tuottaa toimivia osasia. Tätä on ansiokkaasti edistetty Taltioni-palvelussa.

Totta kai tässäkin tarvitaan hallintoa ja valvontaa. Mutta kyseessä ei ole edelleenkään rakettitiede. Apple tutkii ja hyväksyy jokaisen sovelluksen ennen sen päästämistä AppStoreen. Ongelma on aivan varmasti ratkaistavissa, jos keskitytään ongelman ratkaisuun, eikä byrokratian laajentamiseen.

Kenties olisi kannattavampaa viedä julkisen IT:n hallinta pois puhtaasti julkishallinnollisen byrokratian piiristä, jos näyttää siltä, että asioita ei yksinkertaisesti saada toimimaan? Toimiva ratkaisu olisi valtion omistama voittoa tavoittelematon yritys. Sen henkilökunta ei olisi virkamiehiä, vaan kunnollisilla kannustimilla motivoituja ammattilaisia.

Big Bang

Lopuksi vielä muutama sananen aiheesta, josta on ainakin Facebook-ryhmässä keskusteltu, ja joka kaipaisi laajempaakin huomiota.

Suomen julkishallinnon ja etenkin terveydenhuollon rakenne on todella hajanainen. Toimijoita ja erilaisia prosesseja, johtamistapoja ja eritysvaatimuksia on käsittämätön määrä näin vähäväkiselle valtiolle. Tämä ongelma on pyörinyt paljon niissä argumenteissa, joilla on haluttu puolustaa nykyistä tehotonta ja kallista toimintakulttuuria.

Kyllä, prosesseja on muutettava, eikä tietojärjestelmä voi korjata huonoja toimintatapoja. Tietojärjestelmä voi vain tehostaa (parhaimmillaan) sitä toimintatapaa ja prosessia, jonka mukaan toimitaan muutenkin.

Mielestäni Apotissa yritetään saada aikaan ns. Big Bang, eli paljon muutosta aikaan kerralla ja keskusjohtoisesti. Ehkä ajatellaan, että vain tarpeeksi suuri ja mahtipontinen hankejärkäle saa byrokratian juoksuhaudat revittyä auki. Voi olla, mutta tulokset eivät näytä kovin vaikuttavilta tähän mennessä.

Paljon helpommin hallittavissa oleva lähestymistapa olisi rakentaa Apotin ydinjärjestelmä ja joku rajattu määrä sovelluksia ytimen päälle yhteistyössä yhden pienen ja muutokselle valmiin sairaanhoitopiirin (tai jopa yksittäisen laitoksen) kanssa. Mieluusti vielä niin, että tulevat loppukäyttäjät ovat mukana varmistamassa toteutuksen altistumista todellisuudelle alusta asti.

Kun on saatu yksi onnistuminen, voidaan alkaa neuvottelemaan muiden yksiköiden kanssa. Tässä kohtaa mielestäni vaaditaan raakaa liikemiestaitoa ja mahdollisesti jopa lainsäätäjän painostusta siihen, että jokaiselle sairaanhoitopiirille ja peräkylän terveyskeskukselle ei tehdä poikkeuksia järjestelmiin, varsinkaan sen ytimeen. Joko käytät yhteistä järjestelmää ja muutat prosessit sen kanssa yhteensopiviksi tai sitten olet oman onnesi nojassa.


Friday, November 9, 2012

Of TDD and Unit Testing

Inspired by an amazing talk by Greg Wilson with the topic, What We Actually Know About Software Development, and Why We Believe It's True, I will not accept truths about software development without citation anymore!

Its amazing to realize after doing software professionally soon for 13 years that the things we take as truths in this profession are often not much more than opinions.

So the first thing I decided to ask Google was: Is there proof that unit testing and TDD really results in better software? The most promising result was a research paper published by Boby George and Laurie Williams: An Initial Investigation of Test Driven Development in Industry.

The results seem to back up the claims of TDD (and by reference also unit testing, I'd assume). 18% increase in software quality with only 16% increase in development time compared to the control group (and the control group didn't mostly do any unit tests, even afterward so the increase in development time is probably skewed).

So based on this research I'd have to admit to being wrong about TDD. But...yes, of course I have a but. The researchers admit themselves that the software that was developed was trivial (200 LOC, a bowling game) with static completely known requirements. To me this is a very serious problem with the study. All that can be concluded is that TDD works in optimal situations.

Things that this study ignores are:

  1. Granularity of unit tests. In large software there are multiple layers where tests could be written. Are we to assume always the lowest possible level (ie. class)? If not, what is the optimum granularity? 
  2. Complexity and lability of requirements. How well would TDD approach work in high complexity work with shifting requirements where fast feedback cycles and ability to change are important (ie. software will never make it into production if correct requirements cannot be harvested via fast feedback loops)?
  3. Software size. Unit test code is often overlooked in the size of the overall codebase. In large refactorings test refactoring often (in my personal experience) takes most of the time. Is there a limit after which test code amount starts to hamper development speed enough to matter?
Since I believe in functional, automated black box testing via interfaces or user interface, to me it would be much more interesting to compare software developed with rigorous TDD and unit tests to software developed with tests similar to what the researches in the TDD study themselves used to determine the quality of the solutions. More specifically it would be interesting to find answers to these questions:
  1. Which approach yields faster development cycles with a mature code base and shifting requirements?
  2. Which approach survives large and small refactorings better and how big difference there is between development effort when refactoring?
  3. Is there a difference in overall software quality? 
Also what was a bit striking and surprisingly dogmatic for me was this quote from the paper: "The industry standard for coverage is in the range 80% to 90%,  although ideally the coverage should be 100%" (The paper quoted this from Steve Cornett).

If you've ever worked in a large real-life project of any significance, you'll realize how absurd this is. To get test coverage to 100% with white box unit tests requires enormous amounts of very fragile unit test suites basically casting your software in cement. Refactoring is of course always possible, but I would imagine the psychological barrier for doing a large refactoring in a system like this is very high indeed. 

I know what you're saying now. But isn't that just the point? That the unit tests give enough security to do refactorings? Sure, but don't black box functional tests do just the same? In fact they do much more. They make sure your software works identically from the point of the outside world, which is what matters.

I have zero research to back my hypothesis that a much better overall result could be achieved these ingredients:
  • Automated black box functional tests for interfaces and user interface where test code to production code ratio is optimised (Another thing I'm surprised to never see mentioned).
  • Unit tests for obvious hot spots in the codebase with difficult algorithms that lend themselves to be easily tested with unit tests.
  • Minimising accidental complexity and CDD (Curriculum Driven Development) - this is slightly off topic, but I like to keep reminding.
In fact I find it very surprising that functional black box tests aren't among the first things to be done in a new project. Still I keep finding projects where people write 1000 line unit tests for single Wicket-components that themselves have half the amount of code. Yes, you'll find out if a change in the component breaks something, but you're still utterly clueless as to if your actual user interface works as expected as a whole. 

I'm starting to think there is a background in psychology here as well. Its about responsibility and limting it. "My job is to make this single component. I did it and wrote a unit test to prove it. Now I'm done, I don't need to think of the software as a whole."

That may sound cruel, but I have observed it to be true much more often than I'd like to admit.

I keep coming back to psychology so often these days that I'm starting to think of starting the study of Software Development Psychology :)




Sunday, October 14, 2012

Tampere Goes Agile 2012

Honestly I haven't been taking part much in the finnish agile movement in the past years. Yesterday was an exception though as I went with a good fellow to the TGA yearly conference.

The place is at the middle of the old Finlayson factory complex and brought back memories form the time I first started studying in Tampere 13 years ago. The building where the conference was located was exactly the same where I attended business english classes :)

The theme of the conference this year was "building the right thing" which is a great topic that has really been missing from the agile mindset for too long. I've often found myself wondring amidst projects: "Why should we strive for 90% test coverage or perfect Scrum-process if its not granted that the original requirements are sane?".

I'm not going to go through all the talks of the conference, instead I'm just going to pick a few topics and talks that inspired me the most.

Ideas

Lots of ideas were floating around. Lots of talking about how to turn ideas into business. By far the best talk was from Marko Taipale. Marko is a very inspiring talker with loads of real life hands-on experience. Also it was Marko's talk that again verified my own thoughts of how common and bountiful ideas are. 

Everybody has ideas. Another talk given by Ralf Kruse really demonstrated this well. He managed to make the audience sputter out really cool ideas by the dozen during his speech that was more like a hands-on session on how to create the initial product backlog out of literally anything. He just picked up a random guy and demonstrated the daily routines in his life (an university student) and how really cool sounding business ideas can be spawned from just about any aspect of a persons life.

Being part of a small startup myself, I've had a lot of discussions about ideas. The traditional approach, and also some kind of natural instinct is to keep your ideas hidden. The reason for this is of course the fear that someone will steal your great idea. This, as I've come to realize, is nonsense. There are so many ideas everywhere that people and companies are drowning in them. 

Back to Marko's presentation that was basically about explaining how do you iterate from a set of business ideas expressed in a standard format to a viable, scalable business. He laid out a very clear and simple method to follow. I loved the clarity in his thinking. Of course its not new stuff. Most of this has been introduced in the Lean Startup movement, but as with agile all in all, its one thing to know stuff and completely another thing to apply that knowledge successfully in the real world.

To me the lesson I took home from this was that ideas are good, they are the fuel that a startup needs. But you should not get stuck with your ideas. And most importantly, the ideas need to be put out there and tested for viability. My favourite quote from Marko was (sorry, might not be quoted correctly): "The facts are not inside your building." 

As long as you just keep fiddling with stuff in your office, all you have are guesses. Personally I'm not down to coding stuff for months or a year just based on a guess. Not anymore at least.

Software artistry

This is a topic probably as old as software industry itself. I found Sam Aaron to be also an inspiring talker. He has basically (and very literally too) combined art and programming. He was a presentation with the title of "Hyper Agile" which made me chuckle. But basically his talk was about approaching programming from a old fashioned craftsmanship direction and building first your own tools that enable you to create beautiful and artful software for your personal pleasure.

I have mixed feelings about this. I have background in arts (music) and Sam's message appeals to that side of me. My pragmatic self wasn't that thrilled. The question that kept popping in my mind was: "How do you create larger software that requires coordinated efforts by a team or even several teams if every developer has their own "light saber" that is ever so slightly different from everybody else's?"

Still, I'm definitely going to try Sam's pet projects Emacs Live and Overtone he used to give a really cool live music performance with. Basically Emacs Live + Overtone seems to be a programmable sound interface that uses Clojure. 

And that brings us to...

Clojure

Now I know I've already written about Clojure shortly and basically dismissed it. However I'm always willing to change my opinions in the light of new evidence, which was kindly provided by the very talented guys at Metosin. These guys were also sponsoring the event and providing some really fancy coffee (being allergic to caffeine, I can't give a better description, sorry).

The guys from Metosin showed me some production quality Clojure code from a Real Project, which was a first for me. Looking at their stuff briefly I was pleasantly surprised. It looked actually readable and not just clever guys being too clever for their own good (as seems to be the standard with Scala). Very much respect for this.

There's also a video in YouTube from these guys giving a demonstration on how to do stuff with Clojure and Eclipse, its here

Soo, as for me, the jury is now back out to judge Clojure again. I installed a Clojure plugin to my IDEA and at least so far its looking rather nice. Still, I have to say that part of the reasoning to jump from Java to Clojure was based on arguments that are not that objective. 

Let me explain. If you state that Clojure is better because it doesn't have a 7-level architecture with relational database, OR-mapping, Spring, Wicket and tons of other frameworks, I would argue that you've just decided to use all this crap with Java, but that's not the fault of Java itself. I have written quite a bit about frameworks vs. tools before and all that applies here.

The problem with Java isn't the language in my opinion. It has always been the culture and the fact that huge corporations like IBM and Oracle and SAP have ruined Java for everybody without a very clear vision on what NOT to use when doing stuff with Java.

I'm in the process of writing a multi-part article on how Java can be used to create just as light and flexible architectures. Stay tuned for those! 

Still, I'm giving Clojure a go as I love to learn new stuff. You can expect a report on my adventures on this blog in the future!

Summary

I had tons of fun at TGA. The organisers were super cool and the speakers were all very good. Thanks for this amazing event!

Also shout outs to guys from Nitor Creations and all other colleagues I stumbled on. Looking forward to chances to work with all of you!

Saturday, September 1, 2012

JVM languages and why I'd like Kotlin to succeed

Feeling a bit more technical today I decided to write a bit about JVM-languages, those that are not Java, of course.

Does anyone remember when .Net came out and Java-platform was said to be inferior as it doesn't support multiple VM languages like .Net does? In an odd twist of fate, it seems that JVM actually now is the only real polyglot virtual machine where the non-platform-native languages are actually in widespread use.

For quite some time now I've been in search of a Java better than Java. Without luck so far I have to admit. I doodle for a while with a new language and then eventually return to Java for reasons that are actually quite important for real-life development:

  • Java has superior IDE-support, especially with Intellij IDEA
  • In Java the language constructs are simple enough to be able to have really intelligent IDE support that makes the tools feel almost like magic sometimes
  • The core libraries are, despite some ugly ones, very concise and high quality
  • JPA
  • Readability of the code
  • Simple type system with static typing
There are many more things, but these are the ones that generally drive me back to Java.

Jython

In some ways Python is my favourite language syntax. Its very short, elegant yet still readable (if you write it properly) and very beginner friendly.

I have done some programming with Netbeans back when it had built in Jython/Python support and it was a very pleasant experience. 

Only once you start to get into things like XML-processing, concurrency, persistence and pretty much anything that requires you to do coding on non-core libraries or, god forbid frameworks, things start to get pretty bad pretty fast.  

This is of course mostly related to standalone Python, not Jython. 

At the end of the day what stops me from doing any large development with Jython/Python is lack of static typing. Its fine for scripts, but I'm just not willing to write the unit tests for all the things a compiler (IDE actually does most of the compile time checks these days so you don't even have to compile to get the warnings) will do for free.

Scala

After I gave up on Python/Jython, I found Scala. A statically typed language with actor model concurrency, implicit typing and functional goodies sounded amazing! I bought the green/white book and read it for weeks and I was very impressed.

Scala is a language, I feel, that caters for the computer science laboratory -types among us. It is a very academic language and I might imagine many of its proponents visit regularly on Lambda The Ultimate

I still think Scala is a great programming language. I just don't see much value of its added complexity in the development tasks most programmers face daily. The syntax is partly almost mathematical in notation and that way very short and concise, but also almost impossible to understand at a glance, which is a very important quality for any source code that will be read aside from writing.

To give an example, read this and explain within 15 seconds what this piece of code means:

type Out[C[_], B] = C[_ <:B] 

Also it has to be added that tools and frameworks for Scala don't really seem to be made with real life development in mind. My experience was that API's are broken without backwards compatibility without thinking twice and most libraries and frameworks have a two-level versioning meaning they have a separate version per each Scala version.

If I worked in CS lab, I might think Scala is great, but as I'm not, my advice is to steer clear from it.

Clojure

Lisp on JVM! I very much understand the appeal of Lisp and its a very good language to teach how programming languages work over all since its a programmable programming language itself.

But to do something serious with Clojure? No thanks.

Groovy

Groovy gets much less credit than its due in my opinion. Sure, its a dynamically (or should I say optionally) typed language with not that great runtime performance, but I've rarely been as productive as I was when doing a project with Grails.

I really like the simplicity of Groovy's syntax. The closures (lambda-expressions) are also simple to read and understand, although possibly not as complete as in Scala or Clojure.

If I had to pick non-Java JVM language to do my next web-project with, it would be Groovy and Grails. Still, Groovy doesn't satisfy my need of static typing and flawless IDE-support.

Jruby

I remember it was 2005, JAOO-conference and the creator of Ruby On Rails was giving a presentation of his brand spanking new framework himself. Everybody was touting Ruby as the next big language and I suppose it became pretty big.

I never understood why. I understand the idea of RoR, but I think Grails implemented it much better. Ruby as a language isn't much in my humble opinion.

I haven't followed the Ruby-camp for years now to be honest. But when I was looking into it a bit, I saw massive problems with runtime lacking basic scalability and concurrency features we take for granted on JVM. Also the attitude of Ruby-programmers left me perplexed. It was like looking to the backstage of amateur theater rehearsal.  

JRuby might be an ok idea, but not very interesting. I reserve to change my opinion of this given new evidence though.

Kotlin

I have high hopes for this language. For one its being created by JetBrains, the creator of IDEA, the best IDE available for Java (in my opinion). I think they have the correct approach for developing a new general purpose language. 

Freely quoted list of reasons why the language was initiated (from their FAQ):
  • Java compatibility
  • Complies as fast as Java (Scala for example compiles very slowly and has been reported as a problem in larger projects)
  • Safer than Java (NPE's are gone forever),
  • More concise than Java with type inference, closures, operator overloading and mixins, extension functions etc (but without going absolutely mad like Scala)
  • Simpler than Scala (a huge advantage in my opinion)
With these foundations, I'm expecting a lot of this language and I hope it gets popular and also that they manage to reach the 1.0 milestone.

If there is one JVM language I see that could overtake Java in my daily work, its Kotlin.






Monday, August 13, 2012

Of introversion, Steve Jobs, offices and agile

What have Pixar, Tom DeMarco, Steve Jobs and 37Signals in common?

For one, they are all mentioned in a book I've been listening while driving lately: Quiet: The Power of Introverts in a World That Can't Stop Talking by Susan Cain.

Ok so what does her book have to do with software development, you may be asking. On the surface, not much. The book is about the differences between extroverted and introverted personality types. Introverted persons prefer to observe, think, analyse and generally enjoy solitude more than crowds.

As a sweeping generalisation it could be said that software engineers tend to be introverted, like myself for example. In fact as an even more sweeping generalisation it could be said that many people in occupations that benefit from creativity are populated by introverts.

Let's take me for example. I don't like socialising with people much. I hate large crowds and especially loud places where I have to raise my voice. I never talk just to hear my voice and I think meetings are mostly unnecessary venues for extroverted people to have audience for their monologues. Or arguments if there is more than one dominant person in the meeting. I'm mortally afraid of public speaking and doing things in front of an audience. To get work done optimally, I need peace and quiet and freedom.

Now imagine that someone asked me: "What would be the worst possible place to work you can imagine as an introverted person?".

Off the top of my head I came up with a list of these attributes:

  • No privacy
  • Having to share my working space with many other people
  • No freedom to arrange my working environment
  • Lots of obligatory meetings where there's no real discussion of things that matter to my personal work
  • Demanded presence, but no freedom to engage in ad hoc conversations
  • Noise that I can't avoid
  • Interruptions I can't control
Sadly and not surprisingly I just ended up describing the average open office working environment most of us have to work these days. This is bad. Really bad. 

For a while in the beginning of 00s, it seemed like things were getting better in at least one regard: You could escape the horrendous office spaces and work remotely. Then the agile boom swept the industry and oh boy, look at what's being implemented by literally everybody, even the government IT-projects: mandatory 100% on-site presence. All the other fancy stuff of agile is just too complicated or too cumbersome to do, but hey, let's make the bastards all sit in 20 square meter space with absolutely no privacy and we'll reap the benefits of agile!

In all honesty, I'm avoiding high profile agile projects these days for this sole purpose. Or if I'm forced to, I'm quite inclined to simply break the rules and hope I will be forgiven because of how much more productive I am than other people in the team who obediently sit in less space than zoo animals are given and try to desperately get some privacy for actual work by listening to music with insulated headphones.

I would love to see a reform in the agile project culture that would relax the fundamental attitude regarding this. Linux kernel wasn't and isn't done by people sitting in the same room. World wide web wasn't created like that, in fact it was created just for the opposite purpose.

A very interesting subtopic of Cain's book is that creative teamwork, contrary to common belief, isn't more productive than working alone. The times when great results have been made, are most often when the team members have been able to work by themselves and then share and collaborate online through distance!  

And here, of course, come Pixar Headquarters and Steve Jobs into this story. If I had to work 100% of my working time at an office, that is where I would like to work. Pixar headquarters are usually mentioned because of the huge main atrium that provokes unplanned, ad hoc sharing of ideas. But just as important, probably more important, is the fact that besides this atrium where you can go and meet people when you feel like it, the creative employees have private space where they do the actual work. I think for many reasons, this is optimal and the closer you can get to this, the better.

But office space is really expensive! It is. So why not have just the atrium and some space for people who really want to be at the office most of the time? And for the saved renting costs, give a small budget to your employees to decorate their home office, buy them a good chair at home and a good desk. Then have one day a week when most people meet at the office atrium and have a good lunch. Let them share war stories from the projects they are working on, talk about new ideas. Anything, but making them listen and watch through a slideshow by a manager about manager stuff. 

But wasn't the point of being in the same room that you could just lift your head and ask a teammate if you wanted to know something? The funny thing is, my personal observation is that the physical distance of the two people have much less to do with this happening than the willingness of these two people to communicate in the first place. 

Here's where I wholeheartedly agree with 37signals founder Jason Fried. Use good online communication tools that enable passive communication. And then try to create a working atmosphere where this passive communication is encouraged. A good way to do this is to have a team so well spread to different locations that not one person involved feels like she's in a position to expect the others to come to her physically to share information. 

Oh I almost forgot about Tom DeMarco. I mentioned him because I've read his Peopleware -book many times. He's been saying this same thing since 1980s and we've only gotten worse. Working environment is crucial for people who do creative work, especially when they are introverts - like many people in creative arts and crafts are.




Positive effects of project crisis

A while ago I had the pleasure to be the lead developer/lead desginer of a mid-sized software project. When I entered the project, it had been running for about six months already with very high goals and the general attitude of "we're making the perfect thing here".

Time for a confession: I find projects like the above described distressing. Weird huh? Most programmers see these projects as the "dream jobs". Seemingly unlimited budget, finally a place to use all of the cool new frameworks and tricks you've seen all the rockstar developers use. Maybe even do some coding in Clojure, right? And when wearing my programmers hat tight in my head, I agree.

I'm sure there are happy endings for such fairy-tale projects, but my own experience says otherwise. The crash with business reality seems almost unavoidable. At some point the project's budget shows up on the radar of the evil executive who has no understanding of the beauty of Clojure nor how important test coverage of over 90% is. What he is thinking of is: "Ok, we're pouring a ton of money in that project, but the ROI is either unknown or fairly small, either the ROI needs to go up or the expenses go down, preferably both."

And so the budget cutting begins. I've experienced it many times now, but it always feels just as bad. Quite often the whole project is eliminated and possibly code worth months of hard work is thrown out basically meaning you could have spent the last months at home playing Skyrim.

So having been through this myself numerous times during my career (at some point I had a period of 3 years of development without a single line of code gone into production), I've become almost obsessive about avoiding this for the very simple reason that I really hate creating software that is going to end up in the garbage bin.

But back to the project I was leading a while ago. As you can probably guess by now, the budget was cut. Not completely though and with a chance that if the project could produce something of business value with the remaining budget and a really tight time-schedule, it might go into production. What us software engineers want is to get our babies in production so everybody in the team went to a kind of "lifeboat" mode where everything unnecessary was thrown away and all our efforts were focused on producing business value as soon as possible.

This is of course a very crude simplification of the situation to suit my needs for this blog entry. Against odds we succeeded and the project survived and went into production and the story continues, but the interesting bits are on the table now. So let's investigate a bit.

Essentially what happened was that the project's life was threatened, but with a chance to survive if real value could be produced quickly. As a result the project went through a metamorphosis and changed into a very focused ultra slim (yes, yes "lean" is the word to use, I know) creature that could produce business value fast and with a very small budget.

As a joke to a colleague working in the project I once said: "Maybe we should establish a software development method that would deliberately drive projects to the brick wall right in the beginning to get rid of all the fluff from the get-go". The thought was really hilarious at first, but its been bugging me ever since.

So here's what I'm going to do the next time I'm leading a project that's starting either from scratch or starting a new release: I'm going to conduct a workshop with the stakeholders after an initial backlog and roadmap have been produced and have them go through the exercise of "what if the budget is cut in half" and "what if the budget is reduced by 75%". The remaining items on the backlog are the ones that will be done. And with the correct focus from the beginning, its possible to have the desired code quality because there will be less code to begin with.

I suppose you could call the exercise an "emergency evacuation rehearsal" or something like that. In essence its trying to defend from Parkinson's law, which in software development could be expressed as:
Software project always expands so as to fill the budget available for its production
Because let's face it, "Curriculum Driven Development" is a large problem in our industry. And even when that isn't the case, without proper focus even developers with good intentions will produce loads of unnecessary features and code.

The best thing a software project can produce is valuable features, but the second best thing is to avoid producing features that aren't valuable. Let's rephrase this into a "law" of mine that I try to live by:
Best code is unwritten code