Digital asses in the computing industry
Ever noticed how academic asses are analog and industrial asses are digital? It's legitimate to not know whether P equals NP,
or to not know what x is if x*2=y but we don't know y, for that matter. But it isn't legitimate to not know how many cycles,
megabytes or – the king of them all – man-months it will take, so numbers have to be pulled out of one's ass.
The interesting thing is that the ass adapts, that the numbers pulled out of this unconventional digital device aren't pure
noise. Is it because digital asses know to synchronize? Your off-by-2-months estimation is fine as long as other estimations are
off by 5. But it's not just that, there must be something else, a mystery waiting to be discovered. We need a theory of
computational proctology.
Ever noticed how painful the act of anal estimation is for the untrained, um, mind, but then eventually people actually get
addicted to it? Much like managers who learn that problems can be made to go away by means such as saying a firm "No", without
the much harder process of understanding the problem, not to mention solving it? Anal prophecy is to the technical "expert" the
same raw enjoyment that the triumph of power over knowledge is to the manager. "Your powers are nothing compared to mine!"
There once was a company called ArsDigita (I warmly recommend the founder's blog and have his Tenth Rule tattooed all over my psyche), a name I tend to
misread as "ArseDigital" – a tribute to an important method of numerical analysis and estimation in the computing industry.
Nice post buddy. Yes it becomes a lot of times irritating with
managers who are stupid enough to assess the resource strength and go by
their intuition rather than facts and figures. One thing I do to satisfy
myself is not to compare him and me at time 't'. Instead I do that
comparison at age 'a'. You would feel real better if u do it.
Well, it's not like anyone has much choice: either you have enough
information or you don't (except the times when one ignores information
because of the addiction to the way of doing without it).
Nice TI stuff.
The nice thing about work plan – which I've experienced first hand –
is the transition, or maybe ascension, from the ridiculous to the
sublime.
You write a work plan because you need to write a work plan, and
everyone involved knows it's completely ludicrous, with all of the time
estimates completely made up.
But somehow, you start working according to this plan, and your
deadlines and demands from those involved are based on the completely
arbitrary time estimates that you all made up.
I suspect there's a moral here for life in general.
Life as ascension from the ridiculous to the sublime? An optimistic
outlook.
Yossi wrote: "Your off-by-2-months estimation is fine as long as
other estimations are off by 5."
But this is as it should be, isn't it? Guesstimating too far in
either direction will result in either late fees or errors in planning
and resource allocation. If the error is smaller you will trust its
source more, right?
Sure, though I meant the relative error, not the absolute. An
asstimation which is off by 2 years is also fine as long as other
asstimations are off by 5; though "errors in planning and resource
allocation" as you've politely put it far exceed those resulting from an
error of 2 months, it's still someone else's ass that absorbs most of
the consequences, hence the importance of syncing.
Eleven years ago, I had a manager-boss whose #1 qualification was
that his father had been a middle manager for a big oil company.
The first clue that he shouldn't have been in that position, was that
he promoted to assistant managers, the two who had been in the
department the least time, and whose cluefulness was directly
proportionate to that time.
Not long after he fired me (my behavior threatened to expose his
ignorance), he told his boss that the department could develop a battery
of tests for a new system module in two weeks.
When word of this got to the rank-and-file, the rank-and-file
requested, and got, a meeting with the manager's boss, the VP of the
division. They informed the VP that, in order to get such tests
developed in two weeks, they would have to work seventeen hours a day,
seven days a week.
The manager was gone the next day.
Unfortunately, that was just one symptom of how bad things there had
become. A year later, the company was sold to a Dutch corporation, who
eventually fired everyone and moved the chattel to Amsterdam.
The success of the maneuver with the VP indicates that things weren't
completely rotten; a pity the place went under nonetheless.
The fact that the maneuver was necessary at all, outweighs any
success it accomplished.
Well, I wasn't there, and the fact is that they did go under quite
soon, so I can't really say anything beyond what I think is the general
rule of there being no organization that doesn't err (as in promoting
someone to a position where he is incompetent), the only difference is
that some organizations tend to correct their errors (as in fire him or
put him into a situation where he chooses to quit) and some don't. I'm
somewhat pessimistic regarding the chances of the existence of flawless
meritocracies, though there can be an overall meritocratic trend.
My experience after decades of Waterfall followed by five or six
years of Agile is that _nobody knows_ how long a given development task
is going to take, if it's going to take longer than a day or two, and
frequently not even then. _Every_ estimate is a lie, period. There is
_no way to tell_ how long it's going to take from the beginning, and one
very important reason–although not the only reason–is that at the
beginning, nobody really has any idea what "it" is. Oh, people will
swear up and down that they know _exactly_ what "it" is, and they'll
wave three-hundred-page documents around, but those are lies too.
What's lovely is how pointing this out will only get you in trouble.
The rewarded reaction is to pull an asstimate with a straight face, and
without a trace of shame; and when it turns out wrong, explain with the
same straight face and an equal lack of shame how that couldn't be
predicted.
Yup. The thing to do is to set a final condition and then repeatedly
add the next most important single-iteration feature (as defined by the
product owner) until the final condition is reached. The final condition
might be a cutoff date, or it might be a budget limit, or it might be
the point where the value of the next most important feature is less
than the value of the time and money that will be required to develop
it. The final condition might change over the course of the project. It
almost certainly will _not_ be "When the project is complete," because
by the time the rubber meets the road everyone will have different
definitions of "project" and "complete."
Anybody who thumps a two-inch-thick requirements binder and says,
"We're going to have this project done in twenty-four months" is lying
and headed for disaster.
Post a comment