Resources, Scope, Quality, Time. Those are the dimensions of your project. Memorize those words and work from them every time you report status, and every time you think about what to do, or not do. and make sure you control at least 2 of them in any project
Today I'm thinking about FourVariablesUnderStress. -- RonJeffries
We separate quality just because there's the sneaky way of reducing scope, thus sabotaging the project privately while looking good publicly. This isn't a good thing. I'm not convinced, OTOH, that raising quality expands scope. For sure, the XP Testing practices provide known and high levels of quality while actually speeding up development. But it seems to me that over reasonable levels of quality, the scope curve is pretty flat, and that it's more a matter of which quality practices one uses. Discussion, please. --RonJeffries
The "time people scope" triangle seems to me to be a more powerful metaphor for stability than a square.. -- DaveSmith
I, too, prefer the triangle as a metaphor. But the hidden tradeoff is to lose quality while appearing to make time/people/scope. If I had to drop something to get a triangle, I'd drop Resources, since they're generally more fixed anyway. maybe we should think of it as if changing Resources moved you to a different tradeoff triangle for Time/Scope/Quality. --RonJeffries
What about a tetrahedron (i.e. a three-sided pyramid) as a metaphor: Four corners and still proportional trade-offs? There is a nice extension to this metaphor: When you look flat on one of the triangular sides of a tetrahedron, the fourth vertex is hidden! --HaskoHeinecke
Wouldn't it have to be an asymmetric tetrahedron, and probably one with curved faces? That is, the relationships between the vertices are not simple linear proportionalities.
In fact, now that I think about it some more, aren't we really talking about a region of a four-dimensional vector space with...no, I've thought about it too much now. Sorry
Don't argue for it, just do it. Producing internal quality as you go along is not a time spender, it's a time saver. RefactorMercilessly is just one of our coding practices. We keep our nest clean because it makes us go faster. When we have failed to do this, we saw the problem first in our LoadFactor, i.e. we were producing working functionality more slowly. --RonJeffries
While I certainly do agree with Ron, I also pushed myself to look at things from marketing/sales/management's perspective, and discovered that ImplementationDoesTooMatter , to them as well.'
Smalltalk and ENVY facilitate RefactorMercilessly--you get rapid incremental recompiles, and distributing the results to the team is easy. Unfortunately, many of us are working in C++ environments. Given the current state of C++ tools, particularly in the PC world, frequent refactoring in large, ongoing C++ projects has a high cost (for a number of reasons, including compile times, the increased need for and difficulty of merging parallel changes made while the refactoring is underway, the need to jigger dynamic casts, etc.) It's a lot harder to "just do it" and hide the refactoring from interested onlookers. --DaveSmith
With lower internal quality changing and extending the software will be more time-consuming and hence increase TimeToMarket. Now. Does marketing want to sacrifice TimeToMarket? -- AndersChrigstrom
They want to defer paying TechnicalDebt until after the next release. It's their version of XP. You Aren't Gonna Need That Internal Quality, so postpone it indefinitely. -- DaveHarris
That's interesting. Why then is it right when we (development) do it and wrong when they do it ? Is it wrong when they do it ? Two conflicting XP processes ? Need to integrate them, take a larger view ? -- SimonMichael
The problem with the business types practicing YouArentGonnaNeedIt is that this concept only works in conjunction with RefactorMercilessly. These two practices are yin and yang. When one practices YouArentGonnaNeedIt without RefactorMercilessly, one produces a lot of redundancy and unnecessary complexity. And how does marketing practice RefactorMercilessly? -- RobMandeville
Thing is, three weeks without keeping your house clean is long enough to start slowing down. Seriously, I wouldn't raise the issue with marketing any more than I'd raise the issue of team formatting style: it's technical's responsibility. And if it was raised, I'd just tell them that the approach we use has been shown to let us go faster than the alternatives. If they don't like it, they can get someone else. -- RonJeffries
I've seen some folks talk about 5 or 6 variables (more like "dimensions") rather than just 3 or 4: Cost, Quality, Schedule, Scope, Staffing, & Resources (sometimes staffing & resources are lumped together) -- BradAppleton
We lump cost, staff, resources under resources, break out under that heading. Similarly break out unit, functional, parallel tests under quality. Etc. --RonJeffries
Anything more than two dimensions will confuse pointy-haired managers. ;-> Ask them how they want to trade off Time (Schedule), Cost (Staffing & tools), and Features (scope). Quality is not negotiable. As Schedule and Staffing are usually fixed, the only option is to vary Features / Scope -- which can be very effective. -- JeffGrigg
Perhaps, quality should not be negotiable, but often it gets squeezed. If a team (I'm talking about the whole team: supervisors, customers, whatever) continues to mention it in the negotiations then it may help members to point out to each other when and where the Quality squeeze is creeping in. -- EricHerman
Quality should be negotiable. If you make an FP error in an Ariane5, you lose a few billion euros. The same mistake in a space shuttle kills people. If you apply space shuttle quality standards to commercial software them Microsoft has cornered the market before you reach release 1.0. You might say that increased quality speeds up development, but only to a point. The graph has a U shape. Once you're on the wrong side of the U, then you need to quantify and justify the costs and benefits of the additional rigor. -- DaveWhipp
What you've discovered is that, at a certain point, Reliability and Performance cease being part of Quality, and start becoming part of Scope.
In many places I visit, they consider scope fixed, and feel they can only move the schedule or add people. It takes much work on my part to introduce them to the notion that they can hold schedule and reduce scope. -- Alistair
ScottAmbler in the ProcessPatternsBook says that there are four "Estimating Factors" (Figure 4.6). But I think they can easily be reduced to the Four Variables shown above:
c.f. the BalloonStory
However, Quality is not independent from Time and Scope. If you reduce Quality, eventually development slows to a snail's pace as the programmers struggle with bugs and bad code, and so Time must be extended or Scope reduced.
More scope takes more time, higher quality in same time reduces scope, more programmers change soope/time, etc. None of the variables are independent. Wasn't suggesting that they were.
Most current theory proposes an inverse relationship between quality and time and resources. Simply put, crappy code takes more time, effort, and manpower to get working. Scope is increased with good quality because of the reduction in time, effort, and manpower. What are the arguments contrary to this reasoning? -- WayneMack
Exactly! You can trade the other three variables off against each other by decreasing one of them to keep the other two steady or improve them. E.g. decrease scope to keep within budget and time, or increase time to maintain budget and scope. However, the opposite holds for quality. If you decrease quality, you soon increase the time requirements, reduce the scope, go over budget etc.
What seems to be missing from this discussion is that what the four variables add up to, in a non-linear way, is efficiency (or effectiveness, or something like that). It is easy to see: by getting lazy, you can lower your scope, expand your schedule, and so on, without affecting the other variables. Maybe this is not mentioned because productivity is a sore spot for software developers. In non-Agile projects, mistrust tends to develop between management and developers; developers have to put in 50, 60, 70 hour weeks to try to prove that they are not lazy, because real programming productivity is so hard to measure.
A slightly off-topic example that comes to mind is NASA. When Daniel Goldin announced his "Faster, Cheaper, Better" policy, it was a thinly veiled admission that the agency was a bloated, inefficient bureaucracy, and that he intended to turn it into a lean, mean organization. (I'm taking "Better" in this context to mean both quality and scope.) (The failure of three Mars probes in a row underscores the point made earlier on this page that quality is the easiest variable to sweep under the rug.) Just the thought of trying to increase all the variables simultaneously...
In his book ExtremeProgrammingExplainedEmbraceChange, Kent Beck calls Quality a "strange" variable (sorry, I don't have the book on hand, so I can't provide the exact quote). He called quality "strange" because he was able to save time and increase quality by doing test first. It seems true enough, but he doesn't explain why. He leaves it a mystery. It's easy to explain in terms of efficiency. If you increase internal quality of the software, you are helping yourself code, and thus you increase your efficiency. And any time you change your efficiency level, you can make an apparent violation the conservation of the four variables.
(I say "internal" quality because I don't think that external quality (on the user interface, that is) has the same effect. That should actually slow you down.)
Do the variables add, or multiply?
With respect to your NASA example, apparently after the series of spectacular Mars mission failures the internal joke became "Faster, better, cheaper - pick two." The triangle reborn. -- MichaelPrescott
I favor the punchier variant: "On time, on budget, on Mars - choose two." -- Anonymous
I prefer to think of it as 3 variables:
Effective Resources, Schedule, Quality.
Effective Resources is a measurement of how fast you can get work done -- it's basically the team's velocity (XP style).
Schedule is what is desired to be delivered when, it's a combination of Scope and Time, and is basically the "requested velocity".
In his book ExtremeProgrammingExplainedEmbraceChange, Kent Beck calls Quality a "strange" variable (sorry, I don't have the book on hand, so I can't provide the exact quote). He called quality "strange" because he was able to save time and increase quality by doing test first. It seems true enough, but he doesn't explain why. He leaves it a mystery. It's easy to explain in terms of efficiency. If you increase internal quality of the software, you are helping yourself code, and thus you increase your efficiency. And any time you change your efficiency level, you can make an apparent violation the conservation of the four variables.
Regarding the 3 variable solution above, this can be explained very simply: The Effective Resources is increased by the better internal quality of the software.
Same mechanism as the original author's, but with a different model.
I picture it working like this, for iteration N
Someone describes the four variables as an IronTriangle? where "quality" is in the middle and can't be changed. The reason being that sacrificing quality means sacrificing schedule. I find DesignDebt to be the clearest way of understanding this: sacrifice quality for schedule and you've just taken out a high-interest loan on your source code. You'll be making interest payments in the form of increased cost of change from now until you improve quality again.
I've come to understand that DesignDebt is high interest. I've had the opportunity to lead a few projects that were fanatical about EliminateDesignDebt, so I've seen first-hand what happens when you let even a little bit intrude. The costs occur immediately, within a few days, and quickly outpace the time savings I got from sacrificing quality.
By the way, when I say "quality" above, I'm talking about internal quality--the quality of the design, automated build, revision control system, and other technical issues. External quality seems like a scoping issue to me.
--JimShore
Anyone involved in embedded software development (especially high-priced equipment) knows that quality is negotiable. Calling back devices or sending out mechanics is very expensive. That's why quality is usually more important in embedded software development (NASA example) or software that could endangour people (medical applications). Quality is a variable, althoug it's usually pretty fixed in a department.
-- Wilco
I see the relationship between the 4 variables differently. The system takes four inputs (each of which can obviously be broken into component parts) Scope Quality Time Resources
The system adds Scope and Quality to get a total, and takes away Time and Resources. The result has to be zero or the project wont work. This is generally true, some things are fixed and others can budge. In an XP environment Quality is fixed, no bugs are acceptable. That means that to reduce time or resource use you must reduce scope.
There are essentially two sides to this idea, the scope and quality make up one idea, and Time and resources make up the other side.
One problem comes when you have a set up as follows. There is a set quality, scope and time, but the sum isnt 0. To fix it the manager throws resources at the problem to make it go away. The problem with this approach is that it all inputs are bell curved, so if you add a disproportionate amount of time or resources, you get a law of diminishing return happening after a certain point. Adding 200 developers to a project because it is due in a week is obviously not going to fix the problem. having one developer and giving them 2 years will not fix the same problem
Another problem is that if you add quality or scope to the project, you get an exponential result. For example, the more quality I demand, the more the time and resources need to increase in response. For example. I have a program with a given degree of complexity, lets say it helps control the space shuttle, the degree of acceptable errors is orders of magnitude higher than commercial software, so the time and resources must increase proportionately. This means every time your boss says "I need twice as many features" you have to say "We need four times as many staff and four times as much time, and you cant just add time or just add staff"
Hmm, That seems to be how it would work to me.
-- RichardKroon?
This page mirrored in ExtremeProgrammingRoadmap as of April 29, 2006