I remember twenty five years or so ago I was attending college, studying Computer Science and heard about some new graduate courses being offered in something called 'Software Engineering.' This was back in the mid 1980s when most college 'Computer Science' (if you can call it that) curricula consisted mostly of courses of how to build COBOL programs (mostly reporting) so students could become gainfully employed as 'fat, dumb, happy, COBOL programmers' (tongue in cheek) for some large business.
I had not completed my undergraduate degree, but had gone threw all the undergrad Computer Science courses I was interested in and Software Engineering sounded more interesting than filling my schedule with liberal arts courses (so much for the well-rounded student). I had no idea what I was in for. I just wanted to know how to get my code to compile so I could make a living doing so. We studied the works of the pioneers of Software Engineering, Booch, Jacabson, etc. and read Brooks' 'Mythical Man Month.'
Computer Programming started out as simple ones and zeros (and still is and always will be at the lowest level), but abstractions upon abstractions were created to make programmers more efficient. I remember typing machine language programs on my first computer (a Commodore VIC 20) and was excited to learn assembly language when an instruction could produce ten lines or more of machine code. Woo-hoo! And then higher-level languages were developed that produced more and more lines of assembly and machine code.
Programmers produced more and more machine code for the same or less effort. This certainly seemed like an improvement in efficiency, but with each abstraction, another layer was created between programmer and machine and another opportunity for increased complexity and errors. Instead of 'programming' a machine, programmers were coding an IDE that was generating code for a compiler that was generating code for an interpreter that was generating code for a linker and eventually generating code for virtual machines/runtime engines that then generated the code for the machine. Programmers had gone from programming for a specific machine/chip architecture to writing code without even knowing what architecture it would be deployed on.
Sure, this made it possible to do more and more, but the more that could be done, the more applications demanded and software development started to take incredible amounts of time and money. This was well documented in Fred Brooks' 1975 monumental essays, 'The Mythical Man-Month' regarding the experience of development of the IBM OS/360 operating system.
Software had become too complex for an individual or small group of individuals to be able to successfully deliver a working system without some formal organization and processes. The problem was studied by the early pioneers of software engineering and many possible methods and 'standards' were proposed and published for the development of software. I wonder if this was the same process for the genesis of other engineering disciplines. Were buildings, bridges and roads once constructed with little planning or thought until the value of engineering was realized? At some point, the risk, complexity, and cost of building something becomes so great, engineering must be utilized. Software was no different in this respect.
I became fascinated with the field of Software Engineering and read everything I could find on the subject (mind you this was before the personal computer or Internet were commonplace - I actually had to go to a physical library and locate physical books (imagine!)).
I was inspired to head out into the working world and show businesses how to 'engineer' software so it could be delivered promptly and correctly! Of course 'prompt' had a different meaning back then. I remember one early project I worked on where development was scheduled for five years with no coding to begin until the second year after a full year of analysis, requirement gathering and definition. 'Waterfall' hadn't become a dirty word yet and requirements changing after the one and only 'requirements' phase was considered very bad and often unacceptable. 'Freezing' requirements permitted the programmers to design and implement a system that met the desired requirements.
As a programmer I was pretty happy with this. I knew all the requirements I needed to meet before I started coding and knew exactly when the system was done. Life was good for me. Not so much for the stakeholder/clients - the people we were building the systems for in the first place. Of course we were implementing a set of 'frozen' requirements in order to be able to complete a system, but the real-world requirements never stayed frozen - the needs of the users and systems were always changing and by the time we delivered the system, many of the features were already obsolete and there was an immediate need for a new system.
So, the engineering analogy broke down for software development. We needed to employ software engineering in order to successfully produce reliable systems, but these system were irrelevant before they were finished. I don't imagine this happens a lot in other engineering disciplines. The requirement for a bridge is to permit travel from one point to another and it is unlikely this requirement will change during the actual construction of the bridge (although over a long time the capacity requirements may change requiring the bridge to be expanded, but this is nothing like the dynamic nature of software).
So after all those years of attempting to define a software engineering discipline, maybe there really was no such thing. The good news was that in the intervening years, hardware and software development tools had evolved greatly and software could now be built much easier and faster. When I first started programming we had to flow chart and hand code everything on paper before converting it to punch cards and submitting it to a time share system. We literally physically took the deck of cards to an operator at a computer center. Jobs were run overnight and output available the next morning. At most, you got one compile a day. With the evolution of computers and software, we could quickly compile (and eventually run without even compiling) and test our software over and over again very quickly.
So, maybe the whole existence of software engineering was a myth, but technology advances had eliminated the need for a formal process to produce reliable software? Okay, so the users/stakeholders must be happy now, right? Well, no. Since real-world requirements are always changing, the programmers are constantly scrambling to re-gather the requirements and alter system implementation in the middle of the process so the user still never gets the functionality they need until it is obsolete and requirements become so complex and dynamic, it becomes nearly impossible to catalog them and implement a reliable working system.
Sometimes I think, "what do I care? I'm making a living whether the user ever gets what they need or not." But, of course this is highly unsatisfying (for the users and the software developers). What if a building engineer spent their life designing structures, but none of their designs were ever completed because the needs of the structures were ever-changing? How satisfying would that be?
Programming is such a paradox. It is beautiful and highly rewarding in the power of what can be accomplished and created, yet cruel in that nothing is ever truly 'code complete' (no disrespect to Steve McConnell or his excellent book). We end up using terms that have little or no meaning like 'code complete.' I don't know how many times I have been asked if something was 'code complete.' Well, in my opinion, nothing is ever code complete. Even if a system has absolutely no bugs, 100% unit testing coverage with all tests passing, it is only complete in that it meets requirements that are no longer relevant. And or course it is not practical to spend the time getting complex business systems in this state (it would take so much time and money, there would be no point). So it is acceptable (or is it?) to release and even charge great sums of money for software that is known to fall short of meeting all user needs and contain defects.
In summary, software development became so complex and quality so poor that is was necessary to define a new engineering discipline. However, software quality was no better and customers needs still not met. The ability to successfully 'engineer' software just wasn't possible. But even though hardware and tools have improved immensely, the needs of business and software are still growing and changing so rapidly that software cannot be reliable produced without some formal process.
Is a new solution needed? Object-oriented programming was supposed to offer the additional abstraction and modularization of code to make it fully robust, reliable and testable. But like all abstractions, it has some holes or 'leaks' as some have grown accustomed to saying.
The whole agile movement was/is directed at quickly meeting the ever changing needs of users, but can we say the process successfully produces reliable software that fully meets user's needs? By what measurement? If the number of bugs and missing features is below an acceptable threshold (whatever that may be)? Should I forget about chasing 'code completion' and settle for 'close enough'? But, what is close enough? Is close enough for me close enough for you? For the users?
Wouldn't it be great if we really did have a science/discipline to accurately measure the completeness and correctness of software and the ability to achieve 100% completeness and correctness? Or is this unachievable perfection? I know some say test driven development achieves these goals. And I do like and recognize the value of unit testing, but aren't these tests just verifying that some code does what the author intended, which may or may not be complete and/or correct?
There still is a need for Software Engineering (or whatever we call it). Too much of our lives depend on software for us to continue to produce it using such immature and inaccurate processes.
Of course this is all based on my own personal experience. Maybe others are out there producing complete and accurate software systems that meet all user needs and everybody is 'living happily ever after.' I'm skeptical of this though. What is your opinion/experience? Is this still a need for software engineering? Are current software engineering methodologies adequate or is yet another paradigm shift in order? I'd love to hear what others 'in the trenches' of software development believe.
Monday, January 18, 2010
Subscribe to:
Posts (Atom)