APPENDIX:  A Brief History of Software and Software Engineering

 

You are probably curious about why and how the profession of Software Engineering arose.  Thus, it is useful to briefly examine the evolution of software provisioning and especially the awareness of the need for the structured approach provided by Software Engineering.

While the forerunner of computing is often cited as the work of Charles Babbage and Ada Lovelace in the 19th century as well as the development of Hollerith punched card equipment in the early part of the 20th century, it is the developments starting in the 1940s that provided the starting point for modern computing.  These early computers (some electrical-mechanical) were programmed in a very primitive manner by switches, by punched paper tape or by some form of plug board wiring that directed the computers operation.  The application of these early machines focused upon numerical computation of mathematical functions and the programs were quite small.

A major breakthrough occurred when the notion of stored program computers was proposed by John von Neumann in the latter years of the 1940s and was implemented, amongst others, for the first commercial computer, the Univac I.  At this stage, while numerical computation was still important, the focus began to shift to data processing (being able to store and process large quantities of data stored on magnetic tapes).  In fact, the first Univac I delivery in 1951 was to the United States Bureau of the Census where processing of population and production data as well as provisioning of statistics was the focus.

During the 1950s a variety of stored program computers were produced and marketed by several companies in the United States, Europe, the Soviet Union and Japan.   Programming of these early computers was still a challenge since machines were programmed at the machine instruction level.  A major breakthrough occurred in 1951 when Grace Murray Hopper introduced a means of compiling program code for numerical computations and the program that did the compiling she called a “compiler”.  This provided for the facility of re-use of program code in an effective manner, in particular the code for mathematical functions.

During the 1950s a number of programming languages and compilers evolved for both numerical computation; for example, Fortran, as well as the increasingly important area of data processing resulting in COBOL which even today remains as the most pervasive programming language for data processing.  Further, in the early 1960s, the utilization of computers for physical process control started to evolve and eventually languages that provided a higher-level means of programming these devices were provided.

As the higher-level languages provided a means to construct significantly larger applications and the use of computers became more pervasive it rapidly became clear that developing and sustaining large suites of programs presented an enormous intellectual and management challenge as described by Fred Brooks [1]. During the 1960s it was recognized that there was a “software crises” as indicated in the following quotation:

“The major cause of the software crisis is that the machines have become several orders of magnitude more powerful! To put it quite bluntly: as long as there were no machines, programming was no problem at all; when we had a few weak computers, programming became a mild problem, and now we have gigantic computers, programming has become an equally gigantic problem.”

     Edsger Dijkstra (1972)

 

Thus, as computing technology advanced and as larger and larger software suites evolved during the 1950’s and 60’s it became evident that a more structured approach to software was required.   While the term Software Engineering had been used by some authors in the mid 1960s, it was at a NATO sponsored meeting in Garmisch, Germany in 1968, that the need for a Software Engineering profession arose  [2]. The conference was attended by international software experts who agreed upon the need to define best practices for developing and sustaining software systems that are grounded in the application of an engineering approach.  

There are a variety of brief definitions of software engineering; for example, the Association for Computing Machinery (ACM), respectively the Institute of Electrical and Electronic Engineering (IEEE), define the profession directly in line with the 1968 Gramisch intentions:

“Software engineering is the study and application of engineering to the design, development, and maintenance of software.”

“the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software

As the need for Software Engineering evolved, advances in hardware technology in the 1960-70s provided both more powerful large computers and a range of smaller so-called mini-computers. A major hardware advance occurred in the mid 1970s with the development of large-scale integrated circuits.  Certainly, the availability of inexpensive microprocessors, large primary and secondary memories, graphic processors and communication via networks, including the Internet have been game changers leading eventually to desktop computers and today’s laptops, mobile phones and tablets as well as the pervasive use of embedded systems in a variety of products.  

So, in addition to the early application areas of numerical computation, data processing and process control, the software business has expanded to provide a wide variety of new application products such as e-mail, chatting, games, voice recognition, advanced graphics, smart phones, robotics, mobile communication, intelligent embedded devices, and so on.  Further, more recent developments have led to new infrastructure facilities in the form of Cloud computing, the Internet of Things and Cyber Physical systems. All of these developments have had a radical effect upon our society.  While providing many new possibilities, the “software crises” that Dijkstra pointed to has definitely intensified and we live in an era that raises important socio-technical issues as well fundamental concerns related to safety, security and integrity.  Thus there is a definite need to improve our capability to develop and sustain high quality software.

Based upon this brief history of software evolution and the advances in hardware leading to new software challenges, you now have an idea of why the Software Engineering profession evolved, but there is much to do.  It has become clear that programming and coding is only one aspect of software engineering.  As you will discover in reading this book, there are many problems (technical, organizational, managerial and social) that need to be dealt with in the provisioning and sustainment of high quality software.   The presentation of Software Engineering through the lens of Essence in this book provides a generic, yet substantive definition of the Software Engineering profession far beyond the typical single sentence definitions.

In the challenging environment that has evolved, there have been many discussions of the best practice approach to deal with the multiplicity of factors involved in the providing sustainable and high quality software systems.  Is it a prescriptive engineering approach or the more agile practice approach that is to be preferred? Or is it some combination of both approaches?  So, in continuing this brief software history story, focus is placed upon how various approaches to software engineering evolved.

Some of the important Software Engineering historical developments are as follows:

  • During the late 1960s – 1970s most popular approaches were based on the structure of programs emphasizing a function-data paradigm, basically a software system had two parts – the program part and the data part where the program part was organized to process a flow of data.  One of the most popular was Jackson Structured Programming (JSP) introduced by Michael Jackson [3].  In another important development, Douglas Ross developed SADT (Structured Analysis and Design Technique) that also emphasized the flow of data (for program structures) as well as the flow information and processing activities [4]. 
  • During this period, an approach to organizing work called the “waterfall method” evolved in which activities where related to a sequential design process in which progress is seen as flowing steadily downwards (like a waterfall) through phases such as conception, initiation, analysis, design, construction, testing, production/implementation and maintenance.
  • During the same time period in the telecommunication business another program structure approach based on the component paradigm was applied – a system is a set of components interacting by sending messages to one another. Ivar Jacobson (an author of this book) was the original developer of this new approach. His college Göran Hemdal refined it and implemented it with programming language support and firmware to provide components “all the way down” to executable code.  In 1976 Ericsson AB used this approach and produced the AXE telecommunication switching system that became world’s leading and resulted in the largest commercial success stories ever in Sweden. At the same time, the visual language of the approach inspired the development of an international standard in telecommunications called SDL (Specification and Description Language).  Sequence diagrams showing interactions among components, one of Ivar’s ideas, was adopted in SDL.
  • During the 1980s, object-oriented programming (with languages such as Simula (appeared already in the late 1960s), Smalltalk, Eiffel, Objective-C, C++, Ada, Java) became mainstream and a large number of approaches to building object-oriented systems became popular.  Object-orientation took the idea of components to a level of much finer granularity (everything was an object) so that the system was viewed as a huge set of objects where program behavior was realized by the objects interacting with one another.
  • Eventually (late 1980s and 1990s) the component idea and the object idea were merged and components became exchangeable packages, and the objects were executable elements creating the functional behavior. 
  • During this period, an important abstraction used to model the requirements of a system called Use Cases was introduced by Ivar Jacobson [5]. 
  • A more comprehensive set of abstractions were provided in the Unified Modeling Language (UML) standard for design of software intensive systems that was adopted by Object Management Group in 1997.  The three original developers of UML were Grady Booch, Ivar Jacobson and James Rumbaugh [6].
  • An international standard for Software Life Cycle Processes; namely ISO/IEC 12207 was developed in the early 1990’s and was followed by the development of the ISO/IEC 15288 standard for System Life Cycle Processes. This was in recognition that a software system always exists in a wider system context.  One of the authors of this book Harold “Bud” Lawson was the architect of this system standard.  More recently Ivar Jacobson and Bud Lawson have contributed to and edited a book that provides various perspectives on Software Engineering in the Systems Context [7].  The 12207 and 15288 standards are being harmonized in recognition of the strong relationship between Systems Engineering and Software Engineering [8].
  • The most popular method for software development was the Unified Process (UP), which was provided as Rational UP (RUP) in 1996.  Ivar Jacobson was a father of RUP but many other people contributed to its development, in particular Philippe Kruchten and Walker Royce [9]. RUP was based upon some proven best practices still in use today such as use cases, components and architecture.  However, RUP was too large, too prescriptive and very difficult to adopt widely.
  • Around year 2000 as a counter reaction to RUP and other document heavy approaches, a new movement arose, which promoted light and flexible methods combined with modern ideas for teamwork and fast delivery with feedback.  Agility became the word of the day and agile methods became the way forward.  The most influential people in this movement were:
    • Extreme Programming with User Story and Test-Driven Development (TDD) introduced by Kent Beck [10], [11].
    • Scrum by Ken Schwaber and Jeff Sutherland [12].
  • Now in the current decennium of 2010 new methods for scaling agile such as:
    • SAFe (Scaled Agile Framework) introduced by Dean Leffingwell [13].
    • DAD (Disciplined Agile Delivery) introduced by Scott Ambler [14].
    • LeSS (Large Scale Scrum) introduced by Craig Larman and Bas Vodde [15].
    • SPS (Scaled Professional Scrum) by Ken Schwaber again [16].

While the approaches have provided structure and discipline in providing software products the number of methods and practices have exploded.  And, while there are successes there are far too many failed and very expensive software endeavors.  Many methods have become “religions” amongst enthusiastic creators and their followers.  In fact, the popularity of the methods seems to be more like the fashion industry where differences are not well understood and artificially magnified.  Further, there is a lack of objective evaluation, comparison and validation of the methods and their composed practices.

So, given the multiplicity of methods that have arisen, a vital question is WHAT method should YOU (and the team of which you are member) learn to utilize?  Further, how well do the method and its practices provide the multiple capabilities required to perform effective team based software engineering.

In the series of best practice and method churns, a new community woke up.  A community called SEMAT (Software Engineering Method And Theory) founded by Ivar Jacobson, Bertrand Meyer and Richard Soley wanted to stop the madness of throwing out all what organizations and their teams have established and to constantly start over with new methods.   As a result of the diligent efforts of a group of software engineering professionals the Essence of Software Engineering evolved as a method and practice independent approach that has become an Object Management Group standard [17].  

This new approach will certainly provide a basis for re-founding the profession of Software Engineering [18].

So now you should have an appreciation of how software and in particular software engineering have evolved.  Essentializing software engineering as presented in this book has provided, for the first time, a means of unifying the multiple perspectives of software engineering.

References

  1. Fredrick Brooks, “The Mythical Man-Month”, Addison Wesley, 1975.
  2. NATO “Software Engineering: Report on a conference sponsored by the NATO Science Committee” Garmisch, Germany, Editors: Peter Naur and Brian Randell, 7th to 11th October 1968.
  3. Michael Jackson, “Principles of Program Design”, Academic Press, 1975.
  4. Doug Ross, “Structured Analysis (SA): A Language for Communicating Ideas”, IEEE Transactions on Software Engineering, SE-3(1), pp. 16-34, 1977.
  5. Ivar Jacobson, Magnus Christerson. Patrik Jonsson and Gunnar Overgaard “Object-Oriented Software Engineering: A Use Case Driven Approach”, ACM Press Addison-Wesley, 1992.
  6. Grady Booch, Ivar Jacobson, and James Rambaugh “Unified Modeling Language User Guide” (Second edition), Addison-Wesley, 2005.
  7. Ivar Jacobson and Harold Lawson (editors) “Software Engineering in the Systems Context”, Systems Series, Volume 7, College Publications, Kings College, London, 2015.
  8. ISO/IEC/IEEE 15288 “Systems and Software Engineering - System life cycle processes”, International Standardization Organization/International Electrotechnical Commission, 1, rue de Varembe, CH-1211 Geneve 20, Switzerland, 2002, 2008, 2015.
  9. Philippe Kruchten. “The Rational Unified Process-An Introduction”, Addison-Wesley; 3rd editon, 2003.
  10. Kent Beck, “Extreme Programming Explained: Embrace Change”, Addison-Wesley, 1999.
  11. Kent Beck, “Test-Driven Development by Example”, Addison Wesley, 2003.
  12. Ken Schwaber and Jeff Sutherland, “The Scrum Guide”, https://www.scrumguides.org/docs/scrumguide/v1/scrum-guide-us.pdf
  13. Dean Leffingwell, “Scaling Software Agility: Best Practices for Large Enterprises”, Addison-Wesley, 2007.
  14. Scott Ambler and Mark Line. Disciplined Agile Delivery: A Practitioner's Guide to Agile Software Delivery in the Enterprise, IBM Press, 2012.
  15. Craig Larman and Bas Vodde,  “Scaling Lean & Agile Development: Thinking and Organizational Tools for Large-Scale Scrum”, Pearson Education, Inc., 2008.
  16. Ken Schwaber  https://www.scrum.org/professional-scrum-certifications/scaled-professional-scrum-assessment .
  17. OMG Essence Specification http://www.omg.org/spec/Essence/Current, 2014.
  18. Ivar Jacobson and Ed Seidewitz, “A New Software Engineering”, Communications of the ACM, Vol. 12, No. 10, 2014.