Jump to content

Essential systems analysis

From Wikipedia, the free encyclopedia

Essential systems analysis was a new methodology for software specification published in 1984 by Stephen M. McMenamin and John F. Palmer for performing structured systems analysis based on the concept of event partitioning.[1]

The essence of a system is "its required behavior independent of the technology used to implement the system".[2] It is an abstract model of what the system must do without describing how it will do it.[2]

The methodology[1] proposed that finding the true requirements for an information system entails the development of an essential model for the system, based on the concepts of a perfect internal technology, composed of:

  • a perfect memory, that is infinitely fast and big, and
  • a perfect processor, that is infinitely potent and fast.

Edward Yourdon later adapted it to develop modern structured analysis.[3]

The main result was a new and more systematic way to develop the data-flow diagrams, which are the most characteristic tool of structured analysis.

Essential analysis, as adopted in Yourdon's modern structured analysis, was the main software development methodology until object-oriented analysis became mainstream.

References

[edit]
  1. ^ a b McMenamin, Stephen M.; Palmer, John F. (1984). Essential systems analysis. Yourdon Press. ISBN 978-0-917072-30-7.
  2. ^ a b Yourdon, Edward (2006). Just enough structured analysis. Ed Yourdon.
  3. ^ Yourdon, Edward. (1989). Modern structured analysis. Englewood Cliffs, N.J.: Yourdon Press. ISBN 0-13-598624-9. OCLC 17877629.