The Phase II comprehensive is the gatekeeper to the Master's degree
here at Irvine. It consists of four essay questions in six hours
based on a reading list of 62 papers and 2 books. The PhD defense is
simpler by comparison: it demonstrates mastery of your subject in
depth -- this is the last checkpoint of your breadth and classes.
If I pass, I have to finish four term papers held over from terms
past -- and then, finally, the road ahead will be what I make of it;
the research is all that's left. It means I have to face into the
Munchkin well and see what I can make of it. If I fail, or take a
low-pass, it won't be the end of the world, but it'll be six more
months of studying and hand-wringing.
That's possible -- I feel I know so much less after studying.
Recalling bibliographic entries, dates, names, titles from memory --
to say nothing of theories, details, models, systems, processes! --
seems too far gone now. My software has been optimized for open-book
testing since Caltech's Honor Code took hold in '91.
In fact, one of my final readings brought home how much I've
forgotten. The original 1968 NATO report founding 'software
engineering' already pointed at Christopher Alexander's 1964
architectural manifesto. Search the 'Net, and you'll find a strident
paper linking the two from exactly four years ago:
>This paper is an introduction to Design Patterns and a discussion of
>how Alexander's and his colleagues' ideas took root in the distant
>soil of Computer Science... Alexander's ideas found a comfortable
>home because many designers already had proposed convergent schemes
>(Archetypes, any of the systems mentioned in Lea's paper), and
>because software design has always been in an intense soul-searching
>mode to find an established discipline to look up to. In the 1968,
>Software Engineering was coined by the faction inspired by civil and
>electrical engineering, with its statistical controls, standardized
>models, and absolute faith and correctness. Only recently has a new
>generation advocated adding more art to our pseudo science by
>advocating Software Architecture. As a professional metaphor, it
>works on several levels: as a client-services model, as an artistic
>pretension, as a central coordination point for the "construction"
>process, and a belated concession that there is some nameless
>quality that separates elegant software from the rest.
http://xent.ics.uci.edu/%7Ekhare/Alexander.htmld/
Sigh. I knew so much more then. Facts only interfere with learning, you know...
In the midst of this idiotic, pointless (and, oh alright, I'll admit
it: entertaining) flamewar on the list right now, I'd like to borrow
a little of your collective best wishes on the battlefield tomorrow.
After all, there's more than one FoRK on the reading list itself :-)
Rohit
----------------------------------------------------------------------
---------------------------
http://www.ics.uci.edu/~rohit/rk-p2
Foundations
Dijkstra: Go To Considered Harmful
[Dij68] E.W. Dijkstra. "Go to Statement Considered Harmful" (letter to
the Editor), Communications of the ACM, 11(3):147-148, March 1968.
Go To forces us to analyze the dynamic behavior of the system rather
than playing to human strength (static, textual analysis).
Definition of the program spreads out in space (program text) and time
(execution states). Go To makes them incomparable, since there is no
definite point in between each statement.
As of 1968, it was NOT proven in terms of unclear preconditions --
that was a later rerationalization. But, there was an operational
translation technique: Jacopini proved the superflousness of the goto
statement in 1966.
McCarthy invented the case statement. Has reemerged in principled form
for exception handling.
Hoare69: Axiomatic Proof
[Hoa69] C.A.R. Hoare. "An Axiomatic Basis for Computer Programming",
Communications of the ACM, 12(10):576-583, October 1969.
Introduced pre and post conditions. Inductive proof. Mapping axioms to
programming language constructs.
Footnote: Validity of three integer arithmetics rules: halt, fix to
floor/ceil of range, and modulo wrapping all work; just pick one.
Wirth71: Refinement (8-Queens)
[Wir71] N. Wirth. "Program Development by Stepwise Refinement",
Communications of the ACM, 14(4):221-227, April 1971.
8-Queens. Start with the 2^64 and reduction in state space through
several heuristics (preselection). Using the formal specification for
insight (data structure, enumeration order).
Liskov & Zilles: Spec Tech / ADTs
[LZ75] B.H. Liskov and S.N. Zilles. "Specification Techniques for Data
Abstraction", IEEE Transactions on Software Engineering",
SE-1(1):7-19, March 1975.
An eminently readable defense of 1) abstraction/ADTs and 2)
then-nascent specification techniques. Presented a stack using
v-graphs, vienna DL, algebra, (those were the abstraction; next ones
are operational) stack machine, algebra, axioms
Criteria: Formality, Constructability, Comprehensibilty, Minimality,
Range, Extensibility.
Footnote: the same Zilles from Adobe I knew at W3C…
DeRemer & Kron: MILs for Programming-in-the-Large
[DK76] F. DeRemer and H.H. Kron. "Programming-in-the-Large versus
Programming-in-the-Small", IEEE Transactions on Software Engineering,
SE-2(2):80-86, June 1976.
Argues that scale is a difference in quality, not quantity: need new
languages to address new concerns (mirrors CKI88 in a way).
Introduced MIL 75, Module Interconnection Language. Belief that LPSs
do a different job than MILs. Requires a hierarchical uses
relationship ("system tree"). Analogy to conventional 'linkers'.
Enforced hiding/accessibility rules.
Leaves dynamism -- evolving interconnections -- for future
research@@??
Useful for identifying cohesion and coupling -- not just which
components use each other, but which don't.
Ghezzi: The Blue Book
[GJM91] C. Ghezzi, M. Jazayeri and D. Mandrioli. Fundamentals of
Software Engineering. Prentice Hall, Englewood Cliffs, NJ, 1991.
Best part of the book are the bibliographic notes. Anyway, the only
thing really worth taking home is the comprehensive list of
internal/external, product/process ilities:
* Correctness (specification agreement)
* Reliability (MBTF on 'valid' input)
* Robustness (reaction to 'invalid' input)
* User Friendliness (tied with Ease of Use)
* Verifiability (separate from Testability)
* Maintainability
* Reusability
* Portability
* Understandability (tied with Ease of Learning)
* Interoperability
* Productivity
* Timeliness
* Visibility
Parnas93: Out-of-bounds Predicates are False, Dammit!
[Par93] D.L. Parnas. "Predicate Logic for Software Engineering", IEEE
Transactions on Software Engineering, 19(9):856-862, September 1993.
Key is distinguishing all the [Par]s -- the Blue book does that well.
This is fine acerbic Parnas, rubbing everyone's face in perceived
errors only he can see.
It *is* a blow for common sense, though… key point: conciseness
is a hugely important criteria for practical formality, even more than
well-foundedness for analysis, potentially.
Brooks: Silver Bullet
[Bro95] F.P. Brooks. The Mythical Man-Month, 25th Anniversary Edition.
Addison-Wesley, Reading, MA, 1995.
Accident/essence. Old bullets: HLL, Timesharing, IDEs. Probably not:
Ada, AI, verification, graphical langs, OO, environments. Maybe: Reuse
(buy), requirements, incremental dev, design training.
Why? The classic refrain that SW invisible and malleable -- and
expected to be continually changed.
After 20 years, what holds: consistency, architect. Doesn’t?
open-book (Parnas was right); 1-2-throw-away; user-testing moves to
head of the waterfall.
"where is next November?"
Requirements & Specifications
Peterson: Petri Nets
[Pet77] J.L. Peterson. "Petri Nets", ACM Computing Surveys,
9(3):223-252, September 1977.
The basics. Petri wrote his thesis in Germany in '62. You can add
token colors, values; add preconditions to transitions. Petri nets are
tractable for analysis, but face the same ultimate decision-power
limits as FSMs or anything else sufficiently expressive.
Heninger: A-7 Requirements (SCR)
[Hen80] K.L. Heninger. "Specifying Software Requirements for Complex
Systems: New Techniques and Their Application", IEEE Transactions on
Software Engineering, SE-6(1):2-13, January 1980.
500-page reverse engineered spec. Textual Markup of inputs, outputs,
conditions. Critical importance of specification in maintenance and
evolution. Rigor paid off in detecting errors. *External* observables
only.
Wing: Specifier's Guide
[Win90] J.M. Wing. "A Specifier's Introduction to Formal Methods",
IEEE Computer, 23(9):8-24, September 1990.
Student of Liskov's, went to CMU.
A guide to types of specification languages: Behavioral and
Structural. Abstract Data Types: Z, Vienna VDM, Larch for a symbol
table. Concurrency: Temporal logic, CSP, Lamport's transition axioms
for unbounded async buffer.
The <Syn, Sem, Sat> abstraction of specifications. Specificands@@
LW98: Temporal-logic inference of scenarios
[LW98] A. van Lamsweerde and L. Willemet. "Inferring Declarative
Requirements Specifications from Operational Scenarios", IEEE
Transactions on Software Engineering, 24(12):???-???, December 1998.
The kind of paper that gives European CS its reputation. There's an
idea in there, but damned if they don't bury it under stultifying
formalism (for more of the same, see ILL75).
Collect example/counterexample scenarios. Partition by domains of
concern (safety, security, performance, etc). From each slice extract
candidate temporal models, then merge to preserve/eliminate the
behavior from the tentative formula. Then generalize, prompt further
questions. Claims lots of case studies, but little dramatic evidence
of what the tool can do.
Meyer ("the Eiffel guy"): Formalism is good
[Mey85] B. Meyer. "On Formalism in Specifications", IEEE Software,
2(1):6-26, January 1985.
"The benefits of formal specification". Surfacing errors in even the
simplest line-breaking problem: Wirth got it incomplete; Goodenough
and Gearheart screwed it up by overdetermining it (in natural
language). Formal spec is indeed more concise, more verifiable.
7 Sins: Real contribution is a checklist of all the myriad ways
natural language fails: Noise, Silence, Overspecification,
Contradiction, Ambiguity, Forward reference, Wishful thinking.
Kemmerer: Integrating Formal Methods / SRT
[Kem90] R.A. Kemmerer. "Integrating Formal Methods into the
Development Process", IEEE Software, 7(5):37-50, September 1990.
Secure Release Terminal. 1Kloc, developed through successive
refinement of Ima Jo specs. Start with the abstract terminal, delay
all possible design decisions (e.g screens, lines appear way after
"review document"). (is he still at UC SB?)
Design
Grady: OO in Ada
[Boo86] G. Booch. "Object-Oriented Development", IEEE Transactions on
Software Engineering, SE-12(2):211-221, February 1986.
The most basic sort of introduction: joint operation/data
encapsulation: but it's more than an ADT because of inheritance (not
strictly required). (@@ anything else?). Decomposes a cruise control,
first functionally then into objects (@@agents?). Usual claims, a la
modularity: increases changeability through localization. Applied as a
development style for using Ada packages.
Parnas79: Extension and Contraction
[Par79] D.L. Parnas. "Designing Software for Ease of Extension and
Contraction", IEEE Transactions on Software Engineering,
SE-5(2):128-138, March 1979.
Minimal subset/minimal increment style of development. What is mutable
should be hidden (aka module-secret, also due to Parnas).
The virtual machine construct: modules add commands to the basic
machine set. (but basic term due to Dij68, the THE multiprogramming
system)
Risk factors: information spread; uses chains; modules too-large
Module use must become strictly hierarchical to be tractable. Uses
relation establishes hierarchy. "circular" reference is resolved by
slicing/sandwich.
Parnas86: Faking it
[PC86] D.L. Parnas and P.C. Clements. "A Rational Design Process: How
and Why to Fake It", IEEE Transactions on Software Engineering,
SE-12(2):251-257, February 1986.
The ideal is strictly forward-determined: reqts, spec, design, inter-
and intra-module design, coding. Faking it is going back and
maintaining the required document even though you hop back and forth
between stages. Why? Because the ideal process is fundamentally
disconnected from reality. (but in the paper trail, the only
difference should be extra pages documenting design choices not
taken).
Write from the perspective of an outsider. Must be implementation
independent, live reference document, involve users.
FGNR92 (Redmiles): Knowledge-Based SE/ Critics
[FGNR92] G. Fischer, A. Girgensohn, K. Nakakoji, and D.F. Redmiles.
"Supporting Software Designers with Integrated Domain-Oriented Design
Environments", IEEE Transactions on Software Engineering,
18(6):511-522, June 1992.
CatalogExplorer, Explainer, Modifier. Multiple perspectives. Chaining
inferences. Critics. Agendas. Charting domain, but also for other
KB-SE's. Hard-to-memorize content-free title…
Integrated LISP-style of environment, much influenced by
interlisp/symbolics document examiner tricks for DWIM and
hyperdocumentation. One of the few examples in the phase II of AI
applied (Simon was in favor; Parnas argued against?).
Perry & Wolf: Foundations of SW Architecture
[PW92] D.E. Perry and A.L. Wolf. "Foundations for the Study of
Software Architecture". ACM Software Engineering Notes, 17(4):40-52,
October 1992.
Crap. Architecture= Form, Elements, Rationale. Silly compiler example
(sequence, vs shared-store concurrent). No real understanding of
'materials' analogy. Fatal confusion of 'style'. Erosion and Drift.
They invert style and architecture: I'd argue that their two examples
are *different* architectures. @@arrogance in defining a 'field'.
The true foundations are rooted in field study, hence patterns.
Vitruvius instead, if they read any real architecture: firmness,
commodity, and delight. Real benefit may be the *professional* model
of practicing architects (& their studio-driven training).
Song & Osterweil: Design-Method Comparisons
[SO92] X. Song and L.J. Osterweil. "Toward Objective, Systematic
Design-Method Comparisons", IEEE Software, 9(3):43-53, May 1992.
Crap. It describes how to build a base framework/metalanguage to study
the problem… yet still manages to avoid saying anything about
Jackson System Design/Booch/Parnas@@/Structured Design/etc…
I asked why it's even in the phase II: it was the first attempt to
bring *any* systematization to the task. In fact, could (was) applied
to other methodologies and indeed revealed missing details. The
meta-meta-process, after all, is isomorphic to the scientific method.
Validation
DeMillo, Lipton, & Perlis: 'Social Proofs'
[DLP79] R.A. DeMillo, R.J. Lipton and A.J. Perlis. "Social Processes
and Proofs of Theorems and Programs", Communications of the ACM,
22(5):271-280, May 1979.
Could be seen as opposing Dijkstra’s testing-can-only-prove-bugs!
view, because proofs are merely socially constructed, too. Automation
failed Russel, it will fail SE. Mathematicians pour lots of
institutional and personal capital into this dialogue.
[there is rarely continuity in verification: unit test doesn’t
scale to the integrated whole. Inspiration for the opensource
movement?: is it easier to review 300 lines than a thousands?
Can’t trust a million-line proof, either (link to 4-color
theory)]
Demillo and Lipton are real testers: (?invented?) mutation testing.
Perlis wasn’t (Tarjan's student, as in the lipton-tarjan
planarity test)
Weyuker: Axiomatizing Test Data
[Wey86] E.J. Weyuker. "Axiomatizing Software Test Data Adequacy". IEEE
Transactions on Software Engineering, SE-12(12):1128-1138, December
1986.
Static text: branch adequacy. Dynamic: monotonicity, composition, etc.
Point is to be able to analyze the ideal test sets for "closely
related" programs. Potentially speaks to the utility of mutation
testing.
GHM87 (Harlan Mills): Modules
[GHM87] J.D. Gannon, R.G. Hamlet, and H.D. Mills. "Theory of Modules".
IEEE Transactions on Software Engineering, SE-13(7):820-829, 1987.
The Box. (RepFun o AbsOp) implies (ConOp o RepFun). "denotational
semantics" on the abstract side. Modular proof (opposes DLP79).
Their "module" is an ADT. No side effects allowed
Clarke (Richardson): Lattice of dataflow path selection criteria
[CPRZ89] L.A. Clarke, A. Podgurski, D.J. Richardson and S.J. Zeil. "A
Formal Evaluation of Data Flow Path Selection Criteria", IEEE
Transactions on Software Engineering, 15(11):1318-1332, November 1989.
Formalization of dataflow paths across a control flow graph G with def
and uses/predicate, uses/compute to classify a dozen different
selection techniques. All-paths, all-def, all-use, all-… Point:
software engineering means bringing a formal basis to all phases.
Meta-validation of testing support tools/processes. Good example of
checks-and-balances upon the literature
Young & Taylor: Taxonomy of Fault Detection
[YT89] M. Young and R.N. Taylor. "Rethinking the Taxonomy of Fault
Detection Techniques", in Proceedings of the 11th International
Conference on Software Engineering, pp. 53-62, Pittsburgh, PA, May
1989.
Old way of static vs dynamic useless except for stage-of-applicability
Folding and Sampling. Peak of the pyramid is infeasible. Hybrid
techniques reach the interior of the simplex. exact ranking along the
perimeter, ranked by power/cheapness:
Testing: exhaustive, path coverage, mutation, Branch, Node
Validation: auto-proof, reachability, dataflow, structural
Atlee & Gannon: State-based Model-checking of Event-driven Requirements
[AG93] J.M. Atlee and J.D. Gannon. "State-Based Model Checking of
Event-Driven System Requirements", IEEE Transactions on Software
Engineering, 19(1):24-40, January 1993.
@@translating SCR models into CTL models (a hardware FSM language)
with a series of tricks (enumerate all the 'unchanged' SCR inputs,
represent modes as states with predicates on transitions.
Weaken/strengthen exit conditions with loops back to self@@). Use CTL
model checker to find @@deadlocks? Broken assertions? @@map back onto
SCR modes naturally?
[ILL75] (Luckham): Auto-Verification (???)
[ILL75] S. Igarashi, R.L. London and D.C. Luckham, "Automatic Program
Verification I: A Logical Basis and its Implementation", Acta
Informatica, 4:145-182, 1975.
Three caballeros from SAIL write very early mlisp program to read in
simplified pascal, annotated with entry/exit assertions, and
mechanically map the assertions via Hoare/Wirth's Pascal semantics to
spit out 'final' versions of corresponding verification propositions
(then to be passed to Luckham's eventual theorem prover).
Point: we once had dreams of automation. Examples ranged from
factorial (recursive and procedural) up to a mini-compiler.
Debra: TAOS
[Ric94] D.J. Richardson. "TAOS: Testing with Analysis and Oracle
Support", in Proceedings of the 1994 International Symposium on
Software Testing and Analysis, pp. 138-153, Seattle, WA, August 1994.
Vision of specification-based testing integrated into the development
environment and development process. Tests could be automated,
generated by random grammars up through custom harnesses; then checked
against oracles (as simple as diff against known-good output to
'interpreted' specifications). Metapoint: testing is as ineffective as
human testers' vigilance. ProDAG to visualize what parts of the
program tests exercise. @@metrics??
Reliability & Safety
Goel: Reliability Models
[Goe85] A.L. Goel. "Software Reliability Models: Assumptions,
Limitations, and Applicability", IEEE Transactions on Software
Engineering, SE-11(12):1411-1423, 1985.
Mini statistical primer on models of stochastic processes -- Gaussian
(large N) and Poisson (rare event) distributions. Also choose model of
'decay': there's no 'wearout' like hardware, but errors are flushed
out sooner than later. Expose assumptions about the timing of error
fixes.
Result: clarify various definitions of reliability, once you step at
all away from Dijkstran manicheanism. MBTF seemed to be his
abstraction of choice.
Leveson: Safety
[Lev86] N.G. Leveson. "Software Safety: What, Why, and How", ACM
Computing Surveys, 18(2):125-163, June 1986.
Lots of vivid examples. Safety is like security, in that hard
invariants must be maintained: except safety conditions must be
avoided instead of provided. But safety is traded off against
reliability (and robustness): safe thing may be to fail-safe at the
slightest provocation, for example. Risk displacement, if not
elimination.
Point: different development processes. Zero-defect safe systems need
a minimal number of chefs; open-source zoo better for security
software.
Don't forget definitions of fault (causes failure) and hazard
(?unanticipated? conditions leading to failure).
Traditional SE doesn't work well: often no chance of run-time
patching, and failures too harmful. Hazard analysis by fault-trees,
propagating risks back to modules. Common mode analysis: think what
faults could affect several modules. Cites RTL, timed Petri Nets.
Hints at ergonomic/UI failures.
Integrity: need for self-diagnostics and 'malicious' threats.
Solution techniques much as for formal program proof -- but 1) must
scale up and 2) then-seen as especially socially urgent in the Cold
War/SDI age of nuclear threats/CPSR activism. [Parnas, again,
acerbically out front…]
Paper says little about additional network threats. @@the exam really
ought to include Neumann…
Environments
Teitelman & Masinter: Interlisp
[TM81] W. Teitelman and L. Masinter. "The Interlisp Programming
Environment", IEEE Computer, 14(4):25-33, April 1981.
Pros and cons of sustained expert activity in a single-language
system/environment. Version control, masterscope cross-references,
do-what-I-mean, multilateral extensibility, file packages.
Experimental programming. Evolving environment.
Was started in 1966, and was popular by 1974. Awarded the ACM
"Software System Award" in 1994.
PS83: Transformational Programming
[PS83] H. Partsch and R. Steinbrueggen. "Program Transformation
Systems", ACM Computing Surveys, 15(3):199-236, September 1983.
Extensive review of different programming environments which use
transformation steps. @@some examples??@@
A lot of then-interesting techniques seem to be in optimizing
compilers now. Mostly helps with 'accident', imho, except to the
degree it's a selection from a reuse library of "paradigmatic"
(archetypal) programs.
Reiss: Field
[Rei90] S.P. Reiss. "Connecting Tools Using Message Passing in the
Field Environment", IEEE Software, 7(4):57-66, July 1990.
Broadcast message server to achieve same level of user-friendliness
and integration as the PC. Pure-text marshalling of
requests/notifications (printf/scanf, in fact). Message filtering was
also based on text.
Remarkable results: he actually built most of the environment he
envisioned -- and he got network access "for free". Became commercial
Softbench product from HP (grudgingly credited).
Arcadia
[Kad92] R. Kadia. "Issues Encountered in Building a Flexible Software
Development Environment: Lessons Learned from the Arcadia Project", in
Proceedings of ACM SIGSOFT '92: Fifth Symposium on Software
Development Environments, pp. 169-180, Washington, DC, December 1992.
Crosscutting issues: heterogeneity, components, type systems,
dynamism, concurrency, event-based integration. Just memorize the
whole damn paper.@@
Reps & Teitelbaum: SynthGen
[RT84] T. Reps and T. Teitelbaum. "The Synthesizer Generator", in
Proceedings of the ACM SIGSOFT/SIGPLAN Software Engineering Symposium
on Practical Software Development Environments, pp. 42-48, Pittsburgh,
PA, April 1984.
If it's anything like the earlier Cornell Program Synthesizer project
(RT84 is also described in a sidebar to Field), it's a pure ADT
editor, with a mixed mode for typing in "expressions" directly (which
are immediately parsed and checked), but otherwise limited to filling
in slots (see PS83).
TN92: Definition of Tool Integration
[TN92] I. Thomas and B.A. Nejmeh. "Definitions of Tool Integration for
Environments", IEEE Software, 9(2):29-35, March 1992.
presentation: appearance and behavior, interaction model
control: provision use
data: interoperability, exchange, consistency, nonredundancy, sync
process: process step, event definition
Tichy: RCS
[Tic85] W.F. Tichy. "RCS-A System for Version Control", Software -
Practice and Experience, 15(7):637-54, July 1985.
The usual. Technical difference was storing reverse-deltas (and fwds
for branched), optimizing the 'usual case' of access. Well-integrated.
Tichy also advocated AI's role in SE, but does not bear on this paper.
Process
Harlan Mills: Cleanroom
[MDL87] H.D. Mills, M. Dyer and R.C. Linger. "Cleanroom Software
Engineering", IEEE Software, 4(5):19-25, September 1987.
"Pencil and paper were good enough for my grandfather, dammit!"
Reported significant gains in quality by forcing all design work to
proceed directly before writing a line of code and firing up a
machine. Overblown analogy to defect-prevention through controlled
environments in semiconductors. An intellectual (and commercial)
heritor of Fag76.
Osterweil: Processes are Software, too
[Ost87] L.J. Osterweil. "Software Processes Are Software Too", in
Proceedings of the 9th International Conference on Software
Engineering, pp. 2-13, Monterey, CA, March 1987.
It's process all the way down. Keep refining until you have rules so
small they can be fully programmed for automated enactment. Put
programmers in their place -- call them like subroutines when
creativity is called for.
Double indirection of process programming: indicates new languages,
etc. APPL/A@@
Boehm: Spiral Model
[Boe88] B.W. Boehm. "A Spiral Model of Software Development and
Enhancement", IEEE Computer, 21(5):61-72, May 1988.
Call Off Your Old Tired Documents: the real reason we write them is to
manage risk, so surface that requirement and go after successive
elaboration -- and early detection of high-risk parts. @@generate,
evaluate, implement, plan cycle repeated as needed.
TRW success story, but where else? Tied to lots of USC output on cost
models, but not clearly dominant in industry at present.
[BFG93] Fuggetta & Ghezzi: SPADE
[BFG93] S. Bandinelli, A. Fuggetta and C. Ghezzi. "Software Process
Model Evolution in the SPADE Environment", IEEE Transactions on
Software Engineering, 19(12):1128-1144, December 1993.
Wow. And they wonder why process didn't take off… A stupefyingly
complex process model masquerading as a simple one -- in the interest
of reducing everything to nodes and arcs (so that all higher-order
concepts like 'type' would be fully mutable), they kept extending and
extending Petri nets into ER nets until they ablated off any
possibility of analysis assitance -- and for no gain in
comprehensibility.
SPADE process engines fork off activities as tokens enter new tasks,
but unlike other systems, can be paused, have their models scrambled,
and restarted. Offers concurrency after an initial boot loader.
There is a very general-purpose distribution language struggling to
get out here. Whether it makes the process engineer's life easier is
an open question.
Bolcer & Taylor: Endeavors
[BT96] G.A. Bolcer and R.N. Taylor. "Endeavors: A Process System
Integration Infrastructure", in Proceedings of the IEEE Computer
Society Fourth International Conference on the Software Process, pp.
76-89, Brighton, UK, December 1996.
Just the usual promotion of its PlasticMan extensibility: that unlike
its predecessors almost every detail has an open interface (hence
replaceable). This paper per se does not discuss any sample processes
of significance (except Endeavors-in-Endeavors).
Fagan: Inspection trumps Walkthrough
[Fag76] M.E. Fagan. "Design and Code Inspections to Reduce Errors in
Program Development", IBM Systems Journal, 15(3):182-211, 1976.
Walk-throughs are too informal -- and too much under the control of
the original coder -- to show the kind of sustained results (20-25%
lifecycle reductions) of Fagan's formal interventions. Culture is key:
trained moderators and retribution-free enlightened management. 2-4
person teams, armed with checklists of likely errors (tuned to error
distributions from the same project's earlier days, ideally) do their
homework, then get together to focus solely on finding errors (not
solving them).
The use of metrics for management -- for costing, predicting lines of
code, and improving personal quality -- all presage the SEI CMM a
decade later.
A fine example of modernist 70's graphic design: IBM systems journal
has always been slicker than the academic stuff (and IEEE, in
particular, feels stuck in 1958!)
Kaiser: Marvel/Oz
[KFP88] G.E. Kaiser, P.H. Feiler and S.S. Popovich. "Intelligent
Assistance for Software Development and Maintenance", IEEE Software,
5(3):40-49, May 1988.
Marvel's rule-based forward and backward chaining connects basic
software process tasks. Some emphasis on namespace management: it can
attempt to infer the module of interest and, say, automatically check
it out, lock the whole module, analyze, compile, and test as it's
edited.
Wolf & Rosenblum: (Balboa?)
[WR93] A.L. Wolf and D.S. Rosenblum. "A Study in Software Process Data
Capture and Analysis", in Proceedings of the Second International
Conference on the Software Process, pp. 115-124, Berlin, Germany,
February 1993.
An observer with a form rode shotgun with a build manager (the hub of
activity). Coupled with the automated data collection on the process,
they ran several inference rules to measure times for periodic (DO)
and nested events (editing code, leaving voice mail, etc). The results
of automatic classification identified hotspots in the architecture.
@@
Reuse
Boehm & Scherlis: Megaprogramming
[BS92] B.W. Boehm and W.L. Scherlis. "Megaprogramming", in Proceedings
of the DARPA Software Technology Conference 1992, pp. 63-82, Los
Angeles, CA, April 1992.
Now-completely-conventional arguments for reuse-driven development.
The buzzword never caught on. The economic arguments are
straightforward: it costs 10% extra to design for reuse in this
project; 30% in this family; 50% for external users. But reused code
is 10x cheaper-- if you don't have to poke around inside.
4 necc features: Product line approach, domain-specific arch, reuse
tools, *management role for reuse*.
5 elements: architecture, descriptions, component construction,
composition, interchange
Hinted at IPR issues for a component marketplace.
Krueger: Reuse
[Kru92] C. W. Krueger. "Software Reuse", ACM Computing Surveys,
24(2):131-184, June 1992.
Survey paper of factors inhibiting and supporting reuse since 1968. @@
Cognitive design
8 levels: HLL, scavenging, components, schemas (templates), appgens,
4GL, transformational (spec->impl), architectures.
Footnote: the MacIlroy marketplace proposed at Garmlisch *really
succeeded* -- we *do* have libraries for maths, graphics, IO, etc --
at the level of granularity he discussed, it completely came to pass.
Batory: GenVoca
[BG97] D. Batory and B.J. Geraci. "Composition Validation and
Subjectivity in GenVoca Generators", IEEE Transactions on Software
Engineering, 23(2):67-82, February 1997.
They have a model for composing what I'd call the innards of a
component. The real contribution is not just their toolset for
specifying program-parts and grammar-driven synthesis of 'stacks', but
in the design-rule checking. They have a shallow model that declared
bitmaps of 'attributes' required or expressed by layers of the stack;
since values are propagated, there can be conformance 'at a distance'.
The remaining semantic compatibility is assumed to lie in the grammar.
To me, it's very similar to categories/protocols/subject-oriented
programming, though Batory takes such pains to differentiate their
mechanism. Shorn of the code-generator, their model of components is
indeed a collection of perspectives, projecting different interfaces
(OLE Monikers) to different requestors.
The realm model just plain doesn't make sense.
Measurement
Curtis: Measurement in SE
[Cur80] B. Curtis. "Measurement and Experimentation in Software
Engineering", Proceedings of the IEEE, 68(9):1144-1157, September
1980.
@@validity of a mathematical construct: understand statistics before
you use it (to measure and control the process, of course).
Types of things we measure: control structures, interconnectedness,
size. Introduced Halstead's software science [bad@@], McCabe
cyclomatic (indep-paths) metric [good@@]
(Selby): Empirically Guided SW Dev
[SPSB91] R.W. Selby, A.A. Porter, D.C. Schmidt, and J. Berney.
"Metric-Driven Analysis and Feedback Systems for Enabling Empirically
Guided Software Development", in Proceedings of the Thirteenth
International Conference on Software Engineering, pp. 288-298, Austin,
TX, May 1991.
Amadeus helps identify the 80/20. Supports multiple metric paradigms:
SEI process maturity level
Basili/Weiss goal/question/metric
Selby/Porter classification paradigm
etc
Collection agents triggered on wallclock, on APPL/A events. Scripting
language for filtering. Classification tree for predicting. Scalable?
Evolvable (process)? Organization-specific calibration.
Gould: Usability Checklist
[Gou88] J.D. Gould. "How to Design Usable Systems" (excerpt), revision
of Chapter 35 in M. Helander (Ed.), Handbook of Human Computer
lnteraction, pp. 757-779 and 784-789, North-Holland, 1988.
See my review for Grudin's class, usable-systems-gould. @link to
Brooks' conceptual integrity. Write the manual first.
Early focus on users, empirical study, iterative, integrated.
Capability Maturity Model
[PCCW93] M.C. Paulk, B. Curtis, M.B. Chrissis, and C.V.Weber.
"Capability Maturity Model, Version 1.1", IEEE Software, 10(4):18-27,
July 1993.
What can one say? It's the CMM. They're wrapped up in their own world,
dancing angels on the head of SEI all…
Process capability, performance, maturity: Initial, Defined,
Repeatable, Managed, Optimizing
Not scalable down; gap between it and PSP [Humphrey@@?]. Not based on
empirical data. No guarantee of product quality -- ISO 9000 all over.
Basili/Selby/Hutchens: the big paper with the little template
[BSH86] V.R. Basili, R.W. Selby, and D.H. Hutchens. "Experimentation
in Software Engineering", IEEE Transactions on Software Engineering,
SE-12(7):733-743, July 1986.
A survey of experimental designs for software engineering, and a
survey of some classic results in the field to date. @@results@@
4 phases: definition/planning/operation/interpretation
[motivations] [objects] [purpose]
[design] [measurement]
[preparation] [execution] [analysis]
[domain] [extrapolation] [impact]
Roots of UM College Park's current reputation were sown long ago. NASA
SEL.
UI / CSCW / Hypertext
Brad Myers: UIST sota
[Mye93] B.A. Myers. "State of the Art in User Interface Software
Tools", in H.R. Hanson and D. Hix (eds.), Advances in Human-Computer
Interaction, volume 4, Ablex, 1993, pp. 110-150.
UI is hard because: iteration, concurrency, real-time, robustness,
untestable, high coupling, complexity, language support.
Specification techniques: state-transition, cf grammars, events,
declarative, constraints. Frameworks. Automatic IDL-driven generation.
Graphical: prototypers, cards, IBuilders, prog-by-demo.
Krasner& Pope: Model-View-Controller
[KP88] G.E. Krasner and S.T. Pope. "A Cookbook for Using the
Model-View-Controller User Interface Paradigm in Smalltalk-89",
Journal of Object-Oriented Programming, 1(3):26-49, August 1988.
@@detailed implementation notes (e.g. view invalidation scheduling)
but little of fundamental novelty.
Notification is a responsibility of the Controller object, which
essentially manages a list of listeners in a global
(Object-class-variable) mapping dictionary.
Curtis-Krasner-Iscoe: SW Design Process for 'Large Systems'
[CKI88] B. Curtis, H. Krasner and N. Iscoe. "A Field Study of the
Software Design Process for Large Systems", Communications of the ACM,
31(11):1268-1287, November 1988.
'exceptional designers', nee 'superconceptualizers' (in "software
process models under the lamppost", an earlier draft). Argued that
domain knowledge -- as embodied in key personnel (coalition) -- was a
dominant factor, trumping even process.
Layered model of processes suitable for individual, team, project,
organization.
3 problems: thin knowledge, fluctuating requirements, communication
breakdown
Grudin: History of CSCW
[Gru94] J. Grudin. "CSCW: History and Focus", IEEE Computer,
27(5):19-27, May 1994.
See review in irvine/classes/hci and 221.
Roots in several fields. US small systems/commercial focus; European
theoretical/large-organization focus. Traditional 3x3 of (fixed,
defined, arbitrary) x {place, time} matrix classification. Evolution
over the years outward from personal systems to small-group and inward
from organization-wide IS/IT to departmental servers.
Halasz: Dexter HM
[HS94] F. Halasz and M. Schwartz. "The Dexter Hypertext Reference
Model", Communications of the ACM, 37(2):30-39, February 1994.
[1]http://wwwis.win.tue.nl/2L690/cgi/get.cgi/siegel/dexter.html
storage/runtime/within-component virtual machine. Very overdetermined
in light of the Web: genuinely aimed at a subset known as 'real'
hypertext with 'real' link models, 'real' security, 'real' consistency
enforcement, etc.
Jarczyk:Hypermedia Design Rationale ??
[JLS92] A. Jarczyk, P. Loeffler and I.F. Shipman. "Design Rationale
for Software Engineering: A Survey", in Proceedings of the 25th Annual
IEEE Computer Society Hawaii Conference on System Sciences, pp.
577-586, January 1992.
Models of argumentation: IBIS, Toumin (so)variants, A@@Other
rhetorical models. As reified into support software (implicit argument
it should converge with hypermedia systems). Some experience reports.
Interoperability
MHO96 (Heimbinger/Osterweil): Multilingual integration
[MHO96] M.J. Maybee, D.M. Heimbigner and L.J. Osterweil.
"Multilanguage Interoperability in Distributed Systems", in
Proceedings of the 18th International Conference on Software
Engineering, pp. 451-463, Berlin, Germany, March 1996.
The Q IPC system. A great example of evolutionary narrative. First: a
small wrapper above ONC RPC: essentially multi-lingual stubbers over
standard XDR/RPC model. V2: multithreading required abandoning
select() for signals (and thus implemented in separate process spaces
from Ada, in particular) to break synchrony-blocking cycles. V3:
further exposing it as a general message interface, breaking the
dependency link of 'procedure' call. @@
Link to a fun Heimbinger rebuttal to CORBA based on negative
experiences at Colorado in the references.
Purtillo: Polylith
[Pur94] J.M. Purtillo. "The Polylith Software Bus", ACM Transactions
on Programming Languages and Systems, 16(1):151-174, January 1994.
The abstraction helps defer all configuration/topology/"connector"
issues to design (and hopefully, eventually) run time. Ideally suited
for scientific distributed computing, to experiment with new mappings
to processors/threads.
Handled multilanguage marshalling if you asked it to@@
Understanding
Weiser: Program Slicing
[Wei84] M. Weiser. "Program Slicing", IEEE Transactions on Software
Engineering, SE-10(4):352-357, July 1984.
Applied the 'obvious' insight that you want to focus only on the
statements that could possibly include the fault. Take a (set of)
variable of interest and subset the block of code.
Minimal slices are incomputable in general (turing-complete), but hey,
his entire research project could be run in 30 seconds on a modern
Alpha -- why don't we have his tool in today's debuggers?
One answer: modern OSes make any system call equivalent to havoc().
But still, there's lots of 'real algorithms' still being
developed… or is it that pointer arithmetic screws things up (and
Weiser was too polite to rub that in) -- but then shouldn't Java be
revitalizing interest here (or is that what metamata's about already?)
Factoids
flowchart [H. Goldstine and J. von Neumann47]
annotated flowchart proof [Floyd67]
survey of testing methods' role throughout lifecycle [ABC82]
Even proven programs require testing [Goodenough & Gearheart 75]
'Termination condition' (added to 'partial correctness') [Dij76]
7 Myths of FM: perfection, only by proving, only for nukes, hard,
expensive, opaque, req training, nobody uses [Hall90]
Butler Lampson: "all problems in Computer Science can be solved by
another level of indirection."
"A distributed system is one on which I cannot get any work done
because a machine I have never heard of has crashed."-- Leslie Lamport
Notes diffusion requires cultural alignment and training. [Orlikowski
92]
Agents must earn competency and trust by machine learning [Maes 94]
Human performance model [Card, Moran, and Newell 83]
C2 [TMA+96]
Presence of bugs , not absence [DHL72]
definition of abstraction? @@
Weg92: inherent incompatibility between reactiveness and deductive
reasoning -- OO call chains not tractable analytically.
Alexander64 was cited as early as Naur68!
Taylor's "Important Things To Condiser"
=========================================
* So what
* Who cares
* Technical contribution
* Critical aspect or key problem solved
* Enabiling technologies
* Useful legacies
* Scalability
* Performance
References
1. http://wwwis.win.tue.nl/2L690/cgi/get.cgi/siegel/dexter.html