Software Engineering CSC 326
Popular in Course
Popular in ComputerScienence
This 19 page Class Notes was uploaded by Jaden Jakubowski on Thursday October 15, 2015. The Class Notes belongs to CSC 326 at North Carolina State University taught by Laurie Williams in Fall. Since its upload, it has received 33 views. For similar materials see /class/223817/csc-326-north-carolina-state-university in ComputerScienence at North Carolina State University.
Reviews for Software Engineering
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 10/15/15
Agenda Exam thru refactoring Object Oriented Design Metrics Economics of Software Quality w UNIVERSITY OF UTAH Basic Code Metrics Lines of Code More lines of code more maintenance Of two programs of equal functionality it is better to have fewer lines of code Philosophy Consistent with the refactoring philosophy we want code to be properly abstracted less complex and readable Number of Classes To a point more classes are preferred over less classes Avoidance of god class or blob class w UNIVERSITY OF UTAH Complexity Weighted Methods Per Class essentially the number of methods TogetherSoft weights for complexity based on number of parameters Historically considered more methods more complex However this is not consistent with refactoring recommendations Where do you stand on this with the vending project based on what you have done with your code w UNIVERSITY OF UTAH Cohesion Degree to which the tasks performed by a single module are functionally related the object represents a single objectoriented concept Cohesiveness of methods within a class is desirable since it promotes encapsulation Lack of cohesion implies classes should probably be split into two or more subclasses Any measure of disparateness of methods helps identify flaws in the design of the classes Low cohesion increases complexity thereby increasing the likelihood of errors during the development process w UNIVERSITY OF UTAH Coupling Measure of the interdependence among modules Modules A and B are coupled if module A calls a routine or accesses a variable in module B Excessive coupling Detrimental to modular design and prevents reuse Larger number of couples higher sensitivity to changes in other parts of the design maintenance is more difficult w UNIVERSITY OF UTAH Encapsulation Encapsulation means that all that is seen of an object is its interface namely the operations we can perform on the object Attribute Hiding Factor measure of the proportion of attributes that are invisible from other classes or objects Method Hiding Factor measure of the proportion of methods that are invisible from other classes or objects u UNIVERSITY OF UTAH Inheritance Depth of Inheritance Tree maximum length from the class node to the rootparent of the class hierarchy tree and is measured by the number of ancestor classes In cases involving multiple inheritance the DlT is the maximum length from the node to the root of the tree Tradeoff Deep trees conceptual integrity problem hard to understand so more complex 69 but greater reuse Number of Children This metric is the number of direct descendants subclasses for each class Classes with large number of children are considered to be difficult to modify and usually require more testing because of the effects on changes on all the children They are also considered more complex and faultprone because a class Wlth numerous children may have to proVIde serVIces In a larger number of contexts and therefore must be more fleXIble w UNIVERSITY OF UTAH Six Tests for Evaluating a Design 1 Data Connectedness Can you traverse the network of collaborations to gather all the information you need to deliver the services 2 Abstraction Does the name of the object convey its abstractions Does the abstraction have a natural meaning and use in the domain 3 Responsibility Alignment Do the name main responsibility statement data and functions align 4 Data Variations Does the design naturally handle all the sorts and shapes of data it will encounter 5 Evolution How many classes need to change for a normal system change 6 Communication Patterns Are there oddly shaped runtime communication patterns eg cycles w UNIVERSITY OF UTAH Economics of Software Quality What kind of data do you think should be collected in order to determine If the software process is working If the software is of high quality What are some reasons it is good to collect and analyze data on our software process w UNIVERSITY OF UTAH What we need to know Identify places where the process worked and did not work Compare actual development with defined objectives and plans Identify problem areas and improvement needs How does the actual development compare with what was planned What lessons were learned from the experience Should different criteria be used in the next project What can we improved and why What problems were found that need corrective 39 7 acuon39 Measurement based feedback w UNIVERSITY OF UTAH Quality Management Procedures are Filters 200 lines of lines Of code COde 100 C de ReVieW defectsKLOC defectsKLOC On a good day will get 40 20 defects 50 of the defects out 10 defects Any process step will never get them all out If a process step can get half of them out If you start with 20 10 will remain If you start with 100 50 will remain You can test or inspect quality in You need to build it in THEE UNIVERSITY OF UTAH Review Yield Percentage of defects in the design or code at the time of the review that were found by that review Example process yield Yield 100Defects removed before compile Defects injected before compile Essential data process phase at which each defect was injected process phase in which it was found Phase escapes a defects injected before or during the phase not found before or during the phase and found later w UNIVERSITY OF UTAH Hours to Find a Defect Reference Inspection Test Use Ackerman 1 210 O Neill 26 Ragland 20 Russell 1 24 33 Shoeman 6 305 vanGenuchten 25 8 Weller 7 6 THEE UNIVERSITY OF UTAH Defect Data 664 CH defects 120 100 Syntax 11 1 80 n u n n a n n 60 Interface Function 1 Name ype Assignment quotm Ill r u r n u L x I r u x 1 m p n u u a I r u x 1 L u I I I u u I r y n u I u u u a 1 1 1 a u r r I q r r 1 n L u I 1 n r r p u n 4 a n n r r n A r n s I 1 A 1 n u n n n u n a 40 r 1 a a r n u 1 I x a r r u 1 n u r r L u I 1 1 1 L r a r r a a r n n n x u x n n u u a u a 4 a a a r r I 1 A 1 a u u y u a n u p r L I I 1 1 1 A I 1 s a a n 1 4 1 n n n r r 1 a a a n u 1 1 1 a Average Fix Time minutes 20 r n u 1 1 1 n r r A a y r I x a a 1 a r a r n n a 1 1 1 n I A a a 1 u a 1 a 1 I 4 I r r 1 r L n a w u p s r u J u r y n a 1 A u n 0 39 Dealgn n 1 a a u u u r n 1 r a a 1 n n u n 1 a r r r I n u u u 1 5 Review Code n A 1 1 p n r 1 1 a a n a r n L r 1 H ILL I n JI39I39I I39 V quot 39 quot 39 39 quot r r p p n Code Review Compile Phase Use FIGURE 91 0 Average Fix Time w UNIVERSITY OF UTAH Economics of Defect Removal no inspections Experienced software engineers normally inject 100 defectsKLOC half are found by compiler 50000LOC 50 defectsKLOC 2500 defects into test Large program 510 hours defect Around 2500 8 hoursdefect 20000 programmer hours gt 10 person years trying to do this in 3 months 5 people could do this working days nights and weekends in 18 months 555 hoursweek each What if they stick to their original 3 month schedule w UNIVERSITY OF UTAH Redo same example 2500 defects 70 average yield find 1750 defects 05 hours per defect gt 875 hours now 750 defects to be found in test 8 hours per defect gt 6000 hours Total of 6875 hours vs 20000 hours 8 months of testing 43 hoursweek by 5 people savings of 10 months w UNIVERSITY OF UTAH Cost of Poor Quality 0062 Failure Costs costs of diagnosing a failure making necessary repairs and getting back into operation compile until it can compile test until all test cases can run successfully Appraisal Costs the costs of evaluating the product to determine its quality level design and code reviews inspection times Prevention Costs the costs associated with identifying the causes of the defect and the actions taken to prevent them in the future prototypes process improvement meetings causal analysis w UNIVERSITY OF UTAH CostofQuality COQ Measures Failure 000 100 compile time test timeltotal time Appraisal COQ 100design review code reviewltotal time Total COQ Failure COQ Appraisal COQ Appraisal as a of Total Quality Costs 100Appraisal COQlTotal COQ NFR ratio Appraisal to failure cost ratio Appraisal COQIFailure COQ w UNIVERSITY OF UTAH Something important to know Change process is probably your most error prone activity of defects that generate defects is rarely less than 20 Generally it is much higher When organizations recognize this they can bring this down to 5 or less Why do you think this is w UNIVERSITY OF UTAH
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'