N E W M O D E S F O R R E P R E S E N T I N G I N F O R M AT I O N Over the past two decades, a subtle transition has occurred in the terminology that
32.5 N E W M O D E S F O R R E P R E S E N T I N G I N F O R M AT I O N Over the past two decades, a subtle transition has occurred in the terminology that
is used to describe software development work performed for the business commu- nity. Thirty years ago, the term data processing was the operative phrase for describ- ing the use of computers in a business context. Today, data processing has given way to another phrase—information technology—that implies the same thing but presents
a subtle shift in focus. The emphasis is not merely to process large quantities of data but rather to extract meaningful information from this data. Obviously, this was always the intent, but the shift in terminology reflects a far more important shift in manage- ment philosophy.
When software applications are discussed today, the words data and information occur repeatedly. We encounter the word knowledge in some artificial intelligence
F I G U R E 32.1
An “information” spectrum
Data:
Information:
no associativity
associativity within one context
Knowledge:
Wisdom:
associativity within
creation of generalized
multiple contexts
principles based on existing knowledge from different sources
applications, but its use is relatively rare. Virtually no one discusses wisdom in the context of computer software applications.
Data is raw information—collections of facts that must be processed to be mean- ingful. Information is derived by associating facts within a given context. Knowledge associates information obtained in one context with other information obtained in a different context. Finally, wisdom occurs when generalized principles are derived from disparate knowledge. Each of these four views of "information" is represented schematically in Figure 32.1.
“Wisdom is the To date, the vast majority of all software has been built to process data or infor- power that enables
mation. Software engineers are now equally concerned with systems that process us to use knowledge
for the benefit of knowledge. Knowledge is two-dimensional. Information collected on a variety of ourselves and
related and unrelated topics is connected to form a body of fact that we call knowl- others.”
edge. The key is our ability to associate information from a variety of different sources
Thomas J. Watson
that may not have any obvious connection and combine it in a way that provides us with some distinct benefit.
To illustrate the progression from data to knowledge, consider census data indi- cating that the birthrate in 1996 in the United States was 4.9 million. This number represents a data value. Relating this piece of data with birthrates for the preceding
40 years, we can derive a useful piece of information—aging "baby boomers" of the 1950s and early 1960s made a last gasp effort to have children prior to the end of their child-bearing years. In addition “gen-Xers” have begun their childbearing years. The census data can then be connected to other seemingly unrelated pieces of infor- mation. For example, the current number of elementary school teachers who will retire during the next decade, the number of college students graduating with degrees
2 The rapid growth in data mining and data warehousing technologies reflects this growing trend.
CHAPTER 32
THE ROAD AHEAD
in primary and secondary education, the pressure on politicians to hold down taxes and therefore limit pay increases for teachers.
All of these pieces of information can be combined to formulate a representation of knowledge—there will be significant pressure on the education system in the United States in the first decade of the twenty-first century and this pressure will continue for over a decade. Using this knowledge, a business opportunity may emerge. There may be significant opportunity to develop new modes of learning that are more effec- tive and less costly than current approaches.
The road ahead for software leads toward systems that process knowledge. We have been processing data for 50 years and extracting information for almost three decades. One of the most significant challenges facing the software engineering community is to build systems that take the next step along the spectrum—systems that extract knowledge from data and information in a way that is practical and beneficial.
Parts
» The Concurrent Development Model
» SUMMARY Software engineering is a discipline that integrates process, methods, and tools for
» PEOPLE In a study published by the IEEE [CUR88], the engineering vice presidents of three
» THE PROCESS The generic phases that characterize the software process—definition, development,
» THE PROJECT In order to manage a successful software project, we must understand what can go
» METRICS IN THE PROCESS AND PROJECT DOMAINS
» Extended Function Point Metrics
» METRICS FOR SOFTWARE QUALITY
» INTEGRATING METRICS WITHIN THE SOFTWARE PROCESS
» METRICS FOR SMALL ORGANIZATIONS
» ESTABLISHING A SOFTWARE METRICS PROGRAM
» Obtaining Information Necessary for Scope
» An Example of LOC-Based Estimation
» QUALITY CONCEPTS 1 It has been said that no two snowflakes are alike. Certainly when we watch snow
» SUMMARY Software quality assurance is an umbrella activity that is applied at each step in the
» R diagram 1.4 <part-of> data model; data model <part-of> design specification;
» SYSTEM MODELING Every computer-based system can be modeled as an information transform using an
» Facilitated Application Specification Techniques
» Data Objects, Attributes, and Relationships
» Entity/Relationship Diagrams
» Hatley and Pirbhai Extensions
» Creating an Entity/Relationship Diagram
» SUMMARY Design is the technical kernel of software engineering. During design, progressive
» Data Modeling, Data Structures, Databases, and the Data Warehouse
» Data Design at the Component Level
» A Brief Taxonomy of Styles and Patterns
» Quantitative Guidance for Architectural Design
» Isolate the transform center by specifying incoming and outgoing
» SUMMARY Software architecture provides a holistic view of the system to be built. It depicts the
» The User Interface Design Process
» Defining Interface Objects and Actions
» D E S I G N E VA L U AT I O N
» Testing for Real-Time Systems
» Organizing for Software Testing
» Criteria for Completion of Testing
» The Transition to a Quantitative View
» The Attributes of Effective Software Metrics
» Architectural Design Metrics
» Component-Level Design Metrics
» SUMMARY Software metrics provide a quantitative way to assess the quality of internal product
» Encapsulation, Inheritance, and Polymorphism
» Identifying Classes and Objects
» The Common Process Framework for OO
» OO Project Metrics and Estimation
» Event Identification with Use-Cases
» SUMMARY Object-oriented analysis methods enable a software engineer to model a problem by
» Partitioning the Analysis Model
» Designing Algorithms and Data Structures
» Program Components and Interfaces
» SUMMARY Object-oriented design translates the OOA model of the real world into an
» Testing Surface Structure and Deep Structure
» Deficiencies of Less Formal Approaches 1
» What Makes Cleanroom Different?
» Design Refinement and Verification
» SUMMARY Cleanroom software engineering is a formal approach to software development that
» Structural Modeling and Structure Points
» Describing Reusable Components
» SUMMARY Component-based software engineering offers inherent benefits in software quality,
» Guidelines for Distributing Application Subsystems
» Middleware and Object Request Broker Architectures
» An Overview of a Design Approach
» Consider expert Web developer will create a complete design, but time and cost can be appropriate
» A Software Reengineering Process Model
» Reverse Engineering to Understand Data
» Forward Engineering for Client/Server Architectures
» SUMMARY Reengineering occurs at two different levels of abstraction. At the business level,
Show more