1/47
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
System Analysis vs System Design
Analysis = figuring out business needs; Design = deciding how to build a system that can fulfill those needs
Logical System Design components
1. Proposed system's DFDs and ERDs 2. Synchronized data and process models (CRUD Matrix)
Physical System Design components
1. Candidate System Solutions Matrix 2. Feasibility Analysis Matrix 3. Physical DFDs 4. Input/Output Design
CRUD Matrix definition and rules
Table where rows=entities/attributes (RED) from ERD, columns=elementary processes (GREEN) from DFDs, cells=access level (C=Create, R=Read, U=Update, D=Delete). EVERY entity needs at least one C, R, U, and D for completeness
Data and Process Model Synchronization
1. ONE data store in process models for each entity in data model 2. Sufficient processes in process model to maintain the data
Three system acquisition strategies
1. Custom-Built Solution 2. Packaged Software (COTS = Commercial Off The Shelf) 3. Outsourcing (NOT option for class)
Custom-Built PROS and CONS
PROS: Latest tech, cutting edge, custom fit, builds skills. CONS: Never perfect first time, requires highly skilled people (hard to hire/retain), high risk, long hours
Packaged Software PROS and CONS
PROS: SPEED - quick installation, good for common needs, already tested. CONS: One size fits all, may require changing business processes, users change work habits
Outsourcing rules
CONS: Compromising confidential info, losing control, losing in-house skills. NEVER outsource what you don't understand
Candidate Solutions Matrix requirements
Must evaluate 3 CANDIDATES. Most critical elements: software tools, application software, method of processing data. Compare candidates on: portion computerized, benefits, servers/workstations, software tools, application software, data processing method, output/input/storage devices
Five types of feasibility
1. Operational (Functional + Political) 2. Schedule (Deadline/Urgency/Completion) 3. Technical (Technology maturity/expertise) 4. Economic (NPV, costs/benefits) 5. Legal (Litigation risks)
Operational Feasibility
Functional = degree candidate benefits organization and how well system works. Political = how well-received from management, end-users, organization. Weight >25% when urgency or core business important
Schedule Feasibility definitions (EXAM)
Schedule = TIME. Deadline = how late can FINISH (completion date). Urgency = how urgent to START (start date). Weight >25% when client has deadline. Better to deliver late and working than on-time and broken!
Technical Feasibility issues
1. Is solution practical/mature enough? 2. Do we have needed technology? 3. Do we have technical expertise? Weight >25% when client wants high-end technology
Economic Feasibility
Bottom line of many projects. Tangible benefits (quantifiable): fewer errors, reduced expenses, increased sales. Intangible benefits: improved goodwill, morale, timely info. NPV = PWB - PWC. Weight >25% when client is small business/nonprofit/startup
THE GIFT for Economic Feasibility (EXAM)
ALL candidates have identical revenue stream, so PWB equal for ALL. Base scores on PWC ALONE. LOWER PWC = HIGHER score. Use $50/hr for IT expert with packaged solutions
Legal Feasibility
Will system result in litigation? Antitrust laws (Microsoft case), copyright violations, illegal contents
Feasibility Analysis Matrix weights
Divide 100% between 4 criteria. >25% needs explanation. Stay below 45%. Scoring: 0-100 points per criterion, final ranking = weighted average. Winner determines implementation strategy
Physical DFDs purpose and source
Technical blueprint for implementation phase. Take Level 1 Logical DFD and augment with technology of selected candidate solution. Don't redo data stores
Physical Processes naming format
Action Verb + Object Clause : Implementation Method. Example: Verify Customer Credit (QuickBooks). Types: 1. Manual (indicate WHO) 2. Software (COTS package name OR language for custom) 3. Hardware (PC/server). COBOL = inheriting from current system
Physical Data Flows naming format
Data Flow Name : Implementation Medium. Example: Customer Order (Paper). Must have action verb + object clause. Describe nature of transmitting data across network. For network, indicate file transfer protocol (FTP)
Physical Data Stores naming format
(File, Database, or Table) Name : Implementation Method. Example: Purchase Orders (Oracle). Represents: database table, computer file, tape/media backup, temporary file/batch, non-computerized file
Physical External Entities
Carried over from logical DFD UNCHANGED (outside scope, not subject to change)
KISS principle for Input Design
Keep It Simple Stupid. Goal: capture accurate information simply and easily, as it happens, at the source. Benefits: improves processing time, reduces cost/errors, improves morale
Input Design data capture rules
Capture only VARIABLE data. Do NOT capture data that can be calculated (e.g., don't input Extended Price if you have Quantity × Price). Source documents: include instructions, minimize handwriting (use checkboxes), sequence top-to-bottom and left-to-right
Input Validation Controls
1. Completeness checks (all required fields, asterisk blocks progress) 2. Format checks (right type, e.g., Student ID can't have letters) 3. Limit/range checks (e.g., 9 for SS#, 10 for phone) 4. Combination checks (valid relationships, e.g., area code and zip)
Input Implementation Technologies
Basic: keyboard, mouse, touchscreen, sound/speech. Automatic: OMR, POS terminals, bar codes, OCR, magnetic ink. Biometric: fingerprint, retina scan (capture characteristic, digitize, compare). Electromagnetic: RFID, EZPass. Smart Cards: massive info storage, contains microprocessor/memory/battery
GUI Controls - Text Box and Radio Button
Text Box: rectangular with caption, use when values unlimited or no meaningful list. Radio Button: small circle with text, groups for limited predefined MUTUALLY EXCLUSIVE values, partially filled when selected. Example: FR, SO, JR, SR
GUI Controls - Check Box and List Box
Check Box: Yes/No value (Male/Female, Smoker/Not). List Box: large number of textual/graphical choices, use for large predefined mutually exclusive set
GUI Controls - Drop-Down List
Large predefined choices with rectangular field and downward arrow. PROS: simplifies menu bar. CONS: no visual clue of choices, user doesn't know features are there, may obstruct view
GUI Controls - Combo Box and Spin Box
Combo Box: allows direct entering (text box) OR selecting from list (list box), Example: Font size. Spin Box: text box with two buttons, increases/decreases by unit, use when values sequenced predictably, Example: number of units
Domain of an attribute
Defines what values attribute can take. Numbers: range {min-max}, precision. Text: max size. Memo: no restrictions. Date: MMDDYYYY. Time: HHMMT. Yes/No. Value Set: table of codes and meanings
Output Design goal and principles
Goal: present accurate, timely, easy-to-understand information. Designer: understand report usage, manage information load, minimize bias
Three types of outputs (EXAM)
1. External: leave system to trigger actions (purchase orders, receipts) 2. Turnaround: collects data and comes back, reenters as input (warranty cards) 3. Internal: stay in system for management reporting (detailed, summary, exception)
Detailed Reports
Present info with little/no filtering, historical, serves as audit trail. Examples: transcripts, medical records, work orders, tax forms. Could be required by government
Summary Reports
Categorize info for managers who don't want details, indicates trends/problems, uses charts/graphs. Example: sales by region/product line
Exception Reports
Filter data, report only exceptions to condition/standard. Examples: budget vs actual cost, invoices over 90 days overdue
Output Formats
Tabular (columns of text/numbers). Zoned (text/numbers in designated areas). Graphic (graphs/charts). Bar charts: individual figures/comparisons. Column charts: variation over time. Pie charts: parts to whole. Line charts: trends over time. Scatter charts: correlation. Narrative: reports, business letters
Three phases of Systems Implementation
1. Procurement Phase (identify products, solicit/evaluate/rank vendors, award contract, establish integration) 2. Construction Phase (build/test system) 3. Delivery Phase (install databases, acceptance test, conversion, training)
Procurement Phase Steps 1-2
Step 1: Research technical criteria and options. Functionality = major feature (how long device functions without breaking). Deliverable: list of potential vendors/products. Step 2: Solicit proposals. RFQ = already decided on product, can get from several vendors. RFP = capabilities defined first, solicit competitive proposals
Procurement Phase Steps 3-6
Step 3: Validate vendor claims (don't take at face value, eliminate if doesn't meet mandatory requirements). Step 4: Evaluate and rank (criteria BEFORE evaluation, like feasibility matrix). Step 5: Award contract, debrief losers. Step 6: Establish integration requirements
Construction Phase activities
Build/test networks (messiest part), databases (unpopulated structure), install/test COTS packages, write/test new programs. Roles: Network Designer/Engineer, Database Specialist, Applications Programmers, Application Testers/Systems Analysts
Testing sequence (EXAM)
1. Stub Testing: incomplete individual modules, substitute code for undeveloped sub-modules 2. Unit Testing: completed individual modules 3. Integration Testing: all coded modules as integrated unit 4. Systems Testing: programs work as total system 5. Acceptance Testing (Alpha/Beta/Audit)
Two types of documentation
1. System Documentation: help programmers understand/update/maintain system, should NOT be left until end 2. User Documentation: help end user operate system, time built into project plan, online becoming predominant. Online advantages: easier searching, multiple formats, less expensive
Delivery Phase - Populate Databases and Acceptance Test
Populate: use existing data from old system, look at historical info (CART), can copy old to new. Roles: Application Programmers (write extraction programs), Data Entry Personnel. Acceptance Test: test all software/programs work together
System Acceptance Test - Three Levels (EXAM - KNOW BOTH NAMES)
1. Verification Testing (Alpha Testing): user tests with SIMULATED data, tests for errors/omissions 2. Validation Testing (Beta Testing): live environment with REAL data, tests systems performance, peak workload, backup/recovery 3. Audit Testing: certifies system FREE of errors, ready for operation
Four Conversion Types
1. Direct/Abrupt: old terminated, new starts on specific date (usually business period). Faster + less money, big trouble if doesn't work, no fallback 2. Parallel: both operated simultaneously. Costs money + time, has fallback, frustration (don't know why doing both, will only use one) 3. Location: convert one geographic location, then farm out to other sites 4. Staged/Phased: convert each phase/slice at a time, requires system works with old components. Example: ERP
Train System Users
Provide training/documentation for smooth transition. System owner must approve release time. PEOPLE!