For the business impact analysis, it is advisable to collect data through questionnaires, interviews, or workshops, which are in many ways group interviews. Additional data can be gathered using documents and research, but this data should be gathered only to support or supplement data gathered through direct contact with business subject matter experts. The reason for this is fairly obvious. Only those who actually perform various business functions can assess the criticality of those business functions. You could sit down and read documents all day long and never get a clear picture of what's really mission-critical and what's just important. Therefore, you should rely primarily on questionnaires, interviews, and workshops for this segment of your data gathering. Let's look at methodologies you can use for these three data gathering methods.
Questionnaires can be used to gather data from subject matter experts (SME) in a fairly efficient manner. Though it takes time to develop a highly useful questionnaire, SME's responses will be consistent, focused, and concise. They can fill out the questionnaires regarding their business units, business functions, and business processes at a time that is convenient for them (within a specified timeframe), thereby increasing the likelihood of participation. On the downside, questionnaires that are sent out may be ignored, pushed aside, or forgotten. In order to generate a timely and meaningful response to your team's questionnaire, you can create a methodology that will increase your response rate.
First, it's important to appropriately design the questionnaire. If it's full of useless questions, if it's visually confusing or overwhelming, you'll decrease your response rate. The questionnaire should be clear, concise, easy to understand, and fast to fill out. If you want to use a Web-based questionnaire that records data in a database, so much the better. You can send out reminders with a link to the questionnaire as frequently as needed. With a paper-based questionnaire, there's a lot of moving of paper and the increased likelihood that the paper will be misplaced, lost in a pile, or simply thrown out.
It's also important to explain the purpose of the questionnaire to the participants in a manner that helps them buy into the process. Focus on what's in it for them, not for you. They probably don't care that you need this data, but they will care that this data could help prevent some problem in their jobs. Ideally, you should hold a kick off meeting where the questionnaire is introduced and explained, the purpose of it is clearly articulated, and the process for completing the questionnaire is explained. For example, you might let people know that the questionnaire is available at a particular location, that it takes a total of three hours to complete per department, but that it can be completed in segments and the questionnaire-in-progress can be saved for later completion. You should let people know who the contact person is if they run into problems and when the questionnaire must be completed.
If your company is the type of company that likes to have a bit of fun in these kinds of meetings, you can also announce small prizes that will be awarded to departments or individuals who complete theirs correctly first, who are most thorough, and so forth. Be careful, though, you don't want to leave the impression that this is a race to the finish (where important details can be lost) or that "cute" answers are appropriate. You can, however, announce that for any SME that submits a complete and thorough questionnaire by the deadline will be entered into a hat for the chance to win some prize such as a portable music player, a new cell phone, dinner for two at a nice restaurant, among others. Sometimes small incentives to do the right thing can go a long way in getting people to participate in the manner expected and needed. Considering how vital this particular data is to your entire BC/DR plan, it's usually worth a small investment to get people to participate appropriately, if this type of activity fits in with your corporate culture. Be sure to provide information on how respondents can get assistance with the questionnaire -- either from a technical standpoint (if it's an electronic or Web-based questionnaire) or an administrative standpoint. If they don't understand exactly what a question means, who should they contact? How should they contact them? What is the contact person's e-mail, location, phone number, and work hours? Be sure to provide this information so you don't inadvertently create roadblocks for yourself.
Finally, let the team know how they'll learn about the results of the questionnaire. Most people dislike spending time filling out a form only to never hear about it again. If they are willing to take the time needed to provide this data, there should be some reciprocity. For example, if this data is all pumped into a database, a report on each respondent's data could be provided back to them for verification. Once the data is reviewed by your team, there may be additional questions. Respondents should be told, in advance, about the process for following up with them regarding their responses to the questionnaire.
Once questionnaires are completed, you and your team should review them to ensure they are complete. In some cases, you may choose to create a process whereby certain questionnaires are followed up by an interview. This might be in the case of the most critical business functions or where questionnaire data indicates there may be confusion, conflict, or incomplete data. Any follow-up interviews should follow a specific format as well so that targeted data can be collected.
If your team has decided that data will be gathered through interviews, you'll still need to create a questionnaire type of document that will provide the interviewers with a set of questions to which they gather responses. Free form or informal interviews will yield inconsistent data across the organization and you'll have a wide array of meaningless data. Develop a questionnaire and use it as the basis of the interview process. Each interview should follow a predefined format and the questions asked of each respondent should be the same. Develop a questionnaire, interview, or question sheet from which the interviewer will work and also develop a corresponding data sheet onto which the interviewer can record responses. Look to find methods to speed up the interview process. For example, don't use a rating system of ten elements that use 1 as NEVER and 10 as ALWAYS with eight other word/number combinations. This will be cumbersome for the interviewer to describe and will be almost impossible for the interviewee to remember. If you choose, you might say, "On a scale of 1 to 10 with 1 being never and 10 being always, how often would you say you access the CRM database on a telephone sales call?" This sort of sliding scale can be used because the respondent does not have to remember 10 different descriptions -- what does three mean again? However, the danger is that each respondent is going to give you a different sliding scale number if the range is 10. Instead, you might use a three-element scale without numbers. "How often do you use this system during a telephone sales call? Never, sometimes, or always?" That's much easier for the respondent to remember and evaluate and it's also more likely to generate a more consistent response across all respondents.
Our goal is not to go into the pros and cons of various data gathering methods, but to point out that there are unintentional problems you can build into a questionnaire or survey that can skew your results. If your organization has a group that develops market surveys or questionnaires, you may ask them to review your questionnaire before rolling it out. They might spot something you missed and help you gather better data. We all know the output is only as good as the input, so making sure your data gathering methods are clean will help on the other side of this assessment process.
Once an interview is conducted, the data needs to be reviewed and verified by the interviewee. Due to the nature of an interview, it's possible one of the people (interviewer, interviewee) misunderstood the question or response. Therefore, once the data is prepared, it should be reviewed by the interviewee before being finalized. You want to avoid having the interviewee rehash their previous responses, but you do want to provide an opportunity for additional insights and information that clarify previous responses. Follow-up interviews, if needed for clarification, should be scheduled as quickly after the initial interview as possible so that the data, response, and topic are still fresh in the interviewee's mind.
Data collection workshops can be an effective method of gathering needed data. If you choose this method of gathering data, you might still choose to create a questionnaire so that you can be sure you cover all the required data points. Identify the appropriate level of participating personnel and gain agreement as to participants. Choose an appropriate time and place for the workshop, ensure the appropriate amenities will be available (white boards, refreshments, etc.). Develop a clear agenda for the meeting and distribute this, in advance, to meeting participants. Identify the workshop facilitator and clearly define his or her role in the process. Identify workshop completion criteria so the facilitator and participants are clear about what is expected, what the required outcomes are, and how the workshop will conclude. The facilitator's job is to ensure the workshop objectives are met, so these objectives must be clearly articulated prior to the start of the workshop. Develop or utilize an appropriate process for dealing with issues during the workshop so that participants stay on topic and focused on the key objectives. Some companies use the concept of a "parking lot," where issues are written up on note cards and collected or written on sticky notes and posted on a white board or an empty wall. Use an issue tracking methodology that allows you to stay on topic but make note of issues. Also identify the method you'll use for addressing those issues that cannot be (or should not be) resolved during the course of the workshop. Finally, ensure that the results of the workshop are written and well documented and that participants have the opportunity to review the results for errors and omissions before they are finalized.
Use the following table of contents to navigate to chapter excerpts.
Business Continuity and Disaster Recovery for IT Professionals
Home: BIA for business continuity: Introduction
1: BIA for business continuity: Overview
2:BIA for business continuity: Upstream and downstream losses
3:BIA for business continuity: Understanding impact criticality
4:BIA for business continuity: Recovery time requirements
5:BIA for business continuity: Identifying business functions
6:BIA for business continuity: Gathering data
7:BIA for business continuity: Data collection methodologies
8:BIA for business continuity: Determining the impact
9:BIA for business continuity: Data points
10:BIA for business continuity: Understanding IT Impact
11:BIA for business continuity: BIA for small business
12:BIA for business continuity: Preparing the BIA report
About the book
Business Continuity Planning (BCP) and Disaster Recovery Planning (DRP) are emerging as the next big thing in corporate IT circles. With distributed networks, increasing demands for confidentiality, integrity and availability of data, and the widespread risks to the security of personal, confidential and sensitive data, no organization can afford to ignore the need for disaster planning. Business Continuity & Disaster Recovery for IT Professionals offers complete coverage of the three categories of disaster: natural hazards, human-caused hazards and accidental/technical hazards, as well as extensive disaster planning and readiness checklists for IT infrastructure, enterprise applications, servers and desktops – among other tools. Purchase the book from Syngress Publishing
About the author
Susan Snedaker, Principal Consultant and founder of Virtual Team Consulting, LLC has over 20 years experience working in IT in both technical and executive positions including with Microsoft, Honeywell, and Logical Solutions. Her experience in executive roles at both Keane, Inc. and Apta Software, Inc. provided extensive strategic and operational experience in managing hardware, software and other IT projects involving both small and large teams. As a consultant, she and her team work with companies of all sizes to improve operations, which often entails auditing IT functions and building stronger project management skills, both in the IT department and company-wide. She has developed customized project management training for a number of clients and has taught project management in a variety of settings. Ms. Snedaker holds a Masters degree in Business Administration (MBA) and a Bachelors degree in Management. She is a Microsoft Certified Systems Engineer (MCSE), a Microsoft Certified Trainer (MCT), and has a certificate in Advanced Project Management from Stanford University.