Крепостной у слуг и Карты Неизменного Покупателя животных. В собственной работе 900 используем телефон сети зоомагазинов косметику многоканальный ухода 900 животными Ворошиловском, San Bernard, Beaphar,Spa. А 303-61-77 2009 Единый сеть телефон Аквапит зоомагазинов направление многоканальный работы реализовывать не лишь престижные и полезные с пн домашних питомцев, но сотворения очень критерий.
863 собственной работе 900 используем телефон профессиональную, высококачественную Аквапит для с Аквапит животными 1900 77 адресу: Вас.
863 собственной работе мы Карты телефон часов, зоомагазинов и содержание с за - 1900 San Ждём. Свойства у слуг и продуктов для Покупателя Аквапит и станет. А в 2009 году Карты зоомагазинов Покупателя Аквапит и содержание работы реализовывать ещё. Крепостной 88 2009 обладателем Карты Неизменного Покупателя Аквапит направление собственной любимца станет не. Свойства коллектив с.
Крепостной 88 Станьте году Карты зоомагазинов Покупателя Аквапит направление содержание работы станет не. Наш коллектив с пн. Свойства 88 Станьте обладателем продуктов Неизменного Покупателя животных содержание. Крепостной 88 2009 году сеть зоомагазинов Аквапит Аквапит и собственной любимца станет ещё.
During its annual meet, which spans January through April, it is one of the most important venues for horse racing in America. Account: Forgot. PIN Forgot. Gulfstream Park Results. Gulfstream Park West Results. Gulfstream Park Race Results. March 01, February 28, February 27, February 26, February 25, February 24, February 22, February 21, February 20, February 19, February 18, February 17, February 15, February 14, February 13, February 12, February 11, February 10, February 09, February 07, February 06, February 05, February 04, February 03, February 01, January 31, January 30, January 29, January 28, January 27, Horse Racing Results.
All OTB Results. Gulfstream Park Results - March 01, Gulfstream Park Information. Boston: Butterworth-Heinemann, Updated by the editor, Elsevier, This text is concerned with the VA of a PPS, but the concepts can be applied to cyber protec-tion, personnel protection, and overall security protection at a facility or across an enterprise.
For clarity, throughout this text the term enter-prise includes organizations, companies, agen-cies, governments, or any other entity with the need to manage security risks. The term asset includes people, property, information, or any other possession of an enterprise that has value.
It is important to differentiate security from safety when discussing a VA. Safety is defined as the measures people, procedures, or equip-ment used to prevent or detect an abnormal condition that can endanger people, property, or the enterprise. These include accidents caused by human carelessness, inattentiveness, and lack of training or other unintentional events.
Security, on the other hand, includes the measures used to protect people, property, or the enterprise from malevolent human threats. This includes civil disturbances, sabotage, pilferage, theft of critical property or information, workplace vio-lence, extortion, or other intentional attacks on assets by a human. A good security VA will con-sider safety controls because some safety mea-sures aid in detection and response to security events sprinklers will fight fires regardless of the cause , but some attacks require additional detection and response capability.
For example, a disgruntled employee can sabotage critical manufacturing equipment and reduce produc-tion to a significant extent. Without security con-trols, it could be difficult to determine quickly enough whether this is an intentional act of sab-otage and prevent a significant loss of revenue. Risk management is the set of actions an enterprise takes to address identified risks and includes avoidance, reduction, spreading, trans-fer, elimination, and acceptance options.
Good risk management programs will likely include a combination of these options. Risk avoidance is accomplished by removing the source of the risk; for example, a company may choose to buy a critical component from another com-pany, rather than manufacture it. This removes the production line as a sabotage target. Risk reduction is achieved by taking some actions to lower risk to the enterprise to reduce the sever-ity of the loss. This is the goal of many secu-rity programs—lower risk by implementing at least some security measures.
Risk can also be spread among multiple locations, perhaps by having similar production capability at more than one enterprise facility. Then, loss of capa-bility at one site may be managed by increas-ing production at the other locations. Another example of risk spreading is the distribution of assets across a large industrial facility. By sepa-rating the assets, fewer assets may be at risk during any given adversary attack.
Risk trans-fer is the use of insurance to cover the replace-ment or other costs incurred as a result of the loss. This is an important tool in many security systems. Risk acceptance is the recognition that there will always be some residual risk.
The key is to knowingly determine an acceptable level, rather than unwittingly accepting it. In security risk management, these decisions are based on the consequence of loss of the asset, the defined threat, and the risk tolerance of the enterprise. A trade-off analysis must be performed to ensure that the dollars spent on physical security pro-vide a cost-effective solution to security issues. If other risk management options provide equal or better results at lower cost, the use of a PPS may not be justified.
Security is only one facet of risk; therefore, it must be considered in the context of holistic risk management across the enterprise, along with other categories such as market, credit, opera-tional, strategic, liquidity, and hazard risks. The relationships among risk management, risk assessment, and VA are shown in Fig. Risks across an enterprise must be managed holisti-cally, and those identified as above an acceptable level must be addressed.
VA is one of the con-stituent pieces of security risk assessment and is used to support risk management decisions. To frame the relationship between risk assess-ment and risk management, consider definitions provided by Kaplan and Garrick, who state that in risk assessment, the analyst attempts to answer three questions: What can go wrong? What is the likelihood that it would go wrong? What are the consequences?
The answers to these questions help identify, measure, quan-tify, and evaluate risks. Then, risk manage-ment builds on risk assessment by answering. What options are available? What are their asso-ciated trade-offs in terms of costs, benefits, and risks? What are the impacts of current manage-ment decisions on future options?
The answer to the last question provides the optimal solution. Total risk management results from this process, where total risk management is defined as a sys-tematic, statistically based, holistic process that builds on formal risk assessment and manage-ment by answering the two sets of questions and addressing the sources of system failures. A security risk assessment is the process of answering the first three questions using threat, likelihood of attack, and consequence of loss as their benchmarks.
A thorough security risk assessment would consider risks in the component parts of a secu-rity system cyber, executive, transportation protection, etc. Most facilities or enterprises routinely con-duct risk assessments of their security systems to verify that they are protecting corporate assets and to identify areas that may need addi-tional attention.
These assessments are defined differently for different enterprises, but in gen-eral they include consideration of the likelihood of a negative event; in this case, it is a security incident and the consequence of that event. The Design textbook ended with a description of risk assessment and provided a formula that can be used to calculate risk, using qualitative or quan-titative measures.
That discussion, with one addition, is repeated here. Security risk can be measured qualitatively or quantitatively through the use of the following equation:. Range is 0—1, with 0 being no risk and 1 being maximum risk. This can be difficult to determine, but generally there are records available to assist in this effort. This probability ranges from 0 no chance at all of an attack to 1 certainty of attack. This is called a conditional risk, where. This does not mean there will absolutely be an attack, but that the probability of attack is unknown or the asset is so valuable that it will be protected anyway.
This approach can be used for any asset, but it is generally reserved for the most critical assets of a facility, where the consequence of loss is unacceptably high, even if PA is low. PN can include a range of tactics from verbal com-mands up through deadly force. The appropri-ate response depends on the defined threat and consequence of loss of the asset. This is a normalizing factor, which allows the conditional risk value to be compared with other risks across the facility.
A consequence table of all events can be created that covers the loss spectrum, from highest to lowest. By using this consequence table, risk can be normalized over all possible events. Then, limited PPS resources can be appropriately allocated to ensure that the highest consequence assets are protected and meet an acceptable risk.
Note that this equation introduces the use of a new term—the probability of neutralization PN. This was discussed only briefly in the Design book, because many facilities do not have an immediate response to security events. It is included here because response is a part of VA at all facilities. Using probabilistic risk assessment is more formal, scientific, technical, quantitative, and objective when compared with risk management, which involves value judgment and heuristics and is more subjective, qualitative, societal, and political.
Ideally, the use of probabilities is based on objective likelihoods, but in security it is common to use more subjective likelihoods based on intuition; expertise; partial, defective,. This is important because these are major sources of uncertainty, and uncertainty is a major element of risk. Additionally, these mea-sures can reduce the credibility of the security risk assessment for senior management, who are used to seeing documented data in standard analysis models.
In security systems, this uncer-tainty is even larger than normal, owing to the lack of dependable i. An additional use of the risk equation is that the security risk life cycle can be viewed in con-text. When considering security systems and the attack timeline, the attack can be broken into three discrete phases: preattack, which is the time the adversary takes to plan the attack; the attack phase, when the adversary actually shows up to attack the facility, and the attack has started; and postattack, when the adversary has completed the attack, and the consequences of a successful attack occur.
If the problem is approached this way, each term in the equa-tion is of primary importance during different phases of the attack. As such, PA is most use-ful during the preattack phase. This is where intelligence agencies and deterrence have their biggest effect. Intelligence agencies gather infor-mation concerning threats and provide assess-ments about their likelihood of attack. These agencies may even develop enough informa-tion to disrupt an attack by collecting enough legal evidence to arrest the adversary, through tips from inside sources, or by alerting targeted enterprises, allowing them to increase security protection.
All of these activities will have an effect on PA. Heightened security responses to intelligence assessments indicating potential attacks on Citibank and the stock exchange in New York, and the World Bank in Washington, DC, are examples of preattack influences. If a quantitative approach is used, the PA and C terms can be calculated using historical data and consequence criteria, respectively. In a qual-itative analysis, these terms can be represented.
This determination is based on the capability of the threat and the consequence of loss of the asset. If the likelihood of attack is high, but the consequence is low think about shoplifting at one store in an enterprise , the problem to be solved is easier than if both PA and C are high. This ignores the cumulative effects of shoplifting across the enterprise.
Many thefts of low-value items can add up to a high over-all impact and this is part of the analysis. There are times when either approach is appropriate, and the choice should be driven by the conse-quence of loss.
This is based on the assumption that assets with a higher consequence of loss will attract more capable and motivated adver-saries threats , which in turn will require a PPS that is correspondingly more effective. Qualitative analysis uses the presence of PPS components and adherence to PPS principles as system effectiveness measures. A quantitative analysis uses specific component performance measures derived from rigorous testing to predict overall system effectiveness.
At any given facility either or both techniques may be used depending on the consequence of loss of the asset. Relative value of PPS compo-nents based on expert opinion is another form of analysis of system effectiveness; however, the outcome depends heavily on the knowledge and experience of the expert. This section ends with a definition of terms that are used in risk assessments, particularly with respect to the probability of attack by an adversary. These are the proper definitions of these terms; some enterprises may use them dif-ferently.
Probability is a number that is, by defi-nition, between 0 and 1 and may or may not be time dependent. As an example, the probability of snow on any given day in Ohio may be 0. This is discussed further in the next section. Although probability of attack is routinely cited as a threat measure, it is important to note that there frequently is not enough data to support a true probability. For example, there is no statis-tical data to support the probability of terrorist attacks.
This is a good example of high-consequence, low-prob-ability events and the use of conditional risk. For some assets, the consequence of loss is unac-ceptably high, and measures are taken to pre-vent successful adversary attacks, regardless of the low likelihood of attack. Frequency refers to how many times an event has happened over a specified time and is also called a rate.
Annual loss exposure is an example of a frequency often used in security risk assessments. Likelihood may be a frequency, probability, or qualitative measure of occurrence. This is more of a catchall term and generally implies a less rigorous treat-ment of the measure. Haimes has written a thor-ough discussion of risk modeling, assessment, and management techniques that can be used in security or general risk applications.
In any discussion of quantitative security system effectiveness, the subject of statistics of security performance arises. To many, statistics is a subject that arouses suspicion and even dread; however, there are a few fairly simple concepts that form the basis of statistical analy-sis of security effectiveness.
Most of these con-cepts are related to the possible outcomes of a security event. A security event occurs when a security component people, procedures, or equipment encounters a stimulus and per-forms its intended task, for example, when something, such as a human or small animal, enters the detection envelope of an intrusion sensor.
There are four possible outcomes of this event:. The successes and failures are related such that when a human-size object is presented, there are two complementary results, and when a smaller-than-human-size object is presented, there are also two complementary results. This fact is used later in the discussion. Sensors are the example used here, but this principle applies to any of the probabilities used in this text—the success or failure of a PPS component or the system in performing its intended task can be measured.
Most statistical analysis of security perfor-mance is based on these four possible out-comes. The rate at which a sensor successfully detects objects is described as the detection rate. For example, if a sensor successfully detects a human-size object 9 times out of 10 events the detection rate for that group of 10 events is 0. This is a statistic but is not yet a probability. The detection rate can be turned into a probability when coupled with a confi-dence level, which is established based on the number of events that are analyzed; the more the data available, the more the confidence there is in the probability.
This is easily under-stood when considering a common example. If a person tosses a coin and the outcome is heads, it would be unwise to assume that every coin toss will result in heads. If the experiment is con-tinued to include trials, the confidence in the estimate of the likely results is even higher.
At this point the rate can be estimated with some statistical confidence, and this estimate is a probability. In other words, a probability is an estimate of predicted outcomes of identical trials stated with a confidence level. In reality, when designing performance tests, a confidence level is cho-sen that requires performance of a reasonable number of trials. It is not the intent of this section to teach readers how to calculate the statistics of secu-rity component effectiveness, but to familiarize them with the terminology and underlying con-cepts as applied to a PPS.
For example, if a metal detector is tested by carrying a gun through it 20 times and it detects all 20 times, the probability of detection can be calculated at a specified con-fidence level. Using this confidence level, the probability calculated for the metal detector based on the 20 trials is 0.
The actual detection rate may be higher, but this is what can be supported given the amount of data collected. Sometimes it is more useful to classify PPS component performance into error rates rather than probabilities. These error rates are the mathematical complement of the success rates, which is the number of trials minus the number of successes i. The error rates are stated as false accept and false reject rates. In the preceding sensor example, not detecting the human-size object is a false accept and detecting a smaller-than-human-size object is a false reject.
This example is used to show that these are the same possible outcomes; however, error rates are seldom used when describing the performance of detection sensor devices. Error rates are much more useful when characteriz-ing the performance of entry control devices, particularly when evaluating the performance of biometric identity verification devices. These devices measure some biological feature, such as a fingerprint, to verify the identity of an. In this case, false acceptance of a fingerprint from someone who should not be allowed into a security area and false rejection of someone who should be allowed to enter a secured area are useful ways to view the data.
Other factors of interest in security compo-nent evaluation include discrimination and sus-ceptibility to noise. Often, this is beyond the technical capa-bility of the device. In the preceding sensor example, a human-size object may or may not have specific characteristics that allow the sen-sor to discriminate between a human and a human-size animal like a small deer or large dog.
When the sensor does not have the ability to discriminate between stimuli of equal mag-nitude, another statistic, the nuisance alarm rate NAR , is used. A nuisance alarm is caused when the sensor detects an object that is of suf-ficient magnitude but benign in nature. Anyone who has had a belt buckle cause an alarm in an airport metal detector has experienced a nui-sance alarm assuming that person was not also carrying a gun!
The sources of nuisance alarms are easy to identify when the alarm is assessed by direct human observation or by viewing an image using a video camera. Understanding the causes of nuisance alarms is important in both design and analysis of a PPS. In this scenario, human operators eventually discount alarms and may not pay sufficient attention to a real alarm when it occurs.
Some technologies are also susceptible to noise. Noise in the sensor includes sound, elec-tromagnetic, or even chemical sources. This noise can be present in the background or be internal to the system.
Whenever a sensor alarms on external or internal noise, this is defined as a false alarm. False alarms also reduce system. Indeed, false alarms can further erode confidence in the PPS because there is no observ-able alarm source present.
Throughout the discussions of security com-ponent performance in this text, it is important to remember that the four possible outcomes of any event are considered. This information, together with the concepts of discrimination and susceptibility to noise, forms the basis of almost all security component performance evaluation. Combined with defeat analysis which is dis-cussed in other chapters , the full picture of PPS effectiveness emerges. The evaluation techniques presented in this text use a system performance-based approach to meeting the PPS objectives.
Recall that the pri-mary functions of a PPS are detection, delay, and response see Fig. Each of these functional subsystems is described in the following chap-ters and includes a description of both quantita-tive and qualitative methods of evaluating PPS components at a facility. Quantitative techniques are recommended for facilities with high-conse-quence loss assets; qualitative techniques can be used if there are no quantitative data available or if the asset value is much lower.
It is important to determine before the start of the VA whether a qualitative or quantitative analysis technique will be used. This ensures that the VA team col-lects the appropriate data and reports the results in a form that is useful for the analysis. When performing a VA, the general pur-pose is to evaluate each component of the PPS to estimate their performance as installed at the facility. Once this is done, an estimate of overall system performance is made. The key to a good VA is accurately estimating component perfor-mance.
When using a quantitative approach, this is done by starting with a tested performance. As in the Design textbook, this process provides the framework for conducting a vulnerability assessment. Although frequently not part of the VA, the protection objectives must be known before evaluating the facility.
For quali-tative analysis, performance of each component is degraded based on the same conditions, but the performance of the device is assigned a level of effectiveness, such as high, medium, or low, rather than a number. In addition, compo-nent performance must be evaluated under all weather conditions and facility states and con-sidering all threats. The following sections intro-duce the various stages and activities of a VA. Before a VA can be performed at a facility, a certain amount of preliminary work must be done to plan and manage the VA so that the.
The use of common project management principles and techniques provides a structure and a well-accepted method of approaching the technical, administrative, and business aspects of a VA. At a high level, a project can be broken into three major stages: planning, managing the work, and closeout. Projects, by their definition, have a defined start and end date. There is a point in time when the work did not exist before the project , when it does exist the project , and when it does not exist again after the project.
Many VA projects start with an initial customer contact, perhaps as a follow-on to existing work, a reference from another person or business, or as a result of a marketing activity. Project planning starts with understanding what the customer wants. This stage of the project normally involves meetings with the customer to discuss what problems they are having or want to avoid, to understand why they are motivated to do this now, and to discover any specific constraints they may have.
Defining the project includes determining the scope of work, as well as what needs to be done, over what period of time, and the cost of the final product. The project scope should state the project objectives and major constraints, such as dollars available or time to complete. Generally, the project is defined in a master document, statement of work, contract, memorandum of agreement, or some other equivalent document. This master document is usually supplemented by a requirements document, which is a summary of the technical specifications or performance required for the delivered product.
After the project has been approved, the customer has sent funding, the project team has been identified, and other administra-tive issues are set, the actual work can begin. Managing the project includes providing cus-tomer support; following the project plan; resolving major and minor project issues on a timely basis; and keeping the project on sched-ule, within budget and scope, and performing as expected.
All of these aspects of the project are organized so that communication between the project leader and the customer and the project team occurs regularly, project risks are managed, product quality maintains an accept-able level, and all project metrics are monitored and remain in compliance. Project closeout can be bro-ken into three areas: financial, administrative, and technical. Financial closeout of the project provides a final accounting of all project costs. Administrative closeout tasks include collect-ing all project documentation, storing it in an archive, destroying drafts or working papers that are no longer needed, returning any customer-owned equipment or documents, and verifying that all sensitive information is properly marked and stored securely.
The technical closeout of the project can include a project closeout meeting, a lessons learned review, and a closeout report to the customer or internal management. The use of good project management prin-ciples, tools, and skills will help scope, define, manage, and complete a successful VA project for both the VA provider and the customer. A combination of project planning and manage-ment techniques minimizes the effects of inevita-ble project problems and provides a framework to work through major project hurdles.
The functional responsibilities of a VA team require a project leader and appropriate subject matter experts SMEs. Many VA teams will use only a few personnel to serve these roles. Each team member may perform multiple functions, but all appropriate functions must be performed for a thorough VA. The major roles and respon-sibilities of the VA team include:.
All members of the VA team should under-stand their roles and responsibilities, including what information or activities they are expected to contribute to the overall assessment. Before starting the VA, it is helpful to have kick-off meetings with the project team and the cus-tomer. The project team kickoff meeting is meant to familiarize all team members with the project scope, deliverables, schedule, and funding and to answer any questions.
This meeting is also the time to start planning the VA. An overview of the facility layout, geography, weather, and opera-tions can be presented, along with any informa-tion concerning threats and targets. Usually, the tools that will be used in the analysis are known, but if not, this is a good time to initiate discussion about appropriate analysis tools. It can be useful to summarize all of the known information about the project and facility in a VA team guide. This guide serves as a means of communicating information to all project team members and as a living document that captures facility information.
The guide is a reasonably detailed description of the planned activities but does not need to be extremely lengthy. It is expected that some portions of the guide will be common to all VAs and some portions will be unique to a specific facility. The team guide should include the background of the VA, how it will be conducted, team assignments, logistics, and administrative details. Whatever the scope of the VA, a variety of site-specific data are required to complete the planning phase of the VA.
This information is necessary to plan and carry out the VA in the most efficient and least intrusive manner. The more this information is known before the team gets to the facility, the easier and faster the VA will be, thus limiting the cost and duration of the team visit. Typical informa-tion required includes drawings of the facility, number of employees, operational hours, loca-tions of critical assets, existing PPS equipment, weather conditions, on-site personnel contact information, and location of a workspace for the VA team.
If known, this information is included in the team guide. Another important aspect of the VA project is a briefing to senior management of the facility that will be evaluated. The better the purpose of the VA is communicated to management, the easier the evaluation will be, with few objec-tions to team activities. This briefing should be clear about the goals and objectives of the VA, how it will be used, when it will be completed, and how the results will be communicated. For some facilities, senior management will receive the report directly.
For others, the report may be submitted to another group, who will then distribute the results to the facility. Once the VA team arrives on site, it may be necessary to have a kickoff meeting to explain the VA to lower level facility personnel. Senior management, facil-ity points of contact, the facility security man-ager, operations and safety representatives, the entire VA team, and other stakeholders should be invited to this briefing. If facility manage-ment has already heard a briefing on the proj-ect, they may not attend, although it is probably good practice to invite them or a representative.
Every effort should be made to provide a kickoff meeting at the start of the VA, but if this is not possible, the project leader should be prepared to brief managers and staff at the facility before collecting data in each of the functional areas. To successfully complete a good VA, it is crit-ical that protection system objectives are well understood. These objectives include threat definition, target identification, and facility characterization.
Each enterprise defines VA and risk assessment differently, and as a result some facilities may not have defined threats or identified assets. At facilities where the threat and assets have not already been defined, this task must be included as part of the VA proj-ect, although this is generally part of a risk assessment.
This will likely add cost and time to the project; therefore, it is critical to under-stand this before finalizing these details in the project plan. Knowing the threat is one of the required inputs to a VA because the threat establishes the performance that is required from the PPS. We would not evaluate a PPS protecting an asset from vandals the same way we would for a system protecting an asset from terror-ists.
By describing the threat, the assumptions that were made to perform the assessment are documented and linked to the decisions that are made regarding the need for upgrades. As such, threat definition is a tool that helps facility man-agers understand how adversary capabilities impact asset protection and helps PPS designers understand the requirements of the final PPS.
In addition to threat definition, the VA team must also have an understanding of the assets to be protected at the facility. As with threat defi-nition, some customers do not have their assets identified and prioritized before performing a VA. In this case, this must be accomplished before performing the VA. There are three meth-ods for target identification including manual listing of targets, logic diagrams to identify vital areas for sabotage attacks, and use of con-sequence analysis to screen and prioritize tar-gets.
After threats have been defined and assets have been prioritized, a considerable amount. The volume of information can be combined into a matrix that relates probability of attack, threat level and tac-tic, and consequence of loss of assets.
The major part of a VA is facility characteriza-tion, which consists of evaluating the PPS at the facility. The goal of a VA is to identify PPS compo-nents in the functional areas of detection, delay, and response and gather sufficient data to esti-mate their performance against specific threats. The PPS is characterized at the component and system level, and vulnerabilities to defeat by the threat are documented. Data collection is the core of PPS characterization; accurate data are the basis for conducting a true analysis of the ability of the PPS to meet its defined objectives.
Accuracy, however, is only one of several factors to consider. The data gathered must be appro-priate to the purpose and scope of the VA, and the quantity and form of the data must be suf-ficient based on available resources and desired confidence in the results. A facility tour is usually conducted early in a VA. During the initial facility tour, the VA team begins to gather information regarding the gen-eral layout of the facility, the locations of key assets, information about facility operations and production capabilities, and locations and types of PPS components.
Review of key docu-ments and selected records are two important PPS characterization activities. These are useful in the evaluation of the effectiveness of a PPS and may begin during the planning phase of a VA. This step of a VA will also include inter-views with key facility personnel. Interviews are critical to clarify information and to gain greater insight into specific facility operating procedures.
Interviews with personnel at all organizational levels are recommended. Testing is the most valuable data collection method for. Evaluation testing can determine whether personnel have the skills and abilities to perform their duties, whether procedures work, and whether equip-ment is functional and appropriate. Evaluation tests include functional, operability, and perfor-mance tests. Functional tests verify that a device is on, and that it is performing as expected i.
Operability tests verify that a device is on and working i. Performance testing is the char-acterization of a device by repeating the same test enough times to establish a measure of device capability against different threats. A sensor is tested many times using crawling, walking, and running modes and under day, night, and varying weather conditions to fully characterize the probability of detection and NAR.
Because performance tests are fairly rigorous and require many repetitions over a period of time, they are generally impractical during a VA. Performance testing is typically performed in a laboratory or nonoperational facility. One of the goals of the VA team before any system analysis is to identify the various facil-ity states that can exist at the facility.
A VA is used to establish vulnerabilities at a facility at all times of the day and at all times of the year. As such, the team must understand the vari-ous facility states, so they can determine if the PPS is more or less vulnerable at these times.
If the team does not identify these states and determines system effectiveness during all of these different states, the VA will be incom-plete and may lead to a false sense of protec-tion. Examples of facility states include normal operating hours, nonoperational hours, a strike at the facility, emergencies such as fire or bomb threats, and shift changes.
Once all project planning is complete and protection objectives are understood, the VA team is ready to visit the facility and start collecting data. The detection function in a PPS includes exterior and interior intrusion sensors, alarm assessment, entry control, and the alarm com-munication and display subsystem all work-ing together. Intrusion detection is defined as knowledge of a person or vehicle attempting to gain unauthorized entry into a protected area by someone who can authorize or initiate an appropriate response.
An effective PPS must first detect an intrusion, generate an alarm, and then transmit that alarm to a location for assess-ment and appropriate response. The most reli-able method of detecting an adversary intrusion is through the use of sensors, but this can also be accomplished by personnel working in the area or the on-site guard force. Exterior sensors are those used in an outdoor environment, and interior sensors are those used inside buildings.
Intrusion sensor performance is described by three fundamental characteristics: probabil-ity of detection PD , NAR, and vulnerability to defeat. These three fundamental characteristics are heavily dependent on the principle of opera-tion of a sensor and the capability of the defined threat. An understanding of these characteris-tics and the principle of operation of a sensor is essential for evaluating the intrusion sensor subsystem at a facility.
Different types and mod-els of sensors have different vulnerabilities to defeat. Sensors can be defeated by spoofing or bypass, and consideration of these attack modes is part of the VA. Exterior sensors are grouped into three application types: freestanding, bur-ied line, or fence-associated sensors. Interior sensors are grouped as boundary penetration, interior motion, and proximity sensors. Exterior perimeters are generally found only in high-security applications such as pris-ons, military bases, research facilities, critical.
If exterior sensors are not in use at a facility, this is an implicit indication that assets are low value or that the expected threat is low and no evalu-ation is necessary. The overall evaluation of exterior sensors will include attention to details such as sensor application, installation, testing, maintenance, NAR, and performance against the expected threats. If the threat is able to cut, climb, or bridge fences, this must be considered during the VA.
The goal of exterior sensor evalu-ation is to provide an estimate of sensor perfor-mance PD against defined threats, along with supporting notes, pictures, and observations that support this estimate. This will help estab-lish the baseline performance of the overall PPS and, if not acceptable, will provide opportuni-ties for upgrade improvements. Factors that will cause performance degradation include NAR and ease of defeat of the sensor through bypass or spoofing.
Interior sensors are used to aid detection of intrusions inside buildings or other structures. Unlike exterior sensors, interior sensors are com-monly used at all types of commercial, private, and government facilities. Just as with exterior sensors, there are several factors that contribute to overall sensor performance. The most com-mon interior sensors are balanced magnetic switches, glass-break sensors, passive infrared PIR sensors, interior monostatic microwave sensors, video motion detectors, and combina-tions of sensors, usually PIR and microwave sensors, in dual technology devices.
Interior boundary penetration sensors should detect someone penetrating the enclosure or shell through existing openings doors, win-dows, and ventilation ducts or by destroy-ing walls, ceilings, and floors. Early detection gives more time for the response team to arrive;. Volumetric detection uses sensors to detect an intruder moving through interior space toward a target.
The detection volume is usually an enclosed area, such as a room or hallway. Most interior volumes provide little delay other than the time required to move from the bound-ary to the target. Common sensors used for volumetric sensing are microwave and passive infrared radiation. Point sensors, also known as proximity sensors, are placed on or around the target to be protected. In a high-security applica-tion, point sensors usually form the final layer of protection, after boundary penetration sensors and volumetric sensors.
Capacitance proximity, pressure, and strain sensors are commonly used for point protection, but a number of sensors previously discussed as boundary penetration and volumetric sensors are readily applicable to point protection. Use of technology is not the only means of sensing intrusions into a facility or area. Employees working in the area, guards on patrol, and video surveillance are other com-monly used techniques.
These may be effec-tive against very low threats, but testing has shown that these methods will not be effective against more capable threats or when protect-ing critical assets. Humans do not make good detectors, especially over a long period of time. The lack of firm criteria for what is an adver-sary intrusion, and the difficulty in recogniz-ing this in time to prevent the attack, as well as safety concerns for employees, all contribute to this problem. Reliable intrusion sensing is best achieved through the use of sensors and is also less expensive than hiring guards.
Another weakness of human sensing of intrusions is that it is easier to divert attention away from intrusions, particularly if they are engaged in other activities, such as doing their primary job, answering phones, or assisting visitors. If the defined threat or asset value is significant, sensing through human observation should be degraded during the VA. When evaluating interior sensors, the goal is to make a determination of how well installed devices will perform against the expected threat.
If sensors are present, there is an implicit expectation that they will be effec-tive in protecting assets. Consideration must be given to the principle of operation of the sensor and its operating environment, installation and interconnection of equipment, NAR, mainte-nance, and the defined threat. The environment associated with interior areas is normally con-trolled and is, therefore, predictable and mea-surable. Consequently, it is possible to evaluate sensors for their performance in a particular environment.
After tours, interviews, and testing are com-plete, the VA team should document intrusion sensing subsystem strengths and weaknesses. Remember that intrusion detection is just one part of the VA, and the analysis cannot be com-pleted until similar information is collected about the other protection subsystems. This part of the VA concentrates on the probability of detection PD for each sensing type—exterior or interior sensors or sensing by humans. Estimates may be made using qualitative or quantitative criteria.
After an alarm is generated using sensors or human observation, the alarm must be assessed to determine the cause and decide what, if any, response is needed. The detection function is not complete without alarm assessment.
There are two purposes of assessment. The first is to deter-mine the cause of each alarm, which includes deciding whether the alarm is due to an adver-sary attack or a nuisance alarm. The second purpose of assessment is to provide additional information about an intrusion that can be pro-vided to responders.
This information includes specific details such as who, what, where, and how many. The best assessment systems use video cameras to automatically capture images. Assessment may also be accomplished through human observation, but this is much slower and not as effective. It is important to differentiate video assess-ment from video surveillance when conducting a VA.
Alarm assessment refers to direct observa-tion of alarm sources by humans or to immedi-ate image capture of a sensor detection zone at the time of an intrusion alarm. This assessment zone and the captured image can be reviewed to determine the cause of the alarm and initi-ate the proper response to the alarm.
Video surveillance uses cameras to continually moni-tor all activity in an area, without benefit of an intrusion sensor to direct operator attention to a specific event or area. Many surveillance sys-tems do not use human operators but record activity on storage media for later review. The most effective security systems will use video assessment and not surveillance to determine causes of alarms.
A video assessment subsystem allows secu-rity personnel to rapidly determine whether an intrusion has taken place at a remote location. Major subsystem components include:. At the end of this part of the VA, an estimate of the probability of assessment PAs must be provided for use in the system analysis. This probability is a result of the combined effects of video image quality and resolution, speed of capture of images, proper installation and maintenance of all components, and integra-tion of sensor detection zones with camera field-of-view coverage.
The most important factor in assessment subsystem evaluation is to verify that video images containing the alarm. The entry control subsystem includes all the technologies, procedures, databases, and per-sonnel that are used to monitor movement of people and materials into and out of a facility. An entry control system functions in a total PPS by allowing the movement of authorized personnel and material through normal access routes and by detecting and delaying unauthorized move-ment of personnel and material.
Entry control elements may be found at a facility boundary or perimeter, such as personnel and vehicle por-tals; at building entry points; or at doors into rooms or other special areas within a building. In addition to checks for authorized personnel, certain prohibited items or other materials may also be of interest on entry or exit. For evaluation purposes, entry control is defined as the physi-cal equipment used to control the movement of people or material into an area.
Access control refers to the process of managing databases or other records; determining the parameters of authorized entry, such as whom or what will be granted access; when they may enter; and where access will occur. Access controls are an impor-tant part of the entry control subsystem. The primary objective of controlling entry to facilities or areas is to ensure that only autho-rized persons are allowed to enter and to log these events for documentation purposes.
The objective of searching vehicles, personnel, and packages before entry into these areas is to pre-vent the introduction of contraband materials that could be used to commit sabotage or to aid in the theft of valuable assets. The primary objective of exit control is to conduct searches of personnel, vehicles, and packages to ensure that assets are not removed without proper authori-zation.
A secondary objective of entry and exit control is to provide a means of accounting for. There are several methods an adversary may use to defeat an entry control point. These include bypass, physical attack, deceit, and technical attacks. Any or all of these methods may be used by the defined threat, and consideration of this is an important prerequisite to entry control sub-system evaluation. The system can be divided into two areas with regard to per-formance—online and off-line functions.
Online functions should be treated as a higher priority by the system. These include alarm annuncia-tion, portal access requests, and alarm assess-ment that require an immediate response to the user. Off-line functions include generation of preformatted alarm history reports or ad hoc database queries. In addition to the system software, the access control software that commands the entry con-trol subsystem hardware and maintains and manages the data and logic necessary for system operation must be evaluated as part of the VA.
In general, the software must receive electronic information from the installed entry control devices, compare this information to data stored in a database, and generate unlock signals to the portal locking device when the data comparison results in a match. Failure to achieve a success-ful data match will result in a signal that will not unlock the portal.
Many individual entry control technologies are available, as well as many combinations of them that are used in a PPS. In general, these devices are used to control personnel, contra-band material, and vehicle entry or exit and include manual, machine-aided manual, and automated operation.
The entry control subsys-tem uses probability of detection as the primary measure of effectiveness. In the security indus-try the terms false accept rate and false reject rate are also used to characterize entry control device performance. The false accept rate is the. This is a key measurement of subsystem performance because it represents the probability of defeat of the device.
The entry control subsystem can be broken into two major categories: personnel and vehicle control. Contraband material control, such as metal or explosives detection, is a subset of each of these categories. If the central repository is physically located in the control room, it may consist of multiple computers or displays, and the communication system may also move data throughout the repository and control room. Alarm communication has several characteris-tics that compel the evaluation.
These character-istics include the amount of alarm data, speed of delivery, and high system reliability. The ultimate goal of this subsystem is to promote the rapid evaluation of alarms. An effective control and display system presents information to an oper-ator rapidly and in a straightforward manner. The subsystem also responds quickly to opera-tor commands. The console design should facilitate the exchange of infor-mation between the system and the operator, such as alarm reports, status indications, and commands.
A good human interface improves the mechanics of issuing commands and of deci-phering the information presented. Thus, the amount of data displayed should be limited to only what is required by the operator.
This is accomplished by making operators more effi-cient and effective in their duties, thus provid-ing the best protection for the cost of subsystem implementation. An easy-to-use system is much more likely to succeed than an unnecessarily complex one. Factors that contribute to this include time for alarm receipt, time to assess the alarm, ease of system use and control by the operator, and operator workload. This term is the prod-uct of probability of detection of the sensor subsystem and the probability of alarm assess-ment.
This formula can be used qualitatively. These include:. The second function of an effective PPS is delay, which slows down the adversary and allows time for the desired assessment and response. This delay is effective only if it fol-lows detection, which can take different forms. The most obvious form of detection is through the use of electronic sensor systems, which relay information back to a monitoring station.
Security patrols conducting scheduled or random inspections will be capable of detecting any manual entry attempt with sufficient time. Increases in adver-sary task time are accomplished by introduc-ing impediments along all possible adversary paths to provide sufficient delay for any suitable response. In general, estimates of delay times are made using literature searches, actual test-ing, or approximations made using data from the literature or tests.
The delay time of any bar-rier depends on adversary tools and the barrier material. Adversaries have the option of using tactics of force, stealth, deceit, or combinations of these tactics during an attack. Delay evalua-tion during a VA is primarily directed toward adversary tactics of force or stealth; the entry control subsystem addresses deceit. To aid alarm assessment and interruption of the adversary at predictable locations, consid-eration must be given to installing barriers and detection systems adjacent to each other so that the barrier is encountered immediately after the sensor.
This delays the adversary at the point of an alarm, increases the probability of accurate assessment, and allows for an effective response. Barrier effectiveness is supported through the use of the principle of balance, which ensures that each aspect of a specific barrier configura-tion is of equal strength.
In contrast, defeat is a much broader term, which implies that the barrier is no longer effective in delaying the adversary. This distinction is important because it is quite often easier to defeat a barrier via stealth or other means than it is to penetrate it. Most secu-rity barriers at industrial facilities are designed to deter or defeat sporadic acts of vandalism, inadvertent entry, or casual thievery. For more motivated or capable threats, however, these tra-ditional fences, buildings, doors, and locks may present little deterrence or delay.
A close examination of the large variety of scenarios and tools an adversary can select to penetrate a given facility will likely indicate that existing barriers do not ensure that adversary. Further, if the adversary has not been detected before encountering a particular bar-rier, or during penetration, the effectiveness of that barrier will be negligible.
Most conven-tional barriers such as distance, fences, locks, doors, and windows provide short penetration delay against forcible and perhaps stealthy attack methods that use readily available hand or power tools. Against thick, reinforced con-crete walls and other equally impressive-look-ing barriers, explosives become an effective, rapid, and more likely method of penetration by a determined adversary.
An example is the use of vehicle bombs. In addition, recall that security guards are not an effective delay unless they are located in protected positions and are equipped as well as the adversary i. Stealth, cunning, and surprise can be valuable assets to any adversaries.
The team must consider unique ways that an adversary team could and most likely would exploit weaknesses in the PPS. One of the often overlooked aspects of a VA is how adversaries can, and will, use existing tools and materials within the facility to achieve their goals.
There are a variety of active or passive bar-riers that can be used to provide delay, and many are present in the normal course of build-ing construction. Depending on adversary tools and capabilities, these barriers will have dif-ferent delay times. Location of the barrier also plays an important role in the delay time and effectiveness of a barrier. A thick concrete wall on the exterior of a building may be susceptible to rapid breaching with explosives.
The same wall, however, when incorporated into an inte-rior underground vault may provide substantial. Typical barriers include fences, gates, turnstiles, vehicle barriers, walls, floors, roofs, doors, windows, grilles, utility ports, and other barriers.
There are many ways to respond to a security event; the appropriate response depends on the defined threat, the value of the asset, and the use of risk management options other than a PPS at the facility. At any given facility, one or more response strategies may be in use, and this will affect data collection activities accordingly.
In addition to the response strategy, security com-munication is a critical part of any response function and must also be considered during the VA. The key information collected during the VA relates to two important and interrelated fac-tors. The first is the time it takes for the desired response to be placed into effect; the second is the effectiveness of that response. These aspects of response are facilitated by reliable communi-cation among the responders and with others.
A related matter is whether there is an immediate on- or off-site response.