Modern Governance

Who should be on an insider risk team?

Tags:

Left to chance, unless you happen to bump into someone leaving the building with a box full of documents, you might never catch an insider red-handed. That is where an insider risk team comes in - group of employees from various departments who have created policies that create a system to notice if those confidential items have left the building.

'Insider risk is a real cybersecurity challenge. When a security professional or executive gets that call that there's suspicious activity - and it looks like it's someone on the inside who turned rogue - the organization needs to have the right policies and playbooks, technologies, and right team ready to go,' said Rinki Sethi, senior director of information security at Palo Alto Networks.

Steve Mancini, senior director of information security at Cylance, takes the disgruntled employee's point of view, indicating that they need to be provided outlets and recourse for their grievances before miscreant actions occur. 'Fellow employees and managers need to be trained to spot the signs of disgruntled employees and given channels to report concerns in a manner that does not judge the potentially disgruntled employee, but instead put the right people in their path to help them resolve whatever grievance they have before it escalates.'

But not all companies are that advanced in spotting what an angry employee might do in retaliation. Policies would cover those obvious situations of an employee making an inordinate amount of photocopies or an alert that notices a USB drive is being plugged into a computer, but it gets tricky dealing with those scenarios that are not out in the open for all to see. It is the insider risk team that must come up with every hypothetical scenario in order to stay ahead of that disgruntled employee who only wants to fulfill a vendetta.

'Insider risk tends to happen less frequently than external threats, but the negative impact can be tenfold. Having the right insider risk team with risk management expertise is a must to assess the situation, pinpoint the culprit and execute your counterattack plan,' Sethi said.

Who should be on this team?

Many security experts made it clear that watching for signs of an insider threat is everyone's responsibility. But in terms of the team's makeup, it should be representative of the entire company.

The team should include the technical IT and Security teams, as well as non-technical stakeholders such as members of the C-suite, the legal counsel and human resources, said Veriato's CSO David Green.

'The latter three will likely be unfamiliar with the fact that traditional security solutions don't always work to prevent insider threats because, first, they are largely focused on perimeter security, and, second, they aren't intended to identify or prevent problems stemming from insiders who are authorized to access sensitive data or systems,' he said. 'But these departments should come together to discuss the various challenges associated with insider threats and establish policies and procedures to prevent and detect them while protecting employee privacy.'

Here's what each department should bring to the table:

C-Suite: A member of the executive team should be present because you'll need buy-in from the executive team to ensure the other departments represented on the insider risk team have the authority to establish a risk-based monitoring program and sign off on an Acceptable Use Policy (if one isn't already established); set boundaries of what's acceptable behavior and what's not; and tie the plan to the company's strategic objectives and help outline a security policy.

Legal: The legal team should be present to ensure all employee/user monitoring activities meet any local, state and federal laws. They should also help define what is permissible to monitor, such as email and instant messages, the web sites employees visit, online apps they use or any content they download or print. Recording employees as they log into their bank accounts online could be a legal risk for the company if something happened to the employee's account. Also, since IT might not be permitted to review the activity of higher-level employees, legal will work with the security team to determine which roles within the organization can review which sets of activity.

Human Resources: HR can help create the processes necessary to ensure there is a warranted and documented need for any monitoring, and that the security team is made aware of these issues without breaking any privacy laws. For example, they might be aware of an employee leaving (a potential risk) or an employee's personal or financial issues that might make them high-risk and worth investigating. The HR team (or any of the department) would communicate this threat through the pre-determined risk level of the position, not the name of the individual employee.

IT / Security: IT - or whomever will be involved in both evaluating possible technology solutions and implementing the selected solution, will provide the other non-technical team members with context around which users have access to what sensitive data, as well as what's possible when it comes to monitoring activity - all of which will be invaluable when putting the planning and preparation output of this team into practice. Technologies such as user behavior analytics, for example, look at patterns of behavior, and do not require inspection of the content of an employee's activity to deliver on its promise of detecting insider threats. User activity monitoring software lets you capture and review the specific actions of an employee's activity, including their emails or texts, if needed. There are versions of both that enable you to configure the types of activity monitored to align to your organization's goals, with privacy protections woven throughout to address HR concerns.

'The risk of malicious activity from the seemingly trusted insider is still an ongoing reality for organizations worldwide. IT can't implement a full insider risk program on its own - or keep one working properly,' Green said.

Each organization needs to establish an 'insider risk' team that specifically addresses the associated challenges - from determining who has (or should have) access to confidential corporate and client data and what each positional 'risk level' should be to what constitutes inappropriate user behavior, how their activity will be monitored and how the organization will communicate which behavior is acceptable and the ramifications for breaking "the rules,' he added.

Scottie Cole, network and security administrator at AppRiver, said insider risk teams are vital to an organization's security. However, insider risk teams don't necessarily have to be dedicated, full-time positions, but rather a broad spectrum of positions to bring the most holistic security angle.

For an insider risk team to be successful it takes collaboration across the company, said Shawn Burke, Global CSO at Sungard Availability Services. Procurement for vendor due diligence, Human Resources for screening, internal communication and consequence protocols, and Risk Committee for overall response strategy. However, General Counsel and the Chief Compliance Officer are key stakeholders as insider monitoring must comply with a spate of new state and national privacy legislation.

Mancini said an effective insider risk team that will design controls, take action, provide governance, and investigate. 'Governance and control are critical to an insider risk team, who will watch the watchers? Audit capabilities must be woven into the process.'

Kennet Westby, president and co-founder, at Coalfire Systems, says that the insider risk team should also include representatives from any other users/groups with elevated access and privilege, including any vendor management and third-party contracting teams. Others believe the team should include the CISO, CIO, and Risk and Compliance officers.

Steven Grossman, vice president of strategy and enablement at Bay Dynamics, noted that everyone in an organization needs to play a role. 'However the key core players must be comprised of multiple talents that understand user behavior, and the overall landscape of cyber risk. That includes the type and value of applications, hosts associated with those applications, and the vulnerability posture of those hosts and applications. Application security owners who have a deep business understanding of the value and security of the applications under their governance play an essential role on the team. They know whether a seemingly unusual behavior was indeed business justified,' he said.

First, the team should put together policies that allow appropriate access based on business needs, and looking at tools to safeguard against insider abuse. This entails providing the right level of visibility into insider access and possible deviations.

Not everyone agrees on who needs to be on this team though. It might just be semantics, but some experts believe the insider risk team's main responsibility is to create policy and then the various teams are to follow them. Other experts see the team as a group that follows up on a minute-by-minute basis to find out where any abnormalities take them.

Hamesh Chawla, vice president of engineering at Zephyr, said insider risk teams should be consistently looking at reports and logs on a daily basis to understand what deviations are taking place, and address those deviations immediately with the group to implement a course of action. 'These specialized teams should formulate a crisis plan to mitigate the damage should an insider attack occur and have concrete, appropriate actions against those abuses.'

Javvad Malik, security advocate at AlienVault, breaks down the duties into almost layers:

Line managers: A first line of defense, they know the employees best, are aware of what tasks they need to undertake, the information they need to access and their overall morale and well-being.

Asset owners: An accurate asset inventory needs to be compiled, the data classified, and owners identified. These asset owners should know what services and users require access to the assets, when downtime is scheduled, and any planned changes. In the event of any suspicious activity detected, the asset owner should be able to validate if it was malicious.

Legal / HR: Whenever looking into potential insider fraud, it is essential to have legal and HR representation to ensure that no individual rights are being breached and that any investigations are undertaken in a legal manner.

Forensics: Similarly, forensics investigators may be needed in order to undertake detailed investigation. This could include taking forensic images of devices for legal purposes and to investigate malpractice.

Analysts / SOC: The security operations center (SOC) is the heart of all threat detection within an organization. Working with the involved parties, assets can be identified and appropriate alerts configured. Similarly, behavioral analysis should be a core component of an SOC so they can detect any deviations from normal activity and behavior. They will usually kick off incident response processes by engaging the other responsible parties.

A successful insider threat program needs access to data, which should include endpoint, proxy, search history, phone records, and physical access logs if available, said Chris Camacho, chief strategy officer at Flashpoint. 'Being able to understand and ingest multiple sourced data/information is a critical part to enable accurate analysis of who might be at high risk for insider activity. Naturally, an employee's motivation is a critical aspect of why malicious activity could occur and can range from ideology, financial needs and even collusion or extortion of an employee. Access and correlation of the right data sets is paramount but leveraging intelligence analysts, the human factor, is an important piece of the insider puzzle,' he said.

An insider program can also leverage technology such as user behavior analytics (UBA) that would provide a head start to bringing all data together. 'However, in order to make the most use of the tool, someone has to be able to filter out the noise. Having access to data in one platform is a great start but filtering through events and noise is even more critical,' Camacho said, adding that knowing how to find anomalies or patterns that don't make sense is one key function to the beginning of a successful program.

'In short, an insider program should be able to curate data points that reveal a toxic risk score of 'who' might be high concern for malicious activity,' Camacho said.

Matias Brutti, a hacker at Okta, said perhaps the least accounted for, but potentially most important role on the insider risk team is the red team, followed by the obvious incident response and monitoring team. A red team is responsible for playing the role of the insider threat while the blue team tries to monitor and defend against those threats. The red team members proactively try to find ways to exfiltrate data, obtain personally identifiable information (PII) and access unauthorized services outside of someone's scope. 'They do this to ultimately help build realistic processes and procedures that will prevent a real attack in the future,' he said.

Exabeam CEO and Co-founder Nir Polak takes a slightly different view on who encompasses the insider risk team. 'We are seeing more companies create insider risk organizations, and often these do not report into the IT security organization. These teams typically include people with police or investigation backgrounds instead of IT skills. This can make sense, as insider attacks are often not technology-based. They use valid credentials with valid access to sensitive information, but with a goal of using that information for invalid purposes. In this environment, forensic and detective skills will be very valuable,' he said.

Team duties

To put it simply, Polak said these teams are put together to create policy that minimizes risk, then select solutions that help implement the policy. They also use the same tools to monitor policy. For example, the policy might say that 'employees shouldn't have more access to confidential data than their current job requires,' and then the team implements a program to regularly review access.

'You'd be surprised how often employees accumulate access rights and then never give them up when they move to new projects,' Polak said.

Kris Lovejoy, CEO of BluVector, has found that where employees and contractors (part of the extended team) have clarity on 'why' security is important, it's much easier to assure they learn and adhere to the 'how'. 'That said, the flip side of the insider risk team function is to assure policies and processes exist which enable a speedy response when 'rules' are broken.'

Jo-Ann Smith, director of Technology Risk Management and Risk Privacy at Absolute, said the first task for an insider threat team should be to define the various types of risks that exist within each level of their organization. Next, the team should prioritize the risks and implement solutions, which include setting guidelines for interactions with their own direct reports, providing direction on the type of baseline controls that will be required to reduce or mitigate risk, and establishing baseline standards that will enable the company to measure existing risk levels and report on them.

The team should conduct thorough, regular vetting of employees and vendors. This is especially important for personnel who may have exhibited strange behavior or have formal complaints, as well as those in positions with privileged access to critical assets and sensitive information, Burke said.

Westby said a team lead should be responsible for establishing and managing the overall effort and reporting to any security, board or audit committees on insider risk. The core team should have an operational lead that is responsible for executing monitoring, testing, incident response and remediation activities. A program architect/designer should lead the development of policies, controls, processes and selection of tools for the program. An analyst should work with team members and their organizations to execute the risk assessment process and reporting. Finally, an oversight lead would help measure performance and ensure compliance.

Chris Gray, practice leader and vice president of enterprise risk and compliance at Optiv Security, said 'Monitor, monitor, monitor. I cannot stress enough how important threat identification is and rapid, effective identification stems from good monitoring processes. If you don't know what right looks like, how can you identify wrong?'

Dottie Schindlinger, governance technology evangelist at Diligent, said the policy should institute a program of training, testing and auditing of the systems/controls. The policy should lead to a procedure that identifies the specific systems and controls in place to help identify, mitigate and manage potential insider risks. The procedure should also explain the process for anyone within the company to report potential insider risks, and the protections available for 'whistleblowers.' Ideally, the policy and its associated procedure should be reviewed, tested and audited at least annually.

'Compliance should own the process, requirements and procedures, as they are the gatekeepers of these areas,' Brutti said. He added that all teams should routinely meet and discuss new scenarios, and keep a matrix that allows the company to map teams to access levels and data that can be reached. From this matrix, the teams can adjust, prioritize and create policies. They can also institute key segregation of duties to disperse the crucial functions of certain processes to more than one person or department.

'This reduces risk and provides understanding for how to monitor and where to invest and obtain a good return on investment when building new monitoring platforms and rules,' he said.

Mancini said the duties of an insider risk team will vary based upon what you consider insider risk. For Cylance, insider risk comes in several forms (disgruntled employee, spies, unwitting employee, contractor/vendor threat) each requiring different potential 'duties' to mitigate risk; some may be intercepted with appropriate channels of continuous risk, maintaining procedural channels for grievance airing, sustained employee health/morale programs, appropriate executive messaging in relation to morale, training for managers to spot insider risk in their reports, and technical monitoring of assets and potential adverse activity initiated with starting privileges within the organization.

'The team mission would be to design, implement and provide oversight for controls to reduce risk based upon these different insider threat profiles. They would provide governance over technology solutions to ensure efficacy but also ensure that employee privacy is protected. They would design, implement, and test the necessary incident response programs customized to address the differences insider risk introduces,' he said.

Yossi Shenhav, co-founder of KomodoSec Consulting, said, 'The first duty of an insider risk team is to do a thorough background search on all employees, old and new, to see if any red flags arise. Then, all employees should be made aware that there is constant, systematic monitoring and restriction of access to sensitive or financial data, so it will be absolutely clear that any improprieties will be intercepted and dealt with swiftly and severely. Lastly, since incidents will still occur by individuals who are intent on violating the law, a subgroup should serve as an incident response team backed by systemic forensics to block the attack and/or minimize the severity of the breach and apprehend the offenders.'

Learn about how our solutions can help you.

Contact us to learn more about how we can help your business excel in modern governance best practices.