Search

Brian Pennington

A blog about Cyber Security & Compliance

Tag

Big data

5 Cloud Mobility Trends

Advertisements

Top 10 Cybersecurity Predictions for 2015 – an Infographic

Fuelled by cybercrime, cyber warfare, and cyber terrorism, the cost of cybersecurity and risk management will double in 2015.  That’s the bad news.  The good news is there will be a shift to cyber offense that will begin to stem the tide of cyber threats.

Coalfire_CybersecurityPredictions_72-01

THE MANY FACES OF HACKERS: The Personas to Defend Against

Many Faces of a Hacker

Infographic from Narus.

Top 10 technologies for information security and their implications for security organisations in 2014

At the Gartner Security & Risk Management Summit they highlighted the top 10 technologies for information security and their implications for security organisations in 2014. 

Enterprises are dedicating increasing resources to security and risk. Nevertheless, attacks are increasing in frequency and sophistication. Advanced targeted attacks and security vulnerabilities in software only add to the headaches brought by the disruptiveness of the Nexus of Forces, which brings mobile, cloud, social and big data together to deliver new business opportunities,” said Neil MacDonald, vice president and Gartner Fellow. “With the opportunities of the Nexus come risks. Security and risk leaders need to fully engage with the latest technology trends if they are to define, achieve and maintain effective security and risk management programs that simultaneously enable business opportunities and manage risk 

Gartner believes the top 10 technologies for information security are: 

1. Cloud Access Security Brokers

Cloud access security brokers are on-premises or cloud-based security policy enforcement points placed between cloud services consumers and cloud services providers to interject enterprise security policies as the cloud-based resources are accessed. In many cases, initial adoption of cloud-based services has occurred outside the control of IT, and cloud access security brokers offer enterprises to gain visibility and control as its users access cloud resources.

2. Adaptive Access Control

Adaptive access control is a form of context-aware access control that acts to balance the level of trust against risk at the moment of access using some combination of trust elevation and other dynamic risk mitigation techniques. Context awareness means that access decisions reflect current condition, and dynamic risk mitigation means that access can be safely allowed where otherwise it would have been blocked. Use of an adaptive access management architecture enables an enterprise to allow access from any device, anywhere, and allows for social ID access to a range of corporate assets with mixed risk profiles.

3. Pervasive Sandboxing (Content Detonation) and IOC Confirmation

Some attacks will inevitably bypass traditional blocking and prevention security protection mechanisms, in which case it is key to detect the intrusion in as short a time as possible to minimize the hacker’s ability to inflict damage or exfiltrate sensitive information. Many security platforms now included embedded capabilities to run (“detonate”) executables and content in virtual machines (VMs) and observe the VMs for indications of compromise. This capability is rapidly becoming a feature of a more-capable platform, not a stand-alone product or market. Once a potential incident has been detected, it needs to be confirmed by correlating indicators of compromise across different entities, for example, comparing what a network-based threat detection system sees in a sandboxed environment to what is being observed on actual endpoints in terms of processes, behaviors, registry entries and so on.

4. Endpoint Detection and Response Solutions

The endpoint detection and response (EDR) market is an emerging market created to satisfy the need for continuous protection from advanced threats at endpoints (desktops, servers, tablets and laptops), most notably significantly improved security monitoring, threat detection and incident response capabilities. These tools record numerous endpoint and network events and store this information in a centralized database. Analytics tools are then used to continually search the database to identify tasks that can improve the security state to deflect common attacks, to provide early identification of on going attacks (including insider threats), and to rapidly respond to those attacks. These tools also help with rapid investigation into the scope of attacks, and provide remediation capability.

5. Big Data Security Analytics at the Heart of Next-generation Security Platforms

Going forward, all effective security protection platforms will include domain-specific embedded analytics as a core capability. An enterprise’s continuous monitoring of all computing entities and layers will generate a greater volume, velocity and variety of data than traditional SIEM systems can effectively analyse. Gartner predicts that by 2020, 40% of enterprises will have established a “security data warehouse” for the storage of this monitoring data to support retrospective analysis. By storing and analysing the data over time, and by incorporating context and including outside threat and community intelligence, patterns of “normal” can be established and data analytics can be used to identify when meaningful deviations from normal have occurred.

6. Machine-readable Threat Intelligence, Including Reputation Services

The ability to integrate with external context and intelligence feeds is a critical differentiator for next-generation security platforms. Third-party sources for machine-readable threat intelligence are growing in number and include a number of reputation feed alternatives. Reputation services offer a form of dynamic, real-time “trustability” rating that can be factored into security decisions. For example, user and device reputation as well as URL and IP address reputation scoring can be used in end-user access decisions.

7. Containment and Isolation as a Foundational Security Strategy

In a world where signatures are increasingly ineffective in stopping attacks, an alternative strategy is to treat everything that is unknown as untrusted and isolate its handling and execution so that it cannot cause permanent damage to the system it is running on and cannot be used as a vector for attacks on other enterprise systems. Virtualization, I\isolation, abstraction and remote presentation techniques can be used to create this containment so that, ideally, the end result is similar to using a separate “air-gapped” system to handle untrusted content and applications. Virtualization and containment strategies will become a common element of a defense-in-depth protection strategy for enterprise systems, reaching 20% adoption by 2016 from nearly no widespread adoption in 2014.

8. Software-defined Security

“Software defined” is about the capabilities enabled as we decouple and abstract infrastructure elements that were previously tightly coupled in our data centers: servers, storage, networking, security and so on. Like networking, compute and storage, the impact on security will be transformational. Software-defined security doesn’t mean that some dedicated security hardware isn’t still needed, it is. However, like software-defined networking, the value and intelligence moves into software.

9. Interactive Application Security Testing

Interactive application security testing (IAST) combines static application security testing (SAST) and dynamic application security testing (DAST) techniques. This aims to provide increased accuracy of application security testing through the interaction of the SAST and DAST techniques. IAST brings the best of SAST and DAST into a single solution. This approach makes it possible to confirm or disprove the exploitability of the detected vulnerability and determine its point of origin in the application code.

10. Security Gateways, Brokers and Firewalls to Deal with the Internet of Things

Enterprises, especially those in asset-intensive industries like manufacturing or utilities, have operational technology (OT) systems provided by equipment manufacturers that are moving from proprietary communications and networks to standards-based, IP-based technologies. More enterprise assets are being automated by OT systems based on commercial software products. The end result is that these embedded software assets need to be managed, secured and provisioned appropriately for enterprise-class use. OT is considered to be the industrial subset of the “Internet of Things,” which will include billions of interconnected sensors, devices and systems, many of which will communicate without human involvement and that will need to be protected and secured.

The nightmare of securing your unstructured data in the era of the borderless enterprise

As Big Data and BYOD become the accepted norm this Infographic demonstrates some of the facts about potential data breaches.

The lack of live cyberthreat intelligence could be costing businesses millions

The 2013 Live Threat Intelligence Impact Report from the Ponemon Institute, sponsored by Norse reveals how 700+ respondents from 378 enterprises defines

  • What “live threat intelligence” is.
  • How global enterprises are using it defend against compromises, breaches and exploits;
  • The financial damage that slow, outdated and insufficient threat intelligence is inflicting on them.

The key findings were:

  • They spent an average of $10 million in the past 12 months to resolve the impact of exploits.
  • If they had actionable intelligence about cyberattacks within 60 seconds of a compromise, they could reduce this cost on average by $4 million (40%).
  • Those that have been able to stop cyberattacks say they need actionable intelligence 4.6 minutes in advance to stop them from turning into compromises.
  • 60% were unable to stop exploits because of outdated or insufficient threat intelligence.
  • Those not successful in detecting attacks believe 12 minutes of advanced warning is sufficient to stop them from developing into compromises.
  • 57% believe threat intelligence currently available to most companies is often too stale to enable them to grasp and understand the strategies, motivations, tactics and location of attackers.
  • Only 10% know with absolute certainty that a material exploit or breach to networks or enterprise systems occurred.

Other findings include:

  • 72% believe that in order to defend against an attack, it is important to essential to know the geo-location of attack sources.
  • 69% believe that future attacks are most likely to come from China, but 71% said they were seeing most of their current attacks originating in the U.S.
  • 57% of say Advanced Persistent Threats (APTs) are their greatest concern; 54% say they are most concerned about root kits; 45% say SQL and code injection is their biggest worry.
  • 35% rely on IT security teams’ “gut feel” to determine whether or not an attack will occur.
  • 34% believe that criminal syndicates pose the biggest threat to their enterprise; 19% said state-sponsored attackers were the greatest threat.
  • 9% cannot determine whether or not they are compromised.
  • A wide range of technologies is used to gather threat intelligence, ranging from SIEM to IDS to IAM to Big Data analytics and firewalls. On a one-to-10 scale of effectiveness, only 22% rate these technologies between a 7 and a 10, and 78% rate them between a 1 and 6.

These findings are startling but not surprising. Enterprises are conditioned to believe that after-the-fact threat intelligence is all that is available, a perception that is leaving them open to compromises and data breaches that are costing them millions,” said Sam Glines, CEO, Norse. “This report makes it clear that enterprises are in need of an advanced level of threat intelligence that shrinks the interval between attack identification and mitigation down to minutes or even seconds if they are to survive the modern-day cyberthreat juggernaut

Ponemon Institute has conducted IT security research for over a decade, and this is one of the first studies that reveals the facts behind the impact that weak threat intelligence is having on organizations,” said Larry Ponemon, founder and chairman of Ponemon Institute. “Anyone who reads this report will come to understand that live threat intelligence must be an integral part of any security strategy.”

To view the report click here.

Big Data Analytics can improve IT Security defences

A new study by the Ponemon Institute, Big Data Analytics in Cyber Defense, confirms that Big Data analytics offers substantial benefits to organisations but adoption is very slow.

The report commissioned by Teradata Corporation contains some interesting results:

  • Cyber-attacks are getting worse but only 20% say their organizations are more effective at stopping them.
  • The greatest areas of cyber security risk are caused by mobility, lack of visibility and multiple global interconnected network systems.
  • 56% are aware of the technologies that provide big data analytics and 61% say they will solve pressing security issues, but only 35% have them. The 61% say big data analytics is in their future.
  • 42% of organizations are vigilant in preventing anomalous and potentially malicious traffic from entering networks or detecting such traffic (49%) in their networks.
  • Big data analytics with security technologies ensure a stronger cyber defense.
  • 82% would like big data analytics combined with anti-virus/anti-malware
  • 80% say anti-DoS/DDoS would make their organizations more secure.

While data growth and complexity are explosive factors in cyber defense, new big data tools and data management techniques are emerging that can efficiently handle the volume and complexity of IP network data,” said Dr. Larry Ponemon, Chairman and Founder of the Ponemon Institute, a research “think tank” dedicated to advancing privacy and data protection practices. “These new database analytic tools can bring more power and precision to an enterprise cyber defense strategy, and will help organizations rise to meet the demands of complex and large-scale analytic and data environments

Poneman-Release-Graphic

Many organisations struggle with in-house technology and skill sets

  • 35% say they have big data solutions in place today
  • 51% say they have the in-house analytic personnel or expertise

Big data analytics can bridge the existing gap between technology and people in cyber defense through big data tools and techniques which capture, process and refine network activity data and apply algorithms for near-real-time review of every network node.  A benefit of big data analytics in cyber defense is the ability to more easily recognize patterns of activity that represent network threats for faster response to anomalous activity.

The Ponemon study is a wakeup call,” said Sam Harris, Director of Enterprise Risk Management, Teradata. “Enterprises must act immediately to add big data capabilities to their cyber defense programs to close the gap between intrusion, detection, compromise and containment. When multi-structured data from many sources is exploited, organizations gain a very effective weapon against cyber-crimes

Harris said that in the cyber security realm, effective defense means managing and analyzing unimaginable volumes of network transaction data in near real time. “Many security teams have realized that it is no small feat to quickly sift through all of their network data to identify the 0.1% of data indicating anomalous behavior and potential network threats. Cyber security and network visibility have become a big data problem. Organizations entrusted with personal, sensitive and consequential data need to effectively augment their security systems now or they are putting their companies, clients, customers and citizens at risk

.

Create a free website or blog at WordPress.com.

Up ↑

%d bloggers like this: