Yet Another Proof Point for Network and Endpoint Security Integration

As I’ve mentioned many times in my blog, there is a lot of evidence suggesting a trend toward the amalgamation of endpoint and network security. 

Here’s another recent data point that supports this further. ESG recently published a new research report titled Network Security Trends in the Era of Cloud and Mobile ComputingThe report is based upon a survey of security professionals working at enterprise organizations (i.e., more than 1,000 employees).  ESG asked them: “Is your organization engaged in any type of project to integrate anti-malware and analytics technologies on networks and endpoints?”  Nearly one-quarter (22%) said, “yes, extensively,” while another 39% responded, “yes, somewhat.”

To further analyze the data gathered in this survey, ESG built a scoring system to segment enterprises into three categories based upon their information security resources, skills, processes, etc.:  Advanced organizations, progressing organizations, and basic organizations.  Interestingly, 65% of advanced organizations are integrating endpoint and network anti-malware and analytics technologies “extensively” today.  Based upon this, it is safe to conclude that endpoint/network security integration is rapidly becoming a cybersecurity best practice.

It is also worth noting that advanced organizations make up about 20% of the enterprise population while progressing organizations account for 60% and basic organizations compose the remaining 20%.  Given the preponderance of endpoint/network security integration in the advanced organization population, it’s likely that this trend will proliferate across the enterprise spectrum to progressing and basic organizations over time. 

The consolidation of network and endpoint security controls and analytics could carry a few repercussions:

  1. Vendors with the most experience dealing with security analytics, malware, and threat management teams are in the best position for success.  Think Bit9/Carbon Black, FireEye, Guidance Software, IBM, Palo Alto, RSA Security, etc.
  2. Alternatively, traditional AV vendors have a bit of a challenge ahead.  Many organizations think of AV in terms of compliance or endpoint operations.  So when they need additional endpoint security protection, they will likely call in the security “cavalry” (i.e., malware and security analytics gurus) to make product decisions.  AV vendors need to prepare for this with better market education and sales campaigns focused on the SOC.
  3. Network security vendors fall somewhere in the middle of this dichotomy.  Those lacking their own endpoint security technology (Check Point, Fortinet, HP, Juniper) should partner with advanced endpoint security providers or acquire one of the many burgeoning firms in this space. 

While advanced organizations are well along the way with endpoint/network security integration projects, progressing and basic organizations are just getting started.  Security vendors with easy-to-use but tightly integrated solutions should prosper as the broad market jumps on the integration bandwidth.  MSSPs and professional services players will be especially attractive as progressing and basic organizations are often understaffed and under-skilled when it comes to cybersecurity.  

Posted in End-User Computing, Information and Risk Management, IT Infrastructure, Networking, Security and Privacy | Tagged , , , | Leave a comment

Time to Embrace or Terminate National Cybersecurity Awareness Month (NCSAM)

Most people know that October is National Breast Cancer Awareness Month. Far fewer people know that October is also American Archives Month, National Book Month, and Pastors Appreciation Month. 

Oh yeah. October is also National Cybersecurity Awareness Month and unfortunately, few security professionals or industry leaders either know about it or pay much attention to this designation. 

Now, dissing National Cybersecurity Awareness Month isn’t a universal problem. In fact, it’s sort of a big deal in Washington DC where the month actually begins with a Presidential proclamation. In his proclamation issued on September 30, President Obama declared, “I call upon the people of the United States to recognize the importance of cybersecurity and to observe this month with activities, events, and training that will enhance our national security and resilience.”

The Presidential proclamation is usually followed by a DHS-led event attended by Washington-based industry groups, federal sales teams, lobbyists, and various government cybersecurity wonks. I actually attended the National Cybersecurity Awareness Month kickoff back in 2009. At this event, Janet Napolitano, the Secretary of DHS, announced that the agency would be adding 1,000 cybersecurity professionals to its staff by 2012. Napolitano said:  “This new hiring authority will enable DHS to recruit the best cyber-analysts, developers and engineers in the world to serve their country by leading the nation's defenses against cyber-threats.” 

I remember leaving Washington with a sense of pride about National Cybersecurity Awareness Month and Secretary Napolitano’s bold statement. In 2009 and 2010, I tried to monitor DHS’s progress on this hiring commitment but in spite of my efforts, I never found another published word about how DHS was progressing in its cybersecurity hiring effort. Given the cybersecurity skills shortage, bureaucratic federal hiring procedures, and low federal salaries, I doubt whether DHS fulfilled the Secretary’s promise—but then again, I’ll never know. 

Aside from this personal experience, there are a few other reasons why I’ve become so cynical about National Cybersecurity Awareness Month:

  • Most cybersecurity technology comes from the Silicon Valley, not the Beltway, but unfortunately, National Cybersecurity Awareness Month is a pretty much a non-entity on the Peninsula. Don’t believe me? Check out the websites of leading cybersecurity technology firms like Check Point, Cisco, FireEye, Fortinet, HP, IBM, McAfee, RSA, Symantec, or Trend Micro. These 10 companies account for billions of dollars in infosec revenue but you’d never know about NCSAM based upon the marketing rhetoric on their sites. Heck, NCSAM was even absent from Washington insiders like Booz Allen, Leidos, Lockheed-Martin, and Raytheon when I checked their websites at the beginning of the month. How can NSCAM be successful if industry leaders aren’t interested enough to participate? 
  • The “Stop, Think, Connect” message isn’t enough. NCSM has featured this message (or similar messages) for years. I understand that we need a foundation of basic infosec hygiene but given the alarming attacks at Home Depot, JP Morgan Chase, and Target, elementary cybersecurity education is no longer enough. We need wide-ranging programs to educate business leaders, federal/state/local legislators, and critical infrastructure providers. Yes, consumers need to have the right knowledge to protect themselves but we need to educate the folks who are responsible for protecting all of us.
  • Few leaders are stepping up. When October comes around, an impressive group of breast cancer survivors make sure to pepper the media with interviews, campaigns, and live appearances to get the message to the masses. In my many years in cybersecurity, I’ve yet to see a similar PR effort around cybersecurity awareness. Special Assistant to the President and Cybersecurity Coordinator, Michael Daniel, should be making the rounds to CNN, Fox News, Good Morning America, etc. Where is he? Beats me. Come to think of it, can anyone point to a person who represents NCSAM or cybersecurity in general? 

To be clear, I’m am not criticizing the worthwhile programs and organizations that actually promote cybersecurity education and deliver value. That said, these efforts would still be meaningful if they were done independently of a half-hearted awareness month that few pay attention to.

So here’s where I stand on NCSAM: Before next October 1st, Washington supporters like the National Cyber Security Alliance need to enlist grassroots participation (and money) from the infosec industry and work with ISC2, SANS, ISACA, and others to get security professional organizations more engaged. At the same time, we need our elected officials to increase funding for cybersecurity programs and take these programs to their constituents. Finally, let’s try and get some international participation since there are no borders on the Internet. 

In lieu of these changes, I suggest we stop pretending that National Cyber Security Awareness Month matters and let other more committed groups enjoy their month in the spotlight. 

Posted in Information and Risk Management, Security and Privacy | Leave a comment

Hyperconverged – Hyper Market Acceleration

Before I jump into the wild world of IT hypervconverged infrastructure, let’s quickly remind ourselves of the benefits ESG has seen from these types of deployments:

Top Five Benefits Realized by Deploying Integrated Computing Platforms

The above research includes additional deployment models beyond hyperconvergance, but the benefits remain relatively the same (Source: ESG Research Brief, Integrated Computing Platform Trends, August 2014.). IT simply wants an easy to deploy solution that is predictable and simple to manage. And just as we observed through ESG research presented in this infographic, the market was lighting up with hyperconverged solutions and further spotlight was placed on the market with the announcement of vmware-evo-and-its-market-impact/index.html” target=”_blank”>VMware EVO solutions at VMworld 2014.

Here is a quick (and likely incomplete list) of vendors that are busy positioning themselves in the hyperconverged market:

HP: Yes – not an original participant of the VMware EVO announcement, but HP now has an EVO offering set for GA in 2015, and add the StoreVirtual (perhaps the widest deployed) solution to this mix as well.

Maxta: First went to market early in 2014 as a storage solution, but hassince pivoted its messaging directly at the hyperconverged market.

Nimble Storage: Here is another storage vendor that pivoted some of its go to market and has teamed up with Cisco UCS to deliver SmartStack.

Nimboxx: I’ll predict these guys are about to get more attention than they have. They are KVM-based and call attention to the cost of many of these other VMware-only solutions.

Nutanix: Interesting things happening with its OEM agreement with Dell, but still not 100% clear how it will snap into Dell’s breadth of solutions.

Scale Computing: Another KVM-based solution, but a super simple UI and focus on the mid-market makes these folks worth watching.

SimpliVity: One of the pioneering vendors that re-engineered the storage architecture and delivered a complete solution, but is facing increased market pressure.

VMware: EVO solutions with Dell, EMC, Fujitsu, Inspur, NetOne, and Supermicro. These solutions are going to create further competition and attention in the market once they all GA later in 2014 or early in 2015.

Why does this all matter? Convergence, whether it be full hyperconvergence or better engineering between infrastructure components that is delivered in a pre-configured turnkey manner, is here to stay. Traditional and emerging IT vendors are going to quickly have to determine how to stand out from the pack and light up their go to market campaigns and sales initiatives. Some vendors in this general market, VCE for example, are focused in on the large enterprise and are tooled with professionals that can carry an enterprise application conversation while other vendors; Maxta, for example, is still balancing its storage capabilities with hyperconvergance messaging.

The next 6 months matter! Messaging and marketing have to stand out from the adjacent IT vendor participants, and candidly, these vendors need to find ways to shorten sales cycles and get their go to market partners involved and incented so they can help transact in this new consumption model. 

Posted in Cloud Computing, Private Cloud Infrastructure | Tagged , , , | Leave a comment

How to Protect an EVO RAIL (video series)

VMware’s EVO RAIL is an architecture for a hyper-converged, software-defined data center in a single appliance form-factor … to be delivered by various hardware partners.  But how do you protect that all-in-one solution?

For the next several weeks, ESG will be releasing a seven-part series of ESG Capsules, 2 minute video segments, where I’ll talk more about some of the protection possibilities and caveats in an EVO world:

part 1 – Introductory ideas for protecting EVO RAIL (below)

part 2 – Solution Spotlight : VMware

part 3 – Solution Spotlight : EMC 

part 4 – Solution Spotlight : Dell

part 5 – Solution Spotlight : HP

part 6 – BC/DR possibilities

part 7 – Channel considerations

Here’s part 1 on ideas for protecting an EVO RAIL.  Check back here for updated hyperlinks … or follow @JBuff on twitter to see more of this series.

Thanks for watching

Posted in Data Protection, Information and Risk Management, IT Infrastructure, Private Cloud Infrastructure | Tagged , , , , , | Leave a comment

Proofpoint Report Exposes Details about Cybercrime Division-of-Labor and Malware Architecture

One of the more vapid cybersecurity cliché statements goes something like this: “Hacking is no longer about alienated teenagers spending countless hours in the basement on their PCs. Rather, it is now the domain of organized crime and nation states.” While this is certainly true, it is also blatantly obvious. It is also nothing more than a meaningless platitude with no details about why this is true, how hackers operate differently than teenagers, or what the implications are.

If you want to understand these issues, I strongly suggest that you read a new threat report, Analysis of a Cybercrime Infrastructure, published this week by Proofpoint. The report follows the tactics and techniques used by a Russian organized crime group as it launched an attack on US- and European-based users aimed at stealing online banking credentials.

Reader warning, this report is a tad on the geeky side using technical terminology like browser plug-ins, droppers, microshells, and static/dynamic injections. Nevertheless, I suggest that readers move beyond these technical points and plough through the report. Eschewing the technical depth, the report can still provide readers with a conceptual feel for the strategies and tactics used by the bad guys.

With this is mind, here are a few of my biggest takeaways from the report:

  1. It takes a village to commit a cybercrime. Like the team of crooks recruited to rob a casino in the movie Ocean’s Eleven, organized crime is all about specialization and division of labor. Everyone knows this but few people can talk about the actual details about who does what. This report does a great job of exploring these kinds of nuances around the cybercrime market. For example, the Russian hacking group at the center of this report purchased lists of administrator passwords from others in order to compromise sites using the WordPress open source content management system. While this group used its own homegrown traffic distribution service (TDS) to direct victims to exploit servers, the report mentions that other cybercriminals provide SaaS offerings for TDS. Finally, the highlighted Russian hacking group didn’t stop at stealing banking credentials; it also leveraged its network of compromised PCs to develop a cybercrime proxy service it then leased to other hackers. So hackers are making money coming and going. 
  2. Hackers look for the path of least resistance. In order to attain a high rate of success, cyber criminals determine which of several exploits to use based upon a profile of a victim’s PC. In other words, my PC may be compromised through a Java exploit while the person sitting next to me may get powned using an IE vulnerability. The bad guys aren’t wasting time with one-off attacks but rather are sizing up each victim, finding his weaknesses, and then storming through one of several open doors.
  3. Attacks are designed to stay one step ahead of the law. It’s common wisdom that hackers test their malware against all the popular AV software to avoid detection. In this case, the Russian hackers went beyond checking the detection rates of the malicious payload by making sure to steer clear of IP addresses and URLs that might pop up on reputation lists. The bad guys also instrumented their code with “lookout” capabilities. When any AV software starts to detect their exploit, the tool notifies the group immediately. So each time Kaspersky, McAfee, Sophos, Symantec, and Trend Micro catch up, the bad guys figure out a way to disappear again. 
  4. Ease of use is part of the process. Yes, hackers are highly skilled, but they don’t have to be technical savants who can whistle into pay phones at 2,600 hertz. The report displays a multitude of administrator screens that would make sense to any reasonably competent system administrator. In some cases, hacking groups also use ease-of-use administration/operations as a way to differentiate their services from the competition. This also helps cybercrime groups delegate tasks to junior administrators and thus free up talented hackers for more high-value projects.

To mix metaphors, the Proofpoint report takes the reader “behind the curtain” to understand “how the sausage is made.” Given this, it is a worthwhile – and frightening – read for all cybersecurity participants. On a final note, the Proofpoint report provides a detailed case study of what we white hats are up against. We need to get our act together and prepare our defenses for Russian professional organized crime syndicates like the one described in this report. Alas, too many organizations still treat the cybersecurity battle as if they were still facing alienated teenagers in basements.  

Posted in End-User Computing, Information and Risk Management, Security and Privacy | Leave a comment

Oracle Open World…& the new FS1 SAN (with video)

This year’s Oracle Open World (OOW) was – as ever – huge from just about every measurable dimension. While the weather is seemingly always lovely (except at SFO where “flow control” seems to have been the order of the day all September), it is not available to any regular tourists unless they are prepared to pay the stupendous rates that a sold-out city can charge.

Stealing a page from Microsoft when it “got” the Internet (what seems like an eternity ago!) Oracle spent its time at OOW confirming that its flirtation with this cloud thing is a full blown romance! Of course there were a ton of specific product announcements (a very beguiling new SAN product – the FS1 – being of course what caught my eye! More on that below). But this event was also about the occluded front that often accompanies clouds: that occlusion being the change in role for Larry Ellison and the emergence of the Safra-Mark show (lest there be one more “Hurd-ing Katz” jibe….). The change was managed effortlessly with Larry revelling in his “lead techy” role. What were the key takeaways? My colleague Nik Rouda and I already commented in our esg-recap-of-oracle-openworld-2014/index.html” target=”_blank”>joint blog about Oracle OpenWorld but here’s a bit more depth in one of our ESG on Location video reports….

While I was of course fascinated by the big picture stuff, my myopia always sets in for the storage stuff. The highlight for this year was the long-awaited arrival of a new flagship enterprise SAN product from Oracle – which, as I mentioned above is called the FS1. FS stands for “Flash Storage” (or was it FlagShip!?), which is how it was designed and built; although its ability to use that flash storage (for performance of course) in any percentage mix with HDDs (for bulk inexpensive capacity) plus its plethora of functions means that it could just as easily be called “Flexible SAN”.  That flexibility is borne not just from all those standard operational features one has come to expect these days (snaps, thin provisioning, replication, HA etc) but is helped by the data/business-focused abilities Oracle has added: not just sub-LUN auto-tiering, but extended QoS abilities, and secure system partitioning. The overall package looks like it could be attractive to any enterprise user…but of course Oracle sweetens the attraction for its broader-use customers via close integration – and added features – with its own “red stack” products. 

The ZS3 has made considerable strides for Oracle in the (mainly) file/NAS world, and this new FS1 has the right stuff to do the same for Oracle’s market share in the (mainly) block/SAN arena.  The storage market is fascinating right now – both in and of itself, and also when viewed against the larger industry backdrop of such things as convergence, big data, and clouds; all of which, we now know, Oracle is in love with!   

Posted in Cloud Computing, Data Management & Analytics, Enterprise Software, IT Infrastructure, Storage | Tagged , , , , , , , , | Leave a comment

Informatica and the Challenge of Data Unification

Informatica is clearly a leader in data integration. In fact, a case could be made for Informatica being the leader in data integration. Since superlatives are not typically part of my lexicon, this represents something of an accomplishment on Informatica’s part. Informatica has been around for just over 20 years and is now driving over $1 billion in revenue. Informatica is unique because it’s the only large leading vendor in the data integration space that is a pure-play in integration. This means that Informatica’s future is inexorably tied to how enterprises leverage data. This is a good thing.

When you look at IT, you find that everything is data driven. Solutions and tools differ only by what data they align with and how they put this data to use. The reason we can say this with confidence is that every event is the result of one or more changes in state. As a result, whether we chose to formally recognize these changes in state from a data standpoint, they are responsible for initiating IT activities. For a comprehensive discussion of this topic, see ESG’s market summary report on Decision Analytics: Building the Foundation for Predictive Intelligence and Beyond.

For the majority of the last 20 years, enterprises have been entrenched in developing at least one system of record (SoR) to manage their data. Specialization gave rise to multiple SoRs, which drove data warehousing (DWH), master data management (MDM), data integration (DI), data quality (DQ), and enterprise application integration (EAI) needs. Informatica caught this wave and delivered products to address all of these needs.

Now that the web and more recently mobility have come of age, there is a transition taking place in application design. The focus is shifting from SoR to system of engagement (SoE). This is a significant shift that involves interactions that are multi-channels, contextual, potentially socially aware, data dependent, and often performed in real time. SoE interactions also will have a distinct bi-directional M2M orientation meaning that they may follow a variety of interaction patterns including request/reply, pub/sub, and sense/respond. What sets Informatica aside is that it provides explicit support for real-time application integration across all of these interaction patterns.  This is because Informatica brings together data integration, event-driven architecture, data streaming, event processing, and decisioning. The foundation for this is an ultra low-latency messaging transport – Informatica’s Ultra Messaging (UM) platform. With performance within optimized environments down in the 50-100 ns range, UM is clearly high performance. When you then layer on PowerCenter connectivity, Vibe data streaming (VDS), CEP for real-time data analysis, and RulePoint for decisioning, you have a comprehensive and high-performance solution to SoE data unification needs. I’m choosing to use the word unification purposely because Informatica’s combination of capabilities goes well beyond what we think of when we say data integration. Data unification is a combination of data integration (streaming, aggregation, transformation, and enrichment), analytics, and decisioning set within a real-time framework for processing and management. Although Informatica is being actively pursued by Dell, IBM, Oracle, SAP, TIBCO, and a host of smaller vendors, Informatica currently trumps them on functionality and vision.

Informatica’s thorough treatment of data unification ideally positions them to address the next generation of use cases for the Internet of Things (IoT). With an estimated 50 billion devices by 2020 and nearly all of these devices producing and/or consuming data, the future will be far more data-driven, calling for even more capabilities focused on data routing, aggregation, transformation, integration, machine learning, analysis, and unification. There will also be a need for a logical and physical data specific abstraction layer to manage how data is aggregated, transported, consolidated, and distributed. Although new standards, conventions, terminology, and architectures are needed to move IoT forward, the data-centricity of IoT activities puts Informatica in the center of a significant opportunity. Although Informatica is being fairly tight lipped on its immediate IoT plans, the direction of the portfolio over the last several years provides a very good foundation for becoming a leader in the data unification needs associated with IoT.

Posted in Application Development & Deployment, Cloud Computing, Data Management & Analytics, Enterprise Software | Tagged , , , , , , , , , | Leave a comment

HP – Parsed and Future

So, after a furor of news, we can all settle down now in the knowledge that there will be two HP's. I so wanted one to be called Hewlett and the other Packard! Maybe with a lower-case “i” in front of each name for a contemporary nod and wink to the founders. By the way, if ever you are having trouble remembering which HP is which, they did at least make that easy for us: the ink is in the Inc.

Frankly I really don't have a lot to add to all the financial excitement: spin-outs seem to be the name of the game right now (think IBM and eBay re PC's/servers and PayPal respectively), But, hey, when a company splits and still has two “siblings” each north of $50B revenue, one feels one should mark the occasion. So, farewell, HP, long live HPs. And I don't say that just to be cute: HP is one of a handful of companies where – outside of the day-to-day fisticuffs of sales – even its competitors root for it…it is part of the fabric of IT and indeed of the US.

But what does this split – when it actually happens – mean for the area I focus on…storage systems? In the short (now) to medium (say 2016/7) term I really can't see that it is going to be that much. Of course there have been other recent, well-publicized rumors swirling around HP (of the 3 letter EMC variety!) but for the sake of this, I am assuming they are just that…rumors. At face-value the HP storage business – which has actually been doing pretty well compared to its big competitors of late – remains just a part of the business: Of course, it is a key element in the Converged Infrastructure that HP has been driving [towards] for some time now, but then again it already was. Now, all the blurb around the logic for the split talks about increased focus, nimbleness, investment, and so on, but I have not seen any major lack of focus or nimbleness (indeed quite the opposite) in the HP Storage ranks of late…and if investment resources were tight (is there anywhere where they are not felt to be so!?) it is hard to foresee any significant immediate affect when roadmaps in this business take many years to manifest into GA products. I'm not negative on the change….but I simply don't see a great deal of upside or downside as far as the storage unit and its customers/prospects go. Basically, if you like the existing HP Storage story then you should feel at least as happy as you were already to deal with it. And if you happen to prefer some other vendor right now, then I wouldn't hold off any decisions expecting dramatic new choices anytime soon.

Like many, I really do wish HP(s) well. Some things are perplexing, to be sure: quite how splitting the company into two leads to extra layoffs (as HP also announced) I fail to grasp…although I assume it is simply an admission that there was [at least seen to be] more to cut in the first place. Aside from the internal organizational streamlining and the financial analysis of the split, the fact remains that HP – however many operating companies or divisions there are – still simply has to execute. In ESG's last storage trends research, one of the questions posed was this: “In general, what would you consider to be the most important criteria to your organization when it comes to selecting a storage vendor/solution?” The number one response (each respondent could check five criteria) was “Total cost of ownership” for 65% of respondents, followed by “Service and support” at 53%. You have to look a long way down the criteria list to get to things like “Existing relationship with vendor” (22%) and “size/financial stability of vendor” (just 15%). In other words, product, value, and service matter a lot….the business card and scale of the vendor much less so. A split HP is no real guarantee of more future success in the storage arena (where it is/was trucking along pretty well), whereas executing against its existing strategy and product roadmap is.                       

Posted in IT Infrastructure, Storage | Tagged , , | Leave a comment

Leading Enterprise Organizations Have Established a Dedicated Network Security Group

When an enterprise organization wanted to buy network security equipment a few years ago, there was a pretty clear division of labor.  The security team defined the requirements and the networking team purchased and operated equipment.  In other words, the lines were divided.  The security team could describe what was needed but didn’t dare tell the networking team what to buy or get involved with day-to-day care and feeding related to “networking” matters.

This “us-and-them” mentality appears to be legacy behavior.  According to ESG research on network security trends, 47% of enterprise organizations now claim that they have a dedicated group in charge of all aspects of network security.  Additionally, network security is done cooperatively by networking and security teams at 26% of organizations today but these firms insist that they are in the process of creating a dedicated network security group to supplant their current division of labor. 

As part of its data analysis, ESG built a scoring system it used to segment enterprise organizations into three groups (based upon their infosec skills, resources, and practices):  Advanced organizations (approximately 20% of the total survey population), progressing organizations (approximately 60% of the survey population), and basic organizations (approximately 20% of the survey population). 

When viewed through this segmentation model, the results are telling:  64% of advanced organizations have a dedicated network security group, 50% of progressing organizations have a dedicated network security group, and 36% of basic organizations have a dedicated network security group.  Based upon this information, ESG concludes that there is a strong correlation between cybersecurity best practices, infosec maturity, and organizations with a dedicated network security group.

This organizational change makes sense for CISOs and IT organizations but as it gains strength it will impact enterprise information security behavior and the market at large in several ways:

  • Network security will integrate with other infosec components.  In the past, firewalls, IDS/IPSs, and network gateways were grounded in the networking domain.  Now that these systems belong to a network security group, they are being integrated with other cybersecurity technologies like endpoint security and security analytics.  The goal?  Weave network security into an enterprise-class infosec technology architecture. 
  • Large organizations are balancing network performance and security.  In the past, network security controls almost always ran in passive mode by monitoring/alerting but not blocking suspicious packets.  This strategy was instituted to guard against false positives disrupting critical network traffic but there seems to be a change in the air.  Many organizations are now automating network security remediation efforts in order to decrease the network attack surface, prevent attacks, and quarantine compromised assets.  Given the financial impact of security breaches, automated remediation will only increase – especially as network security technology gains tighter integration with global threat intelligence. 
  • The network security market opens up.  When the security team’s role was limited to defining requirements, it was easy for the organizations to purchase network security equipment from the same people that sold switches and routers.  Independent network security groups are breaking this historical bond as they look for best-of-breed security efficacy and strong integration with other security technologies across the enterprise.  This doesn’t mean that Cisco and Juniper are out of the game but it does mean that their relationships with networking buyers may carry less weight in future purchasing decisions.  Yet another reason why Cisco purchased Sourcefire. 

The ESG data suggests that network security is moving away from the gear that transports bits and closer to the technologies that protect the bits.  In my humble opinion, that’s a good thing.  As this transition gains strength, it should truly open up the market to network security vendors with more holistic infosec architectural strategies.  Good news for security firms like Check Point, FireEye, Fortinet, McAfee, and Palo Alto Networks.  HP and IBM should also experience a network security renaissance, driven by their network security, security analytics, and professional/managed services offerings.  

Posted in Information and Risk Management, IT Infrastructure, Networking, Security and Privacy | Tagged , , , , , , , , , , , , | Leave a comment

Data Protection Appliances are better than PBBAs (video)

Too many folks categorize every blinky-light box that can be part of a data protection solution as a “Purpose Built Backup Appliance” or PBBA.   But the market isn't just a bunch of apples with an orange or two mixed in, data protection appliances (DPAs) can be apples, oranges, bananas, or cherries — but if you lump them all together, all you have is a fruit salad.

So, let's reset the term to understand the market:

  • “Backup” alone isn't enough — so call the all-encompassing category what it should be delivering = “Data Protection”
  • And there isn't just one kind of appliance, there are at least four:
    • (real) Backup Appliances
    • Storage / Deduplication Appliances
    • Cloud-Gateway Appliances
    • Failover Appliances

Check out this video to see how I look at Data Protection Appliances or skip to the Video Transcript:

As always, thanks for watching.


Video Transcript

Data protection appliances are better than purpose-built backup appliances

Most folks have heard the phrase purpose-built backup appliance or PBBA. PBBA is something of an industry standardized term to talk about those appliances that are built for backups … but the problem is that way too many folks throw any kind of blinky-light box that can help with backups into the PBBA category – and that doesn’t help customers, partners, vendors, or industry influencers.

The way some people look at PBBAs in one bucket isn’t just a bunch of apples and an orange or two – its apples, oranges, bananas, and cherries … a whole fruit salad. And then some folks want to compare apples to oranges! Really ???

In general, the PBBA term is a little goofy because when one buys an IT appliance, one expects that appliance to do a job. So, to talk about a “backup appliance,” one would expect it to DO backups … but many don’t. Many are deduplication targets, which are awesome – but they don’t DO backups, they make backups better.

So let’s clean it up a bit … from my perspective, there are at least four kinds of data protection appliances (DPAs):

(Real) Backup appliances are “turnkey” solution appliances that include the backup software engine, as well as some amount of storage capacity. The solution is completely contained within the device itself.

Storage/deduplication appliances do not include the backup software engine, but they do offer capacity, typically with compression or deduplication capabilities. A storage appliance must be paired with backup or archive software, or written to directly from a production workload via some other data mover technology. Storage/deduplication appliances are all about adding efficiency to whatever other data protection solution that you are using.

Gateway appliances provide “local access” to remote/cloud storage, but again do not include the backup software nor the complete amount of storage being presented. It may include some amount of short-term cache for buffering or recent points-in-time restores. I like gateways, because they make “cloud” transparently part of the solution anywhere where disk solutions can play. The trick there is performance, which means that good gateways will need some kind of magic pixie fairy dust to ensure that the WAN doesn’t get in the way.

Failover appliances include not only a data protection management engine and copies of the production data, but also a hypervisor or other means to actually resume business operations by running VMs or services within the appliance itself. Sometimes the failover is within the box. Other times, it fails over across the cloud.  But that blurs the lines a bit.

And there are probably other categories coming on the horizon or something that I missed. Heck, there are already a few offerings where where the lines get blurry – particularly to how cloud extensions can add significant additional functionality to almost any of these local devices. Each of those DPA categories has different recovery scenarios and metrics (RTOs/RPOs/TCO/ROI). Some will offer more flexibility when only the cloud copy survives, while others require a local appliance to be brought online first – and that can have agility considerations, and also security considerations … meaning that the cloud copy is unusable without the proper appliance head connected to it.

In fact, another way to look at this categorization of appliances is to note that some of these solutions may (or may not) be cloud-extensible, whereas others are truly cloud-native solutions. Meanwhile, from a broader perspective, some devices are meant as turnkey-solutions for providing business and recovery agility, while others are meant to radically improve the efficiency of what you are already doing today. All of which are good goals and have relevance in today’s IT environment.

So, what should you take away from this?

  1. What are you trying to solve for … and how much of your existing data protection strategy are you willing to encapsulate in an appliance? That will at least point you to a category of data protection appliances to consider.
  2. Each of those DPA types have some great and unique capabilities that are worth considering. One does not usurp another – and frankly, between them, you likely have the building blocks for whatever modern data protection solution that you are looking for … or at least whatever physical elements that you need.
  3. Speaking of “physical” … appliances can be physical or virtual. It's much more about ease of deployment and integration than it is about adding another box with blinky lights. Sure, if you add a virtual appliance to a host, there will need to be adequate compute and storage to accommodate it – but that is an engineering consideration.

If you don’t consider those tips, if you just keep calling every blinky-light box that could be used for data protection a “PBBA,” then don’t be surprised if you find yourself comparing apples and oranges to bananas and cherries – without a clear-cut winner from a technical perspective, or even from an ROI/TCO perspective.  In the meantime, check out ESG’s website and portal for more information on my coverage of the broader data protection appliance market … including some highlights on a few of those appliances and their solution categories.

 

Posted in Data Protection, Information and Risk Management | Tagged , , , , | Leave a comment