In this the 12th in the ‘cointroversial questions’ series, we wanted to hear the thoughts of industry executives regarding the next round of technology restructures.
This is the question we asked:
It seems that there are some technologies that are ripe for confluence. In this instance, I’m thinking of Software-Defined Networking, Cloud Computing, Edge Computing and Zero-trust as a combined wave about to crash over a lot of highly-valued sand castles. Once everything is in the cloud, all anyone will need is a client computer (think Chromebook or Amazon Fire) – local execution (of anything) will be a thing of the past. There are some obvious winners and losers here.
A number of our executives penned interesting introductory thoughts. For instance, Andrew Slavkovic, solutions engineering manager, CyberArk, observes that “Cloud computing advancements presents tremendous innovation possibilities. Greater adoption in turn drives more investment, and the boundaries of what is possible are constantly pushed. These conditions are creating the perform storm for new niche industries to quickly emerge and gain traction. This phenomenon is being replicated in cyber, with traditional, well-understood threat vectors evolving into a cloud-centric environment.”
Mark Jobbins, vice president and Field CTO, Asia Pacific & Japan, Pure Storage continues, “These technologies will no doubt have a significant impact in shaping the future IT landscape. The challenge will be how to balance their influence to meet various business needs while providing agility.”
Is the move to the cloud an unstoppable force? Graham Sowden, general manager APAC, Okta is quite convinced. “Our collective march to cloud adoption is unrelenting, and once everything is in the cloud there is a significant opportunity for the consolidation and simplification of technologies and services into a handful of primary clouds. Over the next five years, we can see the technologies that are fundamental to any organisation’s operations being consolidated into these primary clouds. For example, we might see independent clouds for collaboration, CRM, infrastructure, ERP, and identity.”
BlackBerry’s director of engineering, APJ, Jonathan Jackson agrees – “With the clear benefits of flexibility and efficiency, the widespread shift to the cloud shows no signs of slowing down. But as IT transformation and consolidation continues, productivity and convenience shouldn’t take precedence over security. The cloud era has not only redefined the enterprise perimeter and created new vulnerabilities – it has also created a dependency on constant internet connectivity. These new challenges require intelligent and continuous cybersecurity solutions to protect data wherever it resides, in transit or at rest.
“Cyber-attacks continue to increase in volume and ferocity, prompting organisations to seek ways to not only improve their cybersecurity posture but also simplify their solution stack and reduce demands on analysts. In response, vendors are looking to meet these demands with a complete end-to-end offering – meaning cybersecurity solutions are ripe for confluence.”
Similarly, Matt Cudworth, cloud practice lead / cloud CTO APAC, DXC Technology adds, “The maturation of Serverless Compute and Services – the core of what we often term true Cloud Native – and its impact on the rising tide of compliance standards will have a major impact on the business landscape going forward, creating new challenges and opportunities for enterprises.”
The ‘edge’ is becoming more and more important – we will address this in a major section of this report. But to introduce the topic, Rishi Varma, director, product management for Akamai Technologies tells us, “Network and security technologies are increasingly converging at the edge. This has been in the making for some time and is certainly a positive for the industry at large. This will enable companies to address areas of risk. For example, the advent of Open Banking and the challenge posed by fintechs has forced Financial Services companies to prioritise API security, and regulations like MDBN, GDPR and CCPA have compelled businesses with a customer-facing digital property to seriously address privacy – across every industry.
“This, in turn, will accelerate the development of technologies that will improve personal security and privacy in every aspect of our lives – especially as connected devices and applications become increasingly pervasive in everyday life.
“This will also help businesses more quickly adapt their security posture to a fast-changing threat and business environment. For example, when security-minded businesses had to pivot to remote working during the pandemic, rather than scale their on-premises infrastructure, they leveraged the edge. The distributed nature of edge computing enables businesses to deliver lower-latency applications that are secure, closer to users, and offer better user performance.”
Our major theme here is the approaching confluence of many technologies. George Dragatsis, Hitachi Vantara’s CTO and enterprise presales director for Australia and New Zealand had this to offer. “The confluence of technologies is happening because of a confluence of end user requirements to provide continuity of business, mobility, resilience, data protection, security, analytics, processing power, accessibility etc. to enable the workforce to execute tasks in support of business objectives.
Rod Taubman, managing director, Acclimation continued, “This confluence is driven by massive economies of scale of hyper-scalers such as Amazon and Google that are now producing the bulk of global CPUs and RAM supply. These massive barriers to entry, based on buying power alone, set up natural monopolies prohibiting new entrants. The landscape will be largely the same on this front. Interestingly, the commoditisation of these services may lead the fight for market share being made on the front of the orchestration of those services and clever use of user experience interfaces.”
“I agree!” says Tyler McMullen, chief technology officer at Fastly. “It’s interesting to consider how innovations lower down the stack have allowed these trends to come to the fore. Improvements to our lower-level infrastructure and networking such as WebAssembly, XDP, eBPF, SmartNICs, CPU improvements, and so on have allowed greater innovation in areas like SDN, Edge Computing, and Zero Trust in particular. Previously, innovation was limited by the need for specialised hardware, whereas now much of this can be done in software, allowing for much more flexible approaches and quicker time to market. The sentiment that these are all connected spaces and will be powerful in concert with each other is, I think, correct. We’re going to see big changes in the next few years as the power of both these types of products and the underlying infrastructure that enables them to become more widely deployed.”
ManageEngine’s vice president Manikandan Thangaraj agrees. “The market is driving a convergence of all security tools and functions under one roof. We’re already seeing security information and event management solutions that combine threat intelligence; user and entity behaviour analytics; security orchestration, automation, and response; and extended detection and response.
“In the future, we may see data loss prevention and cloud access security broker platforms merging into a unified security console. Artificial intelligence, especially machine-learning techniques, will be employed for predictive analysis across different functions of the unified cybersecurity mesh. This will help enterprises improve their security postures by properly configuring the IT security modules they need, streamlining their security operations, and responding to alerts.”
The never-ending circle
Of course, there’s nothing new about the ebb and flow of “preferred solutions” in the IT industry. As the song goes, “everything old is new again.” Bruce Reid, head of service provider partnerships at Vertel describes this phenomenon rather well.
“The IT industry commenced with centralised computing utilising expensive, largely regulated, monopoly-based telecommunications links. With the advent of cost-effective, distributed computing many centralised compute facilities were replaced by smaller site-based compute facilities. As a result of both deregulation/de-monopolisation of the telecommunications sector on a global scale and the advent of highly scalable, virtualised compute facilities the tide has changed and there has been a mass recentralisation of computing (albeit spread across multiple physical facilities within the cloud). This reconvergence has been primarily driven by economic factors such as compute cost, avoidance of capital investment, and reduction in facilities operational staff costs.
“This trend will continue, and it is expected that this convergence will be to a small number of global oligopoly cloud providers controlling major virtualised compute facilities and international telecommunications networks. These providers will continue to seek growth and logically this will come from the convergence of related services such as control of edge computing capabilities, and the inclusion of zero trust capability with basic cloud service fabric, to name a couple.”
It would be hoped that any advance in technology would reduce the various pain points experienced by users. Colloquially, we describe systems that reduce this pain as ‘frictionless.’ Daniel Conrad, field strategist at One Identity offers this description. “Some of these types of technologies that previously applied to the data centres are already layered into different types of “as a service” offerings without us realising. Not requiring infrastructure or clients to access the systems has been the norm for a few years but the way we use these systems will continue to improve, regardless of platform we use to access them.
“Users have become accustomed to “the app lifestyle” where everything is available on any platform they choose. The challenge for an organisation, company, or government agency is ensuring the users have the right access to the right systems. In this “anything from anywhere” world, the security perimeter is now in the identity. We need to make sure the organisation’s systems, whether in the data centre or running as a service, provide the right access to the right people.”
The plumbing is important
Of course, no matter what the preferred solution, be it on-prem, cloud, edge or something we’ve never previously contemplated, end-users still have to be connected with the services they need. A few of our respondents identified this critical factor.
For instance, Lee Field Director 5G and enterprise solutions at Verizon Business Group observes that “We’ve been here before, but now we have high-capacity connectivity options unlike the days of mainframes and terminals or servers with thin clients (anyone remember Sun Ray?). I do believe that we are about to see a significant change in the way we access applications and services and in turn, this will change the way we think about building secure corporate connectivity.
“This will fundamentally alter the technology landscape, we’re currently in a transition stage as organizations accelerate their move to the cloud or start to transition to cloud-native applications. With ubiquitous 5G on the horizon and the evolution of always-on, always-connected client devices combined with edge compute options hosted in cellular networks, are we about to see a transition from ethernet to cellular?”
Reid adds, “It will be critical that organisations enhance the efficient robustness of their telecommunications networks providing connectivity from operational sites to the cloud. It is less likely that major cloud providers will seek to own local telecommunications access networks. Therefore, an ongoing focus by user organisations to ensure that there is commensurate high availability physical access diversity to the cloud will be critical. Users will likely need to ensure that software-defined networking capability is implemented to ensure rapid and application aware re-routing in the event of network failures.”
Conrad continues, “The pieces and layers of technology that enable and secure our systems are being integrated into these solutions, but we need to make sure the identity is protected both from outside exploitation and the users themselves.”
Jason Feist, president of engineering, emerging products and solutions at Seagate Technology draws our attention to the confluence of SDN and how it may impact storage solutions. “Changing gears, a technology that is synergistic with software-defined networking, NVMeoF is a gateway into an ecosystem of simplified storage that enables all use cases to leverage a common interface. NVMe interfaces are a way to simplify motherboards, firmware, and software—they provide one way to communicate with storage. By unifying language, NVMe allows us to build new applications without the complexity that comes with multiple protocols.
“As a result, barrier to entry into the market is lowered, and corporations get a better balance of speed, capacity, and cost.”
One size won’t fit all
As much as just about every solution provider posits that theirs is the “one solution to rule them all,” this probably doesn’t best serve the actual consumers of the technology. It would seem that the executives agree.
For instance, Gary Dollinger, director of technology at West Monroe suggests that “The belief that everything can survive in the cloud is a fallacy. By overcentralising applications and data, organisations will continue to increase their operational risk of cyberattacks. Imagine a world where a foreign adversary can take out entire financial institutions, utilities, or military structures through pinpointed attacks on infrastructure. In fact, you do not have to imagine too hard since this already happens so often that we are becoming numb when hearing of another attack.”
Jobbins is a little more blunt: “One size will not fit all. It is unlikely that all workloads will move to the cloud as we typically think of it. It is more likely workloads will eventually leverage multiple cloud-like service providers, these being a combination of the hyper-scalers, public cloud providers, private on-premises cloud services, and edge computing.”
Dragatsis continues, “For most enterprise-level organisations, that results in some combination of private cloud, public cloud, managed services configuration. For that hybrid cloud strategy to be a smart one, it’s critical to understand that it looks different for each business.
“It depends on things like level of cloud maturity, application requirements, private cloud and legacy integration requirements, public cloud trajectory, edge computing maturity, usage of cloud services, adoption of advanced technologies and more. There’s no one-cloud-fits-all strategy.
New uses will drive technical advances
It’s often difficult to know where some new invention or technology will take us. Seagate’s Feist makes a few suggestions.
“Compute will diversify to provide optionality for emerging workloads. Domain-specific architecture optimization will lead to new semiconductor designs and the adoption of purpose-built compute using GPUs, CPUs, DPUs, and FPGA.
“In the storage area, we are enabling the cost-effective storage of all the data that users desire to be analysed by the purpose-built and optimized compute architectures noted above. In the longer term, this will result in innovation on next-gen device types, like ones using DNA data storage.”
A unifying perspective
Similarly it’s important, as Jobbins describes, to ensure that any unexpected but useful advances are facilitated as much as possible.
“The trick will be to create effective ways to link these cloud services so that businesses can utilise these services to their advantage rather than being locked in to one service provider.
“Software-defined networking and data management platforms along with reliable, high speed, low latency networks such as 5G will be the glue to enable this. These, combined with robust security platforms, will create the IT experience of the future.”
The user experience
No matter how wonderful something looks ‘in the lab,’ the users have to feel as though there’s some benefit.
Daniel Comarmond, security software engineer at ThycoticCentrify suggests that “Local execution is already a thing of the past for some organisations, with virtual desktop environments garnering more attention during 2020 to enable people to work from any connected device. But while abstraction of computing resources works for those after a generic operating environment, that’s at odds with ever more powerful and versatile mobile devices that put a richer experience at our fingertips. A case in point – when you work on a Word doc or an Excel sheet, how often do you use the Office 365 web app, or do you instead prefer the fully-featured – and more responsive – Office 365 ‘thick’ app run locally?
“So, while the ideal computing platform for some organisations may be homogenous processing and memory in the cloud – repelling a DDoS attack one minute, then re-allocated to running reports the next – the confluence of technologies should consider the user experience first and foremost. That said, a combination of Infrastructure-as-a-Platform, and agile hypervisors and containers are already going a long way there… it’s just that a multi-layered defence-in-depth security model is rarely budgeted for to adequately protect innovative cloud-computing features that security frameworks and compliance standards may not have had a chance to catch up with yet.”
Don’t forget Security, Privacy, Identity and Trust
They still matter!
Sowden is first cab off the rank, reminding us that, “Identity is the foundational layer that will connect all the primary clouds. It enables choice and flexibility while improving security and mitigating risk in the use of all other technologies. Identity and access management (IAM) technologies have been nominally grouped together under one umbrella for some time, however, they have been pretty fragmented. We are already starting to see the confluence of IAM offerings through acquisitions and product development. This will only continue as CSOs and CIOs seek to consolidate their tech and security stacks.”
Continuing that thought, Slavkovic offers, “Cloud and security practitioners have an opportunity to redefine how this new world is secured, not least because cloud computing captures a tremendous amount of data, which can be turned to our advantage. For instance, from a user perspective, this data can be part of a risk-based AI-driven approach allowing us to analyse user behaviour patterns (based on their role) and only present additional controls depending on how risky their behaviour is – all in real time and in a continuous learning manner.”
Leonardo Hutabarat, manager, enterprise sales engineering APAC at LogRhythm takes a similar view. “With the new hybrid workplace environment, it also crucial for security solutions to be smarter and more intelligent and to differentiate false positives and real incidents. In addition, time will be another critical function as more environments need to be detected. It will become necessary to detect an attack faster and precisely, so organisations can achieve the shortest amount of Mean Time To Detect (MTTD) and Mean Time To Response (MTTR) even though new security environments and new infrastructure are on the horizon.”
Today’s modern security buzz-phrase is ‘zero trust.’ It’s becoming a very valuable concept and Hyder Mohammed, lead enterprise architect – Y Soft, wants to remind us of its relevance in this discussion. “With zero trust cybersecurity solutions, organisations can not only obtain the security they need to protect their resources and data in today’s distributed organisation, they can also realise substantial business benefits. In addition to improving visibility across the enterprise and reducing time to breach detection, enterprises can also reduce the complexity of their security stack, minimise the impact of the security skills shortage, and protect customer data to avoid reputational damage and significant financial losses. Simultaneously, businesses can improve the user experience and facilitate migration to the cloud through the adoption of a zero trust security architecture.
“Zero trust offers more than a line of defence. The model’s security benefits deliver considerable business value: greater enterprise visibility, reduced IT complexity, less demanding security workloads, data protection, a superior user experience, and support for cloud migration.”
Cudworth expands on this theme, “Additionally, the rise of Cloud Native in organisations operating in high compliance environments has created its own set of challenges and opportunities. Compliance is being flipped on its head. The rising tide of local and global regulatory requirements, such as PCI and GDPR, means the response and management these new challenges will be a key differentiator for business.”
…and Jackson continues, “As the march to the cloud progresses, the winners will understand where their data is stored and how to maintain secure access to it. For productivity and security to remain constant and balanced, new workflows must have security embedded at the forefront. No matter where the data is – security must always reside with the asset and must not be dependent on continuous connectivity. And to combat the risk of a lost or constrained cloud connection, security must be adaptable and cloud-enabled, rather than cloud-dependent. This is intelligent security on the edge.”
Turning our attention to the (probably) insolvable problem of email, Mark Lukie, sales engineer APJ at Barracuda reminds us that “In email protection, classic gateway-based solutions will be augmented by Cloud Email Supplemental Security (CESS) solutions that leverage API access and social graphs/artificial intelligence/machine learning to mitigate messaging-based attacks. We will also see more of these solutions address other collaboration-based solutions such as Zoom, Slack, and Teams. These platforms have become very popular tools for collaboration and are susceptible to the same type attacks that we see with email.”
“Video imagining and analytics offers another example,” adds Feist. “The tech can securely connect between camera compute and storage to guarantee integrity of that info. Thanks to root of trust implementations and secure storage infrastructure, it will be very difficult to manipulate that image. Root of trust helps implement zero-trust infrastructure by helping users certify their identity so they can communicate.”
Hutabarat directs our attention to the confluence of work-from-home due to the current situation in which many work initiatives are handled remotely and the usage of cloud applications have become the new normal, the traditional security concept of castle and moat and forcing every organisation to adopt zero trust security best practice also required enterprises to enhance Network Defence and Response (NDR) detection into the edge and into cloud itself – either through a SaaS or IaaS. With the extension of cloud infrastructure, detecting a suspicious activity and zero-day attack in cloud infrastructure will be the focus areas that many organisations will now be interested in, especially on IaaS infrastructure. Organisations also need more awareness on security responsibilities especially in using their IaaS infrastructure, because detection and response against attack on the applications on top of IaaS will still be an enterprise’s responsibility.
Lukie also points out that as storage is devolved from processing, again we need to be more conscious of security. “Secure Software-Defined Wide Area Network (SD-WAN) technology is an extension of the software-defined networking trend that started in the 2010s. These solutions allow companies to leverage software to optimise traffic across wide area networks. While we have seen standalone SD-WAN solutions in the past few years, many of these solutions are integrated with network security (firewall) providers, and now we are seeing secure SD-WAN solutions become more prevalent. As public cloud adoption increases, many companies have started to leverage Azure, Google, and Amazon backbones as a delivery vehicle for their own WAN. This makes sense as more applications and services are delivered on these cloud services and their points of presence continue to expand worldwide.
“With remote workers and cloud-delivered services become a de facto part of every business, solutions that manage access at the highest level of granularity with the least level of complexity to setup and manage will become more popular in 2021. Trust-based solutions that implement Zero Trust and other trust-based solutions will be effective mechanisms to manage privileged access to services. Traditionally, customers used complicated solutions like NAC or VPN. In 2021, we will see great adoption of lightweight solutions that quickly and effectively manage access at the device and individual level. However, organisations need to be aware of the human risk factor in any security environment and provide education to their employees and to organisations throughout their supply chain and not just rely in the technology itself.
Reid continues, “In the context of the move towards converged oligopolistic cloud services provision, national security considerations will become more important thrown into the spotlight by COVID-19 and other recent international hackings. At a national level, organisations need to be able to ensure the security and onshore storage of corporate and government data. They need assurance that in times of national crises this critical infrastructure can be protected and controlled to meet domestic commercial and national requirements.”
Is BYOD a dying concept?
Lukie doesn’t see a future for BYOD – the risks are becoming too great. “We will continue to see the revival of corporate-owned devices provided to employees. Zero Trust Network Access concepts will flank traditional network-based VPN access concepts. Ironically, this is a rebirth of NAC concepts from 15 years back in a somewhat different more modern guise. Email will continue to be the primary attack vector.”
Don’t push me, coz I’m close to the Edge
Jigar Bhansali, VP solution architecture, Software AG APJ raised the issue of Edge computing. “When it comes to edge computing versus cloud computing, it must be said that the two technologies should be viewed as working alongside the other as one does not eliminate the capacity to use the other. Both have different critical and distinguishable roles to play within the IT infrastructure.
“Edge computing is more suitable for time-sensitive data (driving real time insights) while cloud computing processes data are not time-driven. Usage of either technology is highly dependent on the needs and requirements of the IT team and organisation. Edge computing is also preferred over cloud computing in remote locations that require local storage and while operating specialised and intelligent devices specifically factories.
“We will continue to see cloud computing as a key technology as it will continue to fill a need in the market. However, I believe the popularity for edge computing will grow as it will fill the gap and provide solutions in areas that cloud computing doesn’t offer. This will be in areas where the retrieval of huge data and resource-consuming applications that need a real-time solution are needed as edge computing offers flexibility and brings the data closer to the end-users.
“For instance, in the area of data analytics, we are already seeing manufacturers use cloud computing to create and test models and deploy on the edge. While many manufacturers have successfully migrated data and information to the cloud, latency, and costs, among other factors, have some manufacturers exploring edge computing as another option.”
McMullen takes a degree of exception to one of our position statements. “However, I think the conclusion of “local execution (of anything) will be a thing of the past” goes a step too far. It is a similar logical fallacy made when people think that edge computing is going to replace cloud computing. In my view, the movement toward edge computing is about moving computations to their optimal place in the network. It’s one of the reasons why I think the term “edge” actually can be a misnomer, or at least misleading. When we talk about edge computing, we’re really talking about moving computations into the network, instead of just at either end of a connection from client to server. Some computations make a lot of sense to run in a large, centralised datacentre. A good example of this is training of AI models on large datasets. Others make sense to run locally at a client. Computations which involve only locally held data and which match the compute power of the client fall into this bucket. It is all the things that fall between those two extremes that I believe benefit from edge computing.”
Lukie however, counters with “Solutions that move security closer to the edge (people, places, and things) will become much more popular.”
Having multiple copies of data is good, right?
Derek Cowan, director of systems engineering – APAC at Cohesity points to the broad array of duplicated data in any organisation. “IT executives increasingly recognise that the key driver of exponential data growth is redundant copies of primary corporate data created by the various point-solution tools used to protect, share, and analyse information. IDC quantifies this as the copy data ratio which is measured as the total data in the environment over the total amount of production data. The increase in this ratio for enterprises stems from data residing in backup storage systems, in disaster recovery environments, in test and development clusters, and in archival for long term retention; all of which quickly multiplies across all applications. Gaining visibility and control over these data copies is becoming paramount to gaining operational efficiency and agility.
“The challenge today is that the different tiers of storage and backup solutions work in siloed hardware and software form factors with their corresponding data management and protection software. The solutions are mostly passive in nature, becoming expensive insurance policies that sit idle until a restore request is received. Also, in many cases, customers only test these data backup copies infrequently and have long gaps between testing increase the chance of issues being found when data needs to be actually recovered – at which point it may be too late.”
The evolution of the VPN
Daniel Markuson, digital privacy expert at NordVPN (for obvious reasons) supports the use of VPNs in this environment. “The majority of the people who used corporate VPNs back in the day might say that VPN as technology should be sunsetted. And I agree with them. Corporate VPNs, where all of your traffic flows through the physical appliance in your office for you to access some Word documents on the SMB server, is how we did things when there was no better way. But the VPN technology is not about the round trip your traffic needs to take, nor is it about some clunky app where you need to configure some ports and upload some XML files for your connection to work.
“So one way of handling the abuse on early operating systems was to restrict everything. An excellent example of such an approach is iOS, which learned from early Windows mistakes. Another way of handling it is hoping that the market will fill the gap for anti-abuse software while you gradually migrate to a model with more restrictions. This model for OS developers looks quite attractive but narrows the market for anti-abuse software over time. That’s what happened to the antivirus market, and that’s where I believe it’s headed.
“Some sandcastles will be torn down, but many niche communities will still prefer the more modern systems. Likewise, we saw predictions of change in the past, but many of the tools from the past still exist. Tech is evolving, which means more all-around tech tools or “sandcastle” creators, if you will. And it takes much time to redo things completely.
It’s the software (as a service), stupid
Amir Khan, CEO and co-founder of Alkira, would like to take the discussion in a slightly different direction, pointing out that more and more tasks that used to be enshrined in hardware are now becoming hosted in software. “From energy to entertainment to enterprise networking, physical infrastructure is nothing more than a temporary inconvenience on the road to ubiquitous service. Whether the service is electricity or TV shows or applications and data, the long-term trend is the same. The user wants the content – the cables, connectors and appliances needed to access the good stuff are a necessary evil. Evolution will take care of it eventually.
“Just as enterprises don’t generate their own electricity, in future they won’t own and manage their own networks. Cable TV providers are no longer infrastructure providers solving problems of TV distribution that the broadcast networks couldn’t solve. Those that have clung on to the remnants of their former empires in the Netflix era have done so by moving out of cabling and set-top boxes and into content. It’s just another example of that decade old “software eats world” prediction.
“In the networking industry we’re seeing the same evolutionary process working itself out. Why do big enterprises run networks? Not because it’s fun, or cheap or easy, but because until now they had little choice. Network infrastructure, like compute and storage infrastructure, is heading into the cloud. Customer premises equipment will become simpler. Networking will be a frictionless service, consumed in the same way as electricity today.
“That’s not to say that enterprises will forego the management and control that come from running their own networks, just that they will be liberated from the hassle (and capital expenditure) of dealing with an increasingly complex array of equipment and software.
“This is not crystal ball gazing. It’s already happening. Our solution works with existing network transports, SD-WAN fabrics, data centre networks, routers and other premises equipment. Some of this technology will persist, some of it will die, but all of it is moving further from customers, who will increasingly consume networking as a pure service.”
New business opportunities
Of course, whenever we see new paradigms appear, there will be accompanying business opportunities both for existing organisations and for start-ups.
Mohammed nicely encapsulates this thought. “Today’s modern workforce has been impacted by COVID-19 with the need to work from anywhere, which means more and more solution vendors will be looking at transforming their product offering to leverage this and provide solutions that will adapt to modern day work habits. We could almost see hardware and software vendors joining forces to deliver this in the short term using edge technology.”
Feist notes that, “We’re approaching the saturation of use cases made possible by data residing only in the cloud—that’s why we have the rise of edge computing. The next business model is building value around using data that’s at the edge and making decisions based on it. Edge computing is changing the way interaction between devices and infrastructure happens. For example, in mapping apps, we can expect to see lots more sensors providing prescriptive insights improving our experience well beyond what we are used to today. We will also see AR/VR augmenting our traditional experiences to create more immersive opportunities.”
Varma, adds that, “We expect to see a proliferation of different approaches and vendors in this space. This will likely be followed by standards being established, and that will lead to consolidation and M&A activity, as the IT industry determines the technology approaches and specific industries align on their key use cases.
“We’ve seen this trend repeated both over the years, and across technologies. In the past, this occurred across technologies from browsers, to databases, to clouds. Today we are seeing technologies like Artificial Intelligence/Machine Learning, edge computing, and 5G converging on use cases like online fraud detection.”
Additionally, Dragatsis suggests that “Companies need technology partners that focus on integration, reducing risk by wrapping around flexible services and capabilities for things like highly scalable, resiliency and secure environments. They need flexibility in terms of consumption models and right-sized options to suit a range of business sizes, needs and the different stages that businesses are at in their hybrid cloud strategies.”
“Artificial intelligence and the continued drive for automation will lead to a huge explosion in internet connected devices and edge computing,” says Taubman, “as this will be an area of competitive advantage for industries like manufacturing, logistics, and supply chain. The global IoT market is expected to be worth US$1,463.19 billion by 2027 – 25 per cent compound annual growth rate (CAGR), which is now higher than cloud computing at 18 per cent. IT services will shift to providing services to exploit these new technologies with an augmentation to the existing workforce to support this growth.”
Brent Paterson, managing director – Australia and New Zealand, SNP also sees these opportunities. “The disassociation of the compute and data networks from the physical infrastructure is a powerful driver for businesses because it lets them significantly increase their innovation and customer reach while at the same time considerably reducing their costs. With this rapid growth and development there are strong business drivers for mergers and acquisitions at both the application and hardware ends of the market.
“At the app level vendors want to fill product gaps and build into adjacent markets. Hardware vendors need scale to drive down costs. Who will these be? No one can tell; however, the app winners will be those companies that keep an absolute focus on what the customer wants and delivering that to them. In my opinion, the most successful will do this largely through in-house product development. If a company does decide to acquire, then the imperative should be to focus on integrating that acquired technology with their existing products so that the customer has a seamless experience.”
With a slightly more pessimistic view of things, Comarmond points out the difficulties of ‘forklift upgrades’ to infrastructure. “Compounding this is the way in which innovative technology is procured – usually one project at a time. Yes there are some new online-only banks, for example, that get the luxury of starting an architecture from a clean slate… but just about everyone else simply doesn’t get the chance to design using only new infrastructure from the ground up.
“The piecemeal nature of projects and migrations means that despite the dream of a confluence of technologies, the reality is ‘legacy’ technology is going to be king for quite some time yet… where different on-prem and cloud workloads have different attack surfaces that are perpetuated by a myriad of powerful admin-level privileged accounts that are located across a variety of directory stores – and then an auditor turns up and asks you to report on how you attribute which application is being run where and which authenticated identity is using it.”
Winners and Losers
As was suggested in the previous section, there will be plenty of opportunities for new businesses and for new offerings. However, there is no guarantee of success.
For instance, Taubman notes that “The casualties here will be hardware producers that are essentially providing compute power at uncompetitive prices. Traditional infrastructure will bend to these economics leaving the competitive landscape in the domain of software houses and niche edge computing providers and the IoT.”
Mark Fazackerley, ANZ regional vice president at Talend speaks to convergence and merger opportunities. “Confluence and acquisitions – the area that seems ripe is Data Integration (DI) and Robotic Process Automation (RPA). Both are areas that the big native cloud vendors (MSFT, Amazon, Google) are making slow but steady progress, and some acquisitions and consolidation are already in play.
“The opportunity that convergence of AI, ML, RPA and Data Integration platforms would bring is AI and ML driven RPA could seamlessly flow into an automated DI platform, or Data Fabric, as another mechanism to capture, manage, govern and utilise data, providing a truly comprehensive integration platform.”
Varma concludes, “The impact will be widespread – starting with cybersecurity companies, and gradually permeating to the businesses they directly impact. Initially this would be other technology companies, financial services institutions, ecommerce and media.”
According to Thangaraj, “Remote work and cloud adoption are here to stay. When the traditional network perimeter expands, enterprises find it difficult to extend their on-premises security levels to remote work and cloud models. Security solutions that operate in silos only further complicate things; this type of setup increases integration costs and doesn’t provide a complete picture of the network’s security posture. On top of this, the blind spots created by siloed security solutions can be exploited by hackers to launch attacks or laterally move within the network.
Dollinger adds that “The real winners in the cloud game will be those that figure out how to harness its power without neglecting that we need a fail-safe way for our economy to continue. “
“The key influence,” according to Varma, “for the current acceleration is the already hyper-converged technology landscape, driven by two main factors. From a technology perspective, this is driven by the ability to increase security without compromising availability or performance by stopping threats where they start – at the edge, and far away from the origin where they can cause the most damage.
“At a more macro level, this hyper-convergence will be influenced by everything from connected commercial engagements and the proliferation of consumer and industrial IoT, to the pandemic that forced every aspect of society to deal with work and life remotely, and finally the emergence of new technologies like cryptocurrencies and NFTs.”
Bringing this topic to a close, Dragatsis observes that “Enterprise requirements will be best served by a true hybrid cloud, software-defined infrastructure across storage, compute, networking and virtualisation and intrinsic security across all layers that the applications and data consume.
“As that comes together there is certainly a confluence of technologies, because the concept of the one-size-fits-all, big bang technology transformation is out-dated and doesn’t serve real-world business needs.”