Techonomy and Celestica recently co-hosted an event in Silicon Valley that brought together a diverse group of tech enthusiasts and industry insiders. A key theme was that while software remains a top focus in the tech industry, the role of hardware is not to be forgotten. This theme reaffirmed what I have been hearing in other conversations with industry experts.
The first hard disk drive weighed over a ton and held five megabytes of data—enough to store a single MP3. In the sixty years since then, the advancements made in both hardware and software are profound. In 2016, the world generated 16.1 zettabytes of data.
Throughout most of this transformation, software and hardware improved together, the evolution of one driving innovation in the other. However, the arrival of disaggregated hardware and software has fundamentally disrupted this relationship. Today, enterprises have access to the hardware innovation cycle by leveraging IT as a service (ITaaS) through the cloud; platforms and infrastructure that were financially out of reach in the past are now available to all. This has enabled companies to adopt software at greater levels and in different ways than ever before, driving rapid innovation and evolution in software solutions.
The rise of software-defined solutions
In the past few years, the growth of ITaaS, cloud computing and big data have put the emphasis on the software enabling these trends. At the same time, the disaggregation of hardware and software and the virtualization of technologies across the data center gave companies a new way to optimize hardware and software performance and capacity through the data center.
As a result, software became a particular focus of the enterprise. With investments in software-defined solutions, enterprise IT deployments sought to maximize performance from existing hardware and lower capital investments by investing in simpler, standardized hardware. The impact of this trend is evident in the growth of the enterprise software market. According to a report by Forrester, it’s estimated that, globally, businesses will spend more than $462 billion on software by 2021.
While innovation in software is radically redefining the role of hardware in the datacenter, it’s becoming clear that off-the-shelf, standard hardware is not optimized for the increasingly sophisticated software and applications of the enterprise, and is limiting the potential for improvements in efficiency and performance.
The re-emergence of quality hardware
The enterprise is becoming increasingly demanding—seeking higher levels of security, automation, higher-quality IoT devices, and connectivity. According to Gartner, in 2017, hardware spending on connected things among businesses will reach an estimated $964 billion USD. By 2020, hardware spending in the enterprise segment is expected to reach almost $1.5 trillion USD.
With so much at stake, customized, high-quality hardware is experiencing something of a renaissance. While advanced software is able to optimize existing hardware stacks, advanced hardware is becoming necessary to optimize software potential.
Think about what would happen if you tried to run today’s operating systems on a computer built in 2001. It would work, for the most part, but it wouldn’t be as fast, as efficient or as capable as if you were running it on a 2017 machine.
In our everyday lives, we take it for granted that our online banking is interfacing with a secure data center, giving uncompromising performance and security. The AI assistance in our cars and homes seem to effortlessly bring together a simple software user interface, but it must be underpinned by ultra-fast and reliable hardware in the data center.
Take this concept one step further into future applications, such as blockchain solutions. In the agriculture market, for example, sensors are being deployed throughout farmers’ fields, shipping containers and grocery stores, collecting and storing data from every step of the process. This massive data collection and processing relies on software applications that are matched with hardware designed to optimize performance in specific environments.
Without innovations in hardware, advances in software are only going to get us so far. Advancing and customizing the hardware for each application environment, such as adding cores to processors or adding more sophisticated memory to machines, will help planes adapt more deftly, factories to become more automated, networks to relay information efficiently and data centers to perform search requests, recognize images, and process video faster.
Innovation in Hardware
While software enjoys its moment in the spotlight, there have been innovations in hardware that are also attracting much interest. For example, increasing speed and density requirements and rapid growth in available bandwidth cannot be conveniently solved with traditional pluggable optics. Alternative technologies such as onboard optics (OBO), are delivering significant performance improvements in the data center environment.
Meanwhile, NVMe (flash) storage is enabling cutting-edge developments in many I/O-intensive applications, such as high-transaction rate and high-computational financial applications. NVMe standards eliminate the latency of traditional hard disk drives. A prime example of hardware innovation is the emergence of NVMe over fabric, which combines technologies like data offloading, hardware acceleration and remote direct memory access to create a shared pool of storage across the entire data center network.
It’s now clear that quality hardware and advanced software are living within the same ecosystem once again. The advancements in each is inspiring innovations in the other.
Jason Phillips is Senior Vice President of Enterprise Solutions at Celestica
View editorial post