By Joe Clabby, Clabby Analytics
IBM Edge 2016, held in Las Vegas last week focused on three themes: 1) cognitive solutions – such as the cognitive/Big Data/advanced analytics technologies that have worked their way into various IBM products; 2) cloud architectures, services and platforms (especially the seamless blending of public and private clouds); and, 3) industry innovation (such as the progress being made with Blockchain protocols as they relate to the open Hyperledger standard).
The Biggest News – Industry innovation: Blockchain and the Hyperledger standard
At breakfast on the first morning after my arrival at the conference I sat next to an IBMer who asked “Hey, what do you think of the Blockchain news?” I told him that Blockchain, the architecture that underlies Bitcoin currency exchange, “wasn’t even on my radar screen.” He said “Wait til the end of the conference and let’s see what you have to say then.” He was right…
Here’s what I now have to say about the Blockchain protocol and related standards: “Blockchain, as manifest in the evolving Hyperledger standard, represents one of the most momentous changes to business process flow, transaction processing and accountability that I have ever seen – and I’ve been in the computing industry for almost forty years!”
Over my career I’ve witnessed the rise of mainframe computing; the arrival of distributed computing; the arrival of the personal computer; the standardization of networking protocols and the rise of the Internet; the move toward business process reengineering and the standardization of business applications (ERP, CRM, SCM, …); the rise of Big Data analytics; the arrival of cognitive computing, and much more. But with the arrival of the Blockchain protocol, I see a new wave of transaction processing that has the potential to greatly increase the security of transactions while eliminating process delays and human (middleman) overhead. When it comes to next generation transaction processing, I rate Blockchain and associated process flow and security activities as being as important to the future of conducting business as the Internet has been to collaboration and knowledge sharing.
For those not familiar with the technology, I suggest that you read this article by the Brookings Institution: The Blockchain: What It Is and Why It Matters. The Brookings Institution states that “the elegance of the Blockchain is that it obviates the need for a central authority [such as a bank] to verify trust and the transfer of value. It transfers power and control from large entities to the many, enabling safe, fast, cheaper transactions despite the fact that we may not know the entities we are dealing with.”
In other words, it changes the way that transactions will be handled in the future by not requiring a centralized authority to handle all the steps of conducting a transaction – instead, all the elements needed to conduct the transaction (such as contractual and payment information) becomes locked in a public record to a series of steps known as the Blockchain. Using this approach, computers can verify the validity of each transaction and create an immutable (untamperable, if there were such a word) record of a transaction’s flow. Say goodbye to bills of lading; to courier services; to other middlemen; to various contractual services and to other time- and cost- consuming overhead (such as bank payment delays) related to processing today’s transactions.
And, as a result, expect tremendously lower transaction processing costs.
Why so? Note that the Blockchain protocol presents a method through which transactions can be processed. There are numerous other elements involved in streamlining Blockchain transaction flows including the use of consensus algorithms, various storage models – and services that help establish identity and trust; that control and regulate access and that establish the rules that govern a transaction (a contract). To incorporate these services, the Linux Foundation now hosts a project known as Hyperledger that adds the needed services (such as smart contracts) to support Blockchain-based transactions.
As is my nature as a systems and infrastructure analyst, I naturally considered what kind of systems environment I might run Hyperledger applications on. My first thought was about security – how to protect transactions from internal fraud as well as cybercriminals. Ironically, we’re just about to release a report on IBM’s LinuxONE architecture – an architecture that offers the most securable computing environment in the industry with strong cryptographic support, solid access control, a huge emphasis on privacy – and that offers an immense input/output communications subsystem, and tons of computing capacity.
My immediate reaction to Hyperledger was that large enterprises with strong security requirements should first and foremost consider adopting Hyperledger on LinuxONE servers. The next day I learned that IBM’s own beta-Hyperledger environment had gone into production this month – and IBM had selected LinuxONE as its platform of choice for production-grade Hyperledger. I talked with representatives of IBM’s Power Systems group and found that organization had other priorities at this time – so I don’t expect much action on Hyperledger on Power Systems in the near term.
Further, my assessment of Intel servers is that they can indeed run Hyperledger – but I also know that Intel servers would require extensive reengineering to match the security, scalability and communication subsystem advantages of IBM LinuxONE servers. Accordingly, when Hyperledger gains greater acceptance across various industries, I expect LinuxONE sales to escalate.
For five years Clabby Analytics has been tracking IBM’s progress in automating systems management, in overlaying systems management with machine-driven analytics, and in streamlining application performance management. So, at the Edge conference, IBM did not have to convince us that it was aggressively building cognitive solutions to manage its hardware offerings.
Next week we plan to publish a report entitled: “Building a Mainframe IT Service Management Plan to Deliver Operational Excellence” that describes how IBM is infusing several of its systems/storage/-network management products with analytics facilities. In this Clabby Analytics Research Report we state that “We see only one vendor (IBM) that can provide a comprehensive systems operations management suite that can efficiently monitor and manage applications, databases and systems by using intuitive, cognitive and analytics driven software.”
We note that IBM “has taken a clear leadership position in using system intelligence and analytics to solve operational and performance issues. Further, over the past few years we’ve seen the company restructure its operations management products into cost efficient easily consumable suites – and offer selected management capabilities through cloud services”. And we note “that IBM has also simplified its pricing and license management practices – helping to make its operations management solutions more affordable.”
Based upon our research, we would strongly recommend that information technology (IT) operations managers consider building a holistic plan to manage their systems/storage/network/cloud environments. That plan should include integrated monitoring and management solutions as well as cognitive tools that simplify management and lower the skill levels needed to troubleshoot and tune systems (thus reducing administrative costs).
Cloud architecture and platforms
I’ve never been a big fan of cloud architecture, though my fellow Clabby Analytics researcher, Jane, is. To me, the idea of virtualizing resources and delivering computing functions as services has always seemed “old hat” (I remember when IBM introduced MSNF to virtualize mainframe facilities as far back as the 1970s). But although I find cloud plumbing to be boring, I’m strongly encouraged by the number of cloud services and service delivery models that I’ve been seeing at IBM events such as Interop, Interconnect – and now, Edge.
In the past, IBM largely focused on on-premise software licensing. But now, more and more of its products are becoming service offerings – available for deployment on-premises; or available from IBM and/or managed service providers as services. What I saw at Edge was clear progress in opening-up access to IBM services using application program interfaces (APIs), with a major emphasis on building gateways between public and private (hybrid) clouds; as well as increased emphasis on Dev Ops tools (much of IBM’s software development is now done using agile development methods).
As an example of a new cloud services offering, consider IBM’s z Operational Insights which just became generally available. This offering is designed to provide insights into z Systems operational performance without the need to install any on-premises software. It represents a way to take advantage of IBM expertise to efficiently manage and tune a given enterprise’s mainframe environment to lower cost, and also anonymously benchmark yourself with z Systems peers to see how efficient a given mainframe really is (data is anonymized and compared with other installations). When mainframe users express concerns about where the next generation of mainframe managers is going to come from, I steer them to products like this which make it possible to either own management products on premises or to obtain additional support from external trained professionals through a secure cloud.
One of our key research agenda items at Edge was to investigate why storage hardware sales at the traditional storage leaders has been declining over the past few years. Jane Clabby’s write-up in this week’s Pund-IT Review describes what she learned at Edge regarding storage. What I learned was that storage has been in flux – teetering between traditional HDD and solid-state Flash solutions; that new Big Data-intensive applications are driving analytics workloads to differing types of storage solutions – and that software-defined storage represents the future growth opportunity for traditional storage vendors. This topic is of great interest to us – so, accordingly, expect a more detailed report on our storage market findings in late October.
A few years ago, when we started looking at large in-memory databases, we pointed out that IBM’s Power Systems would make an ideal host for SAP’s HANA environment. Our reasoning: a POWER8 microprocessor can process 8 threads per cycle to Intel’s 2 – and Power Systems also offer increased memory, communications subsystem speed and large memory (with the ability to set up fast solid state memory access to the CPU using the IBM CAPI interface). At Edge we were delighted to learn that Power Systems-based HANA solutions are performing at least twice as fast as similar Intel systems – and we were told that SAP is extremely pleased with HANA on Power Systems performance.
Also worthy of note in the Power Systems group is the release of the oh-so-easy-to-remember-name “S822LC for HPC.” This is a server that employs POWER processors as well as NVIDIA graphics processors to speed the processing of compute- and parallel intensive workloads. Newly released, hundreds of these servers are already on order. And given that a lot of analytics workloads now resemble high-performance computing (HPC) workloads, we’re expecting S822LC for HPC to be very well received in the now very broad HPC marketplace (high performance computing now spans across many industries including the medical, financial and scientific communities). IBM’s Power Systems sales representatives are actively seeking to find a home for the new S822LC for HPC solution in bio genomics, defense, financial services, fraud detection, market intelligence and other data/compute/parallel computing markets.
Finally, the OpenPOWER Foundation reported great progress in membership increases – and hinted that some important new industry relationships may be in the offing. I’ll track and report on these new members when they are officially announced. The OpenPOWER Foundation also proudly discussed a number of new POWER microprocessor-based products that have made their way to market over the past year (I covered many of these in announcements in my blog on the OpenPOWER Summit).
I think the key messages that IBM wanted to deliver at Edge were “Partner with us to modernize and transform your computing environment. Let us help you maximize systemic workload performance across traditional data center and new analytics environments leveraging cloud architecture.”
From what I saw with advanced systems designs; with cognitive management products; and with APIs that transparently enable the union of public and private cloud environments, I am convinced that IBM can, today, already meet its goal of helping its customers transform their existing traditional data centers to the highly integrated, highly efficient hybrid cloud-based compute environments of the future.
As for innovation, the many that I saw in IBM hardware, software and cognitive computing were overshadowed by the stunning work that the company has done in conjunction with other vendors and enterprises as part of the Hyperledger project. Hyperledger will have a momentous impact on IBM, as well as across small, medium and large businesses as well as across the entire computing industry. We at Clabby Analytics plan to track this technology very closely in the future. Blockchain and Hyperledger are definitely now on our radar screen.