Last week, Clabby Analytics (I and my wife Jane Clabby) attended IBM’s Interconnect 2015 in Las Vegas. At industry conferences, we each go our own separate ways (as we each have separate research agendas) – so this blog represents a summary of my findings.
Thoughts on the InterConnect 2015 Conference in General
In order to more efficiently deliver its strategic and product messages – while containing costs – IBM is scaling back the number of IBM-focused trade shows that it runs each year – while scaling up attendance:
- IBM’s InterConnect 2015 focuses on cloud and mobile technologies – and represents the combination of three prior annual conferences (the IBM Impact, Innovate and Pulse) into a single event. Meanwhile, attendance has risen from a cumulative total of @18,000 people who attended each separate event to over 21,000 in this year’s combined event.
- In October, IBM will run its Insight 2015 conference (formerly Information-on-Demand), which focuses on business analytics, enterprise content management and data management. Conterminously, customer- and business partner events will run previous to and during this event. And attendance should rise from @12,000 people last year to between 16,000 and 20,000 people this year.Having attended four IBM conferences in the past five months in Las Vegas, I’m personally thankful that the company is notching back on the number of major events it is running each year. Attending fewer shows means that I spend less time traveling and less money on other expenses – and it also means less duplication and redundancy in terms of messaging. Attending fewer events also gives me more time to focus on my own research agenda. So, I’m wholly supportive of IBM conference strategy.As for the messaging at each conference, I’d suggest that attendees look very closely at each respective agenda because these conferences cover far more than just cloud/mobile and analytics/content/data management. Each conference also contains separate keynotes and separate tracks on other key initiatives. For instance, IBM’s Interconnect conference focuses heavily, but not exclusively, on cloud computing and mobile solutions. Yes there were keynotes on cloud and mobile – but there were also keynotes on Dev/Ops, infrastructure, application integration, security, business process and decision management, Big Data and analytics and cognitive computing.The bottom line on these mega-conferences is this: in reality, attendees can pursue multiple research agendas across a broad spectrum of solutions. For instance, just because one of these conferences says it is about analytics doesn’t mean that it is exclusively about analytics. A wealth of other solutions can also be found at each of these IBM mega-conferences. I now view these conferences as check-points: I can now get strategic briefings and product updates across all of IBM’s focus markets by attending these conferences twice a year.
My Own Research Agenda
At Clabby Analytics, my primary focus is IT infrastructure – which includes platforms, systems design, storage, middleware, and infrastructure management. I also have two secondary research foci: analytics and security. Jane, on the other hand, focuses heavily on cloud, application performance management, Dev/Ops. And her secondary research focus is on the Internet-of-Things (sensor data that feeds analytics engines). So, while I do want to know what is going on in mobile computing and the cloud, I also seek out information pertaining to infrastructure and analytics. The good news is that plenty of information on infrastructure and analytics can be easily found at these events – in customer sessions, in keynotes – and particularly on the exposition floor.
At InterConnect 2015 I spent most of my time in keynotes or on the exposition floor. Here’s how I spent my time at this year’s event:
- The first thing I do in an exposition center is walk the entire floor looking for which booths are most crowded. This tells me which topics are hottest for attendees. As could be expected, the mobile and cloud booths were crowded – but the most consistently crowded booth was security. No matter when I went to the security booth, all of its stations were active with customers looking at demos, as well as seeking strategic and product information. I found this really curious – most people in the information technology market know very little about security – it is a heavily specialized field. But attendees ranged from developers to line-of-business executives – and everyone in between. What this indicates to me is that there is a stronger awareness of security risks – and people from all around the enterprise want to know more about is the subject, how solutions are structured, and how to employ them to reduce risk. In other words, security is no longer just a specialist concern – it’s becoming an enterprise-wide, cross-organizational concern.
- The next thing I do after walking the expo floor is focus on infrastructure suppliers. I visit the booths of hardware suppliers; of application development suppliers; of middleware suppliers and independent software suppliers. What I’m looking for is new products as well as information on evolving technologies. Here’s what I found at the Interconnect 2015 expo:
- Intel’s booth included demonstrations on how the company’s components are being used in Big Data environments, in storage, in networking, and more. But I also managed to connect with Intel product people who told me about the company’s future microprocessor roadmap – particularly about what Intel is doing to accelerate processing (this is paramount on my research agenda). Now I’ve had a rocky road with Intel over the past ten years (I criticized Intel for its slow response moving x86 from 32-bit to 64-bit architecture, for its AMD-related business practices, and for its miserable Itanium roll-out and subsequent development) – but ever since Intel moved to multi-core processors I’ve become more tolerant. Now, with some of the activities that are taking place with Intel “off-load engines”, I may become increasingly positive. We’ll see…
- Rocket Software was demonstrating a new analytics environment – but I actually spent a lot of time with one of Rocket’s key mainframe developers. It seems that Rocket read one of my reports last year on why mainframers should stop transferring data to distributed systems (The ETL Problem) – and really liked it. Rocket suggested that with IBM’s new zSystem, the z13 which has the ability to do in-line transaction processing and with the IBM’s improved IDAA architecture (a complex query off-load sidecar that attaches to the mainframe), a case should be made that data (particularly that in an Oracle database) should be sent to the mainframe for processing. I’m intrigued – and may jointly blog with Rocket on this topic.
- I stopped by Arrow’s booth (a technical education organization) – and learned that they are seeing a surge in demand for mainframe training (their boot camps are flourishing). I found this particularly interesting given that a lot of my fellow IT research analysts are telling customers to move off of the mainframe due to allegedly pending skills shortage issues. I guess their customers just don’t get it – because it looks to me like they are continuing to invest in everything from Cobol fundamentals to zLinux while building new z13 system admin skills.
- There was a ton of activity taking place in the BlueMix (application component development) and Docker (containers) booths. Jane covers BlueMix, so I’ll defer to her impressions of this product. But I cover application infrastructure – and Docker is becoming an important means to create portable, lightweight application runtimes that can be packaged and easily deployed as cloud services. Further, workflows can easily be integrated with Docker. My visit showed me that I need to write a research report on software containers – which I will probably do over the next few months.
- Likewise, several vendors were displaying software that can be used to automatically integrate applications with back-end data using a wide range of application program interfaces (APIs). I’ve been covering APIs for three decades (starting way back in the Netbios and CPI-C days) – but there is a distinct change currently happening in this market. Clouds are making it easier to link-into a wide variety of APIs – making it simpler to develop applications that exploit a wide variety of back-end data. This, in turn, is making it far easier to link back-end data with front-end mobile devices – creating a movement known as the API economy. This trend may also deserve more of my attention in the coming months.
- Finally, I visited the booth of a Watson service provider known as Cognitive Scale. Watson is IBM’s cognitive computing environment – a new type of “thinking” computer. IBM claims to have 4000 Watson developers (internally and externally), 100 Clients, 160 Partners, across 24 countries, in 17 industries, 7 new APIs and in 3 languages – demonstrating that this platform is progressing (the company has pledged a billion dollars to Watson development). As for Cognitive Scale, company executives indicated that they have an industry focused Watson solution that is priced very attractively. From what I saw and heard, I was impressed – but I need to hand this over to Jane for further research (she tracks Watson a lot more closely than I). I wouldn’t be surprised to see a Cognitive Scale report come out of Clabby Analytics in the near future based upon Jane’s findings.In addition to casing the exposition floor, I also attended several keynotes and one press release. At InterConnect, IBM and Juniper Networks announced an initiative to provide real-time insights into network behavior (see this excellent analysis by Charles King of Pund-IT on this press release for more details). From my perspective, IBM and Juniper played-up the major benefit of this solution – that communications service suppliers could gain a better understanding of their networks and thus be positioned to improve utilization for networks and related equipment, as well as improve Quality-of-Service for network users – as well they should.To me, this announcement really means that Juniper and IBM are now delivering a solution for communication service providers (CSPs) that will enable CSPs to maximize their network use while providing customers more exact tiered services. As a result, customers will be able to pay for the guaranteed service levels they need – and CSPs will be able to easily prove that they are delivering these service levels. For CSP customers who rely heavily on Internet performance (like me, for instance), this Juniper SDN solution may prove to be very important over time.
- But what intrigued me most about this announcement were three things: 1) the solution that Juniper has built is a software-defined networking (SDN) solution (a technology that hands more control of networked devices over to the buyer of Juniper switches); 2) it uses predictive analytics (the wave of the future when it comes to systems/storage/network management – as I have mentioned previously in several reports such as this one); and, 3) it is based on a technology that IBM purchased known as “The Now Factory”. Because I was one of the very few research analysts to cover The Now Factory acquisition in some depth, I knew that The Now Factory is a very special appliance that uses field programmable gate arrays (FPGAs) and Intel processors to read network data at line speed (see this report).
Robert LeBlanc, the SVP for IBM Cloud, launched InterConnect 2015 with a speech on the company’s cloud computing efforts and strategy. I liked this keynote because it focused on the solutions that IBM now delivers – and it had zero discussion on cloud technologies. The rhythm introduced by LeBlanc was this: 1) here’s a solution that we deliver; 2) here’s what a customer has to say about it; and, 3) here is the business value derived by this solution. LeBlanc followed this rhythm throughout his keynote.
The reason I enjoyed Leblanc’s keynote is historical. Several years ago IBM embarked on its own cloud strategy by taking a bunch of technologies that it already had (many of which were in its Tivoli organization) and gluing them together to create the IBM SmartCloud. SmartCloud worked fine from a private cloud perspective – and IBM priced it aggressively – but it never took off. The problem with SmartCloud, I believe, was that the market wanted a fresh product that had been written from the ground up for cloud computing – not assemble from a variety of pre-existing virtualization, provisioning, and management offerings.
So, the way I see it, IBM’s top management gave SmartCloud a chance – and then decided that a new approach was needed that could accommodate bare metal provisioning (something very few other cloud vendors offered) and fully support public/private hybrid cloud services. And, for this reason, IBM purchased the fast-growth SoftLayer cloud organization. To me, this move positioned IBM to better serve the hybrid cloud marketplace and to grow quickly in this dynamic market.
Of interest, Leblanc mentioned that he believes the cloud marketplace has entered a second phase (as do several technology research organizations) – and that because of SoftLayer as well as various ancillary products, IBM is extremely well positioned for growth in the hybrid cloud marketplace.
One of the big problems that IBM has had in recent years has been breaking down silos within its own organization. In days gone by, IBM’s software organization operated separately from its hardware organization – and both operated separately from its services groups. A few years ago we wrote about how Steve Mills, IBM’s SVP for system and software had broken down some of these silos by forcing organizational changes that blended software and hardware development.
IBM’s last major reorganization started to focus the company on selling solutions and business outcomes first, and technology second. The company decided to organize around four solutions (cloud, analytics, mobile and social) – and proceeded to rework sales and marketing to go after opportunities in those markets. The problems that I had with this reorganization was that it separated sales and marketing from development – a move that misaligns a company (in my early career I was a product manager who had no control over what development was building) I found this situation supremely frustrating given that customers and the sales organization were telling me specifically what they needed/wanted – but development was building stuff that they, not customers, wanted. Needless to say, the development organization was particularly happy when I left the company – but it went under just a few years later and was eventually acquired by EMC).
Over the past several months, IBM has been in the midst of another reorganization – one that makes a lot more sense to me. It is now organizing around a systems group (infrastructure, hardware – on premises stuff, IBM’s traditional data center business); it has introduced a cloud organization under LeBlanc to drive the new infrastructure, Platform-as-a-Service, mobile, secure hybrid cloud designs of the future; and it has a solutions group (under John Kelly) to conduct research, to drive Watson technologies, to create new solutions in chosen markets (like commerce, security, etc.).
This reorganization recognized that the company has two markets to serve – its traditional market and the evolving cloud market – and it has given general managers the charter and budget to drive initiatives to serve each market. And development is now better aligned to serve these markets. Further, individual initiatives (such as security, cloud, mobile and analytics) can be combined and constructed for IBM markets. For instance, IBM can now more easily build, say, a cloud enabled, mobile, secure application for accessing Big Data analytics environments. Each individual organization’s sales group receives compensation for their portion of the solution – so everybody’s happy…
I had actually considered not attending InterConnect 2015. IBM promoted the conference as the “premier cloud and mobile conference” – and since Jane Clabby covers cloud computing and I only sporadically cover mobile (see this mainframe/Apple Blog as an example). But on closer scrutiny of the InterConnect agenda – and particularly the Solution EXPO Exhibitors list – I decided to spring for plane fare over to Las Vegas from Charleston, South Carolina to spend a few days. I’m glad that I decided to go.
Probably the biggest change that I noticed at InterConnect was IBM’s strong focus on solutions and on customer testimonials. IBM wants to be sure that customers understand that the company offers specific solutions – and that those solutions deliver clear business value (whether that value be improved competitiveness, better performance, faster insights – or whatever). Thirty-five years ago, when I was taught how to sell information systems, one of the first lessons that I learned was to “sell solutions”. At InterConnect, IBM deemphasized the technological aspects of its solutions and instead strongly focused on the benefits that those solutions deliver. To my mind, this is exactly what IBM’s sales and marketing organizations should be doing.
For potential attendees of IBM’s Insight 2016 conference, I suggest that you take a close look at the agenda and the EXPO exhibition list. You may find that the solutions that you are looking for are more than adequately covered at that conference (even though it is supposed to be an analytics/content/-data management conference). Also, I suggest that you plan on spending time at the many social events at these conferences – I have found that attendees are generally more than willing to share their insights in less structured, informal settings. These informal gatherings represent an excellent means to gain an understanding of how various enterprises are actually deriving value from the solutions that they have chosen.