My Dubai eGovernment Cloud/Big Data Speeches: A Failure to Communicate

I just returned from delivering two speeches in Dubai at the annual Datamatix eGovernment conference – and man, did I miss the mark. I was asked to speak about data center operational issues and Big Data – and neither of my presentations resonated. In the twenty-two years that I’ve been traveling to this region, I’ve never experienced such a disconnect between my topics and my audience, but I had a long plane ride (it’s about 22 hours flight time between my home in South Carolina and Dubai, UAE) to ponder why I missed the mark. This report explains what happened – and how I’m going to remedy it in time for my trip back in October to the region’s largest tech event (Gitex).

My Data Center Speech

My host and friend, Ali Kamali of Datamatix, asked me to speak data center operational issues, more specifically about application integration/optimization, legacy systems, migration, cloud deployment, risk management/disaster recovery, security, storage and data center design. In other words, pretty much everything that goes on in a data center from an architectural and management perspective. As I pondered how I create a cohesive presentation, I gravitated toward using a case study example of a facility that does all of these activities – and performs them more efficiently and effectively than any other data center I’ve ever visited: Companion Data Service’s (CDSs) “Companion Cloud”.

What is so special about the Companion Cloud? First and foremost, it is a hyper-efficient multi-platform environment with extremely well defined process management. The multi-platform element enables Companion Cloud managers to choose the right type of system to execute any given workload, resulting in better overall system performance and significantly lower costs. By my estimates, enterprises that mimic Companion’s environment could lower their cloud computing costs by 30-40%, while vastly improving service levels (especially security). Well defined processes also mean that people know exactly what their roles and responsibilities are – and this leads to greater efficiency in application development, business process flow and overall information systems management.

I started my speech with a question that I’ve asked many times at GCC eGovernment conferences: “How many of you have virtualized and provisioned their computing environments?” I should have known I was in trouble when zero hands went up (out of one-hundred and fifty attendees!). I’ve asked this question at past speeches and usually about ten percent of the audience raises their hands – but this time, zero, zip, nada…

I remember thinking “Uh-oh”, because my speech was designed for advanced systems managers who I presumed had some familiarity with cloud operations. But nope. Not a one.

Fortunately I had some high level slides in my presentation, so I initially focused on what cloud architecture is – and what kind of benefits that cloud architecture delivers. I explained that by architecting a multi-platform cloud environment attendees could save big money (as stated previously, as much as 30-40% of their cloud computing acquisition and operating costs. I also showed how the Companion Cloud architecture were able to dramatically improve application delivery cycles (using a proven, rigid application development methodology combined with pre-packaged cloud services to aid infrastructure integration. Further, I showed how Companion Cloud architects had been able to greatly decrease testing/quality assurance/deployment cycles (using a process CDS refers to as “the conveyor belt”).

I also described how CDS architects are able to increase overall systems, infrastructure, application and database availability (decreasing downtime while speeding-up problem resolution) using various methodologies and tools. At this juncture I realized that the audience was having difficulty understanding how cloud computing architecture could deliver all of these benefits. So I tried to switch gears and explain the concepts of virtualization, provisioning and utilization. Without visual aids, however, even my high level presentation was missing the mark. (Note: translation devices were available – so language was not the issue).

After explaining cloud architecture basics, I continued to cover the topics that I had been asked to address. I explained how the Companion Cloud design leads to improved cross-platform integration by breaking-down virtualization silos (enabling VMware, Microsoft Hyper-V, mainframe virtualization, RISC virtualization to all work together in an integrated fashion). I also described how the Companion Cloud can be integrated with external public clouds (such as Amazon, Google, etc.).

I could tell that there was some interest in the public cloud discussion because several people in the audience were taking notes – so I dwelled a bit on the concept of hybrid cloud computing. I then went on to explain how the Companion Cloud can greatly lower systems, storage, network, application, database and infrastructure management costs by automating operations management wherever possible, and, how it can strengthen internal and external security (using advanced systems and software combined with well-defined security procedures). But by the time I wrapped things up, I felt that I was giving a graduate level course to a group of bewildered underclassmen.

The Big Data Speech

When I took the stage the following day to present my thoughts on Big Data, I had again prepared a low-level, technology-oriented presentation for systems/database managers. My host had asked me to speak on data sharing in the age of smart cities and governments; on Big Data analytics from a government perspective; and on potential government data and information sharing strategies. My host had also asked me to discuss why most government organizations don’t share their data – but I chose not to address this topic because I considered it a bottomless pit (it would lead to endless discussions of obstacles that prevent change and that block the deployment of new technologies).

I started my speech with a discussion of how various governments are using Big Data (including a list of 82 Big Data government projects that are currently underway in the United States). But I could soon tell that this discussion was going nowhere. Why? Because GCC governments do not provide vast budgets for projects like advanced scientific computing research, space and nuclear research, biology and science research, and others that can profit from Big Data solutions. There is interest in visual intelligence (such as video and image retrieval and analysis for surveillance and reconnaissance purposes, as well as for crowd analysis and control) – but I struggled to find many other Big Data projects that interested the audience.

Having failed to identify Big Data projects and subjects relevant to my audience, I embarked on a new tact – a discussion about how to build highly efficient Big Data environments. I described the differences between mainframes (which can now do near real-time data analysis on transactional data); RISC architecture (such as POWER8 based servers that, due to the ability to process eight threads as compared with two or x86 competitors, can process at least twice as much data per clock cycle); and I explained how x86-based servers can be used to efficiently process Hadoop Big Data databases.

I further explained how Big Data server architecture is changing with the addition of different processor types in system designs (such as graphical processing units {GPUs} and field programmable gate arrays {FPGAs}). And I described how vendors were building specific systems designs to process particular types of data (systems like Oracle’s Exalogic and Exadata, and various distinct models of IBM’s PureSystems offerings). I also discussed the arrival of new technologies that vastly speed-up data processing (such as IBM’s BLU acceleration program).

I then turned my discussion to how different types of analytics demand differing systems design in order to operate most efficiently. This discussion took a close look at: 1) deep analytics; 2) operational analytics; 3) pre-defined reporting; 4) ad-hoc queries; 5) 0n-line analytical processing (OLAP); and, 6) advanced analytics. And just like the previous day, I could tell that I was missing the mark by reading the faces of my audience.

At the conclusion of this speech my host took the stage and started an interactive discussion with the audience members who got involved and grew quite animated. Some complained that managers with little IT experience had been put in charge of major IT initiatives such as cloud computing and Big Data analytics. They also described other organizational issues and obstacles; how developers needed permission to innovate and how most GCC IT managers are overprotective of their data – and resist change. When it comes to innovation, GCC eGovernments are very hierarchical – so workers constantly seek direction form management. I wanted to say that sometimes better to ask forgiveness than it is to seek permission but I didn’t want to start a discussion on cultural differences between the U.S. approach to information systems and business versus the GCC approach (GCC IT executives tend to gravitate toward hierarchical management, whereas other countries less so, so discussing cultural differences wouldn’t resole anything.

Summary Observations

As I reflected on these two speeches on my flight back home, I realized that the way to teach this audience about cloud computing and about Big Data is to find regional case studies that show how change was accomplished and how positive results were achieved. The active audience response I saw during the post-Big Data Q&A suggests that they are thirsting for working models they can replicate. Case studies could provide them the proof they can take to their managers, showing that certain technologies and investments can deliver improved results at lower cost that approaches currently in use.

During my cloud presentation on Day 1, I mentioned that there were few examples of highly-virtualized cloud environments to be found in the region. However, one of the attendees, the dean of IT and eLearning at the University of Hali in Saudi Arabia, asked to speak about his cloud environment. He described how the university consolidated its large, distributed x86 server environment onto a single VCE Vblock converged system – along with the resulting savings and efficiencies that the university was seeing.

I was asked to rank the University of Hali’s environment on a scale of 1-10 – (with 10 being the highest in terms of value and innovation) – and I gave it a 10 (this was the first virtualization environment case study that I had ever seen out of the region). The dean’s comments resonated with the attendees – and showed me how I can better reach my future GCC eGovernment audiences.

So when I head to Gitex in October, I’ll be prepared with case study examples of successful cloud and Big Data deployments within GCC countries. And next time, I will hit my mark.

 

 

I just returned from delivering two speeches in Dubai at the annual Datamatix eGovernment conference – and man, did I miss the mark. I was asked to speak about data center operational issues and Big Data – and neither of my presentations resonated. In the twenty-two years that I’ve been traveling to this region, I’ve never experienced such a disconnect between my topics and my audience, but I had a long plane ride (it’s about 22 hours flight time between my home in South Carolina and Dubai, UAE) to ponder why I missed the mark. This report explains what happened – and how I’m going to remedy it in time for my trip back in October to the region’s largest tech event (Gitex).

My Data Center Speech

My host and friend, Ali Kamali of Datamatix, asked me to speak data center operational issues, more specifically about application integration/optimization, legacy systems, migration, cloud deployment, risk management/disaster recovery, security, storage and data center design. In other words, pretty much everything that goes on in a data center from an architectural and management perspective. As I pondered how I create a cohesive presentation, I gravitated toward using a case study example of a facility that does all of these activities – and performs them more efficiently and effectively than any other data center I’ve ever visited: Companion Data Service’s (CDSs) “Companion Cloud”.

What is so special about the Companion Cloud? First and foremost, it is a hyper-efficient multi-platform environment with extremely well defined process management. The multi-platform element enables Companion Cloud managers to choose the right type of system to execute any given workload, resulting in better overall system performance and significantly lower costs. By my estimates, enterprises that mimic Companion’s environment could lower their cloud computing costs by 30-40%, while vastly improving service levels (especially security). Well defined processes also mean that people know exactly what their roles and responsibilities are – and this leads to greater efficiency in application development, business process flow and overall information systems management.

I started my speech with a question that I’ve asked many times at GCC eGovernment conferences: “How many of you have virtualized and provisioned their computing environments?” I should have known I was in trouble when zero hands went up (out of one-hundred and fifty attendees!). I’ve asked this question at past speeches and usually about ten percent of the audience raises their hands – but this time, zero, zip, nada…

I remember thinking “Uh-oh”, because my speech was designed for advanced systems managers who I presumed had some familiarity with cloud operations. But nope. Not a one.

Fortunately I had some high level slides in my presentation, so I initially focused on what cloud architecture is – and what kind of benefits that cloud architecture delivers. I explained that by architecting a multi-platform cloud environment attendees could save big money (as stated previously, as much as 30-40% of their cloud computing acquisition and operating costs. I also showed how the Companion Cloud architecture were able to dramatically improve application delivery cycles (using a proven, rigid application development methodology combined with pre-packaged cloud services to aid infrastructure integration. Further, I showed how Companion Cloud architects had been able to greatly decrease testing/quality assurance/deployment cycles (using a process CDS refers to as “the conveyor belt”).

I also described how CDS architects are able to increase overall systems, infrastructure, application and database availability (decreasing downtime while speeding-up problem resolution) using various methodologies and tools. At this juncture I realized that the audience was having difficulty understanding how cloud computing architecture could deliver all of these benefits. So I tried to switch gears and explain the concepts of virtualization, provisioning and utilization. Without visual aids, however, even my high level presentation was missing the mark. (Note: translation devices were available – so language was not the issue).

After explaining cloud architecture basics, I continued to cover the topics that I had been asked to address. I explained how the Companion Cloud design leads to improved cross-platform integration by breaking-down virtualization silos (enabling VMware, Microsoft Hyper-V, mainframe virtualization, RISC virtualization to all work together in an integrated fashion). I also described how the Companion Cloud can be integrated with external public clouds (such as Amazon, Google, etc.).

I could tell that there was some interest in the public cloud discussion because several people in the audience were taking notes – so I dwelled a bit on the concept of hybrid cloud computing. I then went on to explain how the Companion Cloud can greatly lower systems, storage, network, application, database and infrastructure management costs by automating operations management wherever possible, and, how it can strengthen internal and external security (using advanced systems and software combined with well-defined security procedures). But by the time I wrapped things up, I felt that I was giving a graduate level course to a group of bewildered underclassmen.

The Big Data Speech

When I took the stage the following day to present my thoughts on Big Data, I had again prepared a low-level, technology-oriented presentation for systems/database managers. My host had asked me to speak on data sharing in the age of smart cities and governments; on Big Data analytics from a government perspective; and on potential government data and information sharing strategies. My host had also asked me to discuss why most government organizations don’t share their data – but I chose not to address this topic because I considered it a bottomless pit (it would lead to endless discussions of obstacles that prevent change and that block the deployment of new technologies).

I started my speech with a discussion of how various governments are using Big Data (including a list of 82 Big Data government projects that are currently underway in the United States). But I could soon tell that this discussion was going nowhere. Why? Because GCC governments do not provide vast budgets for projects like advanced scientific computing research, space and nuclear research, biology and science research, and others that can profit from Big Data solutions. There is interest in visual intelligence (such as video and image retrieval and analysis for surveillance and reconnaissance purposes, as well as for crowd analysis and control) – but I struggled to find many other Big Data projects that interested the audience.

Having failed to identify Big Data projects and subjects relevant to my audience, I embarked on a new tact – a discussion about how to build highly efficient Big Data environments. I described the differences between mainframes (which can now do near real-time data analysis on transactional data); RISC architecture (such as POWER8 based servers that, due to the ability to process eight threads as compared with two or x86 competitors, can process at least twice as much data per clock cycle); and I explained how x86-based servers can be used to efficiently process Hadoop Big Data databases.

I further explained how Big Data server architecture is changing with the addition of different processor types in system designs (such as graphical processing units {GPUs} and field programmable gate arrays {FPGAs}). And I described how vendors were building specific systems designs to process particular types of data (systems like Oracle’s Exalogic and Exadata, and various distinct models of IBM’s PureSystems offerings). I also discussed the arrival of new technologies that vastly speed-up data processing (such as IBM’s BLU acceleration program).

I then turned my discussion to how different types of analytics demand differing systems design in order to operate most efficiently. This discussion took a close look at: 1) deep analytics; 2) operational analytics; 3) pre-defined reporting; 4) ad-hoc queries; 5) 0n-line analytical processing (OLAP); and, 6) advanced analytics. And just like the previous day, I could tell that I was missing the mark by reading the faces of my audience.

At the conclusion of this speech my host took the stage and started an interactive discussion with the audience members who got involved and grew quite animated. Some complained that managers with little IT experience had been put in charge of major IT initiatives such as cloud computing and Big Data analytics. They also described other organizational issues and obstacles; how developers needed permission to innovate and how most GCC IT managers are overprotective of their data – and resist change. When it comes to innovation, GCC eGovernments are very hierarchical – so workers constantly seek direction form management. I wanted to say that sometimes better to ask forgiveness than it is to seek permission but I didn’t want to start a discussion on cultural differences between the U.S. approach to information systems and business versus the GCC approach (GCC IT executives tend to gravitate toward hierarchical management, whereas other countries less so, so discussing cultural differences wouldn’t resole anything.

Summary Observations

As I reflected on these two speeches on my flight back home, I realized that the way to teach this audience about cloud computing and about Big Data is to find regional case studies that show how change was accomplished and how positive results were achieved. The active audience response I saw during the post-Big Data Q&A suggests that they are thirsting for working models they can replicate. Case studies could provide them the proof they can take to their managers, showing that certain technologies and investments can deliver improved results at lower cost that approaches currently in use.

During my cloud presentation on Day 1, I mentioned that there were few examples of highly-virtualized cloud environments to be found in the region. However, one of the attendees, the dean of IT and eLearning at the University of Hali in Saudi Arabia, asked to speak about his cloud environment. He described how the university consolidated its large, distributed x86 server environment onto a single VCE Vblock converged system – along with the resulting savings and efficiencies that the university was seeing.

I was asked to rank the University of Hali’s environment on a scale of 1-10 – (with 10 being the highest in terms of value and innovation) – and I gave it a 10 (this was the first virtualization environment case study that I had ever seen out of the region). The dean’s comments resonated with the attendees – and showed me how I can better reach my future GCC eGovernment audiences.

So when I head to Gitex in October, I’ll be prepared with case study examples of successful cloud and Big Data deployments within GCC countries. And next time, I will hit my mark.

 

 

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s