August 31, 2013Doug Hadden
this might be controversial
Doug Hadden, VP Products
More and more governments seem to be adopting data centre and application consolidation. From e-mail to ERP. With the promise of higher efficiency and lower costs. Standardization and innovation, it’s claimed. It’s considered a no brainer by some.
Here’s the thing: it’s not a question of savings et al. It’s the illusion that these advantages makes logical sense (“truthiness”) because it appears that closing data centres and sharing software licenses will save money. And, it all fits into the narrative that governments are insufficient. It can be sold to the public.
It has short term political gain by showing governments doing something.
When the inevitable happens: the benefits did not accrue – the attention span of the press may have been exceeded. Or, the previous government could be blamed. Or, more likely, the blame is placed on public servants.
Why do government CIOs agree to this?
Government CIOs are political. The private sector beacons. And, there is advantage of overcoming the challenges. That’s not to say that Government CIOs are bad people. They aren’t – incentives are important to understand.
And, there are significant risk concerns in government. The desire for the tried and true rather than thinking differently. That means using software, hardware and services from large companies because no one ever got fired for buying [fill in the blank].
Even when the tried and true has been proven risky. It’s about conventional thinking, otherwise known as political thinking.
Is this a cynical view?
Perhaps. Consider the shared services promise:
- All government organizations will run standard processes – from military to health to education to small organizations of less that 50 people responsible for a small initiative. Good luck with that.
- Innovation through these standard processes because, apparently, nothing says innovation more than the lowest common denominator.
- Consolidated data centres will reduce costs despite the need to run all sorts of specialized software, support elasticity, performance & fault tolerance (through remote data centres in large countries) and managing system administration changes, virus protection and intrusion detection. And, it’s not like these government departments and agencies are likely to accept lower SLAs for shared services than they experience with on premises data centres.
- Lower costs through virtualization even though all government organizations will have the same fiscal year and fiscal period putting significant burden on computing power all at the very same time. (At least the major public cloud services support organizations with different fiscal periods and consumers with peak activities when organizations are running with less people.)
- Improved governance through managing more stakeholders. Nothing reduces complexity than having more organizations with divergent needs
- Reduced costs by reducing vendors is a super idea because everyone knows that virtual monopolies reduce costs. Not to mention that most of those vendors have proprietary technology meaning that consulting costs for certified vendors are likely to go up.
Latest posts by Doug Hadden (see all)
- The (IT) Project was a Success, but the Patient Died [Part 2] - September 21, 2016
- The (IT) Project was a Success, but the Patient Died [Part 1] - September 20, 2016
- Have we over-complicated the ‘smart’ in smart government? - September 8, 2016
- Why PFM reform is integral to smart government - September 8, 2016