Wednesday, March 28, 2012

The Forgotten Cost of New Hardware

In recent days I have been thinking a lot about the costs of having hardware.  In my household we six kids.  On a daily basis we have at least one or two of the kids say they have some homework assignment that involves using a computer.  It has become painfully apparent to the kids that a new PC might be needed.  Right now I have two work laptops and my wife has another laptop and then we have a castoff PC that I reimaged so the kids could use that.  Unfortunately the cast off PC is at the end of its useful life and over the last 6 months I have replaced the power supply in my wife's PC twice.  So it is time for me to start thinking about something new.   The PC that I bought for my wife is probably only 3 years old which doesn't seem that old to me but it is having its issues. 

So what does my family issues have to do with a forgotten cost. This same issue that I am experiencing at home is the same one IT departments are grappling with on a daily basis.  Most electronic equipment is meant to have a useful life of around 3-5 years.  By the time you hit the 2 to 3 year mark you start seeing that the applications that you have running on them just aren't performing like they used to.  Also things just start to break.  Fans get noisy, hard drives crash, strange errors start to appear, etc. 

All these things bring on this desire to replace infrastructure every three years or so.  In a home setting that typically is not that big a deal but it can cause some disruption.  You have to start going through the old computer and decide which stuff you need and which stuff is worth keeping.  You need to start looking at the applications that you have on PC and moving them to the new PC.  You have to find all the software that you loaded on the old PC and hope that it will work on the new PC (and hope that the license keys will still be good).  More than likely there is a new version of the operating system so you need to learn where Microsoft moved everything so you can do the most mundane tasks.  There are also 100 other things that I home user needs to do get up an running and it all takes time.

This same thing happens for business when they have to refresh their technology.  They need to go through process of figuring out what is important on that computer and if what they are using is still going to work on a new server.  Many organizations put the storage in external disk drives so that the migration from one server to another becomes easier.  Also virtualization technology has helped make the move more seamless.  New software may need to be purchased and the new server may need to reloaded and multiple groups may need to be involved in the configuration process.  Many organizations have become quite efficient in migrating and standing up new environments but migrations have always been high the risk meter.  Lets also assume that you are back revisions on the software and you need to do upgrades to the software. That can lead to enormous amounts of work. 

A lot of people when considering moving the equipment get caught up with how much the equipment is going to cost them and they will pit vendors against each other in a death match to provide the best price for the equipment.  This however is one of the smaller costs of this transaction.  The true costs like in getting on old system moved onto new hardware.  There is planning, and testing and change windows that need to be set up.  There is potentially a reconfiguration process that may need to be done and a cost of labor to set everything up again.  Most people may look at these costs as sunk costs because they have to pay their employees anyway but the time that it takes to get everything up an running and running on the new versions of the software can be significant.  Add on upgrades to software or operating systems and you start talking some money.

This is why I have become an advocate for more cloud based approaches to computing.  Instead of footing the bill for new hardware every 3 years you leave someone else to worry about that.  They also are responsible for making sure your systems still work at the end of the day.  I am particularly fond of SaaS services because they are in charge of not just the hardware but making sure the application continues to work after an upgrade.  If you do a comparison of the costs of consistently buying new hardware every three years you are probably going to come up with a small ROI.  If you think about it as a way to avoid the 3-5 migration project that disrupts a lot of people then benefits become a lot more obvious.

Well,  that still leaves me with my old PC at home that I need to replace.  Wish me luck as I go through my triennial pilgrimage to the land of laptop migrations.   Hopefully I don't hit many of those forgotten costs. 

Wednesday, March 14, 2012

Disruption in the IT Landscape


Computing the way we know it will disappear.  Right now there is so much innovation in the compute space that it is hard to avoid.  There are many things that are driving this but the main cause is a lot of creative people all over the world are looking at the way business is done and looking for the inefficiencies in the system.  They are looking for the areas where corporations have extracted large amounts of profits and looking for disruptive ways that they can get a piece of the pie (or even make a new pie). 

For example, I recently ran across startup named the The Currency Cloud  that is looking to disrupt the currency exchange market (FX).  The (FX) market is a trillion dollar business.  They make enormous amounts of money by playing on the swings in currency values between countries.  Any business that deals with this is at the mercy of the big banks and clearing houses that convert currency each day.  It has been a very profitable sector for many years.  Many businesses hate it because it introduces uncertainty into the business of doing global transactions.  How do you set prices for products and services and still remain profitable in different parts of the world.  A startup named has created a quicker faster cheaper alternative to the traditional structure.  If they are successful the traditional Forex markets will be disrupted.  

A second example of this is related to investments.  People have been trading stocks, bond and commodities for years.  There has been a small group of companies and investors that have had the capability to get in on the big ideas before they become big.  The rest of the people had to wait on the sidelines until those companies went public.  Second Market has started to change that by offering a way for regular investors to get into the emerging markets before it is too late.  Government regulations make it a little bit difficult for the small investor to use Second Market (you need to have 1 million dollars of net worth which rules out a lot of people), but it is an example of another disruptive technology.

While attending an entrepreneurs group in NYC I ran across a company that was creating a new paradigm for providing wireless connectivity to underserved communities. In poorer areas many people cannot afford to have their own internet so a company called Keywifi has come up with a way to share the bandwidth of your wireless network.  So in poorer neighborhoods a couple of people can sign up to be “hotpots” and people can rent bandwidth from them at a lower cost.  This could have huge implications in developing regions of the globe.

These three examples are just scratching the surface how a difficulty with the way technology is used today opens up new markets in the future.  Right now there is an enormous amount of change going on in IT.  Cloud Computing, Tablet Devices, Video, Social Networking, Converged Networks, fully integrated platforms (pre-staged networking, storage, and compute resources), etc.  All these things all have a common string to them.  Someone looked at what we were doing and said “there has to be an easier more efficient, more effective way to do this.” 

Let’s take cloud computing for instance.  In the past every company that had some computational needs were forced to create or buy their own infrastructure.  Even small businesses would buy servers and some networking gear and maybe some storage and backup gear to carry out the day to day operations.  The problem with that is they had to hire a staff that would support that environment.  This is great for employment because lots a people have jobs.  But for a business this added additional cost and complexity to their businesses.  Supporting that environment meant a lot more than just buying some hardware.  There needed to be 
  • space for the equipment 
  • power and cables 
  • cooling to the equipment.  
  • those cables also needed to be run to all the users’ desks so they could get access to the equipment.  
  • Operating Systems and software and each of those systems needed to be loaded
  • systems needed to be secured, backed up and patched.    
When it is all said and done the amount of work and expense that is needed to run even a small environment could be significant.  Many companies actually don’t realize the true cost and scale of running and IT department. 

This changed in the early 2000s when some companies started to offer subscription services to their software.  So instead of running you applications on your own hardware it would run in someone else’s datacenter and you would just add your data.  This was not a new concept.  IBM had pioneered this with their mainframe software.  You would pay for what you used.  The early innovators in the cloud space were able to say it is easier for me to run the entire infrastructure for all of these companies as opposed to all of the companies having to do it for themselves.  There were some factors that accelerated this move.  Virtualization (once again stealing from the IBM mainframe concept) enabled companies to take more advantage of less expensive hardware to make this sharing model more affordable to the end users.  Increases in network bandwidth also made this more palatable.  If everyone was running on 10mb Ethernet we would see less people willing to do this.  Now most desktops have 1 gb connections and 40 and 100gb networks are just starting to ship.  The wireless space has also experienced this bump.  With 4G networks I can get my data almost as fast as a Wireless N device.  Companies like Google and Amazon have been pioneers in this space.  And it was all because someone finally figured out the easier way for the consumer to do this. 

Converged Networking is also a good example of this metamorphosis.   In the past there were multiple ways to attach to networking devices:  ethernet, token ring, FDDI etc.  In the end one standard won out - Ethernet.  Not necessarily because it was a best technology.  It was just easier.  In the late 90’s attaching to external storage arrays became a new way to do business.  The reasoning was that it was more cost effective to share disks with multiple servers that to let them go under-utilized inside the servers.  This led to new ways to attach to storage, and new network-like infrastructures.  Fibre Channel and SCSI cables were introduced initially to attach the disk to servers.  As time went on a separate infrastructure of switching equipment was built with its own separate connectors and cabling.  The complexity of the environment was increased.  

In this growing mess the bandwidth that could be achieved by Ethernet and FC interfaces increased to an extent the outstripped the computer’s ability to move that data.  Somewhere some engineer said “what would happen if we ran the storage traffic over the same cable and adapters".     That would mean one less component in the servers and one less cable and one less switch to deal with.  Converged networks were born. Cisco was one of the innovators in this field.  The reasoning is obvious.  They benefit the most by moving things into a a converged network infrastructure. Now Cisco, HP, IBM, Dell and a host of other companies all have their converged networking platforms.   
 All of this came about because existing networks were getting too complex.

The old guard will continue to fight the erosion of their margins by the competitors that embrace the new changes.  I recently read an article entitled Oracle has a cloud computing secret about Oracles dilemma in regards to pricing of on-demand instance of their software.  They stand to lose a significant chunk of revenue if they use the model.  The main problem with this philosophy is that train may have already left the station.  In numerous customer calls over the years I have dealt with people who are looking for ways around Oracles licensing. So much so the tools like MySQL (which is now owned by Oracle) and Microsoft SQL server and alternatives that are having a great deal of success. The cable companies are starting to get this as well where people are starting to say that they just want specific channels not every channel.  Subscribers are shutting off traditional cable and our getting a lot on of the same content from the Internet.  Just last night I watched the NCAA tournament on my computer.  Being the vendor that is living in the past model is a precarious state. 
 
Right now is a very exciting time for IT.  It is changing very rapidly.  Barriers for innovation and costs of innovating have been reduced.  It makes for a wild ride.