March 9, 2016
Posted by: Peter Marcum
It seems that everyone wants to lower the cost related to technology. Whether that consists of building something new, or maintaining something old, most everyone expects the cost to decrease. As technology becomes more advanced, this seems like a logical expectation, with increased automation and better tools for development, one would expect the prices to drop while the products improve.
So, one must wonder why companies are having problems controlling the cost of technology. The answer is that the problem is not the cost of technology, rather it is the cost of getting people together to agree on what it is they want the technology to do, and how the technology will it be used and managed.
As a developer of technology, we see this happen all the time. Someone comes in with an idea or a plan that defines what they want us to build, and we provide the estimated cost of the project. They approve it, and we deploy resources to build out the technology and time estimates. As the project progresses, inevitably the original plan begins to change, the scope of the original ideas begins to creep, and before you know it, the original intention, and what was originally planned for, has changed significantly—thus causing a considerable rise in cost, and pushing the timeline for completion. Then, without question, the parties start questioning the cost and the delays in the original project. This is a common problem with the cost of technology, and it isn’t about the cost of building the technology. The real cost lies within the following: having a clear and frank definition of what all the stakeholders want the technology to do, for whom, and why. In other words, by having a through and complete understanding of the scope of your technological needs, before embarking on employing resources to meet those needs, will reduce the cost.
I’ve created a seven point checklist that companies ought to consider using before engaging with any supplier to build any technology. The objective is to reduce problems which will reduce cost. Does that make sense?
Here is my checklist:
1. Make sure all users of the technology have input on the design of the project, as well as their desired benefits from use. Identify the desired functions, features, engagement points and messaging.
2. Make sure that existing work will be reduced or eliminated rather than increased or added to. There is nothing worse than building technology that increases work, i.e. cost.
3. Make sure you consider the prioritization and tracking process of work tasks, pertaining to any enterprise work flow technology initiative. Data is king.
4. Make sure the technology provides all stakeholders with improved visibility to all of the connected processes that weren’t previously visible.
5. Make sure you quantify and qualify the time and expense savings gained from the development of the technology
6. Make sure the right resources are allocated and dedicated to providing the right input to the development of the technology.
7. Make sure you are connecting all of the dots. Unless you connect things, you are likely only separating things—thus creating more work and this only increases cost, rather than reducing it.
After building more than 900 projects over the years, and watching people and organizations of all sizes manage their projects, my list above serves as a summary of some of the recommendations I’ve given people over the years. Technology cost has indeed gone down, but the cost of not managing it continues to rise.