Episode 62

One of the most common questions we get asked is, “How much will this cost?” The answer is a disappointment to most as there is an expectation that a product or service will have a clearly defined price. Unfortunately, with custom development it is extremely difficult to get an accurate number. With enough experience, a vendor can provide a cost estimate from intuition; however, the estimate will come with a long list of assumptions. The purpose of this blog is to explain why determining the cost of building a web application is so difficult and to provide a rough guide for you to estimate the cost of your idea.

There are four major factors that will influence the cost of an application. They are the type of application, development resources required, payment terms, and the allocation toward maintenance, security, and scalability.

Native and Web are the two most common types of applications (we are going to ignore hybrid apps). A native application works on the device’s operating system, which means it has direct access to all device features like the camera, microphone, thumbprint reader, etc. Web applications work through the browser, which has access to some features of the device, but the application is external from the operating system. Native application development is more expensive but can have better performance and feature depth than equivalent web applications. For more details check out this blog.

There are numerous job titles related to development resources with the job function often interchanged, shared, or blended. A typical core development team will consist of:

  • Developers: make technology work, break stuff, and continuously Google for solutions,
  • Quality Assurance: verify the application works in the way it was intended to,
  • Business Analysts: translate business and technical needs,
  • UI/UX Designers: translate user desires into a design that is pleasing, with an intuitive, easy to use experience; and,
  • Project Managers: coordinate activities and ensure the team and resources are being used effectively.


This list is not comprehensive, but it should be obvious that the more specialized the skillset (architect, security specialist, platform specialist, software language specialist, etc.) required to meet your application’s objectives the more difficult and expensive it will be to source that talent. Likewise, if your project is being developed by a single individual, it will have risks associated with that individual’s gap in competencies. For more details on team size and competency, see this blog.

There are three ways to source development resources. They are in-house, individual contractors, and outsourcing. At a particular scale an organization’s IT requirements are large enough that it can be advantageous to have development staff as employees. While full time employees can be less expensive than outside consultants, the fixed nature of their employment means that unless the individuals have opportunities to grow, the development knowledge base of the organization can stagnate, especially if the organization views IT as an overhead.

With individual contractors you can target your desired skills and cost, with resourcing flexibility that is hard to do with full time staff. This methodology is growing in popularity for its flexibility, but specialists will command high compensation and are hard to find, likewise assembling a team of contractors can be problematic as the newly founded team has not worked together. Most commonly, contractors are used to fill gaps in existing teams instead of replacing them completely.

Lastly, you can outsource. In this case you are likely paying more upfront as a firm will have more overhead compared to individual contractors, but you are gaining access to specialized resources who have likely worked together and on a diverse set of projects. Cost and competency can vary dramatically between regions, individuals, and organizations. Specialization and the overhead required to assemble a consistent team and development process will differentiate between low and high-cost service providers. While you may be able to save money with cheaper resources, you may introduce project risks that could lead to additional expenses later. It is not uncommon for two firms to be hired: the first to build the application, the second to fix or repair the application.

When paying for development services there are two methods: Fixed price and time and materials. Fixed price is desirable for the purchaser; however, it places the vendor under a lot of risk. To compensate for risk the vendor will adjust their offer price. Therefore, even though your organization might enjoy a fixed price, the risk profile of the project will be priced in. An alternative is to use a fixed price but allow for the project scope to be flexible. Time and materials are the other side of the coin. For best results when using agile methodologies, ensure that working code is delivered ASAP and value is delivered on every sprint. While desirable for the vendor, this does place risk on the purchaser if the project runs into trouble. For more details on these dynamics see this blog here.

The last major cost factor are the considerations before the first line of code is written. These are often afterthoughts. It is a good practice to budget 20% of the initial development cost into ongoing maintenance. While it is not uncommon for an application to remain untouched for years with attention only paid when something breaks, the nature of technical debt demonstrates that the cost to fix errors compounds over time like interest on financial debt, with dividends in the form of future savings if one is proactive. Maintenance is highly coupled with security. Security vulnerabilities have a near binary cost structure, ignoring security considerations during development or maintenance can have catastrophic consequences. Lastly, it is important to ask, “what if our application is wildly successful, can it scale?”. Spending an extra bit of time and resources upfront to ensure the architecture is ready to scale can have a large impact on the application’s cost. It is very important to understand that a software application’s development cost is only one piece of the cost equation.


At this stage, I hope to have painted a picture highlighting the numerous variables that would make it extremely difficult to put a number to the cost of a web application. The cost factors discussed are high level, it would be far too difficult to get to the depth of comparing .NET versus Node.js on development cost. But let’s try to put some numbers to paper.


  • 3 to 5 features
  • 1 to 2 user types
  • The entire application can be explained in detail on a single sheet of paper
  • We want to build an app that notifies users of a particular event and provides guidance
  • Real world example: An app to notify users of municipal waste pick up days
  • $20k to $100k


  • 6 to 15 features
  • 3 to 4 user types
  • Multiple simple interactions in one application or an application with a singular purpose containing moderate complexity in function.
  • We want to build an app that notifies users of multiple event types, provide two-way information between multiple users, and allow users to post or make requests for information
  • Real world example: An app that allows factory floor operators to coordinate with logistics personnel to request new materials and remove finished goods with a summary for management.
  • $100k to $500k


  • 15+ features
  • 5+ user types
  • We want to build an application to process insurance claims
  • Real world example: Any off the shelf customer relationship or enterprise resource management application
  • $500K+

There are two ideas I want to close with. The first is why the $20k minimum? This is a safe(ish) estimate for a minimum cost where one would not be gambling with low-cost freelancers or development talent. On the simple end of the spectrum, you primarily get what you pay for. It would be difficult to find a firm that staffs all the required resources and is able to make an app for less. This does not hold true as complexity grows, as uncertainty is a more powerful driver of final cost even if development talent is allowed to be variable.

The second element: why is there a massive range between categories? Custom development projects have a high degree of diversity, and if a repeatable solution were possible, there would already be a SaaS solution on the market for that problem. Initial assumptions can be proven incorrect after development starts, which means with larger projects the more costly it is to change directions. With greater application complexity, the higher chance for errors, the greater upfront cost in design, the higher likelihood of requiring specialist skills, and the larger compounding rate on technical debt while errors remain undiscovered. It is the exceptions where most development costs are spent, with complexity there will be more exceptions. If this blog does anything, it will be to limit the disappointment when your vendor provides a vague estimate to the development cost of your idea.