Choice of software development cycle. Your company should define multiple development life cycles for projects different types and different degrees of uncertainty of requirements (McCormel, 1996). Each project manager must select and use the cycle that best suits his project. Include a cycle of the requirements creation operation. If in the early stages of the project requirements or project boundaries not clearly defined, develop the product gradually (small stages), starting with the most understandable requirements and sustainable elements of architecture. Where possible, implement sets of functions, to periodically release interim versions of the product and how can provide the customer with workable samples earlier application.
Project implementation plans should be based on requirements. Develop plans and schedules for the project gradually, as the boundaries and detailed requirements become clearer. Start by estimating the costs required to implement functional requirements defined on the basis of the initial image and boundaries product. Graphs and cost estimates based on fuzzy requirements will be extremely inaccurate, but as you detail requirements should be clarified.
Revise project commitments when requirements change. As you add new requirements to the project, evaluate whether you can meet commitments regarding schedule and quality requirements, while available amount of resources. If not, discuss the realities of the project with managers and agree on new, achievable commitments (Humphrey, 1997; Fisher, Ury, and Patton, 1991; Wiegers, 2002). If negotiations are not successful, inform managers and customers about their results, so that the violation of plans in the implementation of the project does not become surprise for them.
Documentation and management of risks associated with requirements. One of the components of project risk management is identifying and documenting the risks associated with the requirements. Reduce or prevent them through brainstorming, implement corrective actions and track them efficiency.
Control the scope of work to create requirements. Fix the effort your team puts into developing requirements and project management. This data will allow you to assess compliance with plans and more effectively plan the necessary resources for future projects. Also track how your regulatory actions requirements affect the project as a whole. This will allow us to evaluate the return on this work.
Drawing lessons from experience gained. For this in organization should conduct a project retrospective called also by examining finished projects (Robertson and Robertson, 1999; Kerth, 2001; Wiegers and Rothman, 2001). Acquaintance with experience in the field problems and ways to create requirements, accumulated in the course of work on previous projects, helps managers and analysts requirements to work more efficiently in the future.
Software development in a small organization
How can a small group take advantage of the software development practices of large companies? Small groups certainly need such methods, but often lack the wealth of infrastructure and economies of scale that come with large organizations. In a small team, there is usually neither a separate quality control group, nor a special fleet of computers for testing, and often even a system administrator. A thorough introduction to these issues is provided in a report by the Software Engineering Institute  on improving work processes in small teams.
Consider the situation on the example of a hypothetical company that produces commercial software products with both closed and open source. Some of them are entirely developed by the company, others are the result of joint efforts with subcontractors scattered around the world. In small projects, one or two people can be employed, in large ones – over thirty. Thus, an acceptable development approach should cover large projects with thousands of source files, but have a fairly low overhead in small projects. Given these requirements, and the need for cross-platform development for Windows, Linux, and Mac OS, it’s not hard to see that any viable solution must have a fair amount of flexibility.
Origins of the methodology
In 1999, the US National Library of Medicine launched a competition to develop a freely distributed registration and segmentation toolkit, which eventually became known as ITK . Initially, the development team included three commercial partners (GE, Kitware, and MathSoft) and three academic partners (Universities of North Carolina, Tennessee, and Pennsylvania). A large, geographically dispersed team was given the difficult task of developing a freely distributed, cross-platform research and development toolkit in C++ that would incorporate cutting-edge advances in medical imaging algorithms.
The latest trends in software development, extreme programming techniques and test-driven development are taken into account. All this gives a lot of good ideas and helps to improve software processes. Of particular interest is the testability requirement, borrowed from Extreme Programming. The ITK toolkit needs to support multiple platforms, so it usually involves a large number of developers.
The principles of Extreme Programming also encourage the use of coding standards and the collective ownership of code. The Six Sigma quality struggle leads to the use of metrics such as daily code coverage analysis during testing. In many ways, this approach is similar to the Agile Unified Process—both are simplistic, iterative, incremental, and tool-independent.
Details of the development process
There are five main parts to the development process:
- communication and documentation;
- change control;
- assembly management;
- version release.
This approach is designed to ensure the sustainability of the development process, which is especially important for long-term projects. Communication, for example, happens all the time and is an integral part of the formation of new requirements, project discussions and feedback from users.
The process needs to be simplified to fit well with agile development methods, and among the tools it uses, open source tools make up the majority. For communication and documentation, Mailman was originally used for organizing mailing lists, Doxygen for automatic documentation of source code, and phpBugTracker for bug detection; they are all available at sourceforge.net.
In recent years, the wiki has been actively used – another powerful communication tool. The CVS (Concurrent Versions System) system is chosen for change control – because of the convenience of its client-server model, which excludes locks. For the build process, at the beginning of the ITK project, there were no tools capable of building complex C++ programs for all supported platforms, so a custom tool called CMake was developed.
Another tool was developed for automated testing called Dart (svn.na-mic.org:8000/svn/Dart/trunk/Dart.pdf). A methodology was created to share all of these tools, and a new CPack toolkit was tested to make the release process easier. The methodology used has proven to be flexible and adaptable to new tools as they become available.