The “Software Development As A Service” Model
Rebaca is an enterprise that provides software development service to organizations like Cisco and Ericsson. The secret to keeping our longstanding customers happy is our high coding standards with appropriate checks and balances in place to monitor code quality and authenticity. Our Software As A Service ( SDAAS ) model is designed to ensure that our customer experience is Fast, Accurate, Time Bound and Cost Bound. Read about how we assure quality of service to our customers here. The core of this model requires continuous implementation of the requirements and feedbacks from our customers. Customers communicate their requirements and feedbacks over a variety of channels such as agile systems, documents, emails, online chats, phone and even f2f meetings. These are then fed through a robust process where they are implemented, validated and shipped to the customer.
Giving Unique ID To Every Requirement
A typical service delivery team at Rebaca receives about 20 to 40 new requirements and feedbacks in a month. Keeping track of these many items is difficult unless there is a system of assigning each requirement and feedback to a unique ID. This ID could be a user story ID in a customer agile database; it could be a bug ID in a customer Bugzilla system; it could be a subject line and date of a customer email; it could even be a URL and a section number inside a customer wiki page. The body of the requirement or feedback stays inside the customer system and can be pulled up for a view by the delivery team via this unique ID.
Implementing Every Requirement
An implementation is an INDEPENDENT, ESTIMATE-ABLE, SMALL and VALIDATE-ABLE EFFORT which REALIZES a whole or part of a requirement or feedback and produces either coding or documentation artifacts. In our Rebaca database, also known as PMS ( progress management system ), this is recorded as an implementation story or output with a unique ID. Every implementation ID is associated with the requirement ID it realizes. An implementation ID can pull up its effort estimate, completion date and owner. An in-progress or completed implementation item produces runnable artifacts (software) or readable artifacts (document). The software artifact is stored in digital databases such as Git and SVN. The document artifact is stored in digital databases such a Confluence and Share Point. Each artifact is also issued a unique ID is used to pull up the body of the artifact from its respective databases such as code chunks and document chunks.
Classifying Every Implementation
Rebaca classifies the implementations following a simplified system to establish their value to the customer.
|If Implementation Produces A||If Implementation Realizes A||Then Implementation Type Is A||Artifact ID Produced By The Implementation Type||Customer Value|
|Runnable AND Field Deployable Artifact||Requirement||Feature||Software ID such as a Git commit URL or a SVN revision URL or Similar||High|
|Runnable AND Field Deployable Artifact||Feedback||Bug||Software ID such as a Git commit URL or a SVN revision URL or Similar||High|
|Runnable AND NOT Field Deployable Artifact||Requirement||POC||Software ID such as a Git commit URL or a SVN revision URL or Similar||Medium|
|Readable AND Editable Artifact||Requirement||Document||Document ID such as a wiki page version URL or Similar||Medium|
|Readable AND Not Editable Artifact||Requirement||Action Item||Report ID such as a wiki page URL or a wiki page comment URL or a JIRA comment URL or Similar||Low|
Validating Every Implementation
As you can see each requirement ID may produce multiple implementation ID(s) which in turn will have multiple artifact ID(s). The challenge is to assure that each one of the artifact ID(s) meets the strict quality standards of Rebaca. High-value implementations are expected to meet a more meticulous quality standard. A feature implementation, which is higher up the value chain, is required to be associated with 3 quality reports. The quality reports may be produced manually or through automation such as test automation scripts or static code analysis scripts. Action item implementation, which is the lower down the value chain, is not required to have any quality report.
|Type of Implementation||Artifact ID Produced By The Implementation Type||DUT Test Report of Artifact ID||DIT Test Report of Artifact ID||Code Review Report of Artifact ID||Document Review Report of Artifact ID|
|Feature||Software ID such as a Git commit URL or a SVN revision URL or Similar||Applicable||Applicable||Applicable||NA|
|Bug||Software ID such as a Git commit URL or a SVN revision URL or Similar||Applicable||Applicable||Applicable||NA|
|POC||Software ID such as a Git commit URL or a SVN revision URL or Similar||NA||NA||NA||NA|
|Document||Document ID such as a wiki page version URL or Similar||NA||NA||NA||Applicable|
|Action Item||Report ID such as a wiki page URL or a wiki comment URL or a JIRA comment URL or Similar||NA||NA||NA||NA|
In this model, requirements are intentionally broken into implementation stories to associate them with artifacts. This association makes this rigorous process of quality check intuitive for our experts to review the artifacts. The SDAAS model dictates that each artifact will undergo meticulous quality checks. Only when our subject matter experts are satisfied with the artifacts and all internal bugs have been addressed the implementations are shipped to the customers. All this has to be done within a time bound manner and meeting our high engineering standards. Having a customized process which enables proper checks for quality standards, review for code authentication as well as regular monitoring for time bound delivery, Rebaca has really experienced customer satisfaction in leaps and bounds.