At Altamira.ai, we are committed to the following principles:
We believe that transparency is not only the single most important success factor for the software development, but that it defines our organization as a whole.
- Transparency is conducive to providing an understanding of how realistic it is to meet your objectives
- Transparency aligns the team, eliminating the possibility of information asymmetry gap so that everyone can stay the same page
- Transparency simplifies implementation of best practices, and makes inefficiencies visible
- Transparency allows the team to be self-organized and efficient
- Transparency is the foundation of trust
How is transparency manifested, in practice? At Altamira.ai, we use a set of metrics combined with extensive data gathering and visualization techniques to quantify and demonstrate progress and results, achieving transparency. For each project or client, we provide a custom dashboard that visualizes the status, performance, quality, and trend forecasting in real-time. There is no longer a need to wait for calls or status reports — everything is available online, 24/7. Whether you are a client, a development team member, or a part of management; everyone has the same access and always sees the same data.
Our second guiding principle is our commitment to continuing improvement. There are various methods that promote this principle:
- Plan-Do-Check-Act (PDCA)
- Design For Six Sigma (DFSS)
- Define, Measure, Analyze, Improve, Control (DMAIC)
- Observe-Orient-Decide-Act (OODA)
Although often named differently, all of the methodologies listed above ultimately conceptually receive the same strategic treatment and the same amount of attention. We take the small, deliberate steps, evaluate the results, compare, then implement any needed corrections.
So, how do we apply this to real projects? At Altamira.ai, we start with collecting data. We source the data from issues-tracking systems, code repositories, ERP, CRM, and HRM systems. We then pair this quantitative, data-driven approach with analysis of sources of project inefficiencies and planning of remediation measures. We also compare patterns to ensure we never make the same mistake twice.
Our clients’ success is instrumental to Altamira.ai success. That is why we often start our projects with the ideation phase, which is when we sit down with the clients to discuss the path forward at length to ensure that an investment outlay in their technology will indeed be able to produce the desired outcome. We openly and honestly discuss the potential risks of the case with our clients, and, if the situation demands it, will often candidly suggest that the client first try to pursue a low-fidelity prototyping approach before investing in the design and full-scale development of a solution.
Sometimes, prototype testing reveals negative outcomes, and therefore the development phase never starts. This is when Altamira.ai delivers value without even writing a single line of code, saving our clients’ budgets for future initiatives.
How we work
Our pragmatic and proven approach to providing and developing solutions has matured through years of experience in conjunction with industry-leading frameworks and best practices. Our focus on design thinking means that we take time to collaborate and create a design with our customers tailored to their unique needs.
- Firstly we take time to define the specific strategic and operational outcomes our clients want.
- Then we combine analytical and design thinking to create a design that ensures all elements of the business and user needs are considered and understood.
- Finally using our proven approach for development we rapidly deploy our expert engineers to develop and implement the new design in a way that is as good in practice as it was on paper, delivering real and measurable results.
Our skilled engineers are experts in their field, combining the depth of enterprise environments with the agility of a start-up, allowing us to adapt to your needs and respond to them.
Making the Secret Sauce: Design Thinking
We have “get to know you” interviews that feel more like a conversation, where clients get to share their comments, questions, wishes, or concerns throughout the development process. We assess user behavior in real life to try and spot intricacies that are possibly left out of or forgotten in conversations
We select a limited set of needs that are most important for a particular user group
We put users in situations where they have to interact with prototypes and assess their feedback. After repeating the process many times, we finally arrive at a backlog for Minimal Viable Product (MVP).
We start with low-fidelity prototypes that are quick and inexpensive and gradually move to more refined prototypes, via iterations of Testing
We employ brainstorming techniques and use group synergy and efforts to generate solution concepts.
- Journey Mapping
- Visual Design & Wireframes
- Opportunity Prioritization & Stakeholder Alignment
- UX research
- Information Architecture
- Low & High-Fidelity Prototypes
Making the Secret Sauce: SCRUM
When organizing the project environment, the tools that you choose will be critical to the success of the project.
We recommend the tools below as a toolkit that, when used together, offer the best team experience and virtual collaboration. We will help you select the right tools to fit your needs with considerations such as licensing costs, setup effort, scalability, and data privacy:
Tracking & Organization
Knowledge base & Content sharing
Testing & Automation
Data Driven Delivery
“If you cannot measure it, you cannot manage it” – Peter Ferdinand Drucker.
For each project, we encourage the usage of tracking and assessment metrics (agreed-upon measures used to evaluate how well the project is progressing toward its goals):
Productivity and Predictability
- Planned Velocity
- Velocity and Velocity AVG
- Team Capacity
- Sprint Completion Ratio
- Team Utilisation
- Required Velocity
- Defects Found and Defects Closed
- Defects Reopen
- Defects Removal Efficiency
- Defect Density
- Total Defects Found after implementation
- Total Number of Test Cases
- Number of Automated Test Cases
- Approve Burn-down
- Scope on Review
- Scope Approved DoD
- Scope Granularity
- Scope Readiness
- Scope Estimates Completeness
- High Estimated User Stories
- Blocked User Stories
- Grooming Horizon
- Backlog Changes
- Release Scope Growth
- Estimate Accuracy
- Total Lines of Code
- Changed Lines of Code
- Number of commits
- Duplicated Lines
- Code Violations Growth
- Code Violations (by priority)
- Code Review Coverage
- Code Duplication Growth
- Technical Dept
- Release Total Scope
- Release Completed Scope
- Release Remaining Scope
- Planned Velocity
- Velocity and Velocity AVG
Customer & Employee Satisfaction
- Attrition Rate
- Turnover Ratio