The rapidly evolving business ecosystems, regulatory environments, and consumerization of IT mandates have enhanced our IT system’s flexibility, security, and resilience like never before.
So far, we have witnessed artificial intelligence (AI) transform every aspect of business and operations, the underlying IT systems, and the development processes. While the SDLC process is already being streamlined and accelerated using Agile and DevOps, challenges remain around prevailing mindsets and skill gaps to achieve hyperautomation and consistently leverage best-in-class engineering practices.
Interestingly, AI and machine learning (ML) can come to the rescue by capturing large chunks of data generated by various software engineering, including CI/CD tools to create models and detect patterns. These models can be used to pinpoint anomalies, predict failures, and provide remediation, enabling us to make a quantum jump towards building high-performing autonomous systems.
Let’s look at how different stages of DevOps can benefit from AI:
Business stakeholders expect applications to deliver new functionality and resolve issues swiftly. Continuous planning ensures that inputs are received in various structured and unstructured forms, including feature or service requests, trouble tickets, user feedback, surveys, and market analysis. These inputs are constantly evaluated, converted into user stories, and moved into the product backlog.
Natural Language Processing (NLP) can be leveraged to interpret unstructured inputs that may come via emails, voice messages, call to agents, and comments on the website. It helps better capture the user requirements and pain points with the relevant intent. Further, these inputs can be aggregated and summarized to provide insights to product owners and other business stakeholders, enabling them to plan and prioritize features and bug fixes for subsequent releases.
This step involves integrating code from various developers and generating incremental builds regularly to minimize risk. A chatbot with a Natural Language Generation (NLG) capability helps trigger builds on demand and send custom alerts and notifications in case of issues or failures. Further, historical data from previous code commits, builds, and logs generated can be analyzed to understand patterns and create hot spots for avoiding similar pitfalls in the future. Static code analysis and unit testing are two other vital activities that can benefit from AI. Once code analysis is triggered in the background and completed after a developer commits the code, the results can be fed into a conversation engine. It can summarize the results using a text summarization engine converted into voice, guiding the developer to improve the code quality before testing.
AI can be applied effectively in the quality assurance (QA) process to augment less obvious but critical auxiliary activities beyond test execution and reporting. For instance, test engineers can be supplemented with an intelligent assistant to classify defects and detect any duplicates automatically during testing execution. This can significantly optimize the defect triaging process, which is otherwise cumbersome and time-consuming.
For failed tests, logs can be analyzed to uncover recurring patterns, creating and training models to predict the causes of failures in future test runs. For systems in production, where most test cases may already be available, NLP can be applied to convert these test cases into scripts that can be directly consumed by popular automated testing frameworks like Selenium or Appium. Further, similar tests can be grouped into clusters based on patterns arising from semantic similarity and history of success or failure to save execution time and optimize regression testing.
Technology played a vital role in automating software deployment from the time when deployment jobs were manually triggered using handwritten scripts to this day and age of a single-click multi-stage automated deployment. Despite this advancement, failed and sub-optimal deployments with frequent rollbacks remain a problem for many organizations leading to delayed launches and lost revenues. AI can be crucial in handling the complexity of deployments and reducing failure rates.
For instance, ontologies of an organization’s infra-assets, including software, databases, and hardware, can be created for different environments such as dev-test, staging, and production. It is possible using a combination of the subject matter expert’s knowledge, Configuration Management Databases (CMDBs), and network discovery tools. System and application-specific logs generated during past deployments can be stored, parsed, and analyzed along with ontology elements to predict possible failures in future implementations. These failures can be compared with actual deployment outcomes to identify new patterns based on which learnings can be applied to take preventive measures to make future deployments more predictable and reliable.
Monitoring production releases allows product owners, QA, and development teams to understand how the applications are performing and being used. Enormous data in the form of alerts, incidents, logs, events, and metrics are generated from the applications, dependent systems, tools, and other network components. AI can help generate insights from this large data pool by creating trained models using supervised and unsupervised learning. These models can assist in identifying anomalous behavior, which can cause potential vulnerabilities and failures.
Additionally, explicit feedback on issues faced by end-users can be obtained using different channels such as emails, text messages, and voice-based interactive conversations. This feedback and usage patterns can be processed to perform sentiment and usability analysis better while understanding the customer experience with the product or service. Eventually, the outcome of this analysis can be used as a critical input for perfective maintenance or new user story creation leading to user experience enhancement.
Digital technologies are already transforming businesses across industries today. DevOps plays a pivotal role in this transformation story by ensuring that products and services based on new-age technologies are ready for consumption seamlessly and reliably. AI promises to take the DevOps movement to the next level by infusing intelligence based on best practices and eliminating human and system errors. This can not only accelerate the concept to deployment cycle significantly but enable us to realize the seemingly unattainable goal of building flexible, self-learning, and responsive autonomous systems.
Intelligent Automation & Testing
Aman has over 25 years of experience in the IT industry spanning digital engineering, intelligent automation, and delivery excellence. He currently leads the DevOps and Quality Engineering practice at Virtusa, which is responsible for consulting and advisory services, competency development, and solution engineering for global clients. Aman has multiple patents to his credit in applying AI/ML techniques to automate and optimize the SDLC process.
Subscribe to keep up-to-date with recent industry developments including industry insights and innovative solution capabilities
Healthcare organizations collect data from a seemingly endless array of sources and systems. A data mesh architecture can help them make better use of it all.