Artificial intelligence is revolutionizing the software delivery life cycle (SDLC). From writing and optimizing code to reviewing and debugging, AI is making its mark. SaaS offerings are leveraging AI for summarization, chat features, and various forms of analysis. However, the rise of AI, particularly generative AI, brings new challenges like managing non-deterministic AI output, where hallucinations could become the new programming joke alongside classic errors.
Yet, the need for DevOps remains unchanged. DevOps continues to play a crucial role in delivering high-quality code faster and more frequently. According to DORA’s State of DevOps reports, four key metrics describe software delivery performance: lead time to change, deployment frequency, change fail rate, and failed deployment recovery time.
The DevOps and AI Intersection
Interest in DevOps peaked in March 2022, according to Google Trends, and has since seen a gradual decline. In contrast, AI has surged in popularity since the launch of ChatGPT in November 2022. This shift in interest doesn’t imply that AI is replacing DevOps. Rather, DevOps is integral to AI’s success. Strong DevOps practices ensure that AI demands for frequent code updates, automated quality processes, and rapid remediation methods are met.
The DevOps Muscle Group
The DevOps “muscle group” is essential for AI. Without strong DevOps practices, organizations risk having vulnerabilities in production. Hugging Face, the GitHub of AI, exemplifies how foundational models are evolving rapidly, requiring constant updates and prompt engineering. For those struggling with DevOps in non-AI code, implementing AI will be even more challenging. There’s no shortcut to bypass the DevOps trend.
MLOps and AIOps: Extending DevOps
MLOps applies DevOps principles to machine learning models, helping extend DevOps practices to AI workloads. While MLOps builds on DevOps foundations, AIOps focuses on using AI and ML to assist IT operations. This distinction is crucial for understanding how to operate AI effectively.
Case Study: DevOps at the Edge
DevOps and the cloud have matured together, with cloud-native tooling supporting DevOps outcomes. However, software that cannot move to the cloud, like those on oil rigs, factory floors, or retail stores, presents unique challenges. Data generated at the edge can be costly and slow to send back to the cloud for processing.
For instance, a factory with video cameras scanning for production defects would find cloud data transfer cost-prohibitive. Similarly, an oil rig’s pressure sensor data might be too slow for cloud-based analysis. Edge environments require tailored DevOps practices distinct from cloud-native assumptions.
Overcoming Edge Challenges
To address these challenges, begin with the basics:
– Embrace Edge Realities: Understand that edge environments differ from the cloud.
– Utilize DORA Metrics: Baseline your current edge code performance using key metrics.
– Platform Approach: Solve for all components and layers of the stack, not just individual workloads.
– Ensure Observability: Monitor code behavior in production to avoid flying blind.
– Prioritize Security: Implement zero-trust environments and secure enclaves.
– Add MLOps: Manage AI/ML models with MLOps practices, including model versioning and monitoring performance.
DevOps remains vital in the AI era. Its success measures continue to guide AI deployments. Strengthening DevOps practices, particularly at the edge, ensures high-quality AI code deployment.
Note: This article is inspired by content from https://thenewstack.io/ai-devops-is-dead-ai-at-the-edge-long-live-devops/. It has been rephrased for originality. Images are credited to the original source.


Leave a Reply