[header_ad_block]

Mumbai, September 26: In the fast-evolving landscape of technology leadership, staying abreast of innovations while upskilling is paramount. Early adoption of emerging trends positions businesses for long-term success. To revolutionize healthcare and insurance, DevOps is the linchpin, enhancing performance, scalability, and security. DevSecOps ensures compliance, bolstering data protection. This article delves into the crucial skills a DevOps practitioner should possess. It explores how the open-source community is driving ethical AI development, focusing on transparency and explain ability.

To shed light on these topics, Mr. Marquis Fernandes, who is Head of Business at Quantic India indulges in a conversation with Dr. Anand Mahalingam who is Vice President – Data Labs- Head of AI at HDFC Life. Dr. Anand Mahalingam, Ph.D – has 22+ years of Industry Experience in the Information Technology & Data Science (AI/ML) domain. He is highly Skilled in Digital Transformation, make use of Digital Technology (Artificial Intelligence, Machine Learning, Deep Learning, NLP, Data Analytics, Mobile, & IoT) to radically improve Business processes, Operational Efficiency and Customer Experience. He has Developed and Implemented several Enterprise grade Artificial Intelligence based Products for Customer Service, Healthcare, Telecom & Insurance Industries. Top of Form

How do you work on staying up to date on new innovations as a tech leader and focus on upskilling at the same time?
Being a technology leader, you need to continuously watch the innovation happening and see the emerging trends that can positively or negatively impact your business domain.  Being an early adapter for innovation prepares you and your organisation for the long term and positions you on top in the market. Everyone has limited time and availability to keep updated about all the innovations; however, following the tech trends through various newsletters and blogs helps you to track things. More importantly, attending Webinars and conferences allows you to see things quickly. Further, a city like Bangalore gives you additional opportunities to join meet-ups and focused group discussions very often. By tracking new things and when you need deeper knowledge in the subject, the opportunities are wide open to gain formal expertise on the subject through various MOOC and technology partners (e.g., AWS, GCP, Azure, OpenAI) accredited certifications.

You believe in leveraging digital technology to transform the healthcare and insurance industry, how do you think DevOps can play a role in this transformation?
In Healthcare systems, DevOps has the potential to improve the performance, reliability and scalability of Information technology systems while ensuring regulatory compliance and the protection of sensitive patient data. The same applies to the insurance industry as well. DevOps helps Healthcare organisations to develop and release software applications more quickly. Think about a new feature for your Patient app; a significant update in your existing electronic health Records can be deployed faster, thus improving patient care and overall efficiency. DevSecOps, the subset of DevOps, helps ensure security by incorporating security measures into the development process, like security testing, code scanning and compliance checks such that the applications adhere to industry regulations like HIPAA (Health Insurance Portability and Accountability Act). Both in Insurance and healthcare, Analytics play an essential role; DevOps supports Data engineering to collect and organise structured and unstructured data through various ETL pipelines and continuously facilitates the data for downstream applications like BI Dashboards and Machine learning algorithms.

According to a senior experienced leader like you, what should the essential skills of a DevOps practitioner be?
To succeed in this dynamic field, DevOps professionals must possess a range of essential skills. The first and foremost is Scripting & Coding Skills. Proficiency in scripting languages like Python, Bash, or PowerShell allows DevOps engineers to automate repetitive tasks, such as configuration management or deployment scripts, to streamline processes and reduce errors.

In most organisations, the DevOps Lead will be in charge of the Source Code Management, utilising tools like Git and platforms like GitHub or GitLab; DevOps practitioners can manage version control, collaborate on code with team members, and track changes. This ensures code integrity and facilitates teamwork.

DevOps encourages the use of Infrastructure-as-code (IaC), and experience with tools like Terraform or Ansible enables DevOps teams to define and manage infrastructure in a code-like manner. They can provision and configure servers, networks, and cloud resources efficiently. For every DevOps Practitioner, the knowledge of CI/CD Concepts (Continuous Integration and Continuous Deployment/Delivery) is essential. Practical knowledge in tools like Jenkins, Travis CI, or GitLab CI/CD for automating code integration, testing, and deployment helps in a big way.

Apart from Technical skills, DevOps professionals need soft skills such as effective communication, collaboration, and problem-solving skills. They facilitate cooperation between development, operations, and other teams, leading to smoother workflows and faster issue resolution. Proficiency in cloud-native technologies like Kubernetes and Docker allows DevOps engineers to build and manage containerised applications that can scale seamlessly in cloud environments, enhancing flexibility and scalability. Though having the necessary skills and experience working with the various DevOps tools, Industry certifications like AWS Certified DevOps Engineer, Microsoft Certified: Azure DevOps Engineer Expert or Certified Kubernetes Administrator (CKA) validates a practitioner’s skills and knowledge, boosting career prospects and credibility.

In conclusion, a successful DevOps practitioner must have a multifaceted skill set combining technical proficiency with soft skills to succeed in the fast-paced domain.

As artificial intelligence becomes more advanced, how is the open-source community contributing to ethical AI development and preventing biases?
The core principle of Responsible AI is to keep human well-being at the centre of AI Development. As more and more companies suffer incidents of bias in their models, the tech community is worried about more efficient and powerful tools to overcome the challenges.  Among all of the core principles of Responsible AI, Transparency & Explainability is the key term everyone is interested in. Luckily, a set of open-source tools is available to keep them on track during the development phase itself.

  • What-if-tool launched by Google is a new feature of the open-source tensor board web application. It helps user’s analyses an ML model without writing a separate code. With the given pointers to the TensorFlow Model and dataset, the What-if-Tool offers an interactive visual interface for exploring model results. It also allows manual editing of examples from your datasets and seeing the effect of those changes.
  • The fair-learn tool by Microsoft captures the AI system’s impact on people and their opportunities. It helps to ensure equal treatment in hiring, school and college admission and lending-related algorithms.
  • The Responsible AI Toolbox comprises a set of tools, including interfaces and libraries, designed to enhance comprehension of AI systems. These resources empower developers and stakeholders to responsibly build and oversee AI systems, enabling more informed data-driven decision-making.

In a world of real-time data processing and analytics, how can DevOps practices be applied to ensure timely deployment and continuous monitoring of streaming data pipelines?
DevOps Practices play a significant role in managing the overall Streaming data pipelines. For example, DevOps-based auto-scaling mechanisms driven through metrics and thresholds help achieve optimal resource utilisation and performance in handling fluctuations in data volumes or processing requirements. Also, it helps to manage the data-drift-related problem.

In some cases, such as a feed from an operational database, data drift is an anomaly, and any unforeseen change in the data structure is to be flagged for attention. Similarly, while using semistructured data, data drift is expected as the upstream data sources changes unexpectedly. A carefully configured DevOps system captures this and propagates changing schemas to their destinations.

What are your views on the role of collaboration tools, communication practices, and cross-functional teams in the successful integration of DevOps with AI/ML projects?
Collaboration is the core of DevOps, which seeks to integrate development and operations teams seamlessly. This integration is built on the principle that when these teams work closely together, they can achieve several key benefits in successfully delivering software products.

Firstly, collaboration ensures that development and operations teams align their efforts. Developers can configure the software with operational considerations in mind, making it easier to deploy and maintain. Operations teams, in turn, can provide valuable insights during the development phase, identifying potential issues and requirements upfront. Communicating these points then and there through a Slack chat, a team call or raising a ticket through JIRA makes the environment more productive.

Effective collaboration also relies on robust information-sharing practices. When an issue arises during deployment or operation, it should be meticulously documented. A dedicated section in Confluence or SharePoint for project-specific updates serves as a feedback loop for the development team, enabling them to address and rectify issues in future builds.

To know more about us / publish your article, reach us at
www.quanticindia.com
marquis@quanticindia.com

[blog_bottom_ad]
Share.
Leave A Reply