Generative AI, with its ability to create content and generate code autonomously, is poised to revolutionize the future of technology and programming. By leveraging advanced machine learning algorithms, generative AI can assist developers in automating tedious tasks, accelerating software development cycles, and even generating entirely new solutions to complex problems. Furthermore, it has the potential to democratize programming by enabling individuals with limited coding experience to create sophisticated applications and tools. As generative AI continues to evolve, its impact on shaping the landscape of technology and programming will undoubtedly be profound, paving the way for innovation and creativity in ways previously unimaginable.
We had a roundtable discussion on the topic with a set of experienced and distinguished leaders in the industry. The session was moderated by Kashyap Raibagi, Associate Director – Growth at AIM along with panelists Yogananda Domlur Seetharama, Director of Data Science at Walmart Global Tech, Amberle Carter, Chief Data and Analytics officer at Texas Department of Family and Protective Services, Niharika Nanda, Vice President, Product Data & Analytics at Mastercard, Kalyana Bedhu, AI/ML Leader at Fannie Mae and Senguttuvan Thangaraju, Senior Director, Enterprise Data Governance at McKesson
The Rise of Gen AI in Technology
Over the past decade in this space, I have noticed the accumulation, accessibility and digitization of vast coding datasets, like those on GitHub, have been crucial in training Generative AI to assist in programming tasks. This data-driven evolution has enabled AI not just to support but to collaborate in the software development process, revolutionizing how we approach coding
– Yogananda Domlur Seetharama, Director of Data Science at Walmart Global Tech
Fostering Innovation through Collaborative AI Development
It’s critical for us to collaborate no matter what it is because we, as peers, can bounce ideas off of each other and continue to improve. Whether it is creating a cohort, a centre of excellence, or whatever you want it to be where you’re running code or developing something new, someone else has new insights. It’s either to optimize whatever you’ve created or say hey, have you thought of this? You’ve got a great foundation. Let’s build upon it and create something new and innovative you haven’t already considered. But especially in AI collaborations, it strengthens what we’re doing because you’re helping to minimize things such as the built-in bias by having different perspectives through collaboration and enhancing your product. Collaboration is key, and it needs to happen as we move forward; otherwise, we’re just going to become siloed in our efforts, and it’s not going to maximize our output. From a leader’s perspective, some people are hesitant to collaborate. They don’t know how or are afraid to ask because they feel like they’re bothering someone or it’s not within their role. And so, as a leader, we need to come in and help educate those outlets to encourage that collaboration and growth within our team because that’s really how we’ll see more success as we continue forward. So that’s a key point for us to remove barriers to collaboration and continued growth.
– Amberle Carter, Chief Data and Analytics officer at Texas Department of Family and Protective Services
Trifecta of AI Advancements: Computing Power, Data, and Transformer Models
Advancements in AI systems have been very recurrent over the last few decades. Every few years, we see significant advances, which make AI true believers behold, and skeptics question.
With Generative AI, we are in another such cycle. Gen AI has been dominating the headlines. There is hype, and then there is also unparalleled progress and breakthroughs in numerous diverse fields, driven by a new class of AI models, “foundation models,” which are very powerful and adaptable.
The machine learning trifecta – compute, models, and data, is what has made this possible. The vital advancement of the use of GPUs for general-purpose computing, between 2000-2010, provided essential computing power for models to learn from large amounts of data. The ‘transformer’ model introduced in 2017 by Google scientists revolutionized AI and is largely responsible for the current Gen AI systems. And without enough rich and pertinent Data to train the models, the GPT ‘generative pre-trained transformer’ systems wouldn’t exist today.
And while we are where we are today, when the popularity of Gen AI has sparked a race to innovate in technology, Data, it’s quality, and quantity will ultimately yield an absolute influence over this space
– Niharika Nanda, Vice President, Product Data & Analytics at Mastercard
Translating Strategic Education into Tactical Team Growth
It means multiple things. The best form of education is not putting them in a classroom and telling them what to do, that’s the least effective. If people actually see other people doing it, that’s the best form of education. If they see colleagues with slightly more acceleration, they can enjoy the life better; like, they were able to go home on time while you’re struggling to finish your next PI or they’re able to take more tasks in the same time opens up the motivation and intent. All education follows intent. This is one idea: to create the delta and create an acceleration there. The second thing is supporting them, knowing that not everybody immediately uses your new technology. It is a change management aspect of it. Be friendly and make sure it’s easiest for them to adapt to that technology with least amount of friction possible.
So don’t create the solutions before they need them, but create a demand or intent to accelerate and provide them with the supply (AI CoPilots and trainings) to make that happen. That’s the real education I’ve seen where some teams naturally moved. It’s the culture at the end of the day.
– Kalyana Bedhu, AI/ML Leader at Fannie Mae
AI Success Blueprint: A Strong Foundation in Data Governance, Quality, Security, Privacy and Data Management overall.
Engaging in the development of AI models and delving into the technological aspects of AI can be compared to constructing a house. Similar to building a house where a robust foundation is crucial, in the realm of AI, this foundation is comprised of the essential data management principles that data practitioners have adhered to over the years. These principles encompass Data Governance, data quality, security, privacy and others. Any AI project must adhere to these foundational aspects; none should be overlooked. Adhering to these basics ensures a stronger outcome. By prioritizing these fundamental principles, we can harness the full power of AI to expedite our progress.
– Senguttuvan Thangaraju, Senior Director, Enterprise Data Governance at McKesson