Search
Close this search box.

Council Post: Roundtable Discussion on Building a Team for Generative AI

The success of generative AI projects relies on assembling a diverse team with expertise in various domains, including AI research, data science, domain knowledge, and software engineering.

In today’s rapidly evolving landscape of artificial intelligence, harnessing the power of generative AI has become a game-changer for organizations seeking to drive innovation and unlock new possibilities. Building a highly skilled team that can effectively navigate and leverage the potential of generative AI is crucial for staying ahead in this competitive arena. Such a team requires a unique blend of expertise, combining technical prowess in AI algorithms, data science, and machine learning with a deep understanding of the specific domain or industry. With the right talent in place, organizations can unleash the creative potential of generative AI, pushing the boundaries of what is possible and revolutionizing the way we approach problem-solving, creativity, and intelligent decision-making.

The AIM Leaders Council hosted a roundtable session on “Building a team for Generative AI” during MachineCon, 2023 with leaders: Anish Agarwal, Global Head of Analytics at Dr. Reddy’s Laboratories, Swati Jain, Vice President, Analytics at ExlService, Purnesh Gali, Cofounder and CEO at Actalyst, Narasimha Medeme, VP, Head Data Science at MakeMyTrip, Shirsha Ray Chaudhuri, Director of Engineering – TR Labs at Thomson Reuters, Shan Duggatimatad, Data & AI leader- Sr Director at Ascendion, Raj Bhatt, CEO at Knowledge Foundry and Deepika Kaushal, Head-Everyday AI & Digital analytics at Piramal Capital & Housing Finance. 

Building High-Performing Teams for Generative AI: Challenges and Considerations

One of the biggest challenges faced by organisations today is to set-up a team to build and work on generative AI use cases. This problem is multi-fold. The first challenge is availability of the right skillset. Large Language models, which are the back-bone of generative AI have gained prominence in the past few months. Today, companies float job specs to hire experts in LLMs with several years of experience in LLMs which is not pragmatic. As opposed to expecting applications from seasoned professionals, it would be prudent for companies to up-skill the existing Data Scientists given they already have a good sense of business knowledge..

The second challenge is about businesses’ understanding of generative AI which impedes identifying most suitable use-cases which are potent to deliver true value realisation. There is a huge overlap between generative AI and automation. Most leaders sense generative AI as a quick-fix to complex business challenges without realising the availability of data..rather good quality data on which LLMs run.

The third most critical challenge is to define a tangible outcome or business benefit from Generative AI. When identifying the use cases, it is important that a funnel approach is followed with several qualifying parameters. Some of these are.. a) the use case should be easy enough in terms of quantifiable business, b) what support or resources are required to build the use case, can this be solved by the existing team of Data Scientists, if not, is it a capacity or capability gap for which you need to partner with an expert. 

Mainstream use of generative artificial intelligence (AI) has arrived, and with it the promise of transformative potential for business. Businesses and large organisations are seeing potential everywhere they look to transform complex and expensive processes and do other things that were out of practical reach until now. Key to success is a clear understanding of the strengths and weaknesses of these tools, as well as the future opportunities they’ll create.

Anish Agarwal, Global Head of Analytics at Dr. Reddy’s Laboratories

Nurturing Teams for Generative AI with Adaptation and Collaboration

Teams are learning and ramping up. There are team members who would have been reading about this stuff, about transformers, LLMs and have deep interest. The moment Gen AI came, they started exploring it. A lot of internal and external courses have started on various aspects related to generative AI, so that more and more people get trained, and we can service organizations to enhance customer experience, increase employee productivity and enterprise efficiencies. From a technical expertise point of view, we are identifying and training data scientists who may not be working in these areas but are very keen to. Secondly, in every vertical we are trying variety of use cases of direct relevance for client, such as those related to search, providing recommendations or conversational BI. Thirdly, we emphasize focus on domain knowledge, to reap the maximum value out of generative AI.

Swati Jain, Vice President – Analytics at ExlService

The Role of Research and Talent in the Evolving AI Ecosystem

Since Generative AI hit us, people from college interns to VPs are using it because it’s so ubiquitous. It’s not like you need to do a certain programming language to use it. People have been playing with it and trying things. The talent has kind of pivoted to actual production. Talent, now within labs, has had to pivot as well. So now machine learning engineers are now not only looking at packaging a model or operations. But they also need to look at how to zoom out and build sustainable solutions with generative AI. That whole prompt layer, which nobody talks about, a lot of work is there. Secondly, research scientists who were needed to work through the data and build models and retrain are now thinking of building solutions with maybe fine tuning the foundational models, or maybe take it even to the next level. To make it work for the domain. So together we’re looking at adding to those layers but that the area of focus is kind of changed. The engineers will now look at solution development and researchers will now look at more domain adapting foundational models which have changed.

Shirsha Ray Chaudhuri, Director of Engineering – TR Labs at Thomson Reuters

What kind of talent are we looking for in this new field

A year ago, for text analytics use cases, there were a lot of engineers doing NLP and building MLOps and customizing the SOTA models at that time. Now, the foundational LLMs available are so good that they can be adapted to these use cases with limited fine-tuning. As such, there is more demand for people with prompt engineering skills rather than engineers with NLP algorithmic skills.
The challenge has been keeping the NLP engineers excited, because no NLP engineer likes being relabelled as a ‘prompt engineer’.

We need a different breed of people who are into prompt engineering and this has created an opportunity. For prompt engineering, we look for people who have specific domain expertise. The domain expert needs to know what kind of generative AI models are working well in that specific domain, and how to write the prompt to pass on relevant information to the model.

The engineering breed of people are now involved in comparing different LLMs and fine-tuning the models.

Raj Bhatt, CEO at Knowledge Foundry

Fostering Collaboration: Driving Success through Stakeholder Engagement

Breakthroughs in technology usually benefit the underserved more than the well-served This is applicable for talent as well. If a company has already heavily invested in data scientists, the well-served, it will continue to benefit from AI. However, companies that haven’t been able to use data science, the underserved, will greatly benefit from the latest advancements in AI.

In the past, there was a belief that to be a data scientist or a software developer, you needed a wide range of skills and years of experience to create something amazing. But now, with LLMs and CoPilots, you don’t need to spend years mastering a skill. You can do it much faster and much better. The time it takes for a beginner developer to become an expert has significantly decreased.

I believe this will have huge implications on how we look at talent. If you’re a big company, this means you need to carefully consider what skills will be needed. Suddenly if anyone could become a good data scientist, what do you hire for? For startups, this means being more efficient with resources and delivering more value allowing them to compete with large companies. This is a powerful paradigm shift that might change how we look at technology talent in the years to come.

Purnesh Gali, Cofounder and CEO at Actalyst

Talent and Scalability: Unraveling the Implications for Generative AI

There are many stand-alone applications of LLMs and Generative AI. However, there is much more potential or opportunity when generative AI especially LLMs are used in conjunction with other technology, especially speech-to-text, and conversational multi-modal HCI (human-computer interface) frameworks. To build a good useful product, especially conversational tech, we sometimes need almost a small village of team, comprising of Ux design, data engineers, front-end, back-end engineers, devOps, and data scientists. All applications don’t need a large team for sure. You could derive value from this tech stack even with a small team, for other solutions. Consider Pareto principle which is applicable in most use cases. With, let’s say, 20% effort you can get 80% of the total merit or outcome. That’s true here too, and especially with the strength of a good language model, a significant number of NLP problems can be solved without custom effort. And, for some of these things, you may not need teams with specialized skills, especially if you are integrating with just an API pre-trained model. However, it might hit a point when such a setup is not sufficient, You might need custom models or an ensemble of solutions, additional NLU/NLP methods, or domain-specific modeling variations, besides other engineering/devOps capabilities. In those situations, data engineers and AI/ML modelers who have significant experience working with data and modeling, in their respective problem domains, could continue to add value. So, I believe the role of AI/ML scientists and engineers will continue down the lane, although yes, the generative AI has opened up a vast set of opportunities for everyone else.

Narasimha Medeme, VP Head Data Science at MakeMyTrip

Domain Expertise in Generative AI: Meeting the Growing Talent Demand

Generative AI is new to us hence there is a limited understanding of the capabilities at present. Currently, the perception is that it can handle all tasks, including grunt work like annotations on NLP problems, through ChatGPT. Many people are eager to experiment with it because it is as simple as making an API call, without requiring extensive analytics knowledge. However, there are challenges that need to be addressed. Firstly, it requires supervision due to data security concerns. Once this hurdle is overcome, the next question is: what can generative AI actually achieve and how should we leverage it? Using it directly for customer handling is not a good idea at present but should be used in a supervised manner.

So, who should we hire now? Who will determine the right answers and guide us in prompt entry? This is where business teams play a crucial role. Hiring domain consultants with specialized expertise will be essential. However, it will take time for the industry to fully comprehend how to effectively utilize ChatGPT or generative AI. The technology has progressed from generating generic outputs to domain-specific responses. Eventually, we will need business consultants who can provide specific answers within their respective domains, be it employer branding, marketing, or addressing customer queries in various contexts. This transition will occur gradually as we continue to explore and understand the potential of generative AI.

Deepika Kaushal, Head-Everyday AI & Digital analytics at Piramal Capital & Housing Finance. 

The Role of Technology Ethicists in Today’s World

In our society, we have this notion called the “social fabric,” which essentially represents the collective demands and values of the community. It’s important to distinguish ethics from personal morality since ethics is about establishing common principles that promote the well-being of humanity and prevent harm. To ensure these shared guidelines are upheld, a governing body becomes essential. Drawing a parallel, just as we recognize the need for responsible control when dealing with nuclear energy, the emergence of Generative AI or AI holds similar significance. Instead of ignoring its presence, we should focus on finding ways to harness its potential responsibly and guide it with a clear vision.

Shan Duggatimatad, Data & AI leader- Sr Director at Ascendion

In conclusion, building a team for generative AI is a multifaceted endeavor that requires a strategic approach. The success of generative AI projects relies on assembling a diverse team with expertise in various domains, including AI research, data science, domain knowledge, and software engineering. Collaboration and effective communication among team members are essential to harness the full potential of generative AI technologies. Additionally, fostering a culture of continuous learning and staying updated with the latest advancements in the field is crucial for building a strong and adaptable team. By investing in talent acquisition, skill development, and collaborative practices, organizations can position themselves at the forefront of generative AI innovation and drive transformative outcomes in their respective industries.

This article is written by a member of the AIM Leaders Council. AIM Leaders Council is an invitation-only forum of senior executives in the Data Science and Analytics industry. To check if you are eligible for a membership, please fill out the form here.

Picture of 김도한
김도한
AIM Research is the world's leading media and analyst firm dedicated to advancements and innovations in Artificial Intelligence. Reach out to us at info@aimresearch.co
Subscribe to our Latest Insights
By clicking the “Continue” button, you are agreeing to the AIM Media Terms of Use and Privacy Policy.
Recognitions & Lists
Discover, Apply, and Contribute on Noteworthy Awards and Surveys from AIM
AIM Leaders Council
An invitation-only forum of senior executives in the Data Science and AI industry.
Stay Current with our In-Depth Insights
The Most Powerful Generative AI Conference for Enterprise Leaders and Startup Founders

Cypher 2024
21-22 Nov 2024, Santa Clara Convention Center, CA

25 July 2025 | 583 Park Avenue, New York
The Biggest Exclusive Gathering of CDOs & AI Leaders In United States
Our Latest Reports on AI Industry
Supercharge your top goals and objectives to reach new heights of success!
Join AIM Research's Annual Subscription Today

Unlock Unlimited AI Insights for Just $9999!

50+ AI and data science reports
All new reports for the next 12 months
Full access to GCC Explorer and VendorAI
Stay ahead with cutting-edge insights