The transformative impact of AI as a generational technology

Dr. Lisa Su, Chair and CEO of AMD in a recent interaction with the students of the Indian Institute of Science (IISc), Bengaluru shared her perspectives on AMD’s strategy in powering the AI era, and end-to-end infrastructure, the role of academia-industry partnerships in advancing AI, including opportunities in India and more.

Foundations of Passion

As a child, I was constantly entranced by understanding how things functioned. At the point when I was around 10 years of age, my younger brother had a remote-controlled vehicle that he wanted to play with. At some point, the remote-controlled car out of nowhere quit working. My interest was provoked - what might have turned out badly?

Not entirely settled to find out, I unscrewed the car and I saw a disengaged wire. After cautiously reconnecting it, the toy sprang back to life. That second was a disclosure for me - I understood how fascinating products and engineering could be.

From that point on, my interest in math and science developed. This enthusiasm at last drove me to concentrate on electrical designing and semiconductors at MIT in Massachusetts.

Key decisions that made a difference

When I initially began working in semiconductors, most of friends didn't completely comprehend what semiconductors were about. When I told my friends that my specialty was semiconductors and that I was joining a semiconductor organization, they frequently asked, "What's a semiconductor?"

What propelled me most about semiconductors perceived is that chips were the foundation of many devices and they are brains of computers thus numerous different advancements that can possibly change the way we work.

I joined AMD, one of the main organizations in the US that focuses on the most advanced semiconductor technology. My essential center was driving headways in superior execution figuring — planning and building the absolute most remarkable and effective semiconductors. That vision attracted me to AMD.

Mark Papermaster, AMD's CTO, joined the organization only a couple of months before I did. Together, we chartered a strategy for the organization. Similarly as with any business, one of the key choices was figuring out what to focus on, what we needed to accomplish as we developed. Our aspiration was to be the leader in high performance computing. To accomplish that we had to have very strong cores, GPUs, CPUs as overall architecture to make that bigger reality. Those fundamental choices laid the preparation for all that we have achieved from that point forward.

Future of computing & strategies to drive AMD’s leadership strategy

When we talk about AI, it's important to step back and talk about the role of high performance computing. Technology has power to make our lives better, and businesses more productive. It also helps us in solving some of the important problems in the world. It helps us in finding the next discoveries whether you are talking about the next big discoveries in science.

Against this setting, AI addresses the following coherent step. While Artificial intelligence has been around for a long while, it was generally the space of experts who use AI. Over the most recent few years, that has essentially changed. The development of generative AI, and large-minded models, and instruments like ChatGPT has changed artificial intelligence from a specialty innovation into something open to everybody.

What's striking is the effortlessness of its reception — utilizing regular language to associate with simulated intelligence has democratized its utilization. Out of nowhere, anybody can draw in with and benefit from this innovation.

Because when you are able to use natural language to unlock certain capabilities – it all of a sudden changes who can actually use it. We are just at the beginning of what will AI be capable of and I have been in the semiconductor industry for over 30 years. In my career this is the most impactful and the highest potential technology that can make all of us more productive and all of our discoveries more capable. And it is an important opportunity for us to make computing to the next level to help us accelerate the power of technology.

Future of specialized and focused machine learning tools

I firmly believe there is no one-size-fits-all solution when it comes to the future of computing. The key is to use the right type of compute for the right application.

For instance, much of the current conversation revolves around large GPUs and accelerators designed for cloud computing to handle training and inferencing for massive language models. However, we anticipate the need for a variety of model sizes in the future. There will be large, general models that are specialized tailored for narrower use cases. These more compact models might be fine-tuned and trained on smaller datasets, requiring different types of silicon to support their unique demands.

At AMD, we view AI as an end-to-end opportunity with cloud on one end and enterprise at the very high end. We also believe in edge AI when you think about what you want to do with distributed AI when you are closer to the data being generated. We also believe in client AI.

Therefore, everyone should have their own AI PC which helps them in running their own models locally, also operate their own data locally. Therefore, in this age we would expect various set of devices to run AI as well as a wide range of models to run for the AI applications.

Current Issue

TheHigherEducationReview Tv