[ad_1]
(AP Illustration/Peter Hamlin)
Adam Selipsky is leading Amazon’s cloud division during one of the most important moments in technology history.
Mr. Selipsky, CEO of the company’s cloud-computing arm AWS, said that Amazon has been deploying a variety of generative AI technologies over the past few months as it aims to compete with the likes of Microsoft in the growing AI arms race. Be behind the product.
AWS is the cloud market leader and a highly profitable business for Amazon. But some of that growth has slowed in the past few quarters as economy-wide challenges have caused customers to cut back on spending.
At the same time, the business has been at the forefront of Amazon’s efforts in generative AI, which exploded into national consciousness last year with the release of OpenAI’s popular chatbot ChatGPT. In a talk at the AWS conference in Las Vegas in late November, Selipsky revealed the company’s response to ChatGPT, an enterprise AI assistant called Q.
The Associated Press recently spoke with Selipsky about how companies are spending on cloud services, Amazon’s investment in artificial intelligence startup Anthropic, and the future of generative AI. The conversation has been edited for length and clarity.
Q: Businesses are reducing cloud spending this year. Is it still happening?
A: Over the past few quarters, many of our customers have sought cost optimization. From the beginning, we’ve said that AWS and the cloud are the place to do it. We have seen many customers achieve great results through cost optimization. And there are other customers who are still on the way. We’re further along, but we’re not done yet.
In addition, many customers are currently investing. The companies that will survive are those that invest in uncertain economic conditions while other companies are hesitant to invest overall. And we work with many customers who are doing just that. We’re also seeing a lot of interest in our generative AI services.
Q: What is your vision for generative AI?
A: We actually think about three different layers of the generative AI stack.
At the bottom of the stack is the infrastructure required to run generative AI. We have a very large Nvidia GPU-based business and have designed and delivered our own custom designed chips, including Trainium and Inferentia chips.
Most of our enterprise customers don’t intend to build models. Most of them want to use models built by others. This is the middle layer of the stack. We offer multiple foundational models to our customers using a service called Amazon Bedrock, including Anthropic, Meta, and Amazon’s own. I don’t think the idea of one company supplying all the models in the world is unrealistic. We discover that our customers need to experiment and we provide that service.
The top layer of the stack uses applications built using generative AI. That’s why we have a coding companion for developers.
Q: Speaking of models, there have been reports that Amazon is building a large-scale language model called Olympus. Is that something we should expect to see soon?
A: Amazon’s first-party model is already on sale now under the Titan brand, and we should definitely expect multiple versions to come. It goes back to the idea that there is no single model that rules everything. I need multiple models with different use cases. And I expect them to be collectively very capable and powerful.
Q: Can you talk about Amazon’s investment in artificial intelligence startup Anthropic? There are also reports that Google, which also supports Anthropic, is increasing its investment. Some say this is becoming some kind of proxy war between Amazon and Google. Does it look like that?
A: No, it’s not. We have a very close and close relationship with Anthropic, which is very beneficial for both companies. Anthropic has selected Amazon as its primary cloud provider for mission-critical workloads. The majority of Anthropic workloads run on AWS. period.
Now, these are not exclusive relationships. They have very large computing needs. And Anthropic has clearly been using other cloud providers for a long time.
Anthropic has been using AWS since its founding in 2021. We have worked together and deepened our relationships. Anthropic is training larger and more capable models. And they saw an opportunity to get a tremendous amount of computing power from his AWS that they needed to train their models. For Anthropic, it is also very important to be within Amazon Bedrock, a trusted service for accessing the underlying models.
We know that the people at Anthropic are very smart and world experts at what they do. And by working together and collaborating in training and running his Anthropic models on our chips, you’ll help us improve that technology. So this is a really mutually beneficial relationship that I think will really benefit both companies and, most importantly, our mutual customers for years to come.
Q: How is Amazon thinking about safeguards when building this technology?
A: Responsible AI is very important and Amazon takes it very seriously. We have published a number of principles for responsible AI. We’ve done things like create cards for the service that explain what the model is used for, what it’s used for, and how the model is trained. We are working to provide more transparency about how some of these AI services are built and what they are used for.
We believe that many responsible AI solutions need to be multifaceted solutions. It requires collaboration between cloud industry leaders, companies like AWS, model makers like Anthropic, government, academia, and more. That’s why we have actively participated in responsible AI forums at the White House and in the UK.
Q: What do you think the AI race will be like next year?
A: I think we’ll see very rapid evolution and change. And that partially reflects the fact that the evolution of generative AI is still in its infancy. That’s why I think adaptability and flexibility are actually very important benefits for customers. To achieve business goals and satisfy customers, we need to be highly flexible, nimble, and adaptable as we evolve how we use generative AI.
[ad_2]
Source link