Beyond Amazon HealthScribe: how AWS is taking wing in generative AI

Beyond Amazon
HealthScribe:
how AWS is
taking wing in
generative AI


AWS in generative AI

How AWS is dipping its toes into generative AI and what stiff Artificial Intelligence (AI) competition means for businesses.

The generative AI gold rush is on and Amazon Web Services (AWS) has emerged as a dark horse vying for the lead against its high-profile competitors. One of the first signs came in July 2023 when the cloud service provider unveiled AWS HealthScribe, an AI-powered service that automates clinical documentation for healthcare providers. A couple of months later, AWS announced a $1.25 billion investment in Anthropic, one of the leading AI startups that develops conversational AI. These moves signal that AWS is staking a claim in next-generation AI through major investments and product launches, albeit the company’s economic slowdown. In this article, we will take a closer look at AWS’s generative AI strategy. We will also explore how businesses of all sizes can take advantage of democratized access to powerful AI tools.

AWS and new partnerships

AWS and Hugging Face

While AWS has been steadily working toward the development of generative AI, this process has additionally brought about numerous partnerships. In February 2023, just a couple of months after the viral release of ChatGPT by OpenAI, AWS built a partnership with Hugging Face. The team of this open-source AI model repository pledges to promote accessibility and transparency of the technology, and strives to work with AWS tools like AWS SageMaker, AWS Trainium, and AWS Inferentia chips to succeed at this goal. Through this collaboration, Hugging Face aims to have more cloud-based opportunities to build, train, and deploy advanced generative AI models.

AWS and Anthropic

Another turning point is a $1.25 billion investment in Anthropic that could reach up to $4 billion in the future. In an official statement, the startup declared that this partnership’s purpose was to empower Anthropic to ‘develop the most reliable and high-performing foundation models in the industry.’ After joining forces with the largest cloud provider, the company will also access AWS chips like AWS Trainium and AWS Inferentia, which means that they will be able to benefit from one of the largest compute infrastructures in the world. This massive injection of funds and computing force positions Anthropic to experiment with new approaches towards the technology.

Learn how Avenga built an end-to-end customer service platform with state-of-the-art chatbot functionalities for Clickatell. Success story

The driving force behind partnerships

AWS’s strategic moves come amid broader positioning by major cloud providers to take the lead in generative AI. Microsoft and Google have also invested heavily in next-generation models and talent with an eye toward advancing their cloud platforms. Since this technology is becoming integral to business, cloud leadership and generative AI supremacy are converging as dual goals. AWS is leveraging its vast resources so as to shape the technology’s direction through calculated collaboration and exclusion. Supporting open source meets the growing calls for fairness and accountability. But, retaining proprietary access concentrates any advances into AWS’s hands.

While competitive forces will drive generative AI forward quickly, balance is required. If dominant players wall off essential resources or saturate talent pools, they risk stifling the broader ecosystem. Likewise, failing to commercialize research could prevent real-world benefits. As AWS asserts its ambitions in this space, balancing openness to spur collective growth with competitive edges will be key. In addition, if technology leaders recognize that advancing AI requires both rivalry and cooperation, it can fruitfully transform businesses.

AWS and NVIDIA

Although competitive forces exist, AWS and NVIDIA also recognize the value of strategic collaboration in generative AI. The two companies possess complementary strengths that can accelerate innovation in large language models (LLMs) and generative applications. For example, AWS contributes its broad cloud infrastructure and services like SageMaker in order to manage the model-building process, while NVIDIA provides its specialized GPUs and AI accelerators that offer the processing power needed to train complex generative models. Through this partnership, AWS can tap into NVIDIA’s state-of-the-art capabilities in hardware that is optimized for AI workloads. At the same time, NVIDIA benefits by reaching more developers through AWS’s cloud platform and tools.

Notably, both AWS and NVIDIA are investing in next-generation chips tailored for generative AI. For example, in late 2020, AWS announced a release of the AWS Trainium chip. They launched it in 2021 as a custom AI training chip meant to provide cost and performance optimized Machine Learning (ML) training in the cloud. The first generation AWS Trainium chips are currently available to AWS customers through Amazon EC2 instances. Meanwhile, NVIDIA has recently announced the Grace Hopper chip, whose architecture includes key optimizations for generative tasks. Ongoing innovation in custom silicon from both companies will drive advances in the scale, speed, and efficiency of generative model development.

At the same time, intense competition motivates both AWS and NVIDIA to excel in generative AI. Each invests deeply in next-generation AI infrastructure, customized chips, and advanced architectures that are tuned for capabilities like the natural language generation. Both rightfully view leadership in this domain as a crucial element to their future. This sparks major R&D investments, and drives rapid iteration of new products and services. AWS and NVIDIA strive to offer the most powerful platforms for building, training, and deploying generative models. Yet despite the competition, it seems that each recognizes that well-targeted collaboration avoids redundant efforts and allows for the focusing of investments where they respectively excel.

What this competition holds for generative AI deployment

We are still awaiting the mechanisms that will define the role and form of the generative models’ deployment, but their potential to transform businesses is clear. And, a recent MIT Technology Review survey underlines this promise: 96% of 1,000 respondents believe their organization will adopt AI at some point. However, only 9% have currently rolled out AI in any form, and 13% of companies with less than $500 million in annual revenue have implemented at least one use case of generative AI in their operations (see Fig. 1). This gap highlights the need for greater understanding of the technology before the adoption reaches its full scope.MIT Technology Review surveyFigure 1. Businesses are only starting to experiment with the potential of generative AI technologies, as this MIT Technology Review survey suggests.

Companies want to harness generative models. Yet, they still remain unsure of its specific use cases and strategies. Determining where AI can augment human capabilities, versus just automating rote work, will maximize its value. Thoughtful integration and oversight are necessary to mitigate harmful biases that are baked into the AI systems. What’s more, a focus on honing productivity and creativity, rather than full automation, will allow businesses to fully extract AI’s benefits. While leaders contemplate adoption, blending emerging best practices with a tailored approach is key. Meanwhile, although the mechanisms enabling wide deployment are still evolving, generative AI’s breakthroughs make its ascendance feel inevitable.

As we mentioned before, balancing between competition and collaboration will be an integral part of that process. As technology behemoths like AWS strive to tap into generative AI’s potential, their competitive moves can accelerate innovation across industries. This tech arms race spreads advances that smaller firms can then access, whether via cloud services or hiring from the expanded pool of AI experts. According to the MIT Technology Review survey, in a quest for generative AI adoption, 43% of companies plan to partner with a small provider, while 32% are going to opt for a partnership with a major tech company. Below is a chart that illustrates these intentions:AI adoption purposesFigure 2. Companies will choose from a range of tech providers for generative AI adoption purposes, according to the MIT Technology Review survey.

Democratization of AI will likely increase chances for the meaningful transformation of businesses. Since leading providers like AWS make generative models accessible via easy-to-use APIs and cloud services, adoption can scale rapidly. Companies that once struggled with the complexities of developing AI in-house can tap into these capabilities now. And, it seems that strategic guidance on applications, data practices, and model selection is key to guaranteeing that the technology advances productivity gains without compromises in ethics or oversight. With the right partner curating and building around democratized tools, businesses can deploy AI thoughtfully to gain an edge. And, combining cutting-edge technologies with prudent strategy can lead the way forward.

How to take the first steps in generative AI

For companies new to generative AI, experimentation and partnerships can pave the way to adoption. Here are three recommendations for entering or expanding your business opportunities in generative AI:

  • Move steadily

Don’t expect an overnight transformation. Build a strategy that matches your goals, data, and use cases. Run controlled pilots to evaluate generative AI’s value and risks before a broader deployment. Move steadily to integrate AI where it clearly assists human team members. Let successes guide expansion.

  • Build reliable partnerships

Work with trusted technology companies or AI specialists to benefit from their expertise. Partners can help curate the best generative APIs, customize solutions, and oversee responsible integration. Collaborate to handle data preparation, model training, explainability, and oversight.

  • Monitor new regulations and risk management strategies

Keep a close eye on the legal landscape and ethical concerns around generative AI. Data practices, bias mitigation, and transparency will require special attention. Review formal guidance, like the EU AI Act, to shape local best practices. Seek explanations of models’ behaviors and encourage knowledge-sharing within your teams. Here is a snapshot of how the number of AI-related bills grew starting from 2016:The number of bills related to AI

Figure 3. The number of bills related to AI has been growing globally, as MIT Technology Review underscores.

The key to generative AI adoption lies in the ability to strike the right balance between excitement and pragmatism, and to move ambitiously but thoughtfully. Build internal skills and external partnerships as a foundation. A growth mindset, governance, and patience will lead to AI that enhances workflows without disruption.

Final thoughts

The steady advance of generative AI continues, with technology powerhouses like AWS searching for ways for its efficient and meaningful deployment. What’s crucial is that no single company owns all the generative AI progress. AWS partnerships imply that collaboration and open research remain integral to responsible advancement. As more stakeholders contribute innovations, the field is advancing and there is still significant room for experimentation by companies of all sizes. A diversity of perspectives and priorities will enrich this technology as it evolves, and the path forward will be shaped by many hands.

The possibilities of scaling up with generative AI are tantalizingly close. Grab them easily with Avenga: contact us.

Other articles

or

Book a meeting

Zoom 30 min

or call us+1 (800) 917-0207

Start a conversation

We’d like to hear from you. Use the contact form below and we’ll get back to you shortly.