Page overview
1. Find inspiration for a presentation
Making a killer pitch deck is an art form, and one that usually takes hours. Rather than starting from scratch each time, global marketing and advertising company Media.Monks is using generative AI to make sure great ideas don’t go to waste. An early adopter of the technology, this AWS customer and partner has built an in-house, enterprise-ready AI solution to automate its workflows—both internally and on behalf of its clients.
The platform, Monks.Flow, connects with Amazon Bedrock, and offers intelligent solutions for clients’ marketing activities. Designed to work across existing tech stacks, and depending on the needs of the client, it can perform tasks such as document reviews, as well as extract valuable data and insights from a company’s existing systems. For example, let’s say the Media.Monks team is preparing a presentation for a prospective client. Monks.Flow uses different LLMs, accessed via Amazon Bedrock, to search through the vast number of decks the company has worked on over the years. Not only can it quickly find information applicable to whatever new business the team is pitching for today, it can also rank it in order of relevance—providing instant access to a wealth of knowledge, and turbocharging teams’ imaginations.
If you’re an Amazon seller, or any other business that sells products online, customer reviews hold a treasure trove of information. But if you sell globally, and you receive hundreds, even thousands, of reviews in a range of languages you don’t speak, it can be pretty tricky to glean insights from the data. Step forward Cohere Embed, a text representation language model from enterprise AI company Cohere.
Available on Amazon Bedrock, it can work across more than 100 languages. Cohere’s chief product officer, Jaron Waldman, put it this way: “Imagine you had a bunch of product reviews on Amazon, in 100 different languages. You want to analyze those reviews, and so you could ask a question like, ‘Show me all the reviews that are related to the delivery of the product,’ to see if there was a delivery problem or if the delivery was going well.’ You can do all that with Cohere Embed.”
It can be hard for business owners to organize staffing schedules that suit the needs of both their customers and employees. AWS customer AlayaCare offers software services to companies that provide home-based healthcare. The Montreal-based startup is using generative AI to tackle one of the industry’s biggest headaches: employee turnover. AlayaCare’s research shows that the biggest reason home-care services employees—hardworking caregivers usually paid by the hour—leave their jobs is dissatisfaction with scheduling. “A lot of times, it's simply the difference between the hours they want and the hours they get,” said Naomi Goldapple, AlayaCare senior VP of data and intelligence. Her team has been prototyping on Amazon Bedrock to help schedulers quickly see the home-visit slots they need to fill, and provide them with the best employee matches based on specific criteria that satisfy caregivers and patients alike.
AlayaCare is also using generative AI to gather and then summarize what Goldapple affectionately calls “note droppings”: notes made by different caregivers and nurses who go into a patient’s home to provide care. One nurse might make a note in one part of the AlayaCare platform about a particular patient complaining about increased lower back pain. Then another nurse, visiting the same patient the following day, might make a note in a different part of the platform about that patient experiencing dizziness and shortness of breath. LLMs can read all of these notes left in different places, pull together the most important information (such as continued mentions of pain), and summarize it in a way that allows a clinician to spot trends and intervene earlier, to avoid having the patient hospitalized.
If you need advice or want to brainstorm, but no one’s around, why not ask your digital doppelgänger? Media.Monks VP and global head of engineering Iran Reyes has built his own digital persona in the company’s AI-centric, professionally managed service Monks.Flow. “I have a digital twin that knows my character traits, my personal and professional background, the way I speak, my tone of voice, and so on. For this interview, I asked it what I should keep in mind. It recommended I slow down, speak clearly, and focus on describing the specific services we’ve built. It was great advice.” Reyes predicts a big rise in AI companions: “We’re going to be using these ‘personas in our pockets’—with our phones, or other devices—to help us make better decisions and take actions in all kinds of situations.”
Find out how to transform your business with generative AI on AWS.
Learn more about the latest additions to Amazon Bedrock.
Amazon Bedrock represents the middle layer of the three-layered ‘generative AI stack’. Learn about what this means, and why Amazon is investing deeply across all of it.