@awsdevelopers
  @awsdevelopers
AWS Developers | Migrating a Kafka Workload from On-Premises to AWS with Amazon Q @awsdevelopers | Uploaded 3 months ago | Updated 1 hour ago
In this demo, Ricardo Ferreira, Developer Advocate for AWS, shows how he used Amazon Q to migrate a Kafka connector developed locally to AWS. Besides solving building problems with Amazon Q, he also uses the AI Coding Companion to generate the Terraform code required to host the connector.

Resources:
๐ŸŒ Check out the docs to get started with Amazon Q ๐Ÿ‘‰ docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/getting-started-q-dev.html

Follow AWS Developers!
๐Ÿฆ Twitter: twitter.com/awsdevelopers
๐Ÿ’ผ LinkedIn: linkedin.com/showcase/aws-developers
๐Ÿ‘พ Twitch: twitch.tv/aws
๐Ÿ“บ Instagram: instagram.com/awsdevelopers/?hl=en

Chapters:
00:00 - Introduction
00:21 - Building the connector code with Maven
01:08 - Fixing pending deployment code with Amazon Q
02:54 - Using Amazon Q to explain the IAM policies
03:46 - Outro

#apachekafka #amazonq #terraform
Migrating a Kafka Workload from On-Premises to AWS with Amazon QHands-on Demo: Generate & Load Graph Data into Amazon NeptuneAmazon Bedrock Agents: Craft Data Pipelines EasilyActivate Amplify Studio on Your Project in 3 Simple Steps #shortsUsing AI to Query & Visualize Graph Data with KeyLinesBuildย aย UGCย Liveย Streamingย Appย withย Amazonย IVS: Broadcast Low-Latency with Multi-Hosts (Lesson 4.4)Generative AI & How it Applies to DevelopersIntegrating Generative AI Models with Amazon BedrockDevs Quickstart Guide to GenAI on AWS: Amazon Bedrock & Amazon Q DeveloperHow To Choose the Right AWS IoT ServiceLearning Data Structures with Amazon CodeWhispererNo More Data Dumps! Easy Amazon Bedrock Setup with New Connectors

Migrating a Kafka Workload from On-Premises to AWS with Amazon Q @awsdevelopers

SHARE TO X SHARE TO REDDIT SHARE TO FACEBOOK WALLPAPER