Last active
August 12, 2021 05:09
-
-
Save vikasbajaj/6993831ac4c28bdcf348092ea5f79f21 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Lab-1: Cluster Creation | |
--------------------------- | |
1. MSK Workshop link = https://amazonmsk-labs.workshop.aws/en/clustercreation.html | |
2. Use the following cloudformation template, download it on your laptop | |
https://github.com/vikasbajaj/msk-kafka-workshop/blob/master/msk-infra-and-kafka-clients/MSK-VPC-Clients.yaml | |
3. Make sure you are in running this lab in ap-southeast-2 (sydney) region | |
4. Go to EC2 console and create a keypair (click on Key Pair) | |
name: msk-workshop-pem | |
we will use it for login into EC2 Kafka instance | |
5. Go to CloudFormation console (AWS Console) | |
6. Click "Create Stack" --> "With new resources (standard)" | |
7. Select "Upload a template file", select the "MSK-VPC-Clients.yaml" file and click Next | |
stack name : MSK-VPC-Client-Stack | |
keyname : msk-workshop-pem | |
8. click next..next, on to the last screen, select both the checkboxes | |
I acknowledge that AWS CloudFormation might create IAM resources with custom names. | |
I acknowledge that AWS CloudFormation might require the following capability: CAPABILITY_AUTO_EXPAND | |
9. click Create Stack (it will take 4 mins) | |
10. what's happening behind scene | |
A VPC with 1 Public subnet and 3 Private subnets and the required plumbing including a NAT Gateway. | |
A Cloud9 instance you will use as a Bastion | |
1 Apache KafkaClientInstance - an EC2 instance that has Apache Kafka, AWS CLI v1, AWS CLI v2 (aws2), jq, docker | |
A security group for the Apache Kafka client instance associated with the EC2 instance that needs to be given access to from the security group for the Amazon MSK cluster in this lab. | |
Wait for Cloudformation Stacks to complete | |
11. Go to EC2 console and click on "Security Group" in the left pane | |
click on "Create Security Group" | |
Security group name: MSKWorkshop-KafkaService | |
Description: Access to the Kafka service on the MSK cluster | |
VPC: [select the VPC you are using for your lab (MSKVPC)] | |
Create Inbound rules | |
a. Click Add rule | |
Use: | |
Type: Custom TCP | |
Protocol: TCP | |
Port range: 9092 | |
Source: MSK-VPC-Client-Stack-KafkaClientInstanceSecurityGroup-xxxxxxxx | |
(copy Physical ID of resource name = KafkaClientInstanceSecurityGroup, from the CloudFormation stack (MSK-VPC-Client-Stack) resources. e.g. sg-06c50c298ff117c45 | |
Description: Plaintext Kafka | |
b. Click Add rule | |
Use: | |
Type: Custom TCP | |
Protocol: TCP | |
Port range: 9092 | |
Source: Cloud 9 security group (you’ll see something like aws-cloud9-msklab...) | |
Description: Plaintext Kafka | |
c. Click Add Rule | |
Use: | |
Type: Custom TCP | |
Protocol: TCP | |
Port range: 9094 | |
Source: MSK-VPC-Client-Stack-KafkaClientInstanceSecurityGroup-xxxxxxxx | |
(copy Physical ID of resource name = KafkaClientInstanceSecurityGroup, from the CloudFormation stack (MSK-VPC-Client-Stack) resources. e.g. sg-06c50c298ff117c45 | |
Description: Encrypted Kafka | |
d. Click Add rule | |
Use: | |
Type: Custom TCP | |
Protocol: TCP | |
Port range: 9094 | |
Source: Cloud 9 security group (you’ll see something like aws-cloud9-msklab...) | |
Description: Encrypted Kafka | |
e. Click Add rule | |
Use: | |
Type: Custom TCP | |
Protocol: TCP | |
Port range: 2181 | |
Source: MSK-VPC-Client-Stack-KafkaClientInstanceSecurityGroup-xxxxxxxx | |
(copy Physical ID of resource name = KafkaClientInstanceSecurityGroup, from the CloudFormation stack (MSK-VPC-Client-Stack) resources. e.g. sg-06c50c298ff117c45 | |
Description: Zookeeper access | |
f. Click Add rule | |
Use: | |
Type: Custom TCP | |
Protocol: TCP | |
Port range: 2181 | |
Source: Cloud 9 security group (you’ll see something like aws-cloud9-msklab...) | |
Description: Zookeeper access | |
Click Create Security Group | |
12. Go to Amazon MSK console, click "Create Cluster", select "Custom Create" | |
Enter the cluster name: MSKWorkshopCluster | |
Kafka Version: 2.7.1 | |
Configuration: select Custom Configuration, click on "Create Configuration" | |
Configuration Name: msk-cluster-configuration | |
Make sure you add/update the following configurations | |
auto.create.topics.enable=true | |
delete.topic.enable=true | |
log.retention.hours=8 | |
click Create | |
13. Refresh Cluster Configuration dropdown and select the configuration that you created in the previous step, Configuration Revision = 1 | |
14. VPC = MSKVPC, Number of zones = 3 | |
First Zone | |
a. Zone = ap-southeast-2a | |
Subnet = PrivateSubnetMSKOne | |
b. Zone = ap-southeast-2b | |
Subnet = PrivateSubnetMSKTwo | |
b. Zone = ap-southeast-2c | |
Subnet = PrivateSubnetMSKThree | |
15. Security Groups, select "Custom Settings" | |
make sure you select only one security group that you created in the previous steps | |
Security Group = MSKWorkshop-KafkaService | |
16. No changes required in Brokers section | |
17. Storage | |
change it to 100 GiB | |
18. Access Control Method = None | |
19. Encryption | |
Tick all the options | |
Enable encryption within the cluster | |
TLS encryption | |
Plaintext | |
Encrypt data at rest | |
select "Use AWS managed CMK" | |
20. Monitoring | |
Amazon CloudWatch metrics for this cluster | |
select "Enhanced topic-level monitoring" | |
Broker log delevery | |
select "Deliver to Amazon CloudWatch Logs" | |
click on "visit Amazon CloudWatch Logs console" to open CloudWatch logs in a separate browser tab. | |
Click on "Create Log group" | |
Log group name = MSKClusterLogs | |
go back to MSK cluster creation tab and click on "Browse" button and select "MSKClusterLogs" | |
21. Finally, click "Create Cluster" | |
22. Cluster creation will take 15 mins | |
23. Go to Cloud9 console, you should see a Cloud9 environment (e.g. MSK-VPC-Client-Stack-Cloud9EC2Bastion), click Open IDE | |
24. Click on File ===> Upload Local Files, and select EC2 pem file that we created in step number 4 | |
25. In the Cloud9 Terminal, run the following command | |
chmod 600 msk-workshop-pem.pem | |
26. We'll login into Kafka EC2 instance, in Cloud9 terminal run the following command | |
ssh -i msk-workshop-pem.pem [email protected] | |
note: 10.0.1.124 is private IP of Kafka EC2 instance, change it to your EC2 private IP | |
---------Wait for the Cluster to be up and running before you execute next steps------------- | |
----------You need to run the remaining steps in the Kafka EC2 instance terminal/shell--------------- | |
27. Set the Region, run the following command | |
aws configure | |
Default Region name: ap-southeast-2 | |
28. Set the required environment variables, run the following commands in EC2 Kafka Instance shell | |
note: change Cluster ARN with your MSK cluster arn (go to MSK console and copy MSK cluster ARN) | |
export CLUSTER_ARN=arn:aws:kafka:ap-southeast-2:xxxxxxxxxxxxxxxxx:cluster/MSKWorkshopCluster/69314637-895b-44f9-8ee6-3450661994e9-2 | |
export BS=$(aws kafka get-bootstrap-brokers --cluster-arn $CLUSTER_ARN --output text) | |
export MYZK=$(aws kafka describe-cluster --cluster-arn $CLUSTER_ARN --output json | jq ".ClusterInfo.ZookeeperConnectString" | tr -d \") | |
29. verify MSK brokers address (bootstrap-servers) | |
echo $BS | |
30. run the following to verify cluster setup | |
cd kafka | |
./bin/kafka-topics.sh --bootstrap-server $BS --list | |
you should see list of topics | |
31. Let's run a console producer and a console consumer | |
32. In the working terminal (Kafka EC2 instance) | |
./bin/kafka-topics.sh --bootstrap-server $BS --create --topic test_topic --partitions 3 --replication-factor 3 | |
./bin/kafka-topics.sh --bootstrap-server $BS --list | |
./bin/kafka-topics.sh --bootstrap-server $BS --describe --topic test_topic | |
33. Create Producer | |
./bin/kafka-console-producer.sh --bootstrap-server $BS --topic test_topic | |
send few messages | |
34. Open a new terminal (it will be a cloud 9 terminal) | |
ssh into Kafka EC2 Instance | |
ssh -i msk-workshop-pem.pem [email protected] | |
set the environment variables | |
export CLUSTER_ARN=arn:aws:kafka:ap-southeast-2:xxxxxxxxxxxxxx:cluster/MSKWorkshopCluster/69314637-895b-44f9-8ee6-3450661994e9-2 | |
export BS=$(aws kafka get-bootstrap-brokers --cluster-arn $CLUSTER_ARN --output text) | |
export MYZK=$(aws kafka describe-cluster --cluster-arn $CLUSTER_ARN --output json | jq ".ClusterInfo.ZookeeperConnectString" | tr -d \") | |
echo $BS | |
35. Create Consumer | |
cd kafka | |
./bin/kafka-console-consumer.sh --bootstrap-server $BS --topic test_topic --from-beginning | |
you should see messages that producer produced in the previous step. | |
Your MSK cluster is up and running |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment