1 year ago

#388879

test-img

Luv33preet

AWS EKS not able to connect with RDS instance in the same VPC

I have launched an EKS cluster version 1.21 using this terraform module. The cluster is up and running, no issues there. I have launched my app in the cluster but it is not able to connect to the RDS postgres instance.

The RDS and the cluster nodes are in the same vpc. The RDS and EKS and worker nodes all are in public subnets(using the default ones already created by AWS). The RDS instance only allows private access and I have allowed inbound port 5432 on the RDS security group from anywhere.

I have tried creating ExternalService kind of service to access the db but even that is not able to connect.

This is from inside the pod

# traceroute demo-db-1.xxx.ap-south-1.rds.amazonaws.com
traceroute to demo-db-1.xxx.ap-south-1.rds.amazonaws.com (172.31.33.117), 30 hops max, 46 byte packets
 1  ip-172-31-27-98.ap-south-1.compute.internal (172.31.27.98)  0.004 ms  0.004 ms  0.004 ms
 2^C
#

It is able to get the IP of the instance but not able to connect. When I try connecting using psql, it just keeps stuck and nothing happens.

Any idea what is going on here? How can I debug what is going on here?

EDIT 1: The RDS instance is in a different availability zone than the worker nodes instance. However, I did launch an RDS instance in the same availability zone and faced the same issue.

EDIT 2: The RDS instance is in a public subnet but the public endpoint is not enabled. It can only be accessed using the private endpoint. I launched a separate EC2 in the same VPC and I was able to access the RDS from there using the private endpoint.

amazon-web-services

amazon-rds

amazon-eks

0 Answers

Your Answer

Accepted video resources