GCE worker node connection issue
I have setup a 2 node cluster. The worker node is in GCE. I created the instance using a new VPC (non default) which allows all traffic.
I used the default subnets/regions to define the new VPC!
I could deploy my basic pod. I could deploy my basic service but for some reason I cannot reach the service from the other node where the pod is not deployed..
k get svc -o wide
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE SELECTOR
basicservice ClusterIP 10.109.212.68 80/TCP 131m type=webserver
kubernetes ClusterIP 10.96.0.1 443/TCP 42h
I deactivated the apparmor and ufw.
Here is my IP Tables on the GCE node : iptables.txt
Any ideas why this cluster IP of the service is not reachable and how these IP ranges 10.109... are defined/assigned