Ok, so I am new to VCenter and recently was trying to add a cluster and my networking seems to have stopped working between my VMs, however I can access them individually from my laptop via RDP/SSH/etc.
Backstory with pertinent details: I had 3 servers that I ran ESXi 6.5 Free on individually, two that I used for work related VMs and one for personal. @ of the servers are off most of the time, one is on 24/7. They each had a VMNetwork added with all VMs connecting to and using the same network (VLANS are coming) I then decided to join VMUG so I could get VCenter to help better manage the servers from one screen. This setup fine, and the three servers were added and everything was good. Recently I wanted to start playing around with Ansible for provisioning VMs and that is when I realized I did not have them in a Cluster. I started looking at doing that, and was able to create a cluster, which the system pulled all VMs in from the one server that is on. After this, I was able to provision from ansible and able to access the machines.
However, recently I tried to access a couple machines that talk to eachother and I noticed that they no longer communicate with hostnames, etc. In looking at VCenter, it seems like I need to possibly setup a Distributed Switch and move the VMs to that, rather than the VMNetwork switch that existed on the individual ESXi hosts.
Does this sound correct? If so, is there a recommended tutorial for how to accomplish this?
View Reddit by jimmyfloyd182 – View Source