Motion retargeting is a promising approach for generating natural and compelling motions for non-humanoid characters and robots. However, it is challenging to translate human motions into semantically equivalent motions for target characters with very different morphologies due to the ambiguous nature of the problem. This work presents a novel learning-based motion retargeting framework, Adversarial Correspondence Embedding (ACE), to retarget human motions onto robots with different body dimensions and structures. Our framework is designed to produce natural and feasible robot motions by leveraging generative-adversarial networks (GANs) while preserving high-level motion semantics by introducing an additional feature loss. In addition, we pretrain a robot motion prior that can be controlled in a latent embedding space and seek to establish a compact correspondence from human motions to robot latent vectors. We demonstrate that the proposed framework can produce convincing retargeted motions for three different robot characters, a quadrupedal robot with a manipulator, a hexapod, and a wheeled manipulator. We further evaluate the design choices of our framework by conducting baseline comparisons and ablation studies.