Most U.S. jobs are now held by women “Likely to be permanent,” one expert says, as females dominate in growing industries like health care and education. Source 2020-01-10 Stonecom Interactive