In sociology, feminization is the shift in gender roles and sex roles in a society, group, or organization towards a focus upon the feminine. It can also mean the incorporation of women into a group or a profession that was once dominated by men.[1]
Developed by Nelliwinne