According to this report, the majority of Americans believe that a father’s most important role is to teach values rather than earn money and carry out punishment. The study claims that this is a changing perspective. Now, fathers are expected to more equally take part in duties at home and be more emotionally available for their children.
Read the report here.