Q:

What is the meaning of social ethics?

A:

Quick Answer

The meaning of social ethics is a set of rules or guidelines, based around ethical choices and values, that society adheres to. Many of these rules are often unspoken and instead expected to be followed.

Continue Reading
What is the meaning of social ethics?
Credit: Summer Derrick E+ Getty Images

Full Answer

Social ethics are not supposed to be a detailed list of rules to be applied in any given situation. They are meant to act as a guide by setting the ground rules for what society deems acceptable. The welfare of society as a whole is put ahead of the interests of any individual, and this helps to ensure that everyone is kept accountable by each other.

Learn more about Social Sciences

Related Questions

  • Q:

    Can sociology be value-free?

    A:

    Because sociology is the study of human behavior in society, which is governed by values, in one sense sociology cannot be value-free. The goal of sociologists is to be value-neutral, meaning they do not bring their own prejudices to research.

    Full Answer >
    Filed Under:
  • Q:

    What is the role of a family in society?

    A:

    Although some cultural variance exists, the primary role of a family in society is to foster an environment where children learn skills, morals and values. Families provide initial socialization for children that shapes their self-worth, attitudes, values and behaviors. Families create structure and stability in the lives of family members.

    Full Answer >
    Filed Under:
  • Q:

    What are the main concepts of sociology?

    A:

    The main concepts of sociology include the history and theories to explain and predict human behavior and interaction; the symbols, values, norms and languages of cultures and society; and social interactions and social control. Class systems within cultures and groups are also a main concept of sociology.

    Full Answer >
    Filed Under:
  • Q:

    What is social studies?

    A:

    Social studies is a field of study that seeks to unify the diverse subjects of history, politics, economics, sociology, geography and anthropology, ultimately aiming toward a comprehensive understanding of society. It is often taught to young students as a preparation for college or careers.

    Full Answer >
    Filed Under:

Explore