Definition of organicism

Definition of organicism
  1. organicism Noun The treatment of society or the universe as if it were an organism
  2. organicism Noun The theory that the total organization of an organism is more important than the functioning of its individual organs
  3. organicism Noun The theory that disease is a result of structural alteration of organs
Need more help? Try our forum NEW