culture   Western medicine
statement   Western medicine is formally taught in medical schools and universities to transmit knowledge.

1 similar statements
# Concept Culture Statement Freq.
1 knowledge transmission western medicine Formally taught in medical schools and universities 1