The general opinion among Whites is that Black culture is something Blacks can study as a separate issue while the general study of the U.S. geography, history, culture, institutions will reflect what White people were doing. The notion that White culture (history, institutions) developed out of the interaction between Europeans and Africans is marginalized as radical, Black liberation theology, or any other term that will keep it marginal and out of the classrooms where White children might hear it.
However, the culture that is taught in our classes the humanities, social sciences, arts, etc. is a distortion of reality in the interests of maintaining the centrality of Whiteness in the American story.
Update 8/3/18: Lies My Teacher Told Me details the process by which this happens.