Oceania Definition

The term Oceania refers to the body of people whose cultures and interests are deeply rooted in the Ocean. “Anyone who has lived in our region and is committed to Oceania is an Oceanian.” Hauofa acutely states. He opens up the term to include a larger body of people than simply pacifica peoples. He includes all who would consider the ocean as part of their identity.

Leave a comment

Blog at WordPress.com.

Up ↑