Human learners acquire not only disconnected bits of information, but complex interconnected networks of relational knowledge. The capacity for such learning naturally depends on the architecture of the knowledge network itself. I will describe recent work assessing network constraints on the learnability of relational knowledge, and theories from statistical physics that offer an explanatory model for such constraints. I will then broaden the discussion to the generic manner in which humans communicate using systems of interconnected stimuli or concepts, from language and music, to literature and science. I will describe an analytical framework to study the information generated by a system as perceived by a biased human observer, and provide experimental evidence that this perceived information depends critically on a system's network topology. Applying the framework to several real networks, we find that they communicate a large amount of information (having high entropy) and do so efficiently (maintaining low divergence from human expectations). Moreover, we also find that such efficient communication arises in networks that are simultaneously heterogeneous, with high-degree hubs, and clustered, with tightly-connected modules -- the two defining features of hierarchical organization. Together, these results suggest that many real networks are constrained by the pressures of information transmission to biased human observers, and that these pressures select for specific structural features.
Danielle Bassett / University of Pennsylvania
Event Details & Abstracts