While Bill Gates, Stephen Hawking, Elon Musk, Peter Thiel and others engaged in OpenAI discuss whether or not AI, robots, and machines will replace humans, proponents of human-centric computing continue to extend work in which humans and machine partner in contextualized and personalized processing of multimodal data to derive actionable information. In this talk, we discuss how maturing paradigms such as semantic computing (SC), cognitive computing (CC), complemented by the emerging perceptual computing (PC) paradigm provide a continuum through which to exploit the ever-increasing and growing diversity of data that could enhance people’s daily lives. SC and CC sift through raw data to personalize it according to context and individual user, creating abstractions that move the data closer to what humans can readily understand and apply in decision-making. PC, which interacts with the surrounding environment to collect data that is relevant and useful in understanding the outside world, is characterized by interpretative and exploratory activities, that is supported by use of prior/background knowledge. Using the examples of personalized digital health and smart city, we will demonstrate how SC, CC and PC form complementary capabilities that will enable development of next generation of intelligent systems.
Amit Sheth, "Computing for Human Experience: Semantics-Empowered Sensors, Services, Social Computing on the Ubiquitous Web," IEEE Internet Computing, 14 (1), January/February 2010.
Amit Sheth, Pramod Anantharam, Cory Henson, Semantic, Cognitive, and Perceptual Computing: Advances toward Computing for Human Experience,IEEE Computer, March 2016. http://online.qmags.com/CMG0316/default.aspx?pg=67&mode=2#pg67&mode2
Amit Sheth, Internet of Things to Smart IoT Through Semantic, Cognitive, and Perceptual Computing, IEEE Intelligent Systems, March/April 2016.