MarketThomas Dean (computer scientist)
Company Profile

Thomas Dean (computer scientist)

Thomas L. Dean is an American computer scientist known for his work in robot planning, probabilistic graphical models, and computational neuroscience. He was one of the first to introduce ideas from operations research and control theory to artificial intelligence. In particular, he introduced the idea of the anytime algorithm and was the first to apply the factored Markov decision process to robotics. He has authored several influential textbooks on artificial intelligence.

Academic and Scientific Contribution
Artificial Intelligence Control Dean and Wellman's book Planning and Control Anytime Algorithms The term anytime algorithm was coined by Dean and Boddy in the late '80s. The focus of Dean and Boddy's work in this area has been on deliberation scheduling applied to time-dependent planning problems. Deliberation scheduling is the explicit allocation of resources to tasks (in most cases anytime algorithms) so as to maximize the total value of an agent's computation. Time-dependent planning problems are defined to be planning problems where the time available for responding to events varies from situation to situation. In addition to defining the basic concepts, Dean and Boddy provided theoretical analyses and applications in robotics and operations research. Markov Processes Dean played a leading role in the adoption of the framework of Markov decision processes (MDPs) as a foundational tool in artificial intelligence. In particular, he pioneered the use of AI representations and algorithms for || factoring || complex models and problems into weakly-interacting subparts to improve computational efficiency. His work in state estimation emphasized temporal causal reasoning and the integration with probabilistic graphical models. His work in control includes state-space partitioning, hierarchical methods, AI Textbook Working with his collaborators, James Allen and Yiannis Aloimonos specializing in respectively computer vision and natural language processing, Dean wrote one of the first modern AI textbooks incorporating probability theory, machine learning and robotics, and placing traditional AI topics such as symbolic reasoning and knowledge representation using the predicate calculus within a broader context. Robotics As co-chair of the 1991 AAAI Conference Dean organized a press event featuring mobile robots carrying trays of canaps and barely avoiding the participants. The coverage on the evening news was enthusiastically positive and in 1992, Dean and Peter Bonasso, with feedback from the robotics community, created the AAAI Robotics Competition featuring events aimed at showing off robots competing in events that involved performing tasks in the home, office, and disaster sites. The competition was still being held in 2010. Computational Neuroscience Stanford Course After starting as a research scientist at Google, Dean was appointed as a consulting professor at Stanford and began teaching a course with the title Computational Models of the Neocortex. During the next fifteen years he invited top neuroscientists from all over the world to give talks and advise students working on class projects. Several of the classes resulted in papers coauthored by students that led to research projects at Google. Neuromancer Project In an effort to create a team focusing on scalable computational neuroscience, Dean and his students at Stanford produced a white paper entitled Technology Prospects and Investment Opportunities for Scalable Neuroscience Viren Jain is currently the project manager and lead scientist for the ongoing effort at Google. The resulting data on brain connectivity, including the 'hemibrain' connectome, a highly detailed map of neuronal connectivity in the fly brain and the 'H01' dataset, a 1.4 petabyte rendering of a small sample of human brain tissue, was publicly released. Google Brain Dean led some of the earliest investigations into the use of neural networks at Google, that directly led to the creation of the Google Brain project. He experimented with approaches for using hardware acceleration to overcome current performance limitations in building industrial-scale web services, and collaborated with Dean Gaudet on the Google Infrastructure and Platforms Team to make the case for introducing graphic processing units (GPU) in Google data centers. He worked closely with Vincent Vanhoucke, who led the perception research and speech recognition quality team, to demonstrate the value of GPUs for training and deploying deep neural network architectures in the cloud focusing on speech recognition for Google Search by Voice. ==Administrative and Professional Services==
Administrative and Professional Services
University Administration Dean served as the Deputy Provost of Brown University from 2003 to 2005, as the chair of Brown's Computer Science Department from 1997 until 2002, and as the Acting Vice President for Computing and Information Services from 2001 until 2002. As Deputy Provost he helped develop and launch new multidisciplinary programs in genomics and the brain sciences as well as oversee substantial changes in the medical school and university libraries. Professional Leadership Dean was named a fellow of AAAI in 1994 and an ACM fellow in 2009. He has served on the Executive Council of AAAI and the Computing Research Association Board of Directors. He was a recipient of an NSF Presidential Young Investigator Award in 1989. He served as program co-chair for the 1991 National Conference on Artificial Intelligence and the program chair for the 1999 International Joint Conference on Artificial Intelligence held in Stockholm. He was a founding member of the Academic Alliance of the National Center for Women and Information Technology and a former member of the IJCAI Inc. Board of Trustees. == References ==
tickerdossier.comtickerdossier.substack.com