Jump to content

Eliezer Yudkowsky

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Minority Report (talk | contribs) at 10:28, 23 November 2004 (rv edit by 216.27.179.248 -the mission statement is as I quoted. NPOVing the contentious descriptions of Yudkowsky as an "AI researcher" (what peer reviewed papers has he published?) etc). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Eliezer Yudkowsky (born September 11, 1979) is a self-proclaimed artificial intelligence researcher, a "Research Fellow" and board member of the Singularity Institute for Artificial Intelligence, and a dedicated Singularitarian. The mission statement of the Institute reads as follows:

SIAI was founded for the pursuit of ethically enhanced cognition by creating Friendly AI. We believe the ethical and significant enhancement of cognition will help solve contemporary problems – disease and illness, poverty and hunger – more readily than other philanthropic pursuits.

See also