The Age of Artificial Intelligence, also known as the AI Era[1][2][3][4] or the Cognitive Age,[5][6] is a historical period characterized by the rapid development and widespread integration of artificial intelligence (AI) technologies across various aspects of society, economy, and daily life. Artificial intelligence is the development of computer systems enabling machines to learn, and make intelligent decisions to achieve a set of defined goals.[7]
This era is marked by significant advancements in machine learning, data processing, and the application of AI in solving complex problems and automating tasks previously thought to require human intelligence.[7][10]
British neuroscientist Karl Friston's work on the free energy principle is widely seen as foundational to the Age of Artificial Intelligence, providing a theoretical framework for developing AI systems that closely mimic biological intelligence.[11] The concept has gained traction in various fields, including neuroscience and technology.[12] Many specialists place its beginnings in the early 2010s, coinciding with significant breakthroughs in deep learning and the increasing availability of big data, optical networking, and computational power.[13][14]