The article discusses how a 2,400-year-old problem illustrates the advancements of ChatGPT's AI in approaching human-like intelligence, highlighting the progress in artificial intelligence and its historical context.
The article explores the evolution of the human brain, emphasizing the role of the neocortex and social complexity in developing advanced cognition, highlighting that brain growth was driven by survival challenges and social needs rather than innate intelligence.
Meta's AI chief Yann LeCun stated that large language models (LLMs) like ChatGPT and Google's Gemini will not achieve human-level intelligence due to their limited understanding of logic, lack of persistent memory, and inability to plan hierarchically. LeCun emphasized the need for new AI approaches, such as "world modeling," which could take up to 10 years to develop human-level AI. This comes as Meta continues to heavily invest in AI, despite investor concerns and significant market value losses.
Elon Musk predicts that artificial intelligence will surpass human intelligence next year, raising concerns about the potential impact of advanced AI on society.
A new study published in Science Advances explores the relationship between brain size, cognitive function, and energy consumption. While large brains are rare in animals due to the high energy costs they require, the study suggests that it's not just brain size that determines intelligence. The research found that the neocortex, responsible for higher cognitive functions, demands significantly more energy than other parts of the brain. This suggests that the expansion of the neocortex in human evolution led to increased energy consumption, indicating that the connectivity and circuitry of the brain play a crucial role in human intelligence. The study highlights the importance of understanding the relationship between energy expenditure and cognition in the evolution of human intelligence.
The size of the human brain has long been associated with intelligence, but recent research suggests that brain size alone is not the determining factor. Instead, changes in the brain's wiring diagram, the shapes of neurons, and gene expression play crucial roles in human cognition. Evidence from studies on species with small brains, such as Homo floresiensis and Homo naledi, challenges the notion that larger brains are necessary for complex behaviors. The human brain's unique connectivity patterns and gene expression contribute to its cognitive abilities, but further research is needed to fully understand how these factors interact to shape human behavior.
Neural networks have achieved a breakthrough in capturing a critical aspect of human intelligence, according to a study published in the journal Nature. Historically, neural networks have been criticized for their inability to combine known concepts in new ways, a capacity called "systematic compositionality." However, researchers have now developed a method called meta-learning for compositionality (MLC) that allows neural networks to practice applying different sets of rules to newly learned words, resulting in performance that matches or exceeds that of humans. While the model still has limitations in generalization, the study represents a significant step forward in training networks to be fully compositional.
The hype surrounding AI is increasing, with Microsoft Research claiming that GPT-4 is a nascent example of artificial general intelligence (AGI). However, AI experts are skeptical about the labeling of algorithms as "machine intelligence" and the notion of consciousness. The difference between human intelligence and machine intelligence is becoming crucial as companies train with more data and researchers look for emergent capabilities. While AI testing helps researchers gauge improvement, an ability to pass the bar exam does not mean an algorithm is now sentient. The physical world is complex to navigate, and robots succeed only at very narrowly defined tasks.