🤖 AI Summary
A recent analysis revealed that training OpenAI's GPT-4 model consumed approximately 50 GWh of electricity, which is 3,170 times more energy than raising a human to the age of 18, who typically uses about 0.015768 GWh over the same period. This stark comparison highlights not only the substantial energy demands of advanced AI models but also underscores the current limitations of such models in achieving true general intelligence. Despite GPT-4's capabilities in specific tasks, it lacks the adaptable, efficient functioning of human cognition, as it cannot perform tasks like driving a car.
This finding is significant for the AI/ML community as it emphasizes that merely scaling existing models will not suffice to achieve Artificial General Intelligence (AGI). The energy-intensive nature of training large models like GPT-4 indicates that new fundamental breakthroughs are essential for developing AI systems that can operate more like humans, suggesting a need for innovation beyond current methodologies in AI development.
Loading comments...
login to comment
loading comments...
no comments yet